Researchers have long been interested in discovering the ways that human brains represent thoughts through a complex interplay of electrical signals. Recent improvements in brain recording and statistical methods have given researchers unprecedented insight into the physical processes underlying thoughts. For example, researchers have begun to show that it is possible to use brain recordings to reconstruct aspects of an image or movie clip someone is viewing, a sound someone is hearing or even the text someone is reading.
A new study by University of Pennsylvania and Thomas Jefferson University scientists brings this work one step closer to actual mind reading by using brain recordings to infer the way people organize associations between words in their memories.
The research was conducted by professor Michael J. Kahana of the Department of Psychology in Penn’s School of Arts and Sciences and graduate student Jeremy R. Manning, then a member of the Neuroscience Graduate Group in Penn’s Perelman School of Medicine. They collaborated with other members of Kahana’s laboratory, as well as with research faculty at Thomas Jefferson University Hospital.
Their study was published in The Journal of Neuroscience.
The brain recordings necessary for the study were made possible by the fact that the participants were epilepsy patients who volunteered for the study while awaiting brain surgery. These participants had tiny electrodes implanted in their brains, which allowed researchers to precisely observe electrical signals that would not have been possible to measure outside the skull. While recording these electrical signals, the researchers asked the participants to study lists of 15 randomly chosen words and, a minute later, to repeat the words back in whichever order they came to mind.
The researchers examined the brain recordings as the participants studied each word to home in on signals in the participant’ brains that reflected the meanings of the words. About a second before the participants recalled each word, these same “meaning signals” that were identified during the study phase were spontaneously reactivated in the participants’ brains.
Because the participants were not seeing, hearing or speaking any words at the times these patterns were reactivated, the researchers could be sure they were observing the neural signatures of the participants’ self-generated, internal thoughts.
Critically, differences across participants in the way these meaning signals were reactivated predicted the order in which the participants would recall the words. In particular, the degree to which the meaning signals were reactivated before recalling each word reflected each participant’s tendency to group similar words (like “duck” and “goose”) together in their recall sequence. Since the participants were instructed to say the words in the order they came to mind, the specific sequence of recalls a participant makes provides insights into how the words were organized in that participant’s memory.
In an earlier study, Manning and Kahana used a similar technique to predict participants’ tendencies to organize learned information according to the time in which it was learned. Their new study adds to this research by elucidating the neural signature of organizing learned information by meaning.
“Each person’s brain patterns form a sort of ‘neural fingerprint’ that can be used to read out the ways they organize their memories through associations between words,” Manning said.
The techniques the researchers developed in this study could also be adapted to analyze many different ways of mentally organizing studied information.
“In addition to looking at memories organized by time, as in our previous study, or by meaning, as in our current study, one could use our technique to identify neural signatures of how individuals organize learned information according to appearance, size, texture, sound, taste, location or any other measurable property,” Manning said.
Such studies would paint a more complete picture of a fundamental aspect of human behavior.
“Spontaneous verbal recall is a form of memory that is both pervasive in our lives and unique to the human species,” Kahana said. “Yet, this aspect of human memory is the least well understood in terms of brain mechanisms. Our data show a direct correspondence between patterns of brain activity and the meanings of individual words and show how this neural representation of meaning predicts the way in which one item cues another during spontaneous recall.
“Given the critical role of language in human thought and communication, identifying a neural representation that reflects the meanings of words as they are spontaneously recalled brings us one step closer to the elusive goal of mapping thoughts in the human brain.”
In addition to Kahana and Manning, the research was conducted by Emily Rosenberg, then a research assistant in Kahana’s laboratory, as well as Michael Sperling, director of the Jefferson Epilepsy Center, and Ashwini Sharan, director of functional neurosurgery at Jefferson Hospital. Manning is currently Postdoctoral Research Associate at the Princeton Neuroscience Institute and Department of Computer Science at Princeton University.
The research was supported by the National Institutes of Mental Health.