UC Berkeley neuroscientists have tracked the progress of a thought through the brain, showing clearly how the prefrontal cortex at the front of the brain coordinates activity to help us act in response to a perception.
Recording the electrical activity of neurons directly from the surface of the brain, the scientists found that for a simple task, such as repeating a word presented visually or aurally, the visual and auditory cortexes reacted first to perceive the word. The prefrontal cortex then kicked in to interpret the meaning, followed by activation of the motor cortex in preparation for a response. During the half-second between stimulus and response, the prefrontal cortex remained active to coordinate all the other brain areas.
For a particularly hard task, like determining the antonym of a word, the brain required several seconds to respond, during which the prefrontal cortex recruited other areas of the brain, including presumably memory networks not actually visible. Only then did the prefrontal cortex hand off to the motor cortex to generate a spoken response. The quicker the brain’s handoff, the faster people responded.
Interestingly, the researchers found that the brain began to prepare the motor areas to respond very early, during initial stimulus presentation, suggesting that we get ready to respond even before we know what the response will be.
“This might explain why people sometimes say things before they think,” said Avgusta Shestyuk, a senior researcher in UC Berkeley’s Helen Wills Neuroscience Institute and lead author of a paper reporting the results in the current issue of Nature Human Behavior.
The findings, including the key role played by the prefrontal cortex in coordinating all the activated regions of the brain, are in line with what neuroscientists have pieced together over the past decades from studies in monkeys and humans.
“These very selective studies have found that the frontal cortex is the orchestrator, linking things together for a final output,” said co-author Robert Knight, a UC Berkeley professor of psychology and neuroscience and a professor of neurology and neurosurgery at UCSF. “Here we have eight different experiments, some where the patients have to talk and others where they have to push a button, where some are visual and others auditory, and all found a universal signature of activity centered in the prefrontal lobe that links perception and action. It’s the glue of cognition.”
While other neuroscientists have used functional magnetic resonance imaging (fMRI) and electroencephelography (EEG) to record activity in the thinking brain, the UC Berkeley scientists employed a much more precise technique, electrocorticograhy (ECoG), which records from several hundred electrodes placed on the brain surface and detects activity in the thin outer region, the cortex, where thinking occurs. ECoG provides better time resolution than fMRI and better spatial resolution than EEG, but requires access to epilepsy patients undergoing highly invasive surgery involving opening the skull to pinpoint the location of seizures.
Clues from epilepsy patients
The current study employed 16 epilepsy patients who agreed to participate in experiments while undergoing epilepsy surgery at UC San Francisco and California Pacific Medical Center in San Francisco, Stanford University in Palo Alto and Johns Hopkins University in Baltimore.
“This is the first step in looking at how people think and how people come up with different decisions; how people basically behave,” said Shestyuk, who recorded from the first patient 10 years ago. “We are trying to look at that little window of time between when things happen in the environment and us behaving in response to it.”
Once the electrodes were placed on the brains of each patient, Shestyuk and her colleagues conducted a series of eight tasks that included visual and auditory stimuli. The tasks ranged from simple, such as repeating a word or identifying the gender of a face or a voice, to complex, such as determining a facial emotion, uttering the antonym of a word or assessing whether an adjective describes the patient’s personality.
During these tasks, the brain showed four different types of neural activity. Initially, sensory areas of the auditory and visual cortex activate to process audible or visual cues. Subsequently, areas primarily in the sensory and prefrontal cortices activate to extract the meaning of the stimulus. The prefrontal cortex is continuously active throughout these processes, coordinating input from different areas of the brain. Finally, the prefrontal cortex stands down as the motor cortex activates to generate a spoken response or an action, such as pushing a button.
“This persistent activity, primarily seen in the prefrontal cortex, is a multitasking activity,” Shestyuk said. “fMRI studies often find that when a task gets progressively harder, we see more activity in the brain, and the prefrontal cortex in particular. Here, we are able to see that this is not because the neurons are working really, really hard and firing all the time, but rather, more areas of the cortex are getting recruited.”
In sum, Knight said, “Sustained activity in the prefrontal cortex is what guides a perception into an action.”
Other co-authors of the paper are first author Matar Haller, who obtained her Ph.D. in neuroscience from UC Berkeley and is now a researcher at SparkBeyond in Israel, former UC Berkeley undergraduate John Case, neurologist and epileptologist Nathan Crone of Johns Hopkins Univeristy, neurosurgeon Eddie Chang of UCSF, epileptologists David King-Stephens, Kenneth Laxer and and Peter Weber of CPMC and epileptologist Josef Parvizi of Stanford.
The work was supported by the National Science Foundation, National Institute of Mental Health (F32MH75317) and National Institute of Neurological Disorders and Stroke (R37NS21135).