Investigators at Stanford University have identified a small group of nerve cells in a specific brain region of rats whose signaling activity, or lack of it, explains the vast bulk of differences in risk-taking preferences among the animals.
That activity not only predicts but effectively determines whether an animal decides to take a chance or stick with the safe choice.
The findings expand on noninvasive research conducted previously in humans. “Humans and rats have similar brain structures involved,” said Karl Deisseroth, MD, PhD, professor of bioengineering and of psychiatry and behavioral sciences. “And we found that a drug known to increase risk preference in people had the same effect on the rats. So every indication is that these findings are relevant to humans.
“Risky behavior has its moments where it’s valuable,” he added. “As a species, we wouldn’t have come as far as we have without it.”
But a propensity for high-risk behavior can be damaging, too, said Deisseroth, a practicing psychiatrist. “I’ve seen patients whose aberrantly high-risk-seeking activity resulted in accidents, addictions and social, financial or occupational failures that exposed them to a lot of harm and blame.”
The research is described in a paper published online March 23 in Nature. Deisseroth is the senior author. The lead author is graduate student Kelly Zalocusky.
By throwing light not only on how individual decisions are made but on why individuals differ in their overall risk-taking profiles, the study could provide a better understanding of some psychiatric conditions and lead to better medications to treat them. And, for that matter, it could help researchers mitigate the effect of drugs that themselves influence risk preferences. For example, a drug called pramipexole, prescribed for Parkinson’s disease and other brain disorders, can cause problem gambling.
Appetite for risk varies
Individuals vary in their appetite for risk, said Deisseroth, the D.H. Chen Professor and a Howard Hughes Medical Institute investigator. Most adult humans are relatively risk-averse. Given a choice between, say, a stable salary or fluctuating freelance income that’s likely to wind up being about the same or even somewhat larger in the long run, individuals will usually pick the salaried option.
moments where it’s valuable.
That makes evolutionary sense, Deisseroth said. “One can’t always take the long view. In an always-changing world filled with dangers ranging from starvation to predators, even if a riskier option has a higher expected return over time, one can’t always live long enough to take advantage of it,” he said.
However, a minority within each species studied tends to prefer risk. And even largely risk-averse individuals sometimes choose riskier options.
The researchers focused on a complex of brain circuitry known as the reward system that is shared by every living creature from flies to humans. This circuitry’s evolutionary conservation is due to its essential role in guiding individuals’ behavior, and ensuring species’ survival, by inducing pleasurable sensations and boosting motivation in response to the anticipation or realization of behaviors such as eating and mating.
Reward system’s key nerve tract
A core feature of the reward system is a nerve tract projecting from a deep-brain structure called the ventral tegmental area to another structure in the forebrain, the nucleus accumbens. Nerve cells in this tract can secrete a chemical called dopamine that binds to surface receptors residing on some nerve cells in the nucleus accumbens. This, in turn, ignites activity within the cells that harbor dopamine-receptors. The receptors fall mainly into two categories, DR1 and DR2, that are mostly found on different cells.
Drawing on hints from the medical literature — including previous human brain-imaging research by study co-author and associate professor of psychology Brian Knutson, PhD, indicating increased activity in the nucleus accumbens when people were considering taking risks — the researchers zeroed in on activity in DR2-containing nerve cells in the nucleus accumbens during the decision-making process. They used a single, hair-thin optical fiber implanted in the rats’ nucleus accumbens to both monitor electrochemical signals there — a technique called fiber photometry — and precisely duplicate these naturally occurring signals’ timing and magnitude by stimulating cells with light — a technique called optogenetics. Both techniques were pioneered in Deisseroth’s lab.
The scientists targeted DR2 cells in rats that had been trained and fitted for both fiber photometry and optogenetics with a thin, implanted optical fiber that allowed the rats to move freely. The experiments that followed were designed by Zalocusky and her colleagues including Knutson and Deisseroth.
Mmmmm, sugar water
The rats could initiate a session by poking their nose into a hole, at which point two levers would pop out. Pulling one lever, the rats soon learned, resulted in a dependable dose of sugar water, always the same size. Pulling the other lever would yield a much smaller sugar-water dose most of the time, but a much larger one every so often. The system was set up so that either lever would earn a rat the same total payoff, eventually.
Once trained, about two-thirds of the rats proved risk-averse, consistently choosing the steady-paying “salary.” The remaining one-third were risk-seeking “freelance” types. If the researchers tricked the rats by reversing the levers’ payoffs, the rats responded by switching levers, each adhering to its own preferred reward schedule.
Occasionally, though, a rat of either type would check out the neglected option. If a risk-averse rat experimenting in this fashion happened to get lucky and reap a windfall, it would try that lever again; if it received a pittance, it quickly returned to the “salary” lever. The easy-come, easy-go risk-seekers were relatively unfazed by smaller-than-anticipated rewards. Like some people, a risk-seeking rat on a losing streak doesn’t give up so easily.
Altering rats’ risk preferences
Fiber-photometric observation indicated that — during a roughly 1-second period after a rat initiated the trial but before it was allowed to pull one or the other lever — activity in DR-2-containing nerve cells of the nucleus accumbens was significantly elevated in risk-averse, but not risk-seeking, rats. Mimicking this signaling pattern by optogenetically stimulating DR-2 cells with laser-light pulses, the researchers caused risk-seeking rats to become risk-averse. Their gambling penchant returned as soon as the laser pulses were halted. Stimulating the same cells in rats that were already risk-averse produced essentially no change in their behavior.
In contrast, delivering pramipexole (a DR2-stimulating drug that promotes risky behavior in people) directly to the rats’ nucleus accumbens temporarily converted risk-avoider rats into risk-seekers and also reduced the signal’s size in their nucleus accumbens. A DR1-stimulating compound had no such effect.
“It looks as though we have found a brain signal that, in most individuals, corresponds to a memory of a failed risky choice,” said Deisseroth. “It seems to represent the memory of that recent unfavorable outcome, manifested later at just the right time when it can, and does, modify an upcoming decision.”
The signal was highest in risk-averse rats that had been dealt a disappointing outcome on the previous trial, and was weak in risk-seeking rats, unless forced into existence by optogenetic stimulation. This signal could serve as a guide for understanding interpersonal variability in risk-seeking. “It also might be possible to use this animal assay to predict how different drugs can influence human risk-taking,” Zalocusky said.
Other Stanford co-authors of the study are lab manager Charu Ramakrishnan and postdoctoral scholars Talia Lerner, PhD, and Thomas Davidson, PhD.
The study was funded by the National Institutes of Health (grants 2R01MH086373, 1F31MH105151 and 5R37DA035377), the National Science Foundation, the Defense Advanced Research Project Agency, the Stanford Neuroscience Institute Big Ideas Fund, the Stanford Neuroscience Program, the Wiegers Family Fund, the Nancy and James Grosfeld Foundation and the H.L. Snyder Medical Foundation.
Stanford’s Departments of Bioengineering and of Psychiatry and Behavioral Sciences and the university’s Cracking the Neural Code Program also supported the work. The Department of Bioengineering is jointly operated by the School of Medicine and the School of Engineering.