Posts tagged neuroscience

Posts tagged neuroscience

Researchers survey protein family that helps the brain form synapses
Neuroscientists and bioengineers at Stanford are working together to solve a mystery: How does nature construct the different types of synapses that connect neurons – the brain cells that monitor nerve impulses, control muscles and form thoughts.
In a paper published in the Proceedings of the National Academy of Sciences, Thomas C. Südhof, M.D., a professor of molecular and cellular physiology, and Stephen R. Quake, a professor of bioengineering, describe the diversity of the neurexin family of proteins.
Neurexins help to create the synapses that connect neurons. Think of synapses as switchboards or control panels that connect specific neurons when these brain cells must work together to perform a given task.
Neurexins play a key role in the formation and functioning of synaptic connections. Past human genetics studies have linked neurexins to a variety of cognitive disorders, such as autism and schizophrenia.
Südhof, the Avram Goldstein Professor in the School of Medicine and a winner of the 2013 Nobel Prize in Medicine, has spent years studying the many different forms, or isoforms, of neurexin proteins. He has postulated that different isoforms of neurexins may help to create different types of synaptic connections with distinct properties and functions, and thus enable neurons to do so many complex tasks.
But Südhof had no way to know exactly how many isoforms of neurexins existed until he sat down last year with Quake, the Lee Otterson Professor in the School of Engineering. Quake has pioneered new ways to sequence DNA – the master blueprint that nature follows when making proteins.
The study being published in PNAS represents the results of a year-long collaboration between neuroscientists and bioengineers to better understand how different neurexin proteins affect the behavior of synapses and, ultimately, normal brain functions and neurological conditions such as autism.
Though this will not be the last word on the subject, the findings help illuminate how the brain works and improve our understanding of neurological disorders.
Inside cells, a molecular machine unzips a double-stranded DNA molecule to create an RNA molecule. The RNA molecule is a copy of all the genetic instructions encoded into the DNA. But only specific regions of this RNA molecule contain instructions for making a specific protein. The cell has ways to remove the unnecessary regions and splice the protein-coding regions into a shorter RNA molecule called messenger RNA or mRNA. Thus, each mRNA contains the full instructions for making a specific protein.
To begin this experiment, Ozgun Gokce, a postdoctoral scholar in molecular and cellular physiology in Südhof’s lab, and Barbara Treutlein, a postdoctoral scholar in Quake’s lab, extracted brain cells from the prefrontal cortex of a mouse, then isolated the RNA contained in this tissue.
From this large pool of RNAs they then identified the mRNAs for neurexins. They ran those messenger molecules through equipment designed to read the entire long sequence of chemical instructions for making a specific isoform in the neurexin family of protein.
Quake’s lab is adept at using new instruments that allow researchers to read the long sequence of chemicals in an mRNA strand, allowing them to ascertain exactly what directions this messenger is carrying to the cell’s protein-making machinery.
“This experiment couldn’t have been done even a few years ago,” Treutlein explained.
The mRNAs for neurexins are very long chains of nucleotides – the chemicals that encode genetic information. Only recently have instruments been capable of reading the exact sequence of such long nucleotide chains.
The ability to read the entire sequence of each mRNA was essential because neurexins have 25 constituent parts. But not all of these parts are used each time neurons produce a copy of the protein. Isoforms of neurexin have different combinations of these 25 possible parts. This experiment was designed to discover how many isoforms of neurexin existed and how prevalent each of these isoforms was.
The researchers analyzed more than 25,000 full-length neurexin mRNAs. They found 450 variants. Each variant omitted one or more of the 25 possible components. Most of these isoforms occurred infrequently. A handful accounted for the predominant isoforms.
Although the Stanford scientists sequenced 25,000 mRNAs to discover 450 variants, they believe that if they were to sequence even more mRNAs they would discover more isoforms – their estimate is that at least 2,500 isoforms of the neurexin family exist.
“The fact that we see so many isoforms supports the theory that these protein variants contribute to the huge diversity of synaptic connections that neuroscientists have observed,” Treutlein said.
The experiment raises many questions for future study. For instance, what functions are performed by the predominant isoforms versus the rare variants; how does the inclusion or exclusion of components affect that isoform and the synapse in which it works?
“This experiment was like a flight over the terrain,” Gokce said. “Now we have to go down and look at the details.”

How age opens the gates for Alzheimer’s
With advancing age, highly-evolved brain circuits become susceptible to molecular changes that can lead to neurofibrillary tangles — a hallmark of Alzheimer’s Disease, Yale researchers report the week of March 17 in the Proceedings of the National Academy of Sciences.
The findings not only help to explain why age is such a large risk factor for Alzheimer’s, but why the higher brain circuits regulating cognition are so vulnerable to degeneration while the sensory cortex remains unaffected.
“We hope that understanding the key molecular alterations that occur with advancing age can provide new strategies for disease prevention,” said Amy F.T. Arnsten, professor of neurobiology and one of the senior authors of the study.
Neurofibrillary tangles are made from a protein called tau, which becomes sticky and clumps together when modified in a process called phosphorylation. The Yale study found that phosphorylated tau collects in neurons in higher brain circuits of the aging primate brain, but does not accumulate in neurons of the sensory cortex. Phosphorylated tau collects in and near the excitatory connections called synapses where neurons communicate and can spread between cells in higher brain circuits, the study found.
The study led by Yale researchers Becky C. Carlyle, Angus Nairn, Arnsten and Constantinos D. Paspalas found clues about what causes tau to become phosphorylated with advancing age. They uncovered age-related changes in the molecular signals that control the strength of higher cortical connections. In young brains, an enzyme called phosphodiesterase PDE4A sits near the synapse where it inhibits a chemical “vicious cycle” that disconnects higher brain circuits when we are in danger, switching control of behavior to more primitive brain areas. They further found that PDE4A is lost in the aged prefrontal association cortex, unleashing a chemical cascade of events that increase the phosphorylation of tau. This process may be amplified in humans, where high order cortical neurons have even more excitatory connections, leading to tangle formation and ultimately cell death.
“This insight into one pathway by which tau may influence the onset and progression of Alzheimer’s disease takes us a step closer to unraveling this complex and devastating disorder,” said Dr. Molly Wagster, of the National Institutes of Health, a co-funder of the research.
The new study may also help to explain why head injury is a risk factor for Alzheimer’s, as it may also increase the activity of the chemical “vicious cycle.”
“Now that we begin to see what makes neurons vulnerable, we may be able to protect cells with treatments that mimic the protective effects of PDE4A,” said Arnsten.
A new study in animals shows that using a compound to block the body’s immune response greatly reduces disability after a stroke.

The study by scientists from the University of Wisconsin School of Medicine and Public Health also showed that particular immune cells – CD4+ T-cells produce a mediator, called interleukin (IL)-21 that can cause further damage in stroke tissue.
Moreover, normal mice, ordinarily killed or disabled by an ischemic stroke, were given a shot of a compound that blocks the action of IL-21. Brain scans and brain sections showed that the treated mice suffered little or no stroke damage.
“This is very exciting because we haven’t had a new drug for stroke in decades, and this suggests a target for such a drug,” says lead author Dr. Zsuzsanna Fabry, professor of pathology and laboratory medicine
Stroke is the fourth-leading killer in the world and an important cause of permanent disability. In an ischemic stroke, a clot blocks the flow of oxygen-rich blood to the brain. But Fabry explains that much of the damage to brain cells occurs after the clot is removed or dissolved by medicine. Blood rushes back into the brain tissue, bringing with it immune cells called T-cells, which flock to the source of an injury.
The study shows that after a stroke, the injured brain cells provoke the CD4+ T-cells to produce a substance, IL-21, that kills the neurons in the blood-deprived tissue of the brain. The study gave new insight how stroke induces neural injury.
Similar Findings in Humans
Fabry’s co-author Dr. Matyas Sandor, professor of pathology and laboratory medicine, says that the final part of the study looked at brain tissue from people who had died following ischemic strokes. It found that CD4+ T-cells and their protein, IL-21 are in high concentration in areas of the brain damaged by the stroke.
Sandor says the similarity suggests that the protein that blocks IL-21 could become a treatment for stroke, and would likely be administered at the same time as the current blood-clot dissolving drugs.
“We don’t have proof that it will work in humans,” he says, “but similar accumulation of IL-21 producing cells suggests that it might.”
The paper was published this week in the Journal of Experimental Medicine.
(Source: med.wisc.edu)

Children’s preferences for sweeter and saltier tastes are linked to each other
Scientists from the Monell Chemical Senses Center have found that children who most prefer high levels of sweet tastes also most prefer high levels of salt taste and that, in general, children prefer sweeter and saltier tastes than do adults. These preferences relate not only to food intake but also to measures of growth and can have important implications for efforts to change children’s diets.
Many illnesses of modern society are related to poor food choices. Because children consume far more sugar and salt than recommended, which contributes to poor health, understanding the biology behind children’s preferences for these tastes is a crucial first step to reducing their intake.
"Our research shows that the liking of salty and sweet tastes reflects in part the biology of the child," said study lead author Julie Mennella, PhD, a biopsychologist at Monell. Biology predisposes us to like and consume calorie-rich sweet foods and sodium-rich salty foods, and this is especially true for children. "Growing children’s heightened preferences for sweet and salty tastes make them more vulnerable to the modern diet, which differs from the diet of our past, when salt and sugars were once rare and expensive commodities."
In the study, published online at PLOS ONE, Mennella and colleagues tested 108 children between 5 and 10 years old, and their mothers, for salt and sweet taste preferences. The same testing method was used for both children and their mothers, who tasted broth and crackers that varied in salt content, and sugar water and jellies that varied in sugar content. The method, developed by Mennella and her colleagues at Monell, scientifically determines taste preferences, even for very young children, by having them compare two different levels of a taste, pick their favorite, and then compare that favorite with another, over and again until the most favorite is identified.
Mennella and colleagues also had mothers and children list foods and beverages they consumed in the past 24 hours, from which daily sodium, calorie, and added sugar intakes were estimated. Subjects then gave a saliva sample, which was genotyped for a sweet receptor gene, and a urine sample to measure levels of Ntx, a marker for bone growth. Weight, height, and percent body fat were measured for all subjects.
Analyses of all these data showed that not only were sweet and salty preferences correlated in children, and higher overall than those in adults, but also children’s taste preferences related to measures of growth and development: children who were tall for their age preferred sweeter solutions, and children with higher amounts of body fat preferred saltier soups. There was also some indication that higher sweet liking related to spurts in bone growth, but that result needs confirmation in a larger group of children.
Sweet and salty preferences were correlated in adults as well. And in adults, but not in children, sweet receptor genotype was related to the most preferred level of sweetness. “There are inborn genetic differences that affect the liking for sweet by adults,” says collaborator Danielle Reed, PhD, “but for children, other factors – perhaps the current state of growth – are stronger influences than genetics.”
Both children and adults who preferred higher levels of salt in food also reported consuming more dietary salt in the past 24 hours, but no such relationship was found between sweet preferences and sugar intake. This difference may reflect parents exerting greater control in their children’s diet for added sugar than for added salt. Or it could reflect increased use of non-nutritive sweeteners in foods geared for children – in other words, the sweetness of some foods doesn’t reflect their sugar content.
Current intakes of sodium and added sugars among US children are well in excess of recommendations. For almost all 2- to 8-year-olds, added sugars account for more than half of their discretionary calories (130 total discretionary calories are allowed for children of this age). For 4- to 13-year-olds, sodium intake is more than twice adequate levels (1200-1500 mg/day is allowed for children of this age). The children studied by Mennella and colleagues, two-thirds of whom were overweight or obese, also consumed twice adequate levels of sodium, and their added sugar intake averaged almost 20 teaspoons, or 300 calories, each day.
Guidelines from leading authorities, including the World Health Organization, American Heart Association, U.S. Department of Agriculture, and Institute of Medicine, recommend significantly cutting sugar and salt intake for children, but this can be a daunting task. Commenting on the implications of her research, lead author Mennella noted, “The present findings reveal that the struggle parents have in modifying their children’s diets to comply with recommendations appears to have a biological basis.”
Understanding the basic biology that drives the desire for sweet and salty tastes in children illustrates their vulnerability to the current food environment. But on a positive note, Mennella observed, “it also paves the way toward developing more insightful and informed strategies for promoting healthy eating that meet the particular needs of growing children.”
Scientists slow development of Alzheimer’s trademark cell-killing plaques
University of Michigan researchers have learned how to fix a cellular structure called the Golgi that mysteriously becomes fragmented in all Alzheimer’s patients and appears to be a major cause of the disease.
They say that understanding this mechanism helps decode amyloid plaque formation in the brains of Alzheimer’s patients—plaques that kills cells and contributes to memory loss and other Alzheimer’s symptoms.
The researchers discovered the molecular process behind Golgi fragmentation, and also developed two techniques to ‘rescue’ the Golgi structure.
"We plan to use this as a strategy to delay the disease development," said Yanzhuang Wang, U-M associate professor of molecular, cellular and developmental biology. "We have a better understanding of why plaque forms fast in Alzheimer’s and found a way to slow down plaque formation."
The paper appears in an upcoming edition of the Proceedings of the National Academy of Sciences. Gunjan Joshi, a research fellow in Wang’s lab, is the lead author.
Wang said scientists have long recognized that the Golgi becomes fragmented in the neurons of Alzheimer’s patients, but until now they didn’t know how or why this fragmentation occurred.
The Golgi structure has the important role of sending molecules to the right places in order to make functional cells, Wang said. The Golgi is analogous to a post office of the cell, and when the Golgi becomes fragmented, it’s like a post office gone haywire, sending packages to the wrong places or not sending them at all.
U-M researchers found that the accumulation of the Abeta peptide—the primary culprit in forming plaques that kill cells in Alzheimer’s brains—triggers Golgi fragmentation by activating an enzyme called cdk5 that modifies Golgi structural proteins such as GRASP65.
Wang and colleagues rescued the Golgi structure in two ways: they either inhibited cdk5 or expressed a mutant of GRASP65 that cannot be modified by cdk5. Both rescue measures decreased the harmful Abeta secretion by about 80 percent.
The next step is to see if Golgi fragmentation can be delayed or reversed in mice, Wang said. This involves a collaboration with the Michigan Alzheimer’s Disease Center at the U-M Health System, directed by Dr. Henry Paulson, professor of neurology, and Geoffrey Murphy, assistant professor of physiology and research professor at the U-M Molecular and Behavioral Neuroscience Institute.
Childhood’s end: ADHD, autism and schizophrenia tied to stronger inhibitory interactions in adolescent prefrontal cortex
Key cognitive functions such as working memory (which combines temporary storage and manipulation of information) and executive function (a set of mental processes that helps connect past experience with present action) are associated with the brain’s prefrontal cortex. Unlike other brain regions, the prefrontal cortex does not mature until early adulthood, with the most pronounced changes being seen between its peripubertal (onset of puberty) and postpubertal developmental states. Moreover, this maturation period is correlated with cognitive maturation – but the physical neuronal changes during this transition have remained for the most part unknown. Recently, however, scientists at the Wake Forest School of Medicine in Winston-Salem, NC recorded and compared prefrontal cortical activity peripubertal and adult monkeys.
The researchers found that compared with adults, peripubertal monkeys showed lower connectivity due to stronger inhibitory interactions, suggesting that intrinsic (or resting state) inhibitory connections – that is, inhibitory neural connections that are active in the absence of any particular task – decline with maturation. The scientists then concluded that prefrontal intrinsic connectivity changes are a possible substrate for cognitive maturation.
Prof. Christos Constantinidis discusses the paper that he, Dr. Xin Zhou and their co-authors published in Proceedings of the National Academy of Sciences. When comparing the functional connectivity between pairs of neurons in neuronal activity recorded from the prefrontal cortex of peripubertal and adult monkeys and evaluating the developmental stage of peripubertal rhesus monkeys with a series of morphometric, hormonal, and radiographic measures, Constantinidis tells Medical Xpress that a major challenge was to obtain neural activity from the brain of monkeys around the time of puberty. “We needed to make ourselves experts in the developmental trajectories of monkeys and conduct experiments just at the right time relative to the onset of puberty,” he explains.
Stumbling Fruit Flies Lead Scientists to Discover Gene Essential for Sensing Joint Position
Scientists at The Scripps Research Institute (TSRI) have discovered an important mechanism underlying sensory feedback that guides balance and limb movements.
The finding, which the TSRI team uncovered in fruit flies, centers on a gene and a type of nerve cell required for detection of leg-joint angles. “These cells resemble human nerve cells that innervate joints,” said team leader Professor Boaz Cook, who is an assistant professor at TSRI, “and they encode joint-angle information in the same way.”
If the findings can be fully replicated in humans, they could lead to a better understanding of, as well as treatments for, disorders arising from faulty proprioception, the detection of body position.
A report of the findings appears in the March 14, 2014 issue of the journal Science.
A Mystery of Sensation
The proprioceptive sense of how the limbs are positioned is what enables a person, even with eyes closed, to touch the tip of the nose with the tip of a finger—an ability easily impaired by alcohol, which is why traffic police often test suspected drunk drivers this way.
Scientists have known that proprioceptive signals originate from so-called mechanosensory neurons, whose nerve ends are embedded in muscles, skin and other tissues. The stretching or compression of these tissues opens ion channels in the nerve membrane, which results in a signal to the brain.
What hasn’t been clear is how such a neuron can specialize in sensing just one type of membrane-distorting stimulus—such as the angle of a limb joint—yet exclude others, such as impact pressures.
In the new study, Cook and two members of his laboratory, first author Bela S. Desai, a postdoctoral fellow, and graduate student Abhishek Chadha, sought to shed some light on this mystery with a study of Drosophila fruit flies. Quickly maturing and easily studied, Drosophila often are analyzed for clues to the genetic underpinnings of basic animal behaviors.
Following the Trail
Cook and his colleagues began with a special collection of Drosophila containing a variety of uncatalogued mutations. The scientists sifted through the collection looking for mutant flies with walking impairments and soon zeroed in on several impaired walkers that turned out to have mutations in the same gene.
The scientists named the gene stumble (stum for short) for the abnormality caused by its absence.
Using a fluorescent tracer, they then localized the expression of stum in normal flies to neurons that lay close to the three main leg joints. Each neuron’s input-sensing tendril (dendrite) grew right up to the joints—a sign that its evolved function is to detect joint angle.
The researchers also found that the protein specified by the stum gene normally migrates to the tip of each dendrite. With high-resolution microscopy, they imaged each of these tips and observed an extra length branching more or less sideways at the joint.
At ordinary, at-rest joint angles, the relative positions of the main dendrite tip and its side branch stayed more or less the same; however, at extreme joint angles, the pair stretched out. As they did, the level of calcium ions in the neuron rose sharply, suggesting that ion channels had opened and the neuron was becoming active.
Cook noted the results show how a seemingly general mechanosensory, membrane-stretch-sensitive neuron can evolve a specificity for a particular type of proprioceptive signal. “It’s a nice example of how you can create that specificity from something that only stretches mechanically,” he said.
The team is now trying to nail down the specific role of stum proteins in Drosophila and to determine whether the human version of stum—which has never been characterized—also works in joint angle sensing. Some sensory role for the human version of stum is likely, as the stum gene has been remarkably well conserved throughout animal evolution. Cook and his colleagues were even able to restore some normal walking ability to stum-mutant flies by adding the mouse version of the stum gene. “Stum is probably doing the same thing in all animals,” he said.
A novel protein may explain how biological clocks regulate human sleep cycles

In a series of experiments sparked by fruit flies that couldn’t sleep, Johns Hopkins researchers say they have identified a mutant gene — dubbed “Wide Awake” — that sabotages how the biological clock sets the timing for sleep. The finding also led them to the protein made by a normal copy of the gene that promotes sleep early in the night and properly regulates sleep cycles.
Because genes and the proteins they code for are often highly conserved across species, the researchers suspect their discoveries — boosted by preliminary studies in mice — could lead to new treatments for people whose insomnia or off-hours work schedules keep them awake long after their heads hit the pillow.
“We know that the timing of sleep is regulated by the body’s internal biological clock, but just how this occurs has been a mystery,” says study leader Mark N. Wu, M.D., Ph.D., an assistant professor of neurology, medicine, genetic medicine and neuroscience at the Johns Hopkins University School of Medicine. “We have now found the first protein ever identified that translates timing information from the body’s circadian clock and uses it to regulate sleep.”
A report on the work was published online March 13 in the journal Neuron.
In their hunt for the molecular roots of sleep regulation, Wu and his colleagues studied thousands of fruit fly colonies, each with a different set of genetic mutations, and analyzed their sleep patterns. They found that one group of flies, with a mutation in the gene they would later call Wide Awake (or Wake for short), had trouble falling asleep at night, a malady that looked a lot like sleep-onset insomnia in humans. The investigators say Wake appears to be the messenger from the circadian clock to the brain, telling it that it’s time to shut down and sleep.
After isolating the gene, Wu’s team determined that when working properly, Wake helps shut down clock neurons of the brain that control arousal by making them more responsive to signals from the inhibitory neurotransmitter called GABA. Wake does this specifically in the early evening, thus promoting sleep at the right time. Levels of Wake cycle during the day, peaking near dusk in good sleepers.
Flies with a mutated Wake gene that couldn’t get to sleep were not getting enough GABA signal to quiet their arousal circuits at night, keeping the flies agitated.
The researchers found the same gene in every animal they studied: humans, mice, rabbits, chickens, even worms.
Importantly, when Wu’s team looked to see where Wake was located in the mouse brain, they found that it was expressed in the suprachiasmatic nucleus (SCN), the master clock in mammals. Wu says the fact that the Wake protein was expressed in high concentrations in the SCN of mice is significant.
“Sometimes we discover things in flies that have no direct relevance in higher order animals,” Wu says. “In this case, because we found the protein in a location where it likely plays a role in circadian rhythms and sleep, we are encouraged that this protein may do the same thing in mice and people.”
The hope is that someday, by manipulating Wake, possibly with a medication, shift workers, military personnel and sleep-onset insomniacs could sleep better.
“This novel pathway may be a place where we can intervene,” Wu says.
(Source: hopkinsmedicine.org)
What happened when? How the brain stores memories by time
Before I left the house this morning, I let the cat out and started the dishwasher. Or was that yesterday? Very often, our memories must distinguish not just what happened and where, but when an event occurred — and what came before and after. New research from the University of California, Davis, Center for Neuroscience shows that a part of the brain called the hippocampus stores memories by their “temporal context” — what happened before, and what came after.
"We need to remember not just what happened, but when," said graduate student Liang-Tien (Frank) Hsieh, first author on the paper published March 5 in the journal Neuron.
The hippocampus is thought to be involved in forming memories. But it’s not clear whether the hippocampus stores representations of specific objects, or if it represents them in context.
Hsieh and Charan Ranganath, professor in the Department of Psychology and the Center for Neuroscience, looked for hippocampus activity linked to particular memories. First, they showed volunteers a series of pictures of animals and objects. Then they scanned the volunteers’ brains as they showed them the same series again, with questions such as, “is this alive?” or “does this generate heat?”
The questions prompted the volunteers to search their memories for information. When the images were shown in the same sequence as before, the volunteers could anticipate the next image, making for a faster response.
From brain scans of the hippocampus as the volunteers were answering questions, Hsieh and Ranganath could identify patterns of activity specific to each image. But when they showed the volunteers the same images in a different sequence, they got different patterns of activity.
In other words, the coding of the memory in the hippocampus was dependent on its context, not just on content.
"It turns out that when you take the image out of sequence, the pattern disappears," Ranganath said. "For the hippocampus, context is critical, not content, and it’s fairly unique in how it pulls things together."
Other parts of the brain store memories of objects that are independent of their context, Ranganath noted.
"For patients with memory problems this is a big deal," Ranganath said. "It’s not just something that’s useful in understanding healthy memory, but allows us to understand and intervene in memory problems."
Researchers identify decision-making center of brain
Although choosing to do something because the perceived benefit outweighs the financial cost is something people do daily, little is known about what happens in the brain when a person makes these kinds of decisions. Studying how these cost-benefit decisions are made when choosing to consume alcohol, University of Georgia associate professor of psychology James MacKillop identified distinct profiles of brain activity that are present when making these decisions.
"We were interested in understanding how the brain makes decisions about drinking alcohol. Particularly, we wanted to clarify how the brain weighs the pros and cons of drinking," said MacKillop, who directs the Experimental and Clinical Psychopharmacology Laboratory in the UGA Franklin College of Arts and Sciences.
The study combined functional magnetic resonance imaging and a bar laboratory alcohol procedure to see how the cost of alcohol affected people’s preferences. The study group included 24 men, age 21-31, who were heavy drinkers. Participants were given a $15 bar tab and then were asked to make decisions in the fMRI scanner about how many drinks they would choose at varying prices, from very low to very high. Their choices translated into real drinks, at most eight that they received in the bar immediately after the scan. Any money not spent on drinks was theirs to keep.
The study applied a neuroeconomic approach, which integrates concepts and methods from psychology, economics and cognitive neuroscience to understand how the brain makes decisions. In this study, participants’ cost-benefit decisions were categorized into those in which drinking was perceived to have all benefit and no cost, to have both benefits and costs, and to have all costs and no benefits. In doing so, MacKillop could dissect the neural mechanisms responsible for different types of cost-benefit decision-making.
"We tried to span several levels of analysis, to think about clinical questions, like why do people choose to drink or not drink alcohol, and then unpack those choices into the underlying units of the brain that are involved," he said.
When participants decided to drink in general, activation was seen in several areas of the cerebral cortex, such as the prefrontal and parietal cortices. However, when the decision to drink was affected by the cost of alcohol, activation involved frontostriatal regions, which are important for the interplay between deliberation and reward value, suggesting suppression resulting from greater cognitive load. This is the first study of its kind to examine cost-benefit decision-making for alcohol and was the first to apply a framework from economics, called demand curve analysis, to understanding cost-benefit decision making.
"The brain activity was most differentially active during the suppressed consumption choices, suggesting that participants were experiencing the most conflict," MacKillop said. "We had speculated during the design of the study that the choices not to drink at all might require the most cognitive effort, but that didn’t seem to be the case. Once people decided that the cost of drinking was too high, they didn’t appear to experience a great deal of conflict in terms of the associated brain activity."
These conflicted decisions appeared to be represented by activity in the anterior insula, which has been linked in previous addiction studies to the motivational circuitry of the brain. Not only encoding how much people crave or value drugs, this portion of the brain is believed to be responsible for processing interceptive experiences, a person’s visceral physiological responses.
"It was interesting that the insula was sensitive to escalating alcohol costs especially when the costs of drinking outweighed the benefits," MacKillop said. "That means this could be the region of the brain at the intersection of how our rational and irrational systems work with one another. In general, we saw the choices associated with differential brain activity were those choices in the middle, where people were making choices that reflect the ambivalence between cost and benefits. Where we saw that tension, we saw the most brain activity."
While MacKillop acknowledges the impact this research could have on neuromarketing-or understanding how the brain makes decisions about what to buy-he is more interested in how this research can help people with alcohol addictions.
"These findings reveal the distinct neural signatures associated with different kinds of consumption preferences. Now that we have established a way of studying these choices, we can apply this approach to better understanding substance use disorders and improving treatment," he said, adding that comparing fMRI scans from alcoholics with those of people with normal drinking habits could potentially tease out brain patterns that show what is different between healthy and unhealthy drinkers. "In the past, we have found that behavioral indices of alcohol value predict poor treatment prognosis, but this would permit us to understand the neural basis for negative outcomes."
The research was published in the journal Neuropsychopharmacology March 3. A podcast highlighting this work is available at http://www.nature.com/multimedia/podcast/npp/npp_030314_alcohol.mp3.