Posts tagged neuroscience

Posts tagged neuroscience
Activating a mother’s immune system during her pregnancy disrupts the development of neural cells in the brain of her offspring and damages the cells’ ability to transmit signals and communicate with one another, researchers with the UC Davis Center for Neuroscience and Department of Neurology have found. They said the finding suggests how maternal viral infection might increase the risk of having a child with autism spectrum disorder or schizophrenia.

The research, “MHCI Requires MEF2 Transcription Factors to Negatively Regulate Synapse Density during Development and in Disease,” is published in the Journal of Neuroscience.
The study’s senior author is Kimberley McAllister, professor in the Center for Neuroscience with appointments in the departments of Neurology and Neurobiology, Physiology and Behavior, and a researcher with the UC Davis MIND Institute.
“This is the first evidence that neurons in the developing brain of newborn offspring are altered by maternal immune activation,” McAllister said. “Until now, very little has been known about how maternal immune activation leads to autism spectrum disorder and schizophrenia-like pathophysiology and behaviors in the offspring.”
The study was conducted in mice and rats and compared the brains of the offspring of rodents whose immune systems had been activated and those of animals whose immune systems had not been activated. The pups of animals that were exposed to viral infection had much higher brain levels of immune molecules known as the major histocompatibility complex I (MHCI) molecules.
“This is the first evidence that MHCI levels on the surface of young cortical neurons in offspring are altered by maternal immune activation,” McAllister said.
The researchers found that the high MHCI levels impaired the ability of the neurons from the newborn mice’s brains to form synapses, the tiny gaps separating brain cells through which signals are transmitted. Earlier research has suggested that ASD and schizophrenia may be caused by changes in the development of connections in the brain, especially the cerebral cortex.
The researchers experimentally reduced MHCI to normal levels in neurons from offspring following maternal immune activation.
“Remarkably, synapse density returned to normal levels in those neurons,” McAllister said.
“These results indicate that maternal immune activation does indeed alter connectivity during prenatal development, causing a profound deficit in the ability of cortical neurons to form synapses that is caused by changes in levels of MHCI on the neurons,” she said.
MHCI did not work alone to limit the development of synapses. In a series of experiments, the UC Davis researchers determined that MHCI interacted with calcineurin and myocyte enhancer factor-2 (Mef2), a protein that is a critical determinant of neuronal specialization.
MHCI, calcineurin and Mef2 form a biological signaling pathway that had not been previously identified. McAllister’s team showed that in the offspring of the maternal immune activation mothers, this novel signaling pathway was much more active than it was in the offspring of non-MIA animals.
“This finding provides a potential mechanism linking maternal immune activation to disease-linked behaviors,” McAllister said.
It also is a mechanism that may help McAllister and other scientists to develop diagnostic tests and eventually therapies to improve the lives of individuals with these neurodevelopmental disorders.
(Source: ucdmc.ucdavis.edu)
Soldiers with blast injuries suffer pituitary hormone problems
Researchers studying British soldiers who fought in Afghanistan have highlighted hormonal problems that commonly result from blast injuries.
Soldiers with injuries affecting the pituitary gland may suffer psychological and metabolic symptoms which impede their recovery.
The researchers, from Imperial College London and the Royal Centre for Defence Medicine, say identifying these sufferers will enable them to receive appropriate hormone replacement therapy.
The research was funded by the Medical Research Council is published in the journal Annals of Neurology.
The study looked at 19 British soldiers with moderate to severe brain injury caused by blasts from improvised explosive devices (IEDs) while on duty in Afghanistan, and a group of 39 individuals with moderate to severe traumatic brain injuries caused by road traffic accidents, falls and assaults.
It found that a much higher proportion of soldiers with blast injuries had pituitary hormone problems (32 per cent) than in the non-blast control group (2.6 per cent).
One in five of the soldiers ended up receiving hormone treatment with growth hormone, testosterone and/or hydrocortisone – a replacement for the stress hormone cortisol.
The study also showed that the soldiers who had pituitary dysfunction following blast injury had more severe damage to white matter connections within the brain, and more severe cognitive problems, such as being slow in processing information, than those who did not have hormone problems.
The recent conflicts in Iraq and Afghanistan have seen rapid advances in personal protective equipment and in the medical management of severe trauma. These gains have meant that increasing numbers of soldiers are surviving previously fatal and complex injuries.
Injuries caused by IEDs are so numerous that they have been called the ‘signature injury’ of these conflicts. Between December 2009 and March 2012, 183 UK soldiers survived a moderate to severe blast traumatic brain injury in Afghanistan. The number of such injuries among US troops is much higher. The complex physical forces involved in a blast have led to much speculation about how the blast wave itself causes brain injury.
Dr Tony Goldstone, from the MRC Clinical Sciences Centre at Imperial College London, who led the study, said: “This study was set up to see if there were facets unique to the kind of trauma caused to the brain by IEDs. We found that there was a high prevalence of hormonal problems in soldiers with these kinds of injuries.
“This study involved a relatively small number of soldiers, and so assessment of additional patients will be needed to confirm such a prevalence rate. However the results do emphasise the importance of actively screening for pituitary problems in all soldiers and others who have had moderate to severe brain injury from exposure to blast. This will enable identification of those who may benefit from hormonal treatments to aid their rehabilitation, recovery and quality of life.”
The patients were treated in the multi-disciplinary traumatic brain injury clinic at the Imperial Centre for Endocrinology at Imperial College Healthcare NHS Trust and scanned at the Computational, Cognitive and Clinical Neuroimaging Laboratory at Imperial College London by Professor David Sharp and Major David Baxter.
Air Marshal Paul Evans, Surgeon General said: “I fully support the research that has been undertaken by Imperial College London and the Ministry of Defence. As Surgeon General, I am committed to ensuring Service personnel benefit from the latest advances in medical research and we continue to conduct research into traumatic brain injury with colleagues at Imperial College London as well as our US and other NATO partners. A Defence Medical Services working group identifies priority areas for TBI research and MOD policy continues to be reviewed in light of emerging best practice. Working in partnership will ensure our personnel benefit as well as enable best practice to be shared between the MOD and NHS.”
Professor David Lomas, Chair of the MRC’s Population and Systems Medicine Board, which funded the research, said: “Trauma is a serious health problem that has a major impact on people in both a civilian and military setting. By linking academic and military research programmes through studies such as this we will build a greater understanding of acute trauma that will inform future approaches to trauma management, to ensure that people suffering major injury receive the most advanced specialist care.”
During pregnancy, the bone hormone osteocalcin is produced by the mother; it crosses the placenta, to reach the fetus, where it promotes the formation of the hippocampus and the development of spatial learning and memory. Postnatally, osteocalcin crosses the blood-brain barrier (BBB), to act in various regions of the brain, including the hippocampus, where it causes changes in brain chemistry that help prevent anxiety and depression and improve spatial learning and memory.
Image credit: Gerard Karsenty, MD, PhD and Franck Oury, PhD/Columbia University Medical Center
Bone Hormone Influences Brain Development and Cognition
Findings could lead to new treatments for memory loss, anxiety, and depression
Researchers from Columbia University Medical Center (CUMC) have found that the skeleton, acting through the bone-derived hormone osteocalcin, exerts a powerful influence on prenatal brain development and cognitive functions such as learning, memory, anxiety, and depression in adult mice. Findings from the mouse study could lead to new approaches to the prevention and treatment of neurologic disorders. The study was published today in the online edition of Cell.
“The brain is commonly viewed as an organ that influences other organs and parts of the body, but less often as the recipient of signals coming from elsewhere, least of all, the bones,” said study leader Gerard Karsenty, MD, PhD, Paul A. Marks Professor of Genetics and Development, professor of medicine, and chair of the Department of Genetics and Development.
“In an earlier study, we showed that the brain is a powerful inhibitor of bone mass accrual,” he said. “This effect was so powerful that it immediately raised the question, ‘Does the bone signal back to the brain to limit this negative influence?’ ‘If so, what signals does it use and how do they work?’”
Dr. Karsenty suspected that osteocalcin, a hormone recently identified by his lab and secreted by osteoblasts, might be involved in such bone-to-brain signaling. Earlier studies had shown that osteocalcin affects a variety of processes, such as energy expenditure, glucose balance, and male fertility. “Since most hormones influence a range of physiological processes, it was reasonable to assume that the endocrine functions of osteocalcin were even broader than what was already known,” he said.
To determine whether osteocalcin did indeed play a role in the brain, Dr. Karsenty and his team studied “osteocalcin-null” mice (mice that have been genetically engineered to not produce any osteocalcin). Using these mice, they were able to show unambiguously that osteocalcin can cross the blood-brain barrier; binds to neurons in the brainstem, midbrain, and hippocampus (which is responsible for learning and memory); promotes the birth of neurons; and increases the synthesis of several neurotransmitters, including serotonin, dopamine, and catecholamine. They also found that osteocalcin-null mice had abnormally small hippocampi, a part of the brain involved in memory.
The researchers then hypothesized that the changes in neurotransmitter synthesis should alter the animals’ behavior. In a series of behavioral tests, they confirmed that osteocalcin-null mice exhibit increased anxiety and depression-like behaviors, as well as impaired learning and memory, compared with normal mice.
These changes are similar to those seen in the aging population. “As we age, bone mass decreases, and the production of osteocalcin probably does, too,” said Dr. Karsenty. “We’re currently looking into this. It is not inconceivable that treatments that boost osteocalcin levels or stimulate osteocalcin receptors could help counter the cognitive effects of aging and aging-related diseases such as Alzheimer’s.”
When adult osteocalcin-null mice were infused with osteocalcin, their anxiety and depression did decrease, “but the infusions didn’t affect learning and memory or the size of the hippocampus,” said Dr. Karsenty. “This was perplexing, so we did another experiment—a postnatal knockout of osteocalcin (a genetically engineered model in which the synthesis of osteocalcin is blocked after birth). These mice were anxious and depressed but had normal memory and hippocampus structure. The unavoidable conclusion of the two experiments was that osteocalcin must act during development.” This led to the second part of their study.
In subsequent experiments, the researchers showed that osteocalcin crosses the placenta from mother to fetus and that this maternal pool of osteocalcin is necessary for formation of the hippocampus and the establishment of memory. Lastly, they showed that once-a-day injections of osteocalcin in osteocalcin-null mothers during pregnancy could prevent the development of behavioral abnormalities in their offspring.
“This finding could explain some of the effects observed in children born from undernourished mothers who develop, with an unusually high frequency, metabolic and psychiatric disorders just as osteocalcin-null mice do,” said Dr. Karsenty. “Malnutrition decreases the activity of bone cells; as a result, undernourished mothers have low bone mass, which affects osteocalcin production. This has clinical relevance even today, in developing countries, where maternal malnutrition is still common.”
Any therapies related to osteocalcin are still years away, however, he added.

Scientists identify brain circuitry that triggers overeating
The finding shows that certain parts of brain cells could play a critical role in anorexia, bulimia, binge eating disorder, and obesity.
Sixty years ago scientists could electrically stimulate a region of a mouse’s brain causing the mouse to eat, whether hungry or not. Now researchers from UNC School of Medicine have pinpointed the precise cellular connections responsible for triggering that behavior. The finding, published September 27 in the journal Science, lends insight into a cause for obesity and could lead to treatments for anorexia, bulimia nervosa, and binge eating disorder, the most prevalent eating disorder in the United States.
“The study underscores that obesity and other eating disorders have a neurological basis,” said senior study author Garret Stuber, PhD, assistant professor in the department of psychiatry and department of cell biology and physiology. He’s also a member of the UNC Neuroscience Center. “With further study, we could figure out how to regulate the activity of cells in a specific region of the brain and develop treatments.”
Cynthia Bulik, PhD, Distinguished Professor of Eating Disorders at UNC School of Medicine and the Gillings School of Global Public Health, said, “Stuber’s work drills down to the precise biological mechanisms that drive binge eating and will lead us away from stigmatizing explanations that invoke blame and a lack of willpower.” Bulik was not part of the research team.
Back in the 1950s, when scientists electrically stimulated a region of the brain called the lateral hypothalamus, they knew that they were stimulating many different types of brain cells. Stuber wanted to focus on one cell type — gaba neurons in the bed nucleus of the stria terminalis, or BNST. The BNST is an outcropping of the amygdala, the part of the brain associated with emotion. The BNST also forms a bridge between the amygdala and the lateral hypothalamus, the brain region that drives primal functions such as eating, sexual behavior, and aggression.
The BNST gaba neurons have a cell body and a long strand with branched synapses that transmit electrical signals into the lateral hypothalamus. Stuber and his team wanted to stimulate those synapses by using an optogenetic technique, an involved process that would let him stimulate BNST cells simply by shining light on their synapses.
Typically, brain cells don’t respond to light. So Stuber’s team used genetically engineered proteins — from algae — that are sensitive to light and used genetically engineered viruses to deliver them into the brains of mice. Those proteins then get expressed only in the BNST cells, including in the synapses that connect to the hypothalamus.
His team then implanted fiber optic cables in the brains of these specially-bred mice, and this allowed the researchers to shine light through the cables and onto BNST synapses. As soon as the light hit BNST synapses the mice began to eat voraciously even though they had already been well fed. Moreover, the mice showed a strong preference for high-fat foods.
“They would essentially eat up to half their daily caloric intake in about 20 minutes,” Stuber said. “This suggests that this BNST pathway could play a role in food consumption and pathological conditions such as binge eating.”
Stimulating the BNST also led the mice to exhibit behaviors associated with reward, suggesting that shining light on BNST cells enhanced the pleasure of eating. On the flip side, shutting down the BNST pathway caused mice to show little interest in eating, even if they had been deprived of food.
“We were able to really home in on the precise neural circuit connection that was causing this phenomenon that’s been observed for more than 50 years,” Stuber said.
The study, which uses technologies highlighted in the new National Institutes of Health Brain Initiative, suggests that faulty wiring in BNST cells could interfere with hunger or satiety cues and contribute to human eating disorders, leading people to eat even when they are full or to avoid food when they are hungry. Further research is needed to determine whether it would be possible to develop drugs that correct a malfunctioning BNST circuit.
“We want to actually observe the normal function of these cell types and how they fire electrical signals when the animals are feeding or hungry,” Stuber said. “We want to understand their genetic characteristics – what genes are expressed. For example, if we find cells that become really activated after binge eating, can we look at the gene expression profile to find out what makes those cells unique from other neurons.”
And that, Stuber said, could lead to potential targets for drugs to treat certain populations of patients with eating disorders.
Ballet dancers’ brains adapt to stop them feeling dizzy
Scientists have discovered differences in the brain structure of ballet dancers that may help them avoid feeling dizzy when they perform pirouettes.
The research suggests that years of training can enable dancers to suppress signals from the balance organs in the inner ear.
The findings, published in the journal Cerebral Cortex, could help to improve treatment for patients with chronic dizziness. Around one in four people experience this condition at some time in their lives.
Normally, the feeling of dizziness stems from the vestibular organs in the inner ear. These fluid-filled chambers sense rotation of the head through tiny hairs that sense the fluid moving. After turning around rapidly, the fluid continues to move, which can make you feel like you’re still spinning.
Ballet dancers can perform multiple pirouettes with little or no feeling of dizziness. The findings show that this feat isn’t just down to spotting, a technique dancers use that involves rapidly moving the head to fix their gaze on the same spot as much as possible.
Researchers at Imperial College London recruited 29 female ballet dancers and, as a comparison group, 20 female rowers whose age and fitness levels matched the dancers’.
The volunteers were spun around in a chair in a dark room. They were asked to turn a handle in time with how quickly they felt like they were still spinning after they had stopped. The researchers also measured eye reflexes triggered by input from the vestibular organs. Later, they examined the participants’ brain structure with MRI scans.
In dancers, both the eye reflexes and their perception of spinning lasted a shorter time than in the rowers.
Dr Barry Seemungal, from the Department of Medicine at Imperial, said: “Dizziness, which is the feeling that we are moving when in fact we are still, is a common problem. I see a lot of patients who have suffered from dizziness for a long time. Ballet dancers seem to be able to train themselves not to get dizzy, so we wondered whether we could use the same principles to help our patients.”
The brain scans revealed differences between the groups in two parts of the brain: an area in the cerebellum where sensory input from the vestibular organs is processed and in the cerebral cortex, which is responsible for the perception of dizziness.
The area in the cerebellum was smaller in dancers. Dr Seemungal thinks this is because dancers would be better off not using their vestibular systems, relying instead on highly co-ordinated pre-programmed movements.
“It’s not useful for a ballet dancer to feel dizzy or off balance. Their brains adapt over years of training to suppress that input. Consequently, the signal going to the brain areas responsible for perception of dizziness in the cerebral cortex is reduced, making dancers resistant to feeling dizzy.
“If we can target that same brain area or monitor it in patients with chronic dizziness, we can begin to understand how to treat them better.”
Another finding in the study may be important for how chronic dizzy patients are tested in the clinic. In the control group, the perception of spinning closely matched the eye reflexes triggered by vestibular signals, but in dancers, the two were uncoupled.
“This shows that the sensation of spinning is separate from the reflexes that make your eyes move back and forth,” Dr Seemungal said. “In many clinics, it’s common to only measure the reflexes, meaning that when these tests come back normal the patient is told that there is nothing wrong. But that’s only half the story. You need to look at tests that assess both reflex and sensation.”
Experiments with neutrons at the Technische Universität München (TUM) show that the antidepressant lithium accumulates more strongly in white matter of the brain than in grey matter. This leads to the conclusion that it works differently from synthetic psychotropic drugs. The tissue samples were examined at the Research Neutron Source Heinz Maier-Leibnitz (FRM II) with the aim of developing a better understanding of the effects this substance has on the human psyche.
At present lithium is most popular for its use in rechargeable batteries. But for decades now, lithium has also been used to treat various psychological diseases such as depressions, manias and bipolar disorders. But, the exact biological mode of action in certain brain regions has hardly been understood. It is well known that lithium lightens moods and reduces aggression potential.
Because it is so hard to dose, doctors have been reluctant to prescribe this “universal drug”. Nonetheless, a number of international studies have shown that a higher natural lithium content in drinking water leads to a lower suicide rate in the general population. Lithium accumulates in the brains of untreated people, too. This means that lithium, which has so far been regarded as unimportant, could be an essential trace element for humans.
Lithium detection with neutrons
This is what Josef Lichtinger is studying in his doctoral thesis at the Chair for Hadron and Nuclear Physics (E12) at the Technische Universität München. From the Institute for Forensic Medicine at the Ludwig-Maximilians-Universität Munich (LMU) he received tissue samples taken from patients treated with lithium, untreated patients and healthy test persons. The physicist exposed these to a focused cold neutron beam of greatest intensity at the measuring station for prompt gamma activation analysis at FRM II.
Lithium reacts with neutrons in a very specific manner and decays to a helium and a tritium atom. Using a special detector developed by Josef Lichtinger, traces as low as 0.45 nanograms of lithium per gram of tissue can be measured. “It is impossible to make measurements as precise as those using the neutrons with any other method,” says Jutta Schöpfer, forensic scientist at the LMU in charge of several research projects on lithium distribution in the human body.
Lithium concentrates at the nerve-tracts
Lichtinger’s results are surprising: Only in the samples of a depressive patient treated with lithium did he observe a higher accumulation of lithium in the so-called white matter. This is the area in the brain where nerve tracts run. The lithium content in the neighboring grey matter was 3 to 4 times lower. Lithium accumulation in white matter was not observed in a number of untreated depressive patients. This points to the fact that lithium does not work in the space between nerve cells, like other psychotropic drugs, but within the nerve tracts themselves.
In a next step Josef Lichtinger plans to examine further tissue samples at TUM’s Research Neutron Source in order to confirm and expand his results. The goal is a space-resolved map showing lithium accumulation in the brain of a healthy and a depressive patient. This would allow the universal drug lithium to be prescribed for psychological disorders with greater precision and control. The project is funded by the German Research Foundation (DFG).
Publication:
J. Lichtinger et. al, „Position sensitive measurement of lithium traces in brain tissue with neutrons“, Med. Phys. 40, 023501 (2013)
A new experimental approach to treating a type of brain cancer called medulloblastoma has been developed by researchers at Sanford-Burnham. The method targets cancer stem cells—the cells that are critical for maintaining tumor growth—and halts their ability to proliferate by inhibiting enzymes that are essential for tumor progression. The process destroys the ability of the cancer cells to grow and divide, paving the way for a new type of treatment for patients with this disease.

The research team, led by Robert Wechsler-Reya, Ph.D., professor in Sanford-Burnham’s NCI-Designated Cancer Center and director of the Tumor Initiation and Maintenance Program, discovered that the medulloblastoma cancer cells responsible for tumor growth and progression (called cancer stem cells or tumor-propagating cells—TPCs) divide more quickly than normal cells. Correspondingly, they have higher levels of certain enzymes that regulate the cell cycle (Aurora and Polo-like kinases). By using small-molecule inhibitors to stop the action of these enzymes, the researchers were able to block the growth of tumor cells from mice as well as humans. The research findings are described in an online paper published today by Cancer Research.
“One tumor can have many different types of cells in it, and they can grow at different rates. By targeting fast-growing TPCs with cell-cycle inhibitors, we have developed a new route to assault medulloblastoma. In this study, we have shown that cell-cycle inhibitors essentially block medulloblastoma tumor progression by halting TPC expansion, and have opened the window to preventing cancer recurrence,” said Wechsler-Reya.
The team’s first set of experiments used a mouse model for medulloblastoma. In-vitro studies of mouse tumor cells showed that cell-cycle inhibitors caused tumor cell death. In vivo, mice that were treated with the inhibitor had smaller tumors that weighed less compared to mice that were not treated, essentially halting the progression of the tumor.
The second set of experiments used human medulloblastoma cells. When the researchers treated these human tumor cells with cell-cycle inhibitors, they also observed a significant reduction in tumor growth and progression.
Finally, when the scientists combined cell-cycle inhibitors with treatments currently used for medulloblastoma, they found that the combination worked together to produce results that were greater than either inhibitor alone.
“These results strongly support an approach to treatment that combines current therapies with cell-cycle inhibitors to treat medulloblastoma. Our hope is that the combination of these inhibitors will prevent tumor progression and drug resistance, and improve the overall effectiveness of current treatment options. We look forward to clinical studies in human medulloblastoma patients as well as other cancers that are suitable for this approach,” Wechsler-Reya said.
(Source: beaker.sanfordburnham.org)
Proteins play important roles in the human body, particularly neuroproteins that maintain proper brain function.
Brain diseases such as ALS, Alzheimer’s, and Parkinson’s are known as “tangle diseases” because they are characterized by misfolded and tangled proteins which accumulate in the brain.
A team of Australian and American scientists discovered that an unusual amino acid called BMAA can be inserted into neuroproteins, causing them to misfold and aggregate. BMAA is produced by cyanobacteria, photosynthetic bacteria that form scums or mats in polluted lakes or estuaries.
BMAA has been detected in the brain tissues of ALS patients.
In an article published in PLOS ONE scientists at the University of Technology Sydney and the Institute for Ethnomedicine in Jackson Hole, Wyoming, report that BMAA mimics a dietary aminoacid, L-Serine, and is mistakenly incorporated into neuroproteins, causing the proteins to misfold. The misfolded proteins build up in cells, eventually killing them.
"We found that BMAA inserts itself by seizing the transfer RNA for L-Serine. This, in essence, puts a kink in the protein causing it to misfold," says lead author Dr. Rachael Dunlop, a cell biologist in Sydney working in the laboratory of Dr. Ken Rodgers.
"The cells then begin programmed cell death, called apoptosis. "Even more importantly, the scientists found that extra L-Serine added to the cell culture can prevent the insertion of BMAA into neuroproteins. The possibility that L-Serine could be used to prevent or slow ALS is now being studied."
Even though L-serine occurs in our diet, its safety and efficacy for ALS patients should be properly determined through FDA-approved clinical trials before anyone advocates its use,” says American co-author Dr. Paul Cox.
In ALS, motor neurons in the brain and spinal cord die, progressively paralyzing the body until even swallowing and breathing becomes impossible.
The disease is relatively rare but has affected a number of high-profile people including Professor Stephen Hawking and Yankee baseball player Lou Gehrig.
"For many years scientists have linked BMAA to an increased risk of motor neuron disease but the missing pieces of the puzzle relate to how this might occur. Finally, we have one of those pieces," said Dr Sandra Banack, a co-author on the paper.
(Source: eurekalert.org)
Bad experiences enhance memory formation about places, scientists at The University of Queensland have found.

Dr Oliver Baumann from the Queensland Brain Institute found that associating negative imagery with specific locations activates a part of the brain responsible for forming memory of places during navigation – the parahippocampal cortex.
“This heightened recall occurs automatically, without people even being aware that the negative imagery is affecting their memories,” said Dr Baumann, who worked on the study in the QBI’s Mattingley lab.
“It could serve as a cue for avoiding potential threats,” Dr Baumann said.
“Our findings show that emotions can exert a powerful influence on spatial and navigational memory for places.
“In future we might be able to boost memory functions by triggering the positive side-effects of emotional arousal, while avoiding the need for negative experiences.”
For the research, Professor Jason Mattingley built a “virtual house” and staged events in each room unrelated to the subject navigating the house.
The events were designed to elicit an emotional response – positive, negative, or neutral, and varied in their rate of occurrence.
“The events were illustrated using images from the International Affective Picture System library and included dramatic scenes of attack and threat, as well as more pleasant imagery,” Dr Baumann said.
The day after navigating through the house, participants viewed static images of the house without the emotional imagery, while their neural activity was recorded using an MRI scanner.
“The results showed that emotional arousal exerted a powerful influence on memory by enhancing parahippocampal activity,” Dr Baumann said.
The study was published in the Journal of Cognitive Neuroscience.
(Source: uq.edu.au)
Several studies have shown that expecting a reward or punishment can affect brain activity in areas responsible for processing different senses, including sight or touch. For example, research shows that these brain regions light up on brain scans when humans are expecting a treat. However, researchers know less about what happens when the reward is actually received—or an expected reward is denied. Insight on these scenarios can help researchers better understand how we learn in general.

To get a better grasp on how the brain behaves when people who are expecting a reward actually receive it, or conversely, are denied it, Tina Weis of Carl-von-Ossietzky University and her colleagues monitored the auditory cortex—the part of the brain that processes and interprets sounds—while volunteers solved a task in which they had a chance of winning 50 Euro cents with each round, signaled by a specific sound. Their findings show that the auditory cortex activity picked up both when participants were expecting a reward and received it, as well as when their expectation of receiving no reward was correct.
The article is entitled “Feedback that Confirms Reward Expectation Triggers Auditory Cortex Activity.” It appears in the Articles in Press section of the Journal of Neurophysiology, published by the American Physiological Society.
Methodology
The researchers worked with 105 healthy adult volunteers with normal hearing. While each volunteer received a functional MRI (fMRI)—a brain scan that measures brain activity during tasks—the researchers had them solve a task with sounds where they had the chance of winning money at the end of each round. At the beginning of a round participants heard a sound and had to learn if this sound signified that they could win a 50 Euro cents reward or not. They then saw a number on a screen and had to press a button to indicate whether the number was greater or smaller than 5. If the sound before indicated that they could receive a reward and they solved the number task quickly and correctly, an image of a 50 Euro cents coin appeared on the screen. The researchers monitored brain activity in the subjects’ auditory cortex throughout the task, paying special attention to what happened when they received the reward, or not, at the end of the round.
Results
The study authors found that when the volunteers were expecting and finally received a reward, then their auditory cortex was activated. Similarly, there was an increase in brain activity in this area when the subjects weren’t expecting a reward and didn’t get one. There was no additional activity when they were expecting a reward and didn’t get one.
Importance of the Findings
These findings add to accumulating evidence that the auditory cortex performs a role beyond just processing sound. Rather, this area of the brain appears to be activated during other activities that require learning and thought, such as confirming expectations of receiving a reward.
"Our findings thus support the view of a highly cognitive role of the auditory cortex," the study authors say.
(Source: eurekalert.org)