Neuroscience

Articles and news from the latest research reports.

77 notes

EEG Study Findings Reveal How Fear is Processed in the Brain
An estimated 8% of Americans will suffer from post traumatic stress disorder (PTSD) at some point during their lifetime. Brought on by an overwhelming or stressful event or events, PTSD is the result of altered chemistry and physiology of the brain. Understanding how threat is processed in a normal brain versus one altered by PTSD is essential to developing effective interventions. 
New research from the Center for BrainHealth at The University of Texas at Dallas published online today in Brain and Cognition illustrates how fear arises in the brain when individuals are exposed to threatening images. This novel study is the first to separate emotion from threat by controlling for the dimension of arousal, the emotional reaction provoked, whether positive or negative, in response to stimuli. Building on previous animal and human research, the study identifies an electrophysiological marker for threat in the brain.
“We are trying to find where thought exists in the mind,” explained John Hart, Jr., M.D., Medical Science Director at the Center for BrainHealth. “We know that groups of neurons firing on and off create a frequency and pattern that tell other areas of the brain what to do. By identifying these rhythms, we can correlate them with a cognitive unit such as fear.”
Utilizing electroencephalography (EEG), Dr. Hart’s research team identified theta and beta wave activity that signifies the brain’s reaction to visually threatening images. 
“We have known for a long time that the brain prioritizes threatening information over other cognitive processes,” explained Bambi DeLaRosa, study lead author. “These findings show us how this happens. Theta wave activity starts in the back of the brain, in it’s fear center – the amygdala – and then interacts with brain’s memory center - the hippocampus – before traveling to the frontal lobe where thought processing areas are engaged. At the same time, beta wave activity indicates that the motor cortex is revving up in case the feet need to move to avoid the perceived threat.” 
For the study, 26 adults (19 female, 7 male), ages 19-30 were shown 224 randomized images that were either unidentifiably scrambled or real pictures. Real pictures were separated into two categories: threatening (weapons, combat, nature or animals) and non-threatening (pleasant situations, food, nature or animals). 
While wearing an EEG cap, participants were asked to push a button with their right index finger for real items and another button with their right middle finger for nonreal/scrambled items. Shorter response times were recorded for scrambled images than the real images. There was no difference in reaction time for threatening versus non-threatening images. 
EEG results revealed that threatening images evoked an early increase in theta activity in the occipital lobe (the area in the brain where visual information is processed), followed by a later increase in theta power in the frontal lobe (where higher mental functions such as thinking, decision-making, and planning occur). A left lateralized desynchronization of the beta band, the wave pattern associated with motor behavior (like the impulse to run), also consistently appeared in the threatening condition.
This study will serve as a foundation for future work that will explore normal versus abnormal fear associated with an object in other atypical populations including individuals with PTSD.

EEG Study Findings Reveal How Fear is Processed in the Brain

An estimated 8% of Americans will suffer from post traumatic stress disorder (PTSD) at some point during their lifetime. Brought on by an overwhelming or stressful event or events, PTSD is the result of altered chemistry and physiology of the brain. Understanding how threat is processed in a normal brain versus one altered by PTSD is essential to developing effective interventions. 

New research from the Center for BrainHealth at The University of Texas at Dallas published online today in Brain and Cognition illustrates how fear arises in the brain when individuals are exposed to threatening images. This novel study is the first to separate emotion from threat by controlling for the dimension of arousal, the emotional reaction provoked, whether positive or negative, in response to stimuli. Building on previous animal and human research, the study identifies an electrophysiological marker for threat in the brain.

“We are trying to find where thought exists in the mind,” explained John Hart, Jr., M.D., Medical Science Director at the Center for BrainHealth. “We know that groups of neurons firing on and off create a frequency and pattern that tell other areas of the brain what to do. By identifying these rhythms, we can correlate them with a cognitive unit such as fear.”

Utilizing electroencephalography (EEG), Dr. Hart’s research team identified theta and beta wave activity that signifies the brain’s reaction to visually threatening images. 

“We have known for a long time that the brain prioritizes threatening information over other cognitive processes,” explained Bambi DeLaRosa, study lead author. “These findings show us how this happens. Theta wave activity starts in the back of the brain, in it’s fear center – the amygdala – and then interacts with brain’s memory center - the hippocampus – before traveling to the frontal lobe where thought processing areas are engaged. At the same time, beta wave activity indicates that the motor cortex is revving up in case the feet need to move to avoid the perceived threat.” 

For the study, 26 adults (19 female, 7 male), ages 19-30 were shown 224 randomized images that were either unidentifiably scrambled or real pictures. Real pictures were separated into two categories: threatening (weapons, combat, nature or animals) and non-threatening (pleasant situations, food, nature or animals). 

While wearing an EEG cap, participants were asked to push a button with their right index finger for real items and another button with their right middle finger for nonreal/scrambled items. Shorter response times were recorded for scrambled images than the real images. There was no difference in reaction time for threatening versus non-threatening images. 

EEG results revealed that threatening images evoked an early increase in theta activity in the occipital lobe (the area in the brain where visual information is processed), followed by a later increase in theta power in the frontal lobe (where higher mental functions such as thinking, decision-making, and planning occur). A left lateralized desynchronization of the beta band, the wave pattern associated with motor behavior (like the impulse to run), also consistently appeared in the threatening condition.

This study will serve as a foundation for future work that will explore normal versus abnormal fear associated with an object in other atypical populations including individuals with PTSD.

Filed under fear PTSD emotions EEG brainwaves amygdala motor cortex hippocampus neuroscience science

90 notes

Brain Development in Schizophrenia Strays from the Normal Path
Schizophrenia is generally considered to be a disorder of brain development and it shares many risk factors, both genetic and environmental, with other neurodevelopmental disorders such as autism and intellectual disability.
The normal path for brain development is determined by the combined effects of a complex network of genes and a wide range of environmental factors.
However, longitudinal brain imaging studies in both healthy and patient populations are required in order to map the disturbances in brain structures as they emerge, i.e., the disturbed trajectories of brain development.
A new study by an international, collaborative group of researchers has measured neurodevelopment in schizophrenia, by studying brain development during childhood and adolescence in people with and without this disorder. With access to new statistical approaches and long-term follow-up with participants, in some cases over more than a decade, the researchers were able to describe brain development patterns associated with schizophrenia.
"Specifically, this paper shows that parts of the brain’s cortex develop differently in people with schizophrenia," said first author Dr. Aaron F. Alexander-Bloch, from the National Institute of Mental Health.
"The mapping of the path that the brain follows in deviating from normal development provides important clues to the underlying causes of the disorder," said Dr. John Krystal, Editor of Biological Psychiatry.
The findings were derived by investigating the trajectory of cortical thickness growth curves in 106 patients with childhood-onset schizophrenia and a comparison group of 102 healthy volunteers.
Each participant, ranging from 7–32 years of age, had repeated imaging scans over the course of several years. Then, using over 80,000 vertices across the cortex, the researchers modeled the effect of schizophrenia on the growth curve of cortical thickness.
This revealed differences that occur within a specific group of highly-connected brain regions that mature in synchrony during typical development, but follow altered trajectories of growth in schizophrenia.
"These findings show a relationship between the hypothesis that schizophrenia is a neurodevelopmental disorder and the longstanding hypothesis – first articulated by the German anatomist Karl Wernicke in the late 19th century – that it is a disease of altered connectivity between regions of the brain," added Alexander-Bloch.
This theoretical consistency is important, as it allows researchers to better focus future studies of brain connectivity in schizophrenia, by targeting the brain regions known to be affected.

Brain Development in Schizophrenia Strays from the Normal Path

Schizophrenia is generally considered to be a disorder of brain development and it shares many risk factors, both genetic and environmental, with other neurodevelopmental disorders such as autism and intellectual disability.

The normal path for brain development is determined by the combined effects of a complex network of genes and a wide range of environmental factors.

However, longitudinal brain imaging studies in both healthy and patient populations are required in order to map the disturbances in brain structures as they emerge, i.e., the disturbed trajectories of brain development.

A new study by an international, collaborative group of researchers has measured neurodevelopment in schizophrenia, by studying brain development during childhood and adolescence in people with and without this disorder. With access to new statistical approaches and long-term follow-up with participants, in some cases over more than a decade, the researchers were able to describe brain development patterns associated with schizophrenia.

"Specifically, this paper shows that parts of the brain’s cortex develop differently in people with schizophrenia," said first author Dr. Aaron F. Alexander-Bloch, from the National Institute of Mental Health.

"The mapping of the path that the brain follows in deviating from normal development provides important clues to the underlying causes of the disorder," said Dr. John Krystal, Editor of Biological Psychiatry.

The findings were derived by investigating the trajectory of cortical thickness growth curves in 106 patients with childhood-onset schizophrenia and a comparison group of 102 healthy volunteers.

Each participant, ranging from 7–32 years of age, had repeated imaging scans over the course of several years. Then, using over 80,000 vertices across the cortex, the researchers modeled the effect of schizophrenia on the growth curve of cortical thickness.

This revealed differences that occur within a specific group of highly-connected brain regions that mature in synchrony during typical development, but follow altered trajectories of growth in schizophrenia.

"These findings show a relationship between the hypothesis that schizophrenia is a neurodevelopmental disorder and the longstanding hypothesis – first articulated by the German anatomist Karl Wernicke in the late 19th century – that it is a disease of altered connectivity between regions of the brain," added Alexander-Bloch.

This theoretical consistency is important, as it allows researchers to better focus future studies of brain connectivity in schizophrenia, by targeting the brain regions known to be affected.

Filed under schizophrenia brain development neuroimaging cortical thickness neuroscience science

80 notes

(Image caption: Shown are fMRI scans across all subjects in the study. The yellow and red areas in Section A represent parts of the brain that are activated while subjects are forming “gist memories” of pictures viewed. Section B represents areas of increased activation, shown in yellow and red, as detailed memories are being formed. Credit: Image courtesy of Jagust Lab)
Researchers find neural compensation in people with Alzheimer’s-related protein
The human brain is capable of a neural workaround that compensates for the buildup of beta-amyloid, a destructive protein associated with Alzheimer’s disease, according to a new study led by UC Berkeley researchers.
The findings, published today (Sunday, Sept. 14) in the journal Nature Neuroscience, could help explain how some older adults with beta-amyloid deposits in their brain retain normal cognitive function while others develop dementia.
“This study provides evidence that there is plasticity or compensation ability in the aging brain that appears to be beneficial, even in the face of beta-amyloid accumulation,” said study principal investigator Dr. William Jagust, a professor with joint appointments at UC Berkeley’s Helen Wills Neuroscience Institute, the School of Public Health and Lawrence Berkeley National Laboratory.
Previous studies have shown a link between increased brain activity and beta-amyloid deposits, but it was unclear whether the activity was tied to better mental performance.
The study included 22 healthy young adults and 49 older adults who had no signs of mental decline. Brain scans showed that 16 of the older subjects had beta-amyloid deposits, while the remaining 55 adults did not.
The researchers used functional magnetic resonance imaging (fMRI) to track the brain activity of subjects in the process of memorizing pictures of various scenes. Afterwards, the researchers tested the subjects’ “gist memory” by asking them to confirm whether a written description of a scene – such as a boy doing a handstand – corresponded to a picture previously viewed. Subjects were then asked to confirm whether specific written details of a scene – such as the color of the boy’s shirt – were true.
“Generally, the groups performed equally well in the tasks, but it turned out that for people with beta-amyloid deposits in the brain, the more detailed and complex their memory, the more brain activity there was,” said Jagust. “It seems that their brain has found a way to compensate for the presence of the proteins associated with Alzheimer’s.”
What remains unclear, said Jagust, is why some people with beta-amyloid deposits are better at using different parts of their brain than others. Previous studies suggest that people who engage in mentally stimulating activities throughout their lives have lower levels of beta-amyloid.
“I think it’s very possible that people who spend a lifetime involved in cognitively stimulating activity have brains that are better able to adapt to potential damage,” said Jagust.

(Image caption: Shown are fMRI scans across all subjects in the study. The yellow and red areas in Section A represent parts of the brain that are activated while subjects are forming “gist memories” of pictures viewed. Section B represents areas of increased activation, shown in yellow and red, as detailed memories are being formed. Credit: Image courtesy of Jagust Lab)

Researchers find neural compensation in people with Alzheimer’s-related protein

The human brain is capable of a neural workaround that compensates for the buildup of beta-amyloid, a destructive protein associated with Alzheimer’s disease, according to a new study led by UC Berkeley researchers.

The findings, published today (Sunday, Sept. 14) in the journal Nature Neuroscience, could help explain how some older adults with beta-amyloid deposits in their brain retain normal cognitive function while others develop dementia.

“This study provides evidence that there is plasticity or compensation ability in the aging brain that appears to be beneficial, even in the face of beta-amyloid accumulation,” said study principal investigator Dr. William Jagust, a professor with joint appointments at UC Berkeley’s Helen Wills Neuroscience Institute, the School of Public Health and Lawrence Berkeley National Laboratory.

Previous studies have shown a link between increased brain activity and beta-amyloid deposits, but it was unclear whether the activity was tied to better mental performance.

The study included 22 healthy young adults and 49 older adults who had no signs of mental decline. Brain scans showed that 16 of the older subjects had beta-amyloid deposits, while the remaining 55 adults did not.

The researchers used functional magnetic resonance imaging (fMRI) to track the brain activity of subjects in the process of memorizing pictures of various scenes. Afterwards, the researchers tested the subjects’ “gist memory” by asking them to confirm whether a written description of a scene – such as a boy doing a handstand – corresponded to a picture previously viewed. Subjects were then asked to confirm whether specific written details of a scene – such as the color of the boy’s shirt – were true.

“Generally, the groups performed equally well in the tasks, but it turned out that for people with beta-amyloid deposits in the brain, the more detailed and complex their memory, the more brain activity there was,” said Jagust. “It seems that their brain has found a way to compensate for the presence of the proteins associated with Alzheimer’s.”

What remains unclear, said Jagust, is why some people with beta-amyloid deposits are better at using different parts of their brain than others. Previous studies suggest that people who engage in mentally stimulating activities throughout their lives have lower levels of beta-amyloid.

“I think it’s very possible that people who spend a lifetime involved in cognitively stimulating activity have brains that are better able to adapt to potential damage,” said Jagust.

Filed under beta amyloid brain activity cognitive function dementia alzheimer's disease neuroscience science

491 notes

Schizophrenia not a single disease but multiple genetically distinct disorders

New research shows that schizophrenia isn’t a single disease but a group of eight genetically distinct disorders, each with its own set of symptoms. The finding could be a first step toward improved diagnosis and treatment for the debilitating psychiatric illness.

image

The research at Washington University School of Medicine in St. Louis is reported online Sept. 15 in The American Journal of Psychiatry.

About 80 percent of the risk for schizophrenia is known to be inherited, but scientists have struggled to identify specific genes for the condition. Now, in a novel approach analyzing genetic influences on more than 4,000 people with schizophrenia, the research team has identified distinct gene clusters that contribute to eight different classes of schizophrenia.

“Genes don’t operate by themselves,” said C. Robert Cloninger, MD, PhD, one of the study’s senior investigators. “They function in concert much like an orchestra, and to understand how they’re working, you have to know not just who the members of the orchestra are but how they interact.”

Cloninger, the Wallace Renard Professor of Psychiatry and Genetics, and his colleagues matched precise DNA variations in people with and without schizophrenia to symptoms in individual patients. In all, the researchers analyzed nearly 700,000 sites within the genome where a single unit of DNA is changed, often referred to as a single nucleotide polymorphism (SNP). They looked at SNPs in 4,200 people with schizophrenia and 3,800 healthy controls, learning how individual genetic variations interacted with each other to produce the illness.

In some patients with hallucinations or delusions, for example, the researchers matched distinct genetic features to patients’ symptoms, demonstrating that specific genetic variations interacted to create a 95 percent certainty of schizophrenia. In another group, they found that disorganized speech and behavior were specifically associated with a set of DNA variations that carried a 100 percent risk of schizophrenia.

“What we’ve done here, after a decade of frustration in the field of psychiatric genetics, is identify the way genes interact with each other, how the ‘orchestra’ is either harmonious and leads to health, or disorganized in ways that lead to distinct classes of schizophrenia,” Cloninger said. 

Although individual genes have only weak and inconsistent associations with schizophrenia, groups of interacting gene clusters create an extremely high and consistent risk of illness, on the order of 70 to 100 percent. That makes it almost impossible for people with those genetic variations to avoid the condition. In all, the researchers identified 42 clusters of genetic variations that dramatically increased the risk of schizophrenia.

“In the past, scientists had been looking for associations between individual genes and schizophrenia,” explained Dragan Svrakic, PhD, MD, a co-investigator and a professor of psychiatry at Washington University. “When one study would identify an association, no one else could replicate it. What was missing was the idea that these genes don’t act independently. They work in concert to disrupt the brain’s structure and function, and that results in the illness.”

Svrakic said it was only when the research team was able to organize the genetic variations and the patients’ symptoms into groups that they could see that particular clusters of DNA variations acted together to cause specific types of symptoms.

Then they divided patients according to the type and severity of their symptoms, such as different types of hallucinations or delusions, and other symptoms, such as lack of initiative, problems organizing thoughts or a lack of connection between emotions and thoughts. The results indicated that those symptom profiles describe eight qualitatively distinct disorders based on underlying genetic conditions.

The investigators also replicated their findings in two additional DNA databases of people with schizophrenia, an indicator that identifying the gene variations that are working together is a valid avenue to explore for improving diagnosis and treatment.

By identifying groups of genetic variations and matching them to symptoms in individual patients, it soon may be possible to target treatments to specific pathways that cause problems, according to co-investigator Igor Zwir, PhD, research associate in psychiatry at Washington University and associate professor in the Department of Computer Science and Artificial Intelligence at the University of Granada, Spain.

And Cloninger added it may be possible to use the same approach to better understand how genes work together to cause other common but complex disorders.

“People have been looking at genes to get a better handle on heart disease, hypertension and diabetes, and it’s been a real disappointment,” he said. “Most of the variability in the severity of disease has not been explained, but we were able to find that different sets of genetic variations were leading to distinct clinical syndromes. So I think this really could change the way people approach understanding the causes of complex diseases.”

(Source: news.wustl.edu)

Filed under schizophrenia mental illness genes genetic variations genetics genomics neuroscience science

241 notes

Sometimes, adolescents just can’t resist
Don’t get mad the next time you catch your teenager texting when he promised to be studying.
He simply may not be able to resist.
A University of Iowa study found teenagers are far more sensitive than adults to the immediate effect or reward of their behaviors. The findings may help explain, for example, why the initial rush of texting may be more enticing for adolescents than the long-term payoff of studying.
“The rewards have a strong, perceptional draw and are more enticing to the teenager,” says Jatin Vaidya, a professor of psychiatry at the UI and corresponding author of the study, which appeared online this week in the journal Psychological Science. “Even when a behavior is no longer in a teenager’s best interest to continue, they will because the effect of the reward is still there and lasts much longer in adolescents than in adults.”
For parents, that means limiting distractions so teenagers can make better choices. Take the homework and social media dilemma: At 9 p.m., shut off everything except a computer that has no access to Facebook or Twitter, the researchers advise.
“I’m not saying they shouldn’t be allowed access to technology,” Vaidya says. “But they need help in regulating their attention so they can develop those impulse-control skills.”
In their study, “Value-Driven Attentional Capture in Adolescence,” Vaidya and co-authors Shaun Vecera, a professor of psychology, and Zachary Roper, a graduate student in psychology, note researchers generally believe teenagers are impulsive, make bad decisions, and engage in risky behavior because the frontal lobes of their brains are not fully developed.
But the UI researchers wondered whether something more fundamental was going on with adolescents to trigger behaviors independent of higher-level reasoning.
“We wanted to try to understand the brain’s reward system and how it changes from childhood to adulthood,” says Vaidya, who adds the reward trait in the human brain is much more primitive than decision-making. “We’ve been trying to understand the reward process in adolescence and whether there is more to adolescent behavior than an under-developed frontal lobe,” he adds.
For their study, the researchers recruited 40 adolescents, ages 13 and 16, and 40 adults, ages 20 and 35. First, participants were asked to find a red or green ring hidden within an array of rings on a computer screen. Once identified, they reported whether the white line inside the ring was vertical or horizontal. If they were right, they received a reward between 2 and 10 cents, depending on the color. For some participants, the red ring paid the highest reward; for others, it was the green. None was told which color would pay the most.
After 240 trials, the participants were asked whether they noticed anything about the colors. Most made no association between a color and reward, which researchers say proves the ring exercise didn’t involve high-level, decision-making.
In the next stage, participants showed they had developed an intuitive association when they were asked to find a diamond-shaped target. This time, the red and green rings were used as decoys.
At first, the adolescents and adults selected the color ring that garnered them the highest monetary reward, the goal of the first trial. But in short order, the adults adjusted and selected the diamond. The adolescents did not.
Even after 240 trials, the adolescents were still more apt to pick the colored rings.
“Even though you’ve told them, ‘You have a new target,’ the adolescents can’t get rid of the association they learned before,” Vecera says. “It’s as if that association is much more potent for the adolescent than for the adult.
“If you give the adolescent a reward, it will persist longer,” he adds. “The fact that the reward is gone doesn’t matter. They will act as if the reward is still there.”
Researchers say that inability to readily adjust behavior explains why, for example, a teenager may continue to make inappropriate comments in class long after friends stopped laughing.
In the future, researchers hope to delve into the psychological and neurological aspects of their results.
“Are there certain brain regions or circuits that continue to develop from adolescence to adulthood that play role in directing attention away from reward stimuli that are not task relevant?” Vaidya asks. “Also, what sort of life experiences and skill help to improve performance on this task?”

Sometimes, adolescents just can’t resist

Don’t get mad the next time you catch your teenager texting when he promised to be studying.

He simply may not be able to resist.

A University of Iowa study found teenagers are far more sensitive than adults to the immediate effect or reward of their behaviors. The findings may help explain, for example, why the initial rush of texting may be more enticing for adolescents than the long-term payoff of studying.

“The rewards have a strong, perceptional draw and are more enticing to the teenager,” says Jatin Vaidya, a professor of psychiatry at the UI and corresponding author of the study, which appeared online this week in the journal Psychological Science. “Even when a behavior is no longer in a teenager’s best interest to continue, they will because the effect of the reward is still there and lasts much longer in adolescents than in adults.”

For parents, that means limiting distractions so teenagers can make better choices. Take the homework and social media dilemma: At 9 p.m., shut off everything except a computer that has no access to Facebook or Twitter, the researchers advise.

“I’m not saying they shouldn’t be allowed access to technology,” Vaidya says. “But they need help in regulating their attention so they can develop those impulse-control skills.”

In their study, “Value-Driven Attentional Capture in Adolescence,” Vaidya and co-authors Shaun Vecera, a professor of psychology, and Zachary Roper, a graduate student in psychology, note researchers generally believe teenagers are impulsive, make bad decisions, and engage in risky behavior because the frontal lobes of their brains are not fully developed.

But the UI researchers wondered whether something more fundamental was going on with adolescents to trigger behaviors independent of higher-level reasoning.

“We wanted to try to understand the brain’s reward system and how it changes from childhood to adulthood,” says Vaidya, who adds the reward trait in the human brain is much more primitive than decision-making. “We’ve been trying to understand the reward process in adolescence and whether there is more to adolescent behavior than an under-developed frontal lobe,” he adds.

For their study, the researchers recruited 40 adolescents, ages 13 and 16, and 40 adults, ages 20 and 35. First, participants were asked to find a red or green ring hidden within an array of rings on a computer screen. Once identified, they reported whether the white line inside the ring was vertical or horizontal. If they were right, they received a reward between 2 and 10 cents, depending on the color. For some participants, the red ring paid the highest reward; for others, it was the green. None was told which color would pay the most.

After 240 trials, the participants were asked whether they noticed anything about the colors. Most made no association between a color and reward, which researchers say proves the ring exercise didn’t involve high-level, decision-making.

In the next stage, participants showed they had developed an intuitive association when they were asked to find a diamond-shaped target. This time, the red and green rings were used as decoys.

At first, the adolescents and adults selected the color ring that garnered them the highest monetary reward, the goal of the first trial. But in short order, the adults adjusted and selected the diamond. The adolescents did not.

Even after 240 trials, the adolescents were still more apt to pick the colored rings.

“Even though you’ve told them, ‘You have a new target,’ the adolescents can’t get rid of the association they learned before,” Vecera says. “It’s as if that association is much more potent for the adolescent than for the adult.

“If you give the adolescent a reward, it will persist longer,” he adds. “The fact that the reward is gone doesn’t matter. They will act as if the reward is still there.”

Researchers say that inability to readily adjust behavior explains why, for example, a teenager may continue to make inappropriate comments in class long after friends stopped laughing.

In the future, researchers hope to delve into the psychological and neurological aspects of their results.

“Are there certain brain regions or circuits that continue to develop from adolescence to adulthood that play role in directing attention away from reward stimuli that are not task relevant?” Vaidya asks. “Also, what sort of life experiences and skill help to improve performance on this task?”

Filed under adolescence attentional capture reward frontal lobe learning psychology neuroscience science

143 notes

Nicotine withdrawal reduces response to rewards across species
Cigarette smoking is a leading cause of preventable death worldwide and is associated with approximately 440,000 deaths in the United States each year, according to the U.S. Centers for Disease Control and Prevention, but nearly 20 percent of the U.S. population continues to smoke cigarettes. While more than half of U.S. smokers try to quit every year, less than 10 percent are able to remain smoke-free, and relapse commonly occurs within 48 hours of smoking cessation. Learning about withdrawal and difficulty of quitting can lead to more effective treatments to help smokers quit.
In a first of its kind study on nicotine addiction, scientists measured a behavior that can be similarly quantified across species like humans and rats, the responses to rewards during nicotine withdrawal. Findings from this study were published online on Sept. 10, 2014 in JAMA Psychiatry.
Response to reward is the brain’s ability to derive and recognize pleasure from natural things such as food, money and sex. The reduced ability to respond to rewards is a behavioral process associated with depression in humans. In prior studies of nicotine withdrawal, investigators used very different behavioral measurements across humans and rats, limiting our understanding of this important brain reward system.
Using a translational behavioral approach, Michele Pergadia, Ph.D., associate professor of clinical biomedical science in the Charles E. Schmidt College of Medicine at Florida Atlantic University, who completed the human study while at Washington University School of Medicine, Andre Der-Avakian, Ph.D., who completed the rat study at the University of California San Diego (UCSD), and colleagues, including senior collaborators Athina Markou, Ph.D. at UCSD and Diego Pizzagalli, Ph.D. at Harvard Medical School, found that nicotine withdrawal similarly reduced reward responsiveness in human smokers - particularly those with a history of depression - as well as in nicotine-treated rats.
Pergadia, one of the lead authors, notes that replication of experimental results across species is a major step forward, because it allows for greater generalizability and a more reliable means for identifying behavioral and neurobiological mechanisms that explain the complicated behavior of nicotine withdrawal in humans addicted to tobacco.
"The fact that the effect was similar across species using this translational task not only provides us with a ready framework to proceed with additional research to better understand the mechanisms underlying withdrawal of nicotine, and potentially new treatment development, but it also makes us feel more confident that we are actually studying the same behavior in humans and rats as the studies move forward," said Pergadia.
Pergadia and colleagues plan to pursue future studies that will include a systematic study of depression vulnerability as it relates to reward sensitivity, the course of withdrawal-related reward deficits, including effects on relapse to smoking, and identification of processes in the brain that lead to these behaviors.
Pergadia emphasizes that the ultimate goal of this line of research is to improve treatments that manage nicotine withdrawal-related symptoms and thereby increase success during efforts to quit.
"Many smokers are struggling to quit, and there is a real need to develop new strategies to aid them in this process. Therapies targeting this reward dysfunction during withdrawal may prove to be useful," said Pergadia.

Nicotine withdrawal reduces response to rewards across species

Cigarette smoking is a leading cause of preventable death worldwide and is associated with approximately 440,000 deaths in the United States each year, according to the U.S. Centers for Disease Control and Prevention, but nearly 20 percent of the U.S. population continues to smoke cigarettes. While more than half of U.S. smokers try to quit every year, less than 10 percent are able to remain smoke-free, and relapse commonly occurs within 48 hours of smoking cessation. Learning about withdrawal and difficulty of quitting can lead to more effective treatments to help smokers quit.

In a first of its kind study on nicotine addiction, scientists measured a behavior that can be similarly quantified across species like humans and rats, the responses to rewards during nicotine withdrawal. Findings from this study were published online on Sept. 10, 2014 in JAMA Psychiatry.

Response to reward is the brain’s ability to derive and recognize pleasure from natural things such as food, money and sex. The reduced ability to respond to rewards is a behavioral process associated with depression in humans. In prior studies of nicotine withdrawal, investigators used very different behavioral measurements across humans and rats, limiting our understanding of this important brain reward system.

Using a translational behavioral approach, Michele Pergadia, Ph.D., associate professor of clinical biomedical science in the Charles E. Schmidt College of Medicine at Florida Atlantic University, who completed the human study while at Washington University School of Medicine, Andre Der-Avakian, Ph.D., who completed the rat study at the University of California San Diego (UCSD), and colleagues, including senior collaborators Athina Markou, Ph.D. at UCSD and Diego Pizzagalli, Ph.D. at Harvard Medical School, found that nicotine withdrawal similarly reduced reward responsiveness in human smokers - particularly those with a history of depression - as well as in nicotine-treated rats.

Pergadia, one of the lead authors, notes that replication of experimental results across species is a major step forward, because it allows for greater generalizability and a more reliable means for identifying behavioral and neurobiological mechanisms that explain the complicated behavior of nicotine withdrawal in humans addicted to tobacco.

"The fact that the effect was similar across species using this translational task not only provides us with a ready framework to proceed with additional research to better understand the mechanisms underlying withdrawal of nicotine, and potentially new treatment development, but it also makes us feel more confident that we are actually studying the same behavior in humans and rats as the studies move forward," said Pergadia.

Pergadia and colleagues plan to pursue future studies that will include a systematic study of depression vulnerability as it relates to reward sensitivity, the course of withdrawal-related reward deficits, including effects on relapse to smoking, and identification of processes in the brain that lead to these behaviors.

Pergadia emphasizes that the ultimate goal of this line of research is to improve treatments that manage nicotine withdrawal-related symptoms and thereby increase success during efforts to quit.

"Many smokers are struggling to quit, and there is a real need to develop new strategies to aid them in this process. Therapies targeting this reward dysfunction during withdrawal may prove to be useful," said Pergadia.

Filed under nicotine nicotine withdrawal reward system tobacco smoking neuroscience science

165 notes

Cells Put Off Protein Production During Times of Stress

Living cells are like miniature factories, responsible for the production of more than 25,000 different proteins with very specific 3-D shapes. And just as an overwhelmed assembly line can begin making mistakes, a stressed cell can end up producing misshapen proteins that are unfolded or misfolded.

image

(Image caption: A color-enhanced electron micrograph shows the nucleus of a cell (blue) adjacent to the rough endoplasmic reticulum (green), where proteins are manufactured from mRNA templates produced by the nucleus. Credit: University of Edinburgh, via the Wellcome TrustAdd)

Now Duke University researchers in North Carolina and Singapore have shown that the cell recognizes the buildup of these misfolded proteins and responds by reshuffling its workload, much like a stressed out employee might temporarily move papers from an overflowing inbox into a junk drawer. 

The study, which appears Sept. 11, 2014 in Cell, could lend insight into diseases that result from misfolded proteins piling up, such as Alzheimer’s disease, ALS, Huntington’s disease, Parkinson’s disease, and type 2 diabetes.

“We have identified an entirely new mechanism for how the cell responds to stress,” said Christopher V. Nicchitta, Ph.D., a professor of cell biology at Duke University School of Medicine. “Essentially, the cell remodels the organization of its protein production machinery in order to compartmentalize the tasks at hand.” 

The general architecture and workflow of these cellular factories has been understood for decades. First, DNA’s master blueprint, which is locked tightly in the nucleus of each cell, is transcribed into messenger RNA or mRNA. Then this working copy travels to the ribosomes standing on the surface of a larger accordion-shaped structure called the endoplasmic reticulum (ER). The ribosomes on the ER are tiny assembly lines that translate the mRNAs into proteins.

When a cell gets stressed, either by overheating or starvation, its proteins no longer fold properly. These unfolded proteins can set off an alarm — called the unfolded protein response or UPR – to slow down the assembly line and clean up the improperly folded products. Nicchitta wondered if the stress response might also employ other tactics to deal with the problem.

In this study, Nicchitta and his colleagues treated tissue culture cells with a stress-inducing agent called thapsigargin. They then separated the cells into two groups — those containing mRNAs associated with ribosomes on the endoplasmic reticulum, and those containing mRNAs associated with free-floating ribosomes in the neighboring fluid-filled space known as the cytosol. 

The researchers found that when the cells were stressed, they quickly moved mRNAs from the endoplasmic reticulum to the cytosol. Once the stress was resolved, the mRNAs went back to their spots on the production floor of the endoplasmic reticulum. 

“You can slow down protein production, but sometimes slowing down the workflow is not enough,” Nicchitta said. “You can activate genes to help chew up the misfolded proteins, but sometimes they are accumulating too quickly. Here we have discovered a mechanism that does one better — it effectively puts everything on hold. Once things get back to normal, the mRNAs are released from the holding pattern.” 

Interestingly, the researchers found that shuttling ribosomes between the ER and the cytoplasm during stress only affected the subset of mRNAs that would give rise to secreted proteins like hormones or membrane proteins like growth factor receptors — the types of proteins that set off the stress response if they’re misfolded. They aren’t sure yet what this means.

Nicchitta is currently searching for the factors that ultimately determine which mechanisms cells employ during the stress response. He has already pinpointed one promising candidate, and is looking to see how cells respond to stress when that factor is manipulated.

(Source: today.duke.edu)

Filed under neurodegenerative diseases stress endoplasmic reticulum thapsigargin cytoplasm neuroscience science

501 notes

Breast milk is brain food
You are what you eat, the saying goes, and now a study conducted by researchers at UC Santa Barbara and the University of Pittsburgh suggests that the oft-repeated adage applies not just to physical health but to brain power as well.
In a paper published in the early online edition of the journal Prostaglandins, Leukotrienes and Essential Fatty Acids, the researchers compared the fatty acid profiles of breast milk from women in over two dozen countries with how well children from those same countries performed on academic tests.

Their findings show that the amount of omega-3 docosahexaenoic acid (DHA) in a mother’s milk — fats found primarily in certain fish, nuts and seeds — is the strongest predictor of test performance. It outweighs national income and the number of dollars spent per pupil in schools.
DHA alone accounted for about 20 percent of the differences in test scores among countries, the researchers found.
On the other hand, the amount of omega-6 fat in mother’s milk — fats that come from vegetable oils such as corn and soybean — predict lower test scores. When the amount of DHA and linoleic acid (LA) — the most common omega-6 fat — were considered together, they explained nearly half of the differences in test scores. In countries where mother’s diets contain more omega-6, the beneficial effects of DHA seem to be reduced.
More omega-3, less omega-6
“Human intelligence has a physical basis in the huge size of our brains — some seven times larger than would be expected for a mammal with our body size,” said Steven Gaulin, UCSB professor of anthropology and co-author of the paper. “Since there is never a free lunch, those big brains need lots of extra building materials — most importantly, they need omega-3 fatty acids, especially DHA. Omega-6 fats, however, undermine the effects of DHA and seem to be bad for brains.”
Both kinds of omega fat must be obtained through diet. But because diets vary from place to place, for their study Gaulin and his co-author, William D. Lassek, M.D., a professor at the University of Pittsburgh’s Graduate School of Public Health and a retired assistant surgeon general, estimated the DHA and LA content — the good fat and the bad fat — in diets in 50 countries by examining published studies of the fatty acid profiles of women’s breast milk.
The profiles are a useful measure for two reasons, according to Gaulin. First, because various kinds of fats interfere with one another in the body, breast milk DHA shows how much of this brain-essential fat survives competition with omega-6. Second, children receive their brain-building fats from their mothers. Breast milk profiles indicate the amount of DHA children in each region receive in the womb, through breastfeeding, and from the local diet available to their mothers and to them after they are weaned.
The academic test results came from the Programme for International Student Assessment (PISA), which administers standardized tests in 58 nations. Gaulin and Lassek averaged the three PISA tests — math, science and reading ability — as their measure of cognitive performance. There were 28 countries for which the researchers found information about both breast milk and test scores.
DHA content: best predictor of math test performance
“Looking at those 28 countries, the DHA content of breast milk was the single best predictor of math test performance,” Gaulin said. The second best indicator was the amount of omega-6, and its effect is opposite. “Considering the benefits of omega-3 and the detriment of omega-6, we can get pretty darn close to explaining half the difference in scores between countries,” he added. When DHA and LA are considered together, he added, they are twice as effective at predicting test scores as either is alone, Gaulin said.
Gaulin and Lassek considered two economic factors as well: per capita gross domestic product (a measure of average wealth in each nation) and per student expenditures on education. “Each of these factors helps explain some of the differences between nations in test scores, but the fatty acid profile of the average mother’s milk in a given country is a better predictor of the average cognitive performance in that country than is either of the conventional socioeconomic measures people use,” said Gaulin.
From their analysis, the researchers conclude that both economic wellbeing and diet make a difference in cognitive test performance, and children are best off when they have both factors in their favor. “But if you had to choose one, you should choose the better diet rather than the better economy,” Gaulin said.
The current research follows a study published in 2008 that showed that the children of women who had larger amounts of gluteofemoral fat “depots” performed better on academic tests than those of mothers with less. “At that time we weren’t trying to identify the dietary cause,” explained Gaulin. “We found that this depot that has been evolutionarily elaborated in women is important to building a good brain. We were content at that time to show that as a way of understanding why the female body is as evolutionarily distinctive as it is.”
Now the researchers are looking at diet as the key to brain-building fat, since mothers need to acquire these fats in the first place.
Their results are particularly interesting in 21st-century North America, Gaulin noted, because our current agribusiness-based diets provide very low levels of DHA — among the lowest in the world. Thanks to two heavily government-subsidized crops — corn and soybeans — the average U.S. diet is heavy in the bad omega-6 fatty acids and far too light on the good omega-3s, Gaulin said.
Wrong kind of polyunsaturated fat
“Back in the 1960s, in the middle of the cardiovascular disease epidemic, people got the idea that saturated fats were bad and polyunsaturated fats were good,” he explained. “That’s one reason margarine became so popular. But the polyunsaturated fats that were increased were the ones with omega-6, not omega-3. So our message is that not only is it advisable to increase omega 3 intake, it’s highly advisable to decrease omega-6 — the very fats that in the 1960s and ’70s we were told we should be eating more of.”
Gaulin added that mayonnaise is, in general, the most omega-6-laden food in the average person’s refrigerator. “If you have too much of one — omega-6 — and too little of the other — omega 3 — you’re going to end up paying a price cognitively,” he said.
The issue is a huge concern for women, Gaulin noted, because “that’s where kids’ brains come from. But it’s important for men as well because they have to take care of the brains their moms gave them.
“Just like a racecar burns up some of its motor oil with every lap, your brain burns up omega-3 and you need to replenish it every day,” he said.
(Image: Stacy Librandi)

Breast milk is brain food

You are what you eat, the saying goes, and now a study conducted by researchers at UC Santa Barbara and the University of Pittsburgh suggests that the oft-repeated adage applies not just to physical health but to brain power as well.

In a paper published in the early online edition of the journal Prostaglandins, Leukotrienes and Essential Fatty Acids, the researchers compared the fatty acid profiles of breast milk from women in over two dozen countries with how well children from those same countries performed on academic tests.

Their findings show that the amount of omega-3 docosahexaenoic acid (DHA) in a mother’s milk — fats found primarily in certain fish, nuts and seeds — is the strongest predictor of test performance. It outweighs national income and the number of dollars spent per pupil in schools.

DHA alone accounted for about 20 percent of the differences in test scores among countries, the researchers found.

On the other hand, the amount of omega-6 fat in mother’s milk — fats that come from vegetable oils such as corn and soybean — predict lower test scores. When the amount of DHA and linoleic acid (LA) — the most common omega-6 fat — were considered together, they explained nearly half of the differences in test scores. In countries where mother’s diets contain more omega-6, the beneficial effects of DHA seem to be reduced.

More omega-3, less omega-6

“Human intelligence has a physical basis in the huge size of our brains — some seven times larger than would be expected for a mammal with our body size,” said Steven Gaulin, UCSB professor of anthropology and co-author of the paper. “Since there is never a free lunch, those big brains need lots of extra building materials — most importantly, they need omega-3 fatty acids, especially DHA. Omega-6 fats, however, undermine the effects of DHA and seem to be bad for brains.”

Both kinds of omega fat must be obtained through diet. But because diets vary from place to place, for their study Gaulin and his co-author, William D. Lassek, M.D., a professor at the University of Pittsburgh’s Graduate School of Public Health and a retired assistant surgeon general, estimated the DHA and LA content — the good fat and the bad fat — in diets in 50 countries by examining published studies of the fatty acid profiles of women’s breast milk.

The profiles are a useful measure for two reasons, according to Gaulin. First, because various kinds of fats interfere with one another in the body, breast milk DHA shows how much of this brain-essential fat survives competition with omega-6. Second, children receive their brain-building fats from their mothers. Breast milk profiles indicate the amount of DHA children in each region receive in the womb, through breastfeeding, and from the local diet available to their mothers and to them after they are weaned.

The academic test results came from the Programme for International Student Assessment (PISA), which administers standardized tests in 58 nations. Gaulin and Lassek averaged the three PISA tests — math, science and reading ability — as their measure of cognitive performance. There were 28 countries for which the researchers found information about both breast milk and test scores.

DHA content: best predictor of math test performance

“Looking at those 28 countries, the DHA content of breast milk was the single best predictor of math test performance,” Gaulin said. The second best indicator was the amount of omega-6, and its effect is opposite. “Considering the benefits of omega-3 and the detriment of omega-6, we can get pretty darn close to explaining half the difference in scores between countries,” he added. When DHA and LA are considered together, he added, they are twice as effective at predicting test scores as either is alone, Gaulin said.

Gaulin and Lassek considered two economic factors as well: per capita gross domestic product (a measure of average wealth in each nation) and per student expenditures on education. “Each of these factors helps explain some of the differences between nations in test scores, but the fatty acid profile of the average mother’s milk in a given country is a better predictor of the average cognitive performance in that country than is either of the conventional socioeconomic measures people use,” said Gaulin.

From their analysis, the researchers conclude that both economic wellbeing and diet make a difference in cognitive test performance, and children are best off when they have both factors in their favor. “But if you had to choose one, you should choose the better diet rather than the better economy,” Gaulin said.

The current research follows a study published in 2008 that showed that the children of women who had larger amounts of gluteofemoral fat “depots” performed better on academic tests than those of mothers with less. “At that time we weren’t trying to identify the dietary cause,” explained Gaulin. “We found that this depot that has been evolutionarily elaborated in women is important to building a good brain. We were content at that time to show that as a way of understanding why the female body is as evolutionarily distinctive as it is.”

Now the researchers are looking at diet as the key to brain-building fat, since mothers need to acquire these fats in the first place.

Their results are particularly interesting in 21st-century North America, Gaulin noted, because our current agribusiness-based diets provide very low levels of DHA — among the lowest in the world. Thanks to two heavily government-subsidized crops — corn and soybeans — the average U.S. diet is heavy in the bad omega-6 fatty acids and far too light on the good omega-3s, Gaulin said.

Wrong kind of polyunsaturated fat

“Back in the 1960s, in the middle of the cardiovascular disease epidemic, people got the idea that saturated fats were bad and polyunsaturated fats were good,” he explained. “That’s one reason margarine became so popular. But the polyunsaturated fats that were increased were the ones with omega-6, not omega-3. So our message is that not only is it advisable to increase omega 3 intake, it’s highly advisable to decrease omega-6 — the very fats that in the 1960s and ’70s we were told we should be eating more of.”

Gaulin added that mayonnaise is, in general, the most omega-6-laden food in the average person’s refrigerator. “If you have too much of one — omega-6 — and too little of the other — omega 3 — you’re going to end up paying a price cognitively,” he said.

The issue is a huge concern for women, Gaulin noted, because “that’s where kids’ brains come from. But it’s important for men as well because they have to take care of the brains their moms gave them.

“Just like a racecar burns up some of its motor oil with every lap, your brain burns up omega-3 and you need to replenish it every day,” he said.

(Image: Stacy Librandi)

Filed under breast milk breastfeeding omega-3 cognitive performance health psychology neuroscience science

91 notes

Sleep disorders widely undiagnosed in individuals with multiple sclerosis

In what may be the largest study of sleep problems among individuals with multiple sclerosis (MS), researchers at UC Davis have found that widely undiagnosed sleep disorders may be at the root of the most common and disabling symptom of the disease: fatigue.

image

Conducted in over 2,300 individuals in Northern California with multiple sclerosis, the large, population-based study found that, overall, more than 70 percent of participants screened positive for one or more sleep disorders.

The research highlights the importance of diagnosing the root causes of fatigue among individuals with MS, as sleep disorders may affect the course of the disease as well as the overall health and well-being of sufferers, the authors said.

The study “The Underdiagnosis of Sleep Disorders in Patients with Multiple Sclerosis,” is published online today in the Journal of Clinical Sleep Medicine.

“A large percentage of MS subjects in our study are sleep deprived and screened positive for one or more sleep disorders,” said Steven Brass, associate clinical professor and director of the Neurology Sleep Clinical Program and co-medical director of the UC Davis Sleep Medicine Laboratory.

“The vast majority of these sleep disorders are potentially undiagnosed and untreated,” he said. “This work suggests that patients with MS may have sleep disorders requiring independent diagnosis and management.”

Fatigue is the hallmark of multiple sclerosis, an inflammatory disease affecting the white matter and spinal cord of sufferers. MS symptoms include loss of vision, vertigo, weakness and numbness. Patients also may experience psychiatric symptoms. Disease onset generally is between the ages of 20 and 50 years. The cause of MS is not known, although it is believed to be an autoimmune condition.

Sleep disorders are known to occur more frequently among patients with MS. To gauge the extent of sleep disorders, such as obstructive sleep apnea and insomnia, Brass and his colleagues surveyed members of the Northern California Chapter of the National MS Society. Subjects were recruited in 2011.

More than 11,000 surveys were mailed to prospective participants. Of those, 2,375 met criteria and were included in the study. Consistent with the reported epidemiology of multiple sclerosis, the majority (81 percent) were female and Caucasian (88 percent). The mean age of the participants was 54.

Participants were asked to complete a 10-page survey, which included a detailed sleep history and questions assessing obstructive sleep apnea, daytime sleepiness, insomnia and restless legs syndrome.

Most of the participants - nearly 52 percent - said it took them more than one half hour to fall asleep at night, and nearly 11 percent reported taking a medication to fall asleep. Close to 38 percent of participants screened positive for obstructive sleep apnea. Nearly 32 percent had moderate to severe insomnia and nearly 37 percent had restless legs syndrome.

However, most of the participants had not been diagnosed with a sleep disorder by a physician. While nearly 38 percent reported having obstructive sleep apnea, only a little more than 4 percent reported being diagnosed by a physician with the condition. Similar statistics were seen for other sleep disorders.

“This study shows that sleep disorder frequency, sleep patterns and complaints of excessive daytime sleepiness suggest that sleep problems may be a hidden epidemic in the MS population, separate from MS fatigue,” Brass said.

(Source: ucdmc.ucdavis.edu)

Filed under MS sleep sleep problems daytime sleepiness sleep apnea neuroscience science

82 notes

Brain inflammation dramatically disrupts memory retrieval networks
Brain inflammation can rapidly disrupt our ability to retrieve complex memories of similar but distinct experiences, according to UC Irvine neuroscientists Jennifer Czerniawski and John Guzowski.
Their study – which appears today in The Journal of Neuroscience – specifically identifies how immune system signaling molecules, called cytokines, impair communication among neurons in the hippocampus, an area of the brain critical for discrimination memory. The findings offer insight into why cognitive deficits occurs in people undergoing chemotherapy and those with autoimmune or neurodegenerative diseases.
Moreover, since cytokines are elevated in the brain in each of these conditions, the work suggests potential therapeutic targets to alleviate memory problems in these patients.
“Our research provides the first link among immune system activation, altered neural circuit function and impaired discrimination memory,” said Guzowski, the James L. McGaugh Chair in the Neurobiology of Learning & Memory. “The implications may be beneficial for those who have chronic diseases, such as multiple sclerosis, in which memory loss occurs and even for cancer patients.”
What he found interesting is that increased cytokine levels in the hippocampus only affected complex discrimination memory, the type that lets us differentiate among generally similar experiences – what we did at work or ate at dinner, for example. A simpler form of memory processed by the hippocampus – which would be akin to remembering where you work – was not altered by brain inflammation.
In the study, Czerniawski, a UCI postdoctoral scholar, exposed rats to two similar but discernable environments over several days. They received a mild foot shock daily in one, making them apprehensive about entering that specific site. Once the rodents showed that they had learned the difference between the two environments, some were given a low dose of a bacterial agent to induce a neuroinflammatory response, leading to cytokine release in the brain. Those animals were then no longer able to distinguish between the two environments.
Afterward, the researchers explored the activity patterns of neurons – the primary cell type for information processing – in the rats’ hippocampi using a gene-based cellular imaging method developed in the Guzowski lab. In the rodents that received the bacterial agent (and exhibited memory deterioration), the networks of neurons activated in the two environments were very similar, unlike those in the animals not given the agent (whose memories remained strong). This finding suggests that cytokines impaired recall by disrupting the function of these specific neuron circuits in the hippocampus.
“The cytokines caused the neural network to react as if no learning had taken place,” said Guzowski, associate professor of neurobiology & behavior. “The neural circuit activity was back to the pattern seen before learning.”
The work may also shed light on a chemotherapy-related mental phenomenon known as “chemo brain,” in which cancer patients find it difficult to efficiently process information. UCI neuro-oncologists have found that chemotherapeutic agents destroy stem cells in the brain that would have become neurons for creating and storing memories.
Dr. Daniela Bota, who co-authored that study, is currently collaborating with Guzowski’s research group to see if brain inflammation may be another of the underlying causes of “chemo brain” symptoms.
She said they’re looking for a simple intervention, such as an anti-inflammatory or steroid drug, that could lessen post-chemo inflammation. Bota will test this approach on patients, pending the outcome of animal studies.
“It will be interesting to see if limiting neuroinflammation will give cancer patients fewer or no problems,” she said. “It’s a wonderful idea, and it presents a new method to limit brain cell damage, improving quality of life. This is a great example of basic science and clinical ideas coming together to benefit patients.”

Brain inflammation dramatically disrupts memory retrieval networks

Brain inflammation can rapidly disrupt our ability to retrieve complex memories of similar but distinct experiences, according to UC Irvine neuroscientists Jennifer Czerniawski and John Guzowski.

Their study – which appears today in The Journal of Neuroscience – specifically identifies how immune system signaling molecules, called cytokines, impair communication among neurons in the hippocampus, an area of the brain critical for discrimination memory. The findings offer insight into why cognitive deficits occurs in people undergoing chemotherapy and those with autoimmune or neurodegenerative diseases.

Moreover, since cytokines are elevated in the brain in each of these conditions, the work suggests potential therapeutic targets to alleviate memory problems in these patients.

“Our research provides the first link among immune system activation, altered neural circuit function and impaired discrimination memory,” said Guzowski, the James L. McGaugh Chair in the Neurobiology of Learning & Memory. “The implications may be beneficial for those who have chronic diseases, such as multiple sclerosis, in which memory loss occurs and even for cancer patients.”

What he found interesting is that increased cytokine levels in the hippocampus only affected complex discrimination memory, the type that lets us differentiate among generally similar experiences – what we did at work or ate at dinner, for example. A simpler form of memory processed by the hippocampus – which would be akin to remembering where you work – was not altered by brain inflammation.

In the study, Czerniawski, a UCI postdoctoral scholar, exposed rats to two similar but discernable environments over several days. They received a mild foot shock daily in one, making them apprehensive about entering that specific site. Once the rodents showed that they had learned the difference between the two environments, some were given a low dose of a bacterial agent to induce a neuroinflammatory response, leading to cytokine release in the brain. Those animals were then no longer able to distinguish between the two environments.

Afterward, the researchers explored the activity patterns of neurons – the primary cell type for information processing – in the rats’ hippocampi using a gene-based cellular imaging method developed in the Guzowski lab. In the rodents that received the bacterial agent (and exhibited memory deterioration), the networks of neurons activated in the two environments were very similar, unlike those in the animals not given the agent (whose memories remained strong). This finding suggests that cytokines impaired recall by disrupting the function of these specific neuron circuits in the hippocampus.

“The cytokines caused the neural network to react as if no learning had taken place,” said Guzowski, associate professor of neurobiology & behavior. “The neural circuit activity was back to the pattern seen before learning.”

The work may also shed light on a chemotherapy-related mental phenomenon known as “chemo brain,” in which cancer patients find it difficult to efficiently process information. UCI neuro-oncologists have found that chemotherapeutic agents destroy stem cells in the brain that would have become neurons for creating and storing memories.

Dr. Daniela Bota, who co-authored that study, is currently collaborating with Guzowski’s research group to see if brain inflammation may be another of the underlying causes of “chemo brain” symptoms.

She said they’re looking for a simple intervention, such as an anti-inflammatory or steroid drug, that could lessen post-chemo inflammation. Bota will test this approach on patients, pending the outcome of animal studies.

“It will be interesting to see if limiting neuroinflammation will give cancer patients fewer or no problems,” she said. “It’s a wonderful idea, and it presents a new method to limit brain cell damage, improving quality of life. This is a great example of basic science and clinical ideas coming together to benefit patients.”

Filed under neuroinflammation memory hippocampus cytokines immune system neuroscience science

free counters