Neuroscience

Articles and news from the latest research reports.

Posts tagged cortisol

266 notes

Brain Imaging Research Pinpoints Neurobiological Basis for Key Symptoms Associated with Post-Traumatic Stress Disorder Like Listlessness and Emotional Detachment in Trauma Victims
In a novel brain-imaging study among trauma victims, researchers at NYU Langone Medical Center have linked an opioid receptor in the brain  — associated with emotions — to a narrow cluster of trauma symptoms, including sadness, emotional detachment and listlessness. The study, published online today in the journal JAMA Psychiatry, holds important implications for targeted, personalized treatment of post-traumatic stress disorder, or PTSD, a psychiatric condition affecting more than 8 million Americans that can cause a wide range of debilitating psychiatric symptoms.
“Our study points toward a more personalized treatment approach for people with a specific symptom profile that’s been linked to a particular neurobiological abnormality,” says lead author Alexander Neumeister, MD, director of the molecular imaging program in the Departments of Psychiatry and Radiology at NYU School of Medicine, and Co-Director of NYU Langone’s Steven and Alexandra Cohen Veterans Center for the Study of Post-Traumatic Stress Disorder and Traumatic Brain Injury. “Understanding more about where and how symptoms of PTSD manifest in the brain is a critical part of research efforts to develop more effective medications and treatment modalities.”
The new study confirms a growing body of evidence linking a particular set of symptoms to specific brain circuits and chemicals, and bolsters a shift within the field of psychiatry away from “one-size-fits-all treatments” and toward more individualized medication regimens that target highly specific neurobiological components. “We know from previous clinical trials that antidepressants, for example, do not work well for dysphoria and the numbing symptoms often found in PTSD,” Dr. Neumeister added. “Currently available antidepressants are just not linked specifically enough to the neurobiological basis of these symptoms in PTSD. Going forward, our study will help pave the way toward development of better options.”
“People with cancer have a variety of different treatment options available based on the type of cancer that they have,” adds Dr. Neumeister. “We aim to do the same thing in psychiatry. We’re deconstructing PTSD symptoms, linking them to different brain dysfunction, and then developing treatments that target those symptoms. It’s really a revolutionary step forward that has been supported by the National Institute of Mental Health (NIMH) over the past few years in their Research Domain Criteria Project.”
The study, funded by the National Institute of Mental Health (NIMH), compared the brain scans of healthy volunteers with those of clinically diagnosed trauma victims with PTSD, major depression, and generalized anxiety disorder whose symptoms ranged from emotional detachment to isolation. Participants received a harmless radioactive tracer that binds to and illuminates a class of opioid receptors, known as kappa, when exposed to high-resolution positron emission tomography (PET). Kappa opioid receptors bind a potent natural opioid known as dynorphin, which is released by the body during times of stress to help relieve dysphoria or numbing.
Chronic exposure to stress, such as the case with PTSD, taxes kappa opioid receptors, however, causing the receptors to retract inside cells, leaving dynorphin without a place to dock. As a result, patients can experience dysphoria, characterized by feelings of hopelessness, detachment and emotional unease.
Results showed that fewer available kappa opioid receptors in the brain regions believed to govern emotions were associated with more intense feelings of dysphoria, but not feelings of anxious arousal. The findings confirm previous studies in animals linking the opioid-receptor system expressed in these specific brain regions to symptoms of dysphoria. The study also found an association between lower levels of cortisol, a stress hormone, and unavailable kappa opioid receptors, suggesting a new role for cortisol as a biomarker for certain types of PTSD symptoms.
“This is the first brain-imaging study to explore any psychiatric condition using a protein that binds to the kappa opioid receptor system,” notes Dr. Neumeister, who says the data support clinical trials under way at NYU Langone and other institutions of new medications that target kappa opioid receptors and other brain systems that can be linked to specific symptoms in trauma survivors. Such medications could be widely available for the treatment of PTSD in the future if ongoing clinical trials yield encouraging results.
(Image: Alamy)

Brain Imaging Research Pinpoints Neurobiological Basis for Key Symptoms Associated with Post-Traumatic Stress Disorder Like Listlessness and Emotional Detachment in Trauma Victims

In a novel brain-imaging study among trauma victims, researchers at NYU Langone Medical Center have linked an opioid receptor in the brain — associated with emotions — to a narrow cluster of trauma symptoms, including sadness, emotional detachment and listlessness. The study, published online today in the journal JAMA Psychiatry, holds important implications for targeted, personalized treatment of post-traumatic stress disorder, or PTSD, a psychiatric condition affecting more than 8 million Americans that can cause a wide range of debilitating psychiatric symptoms.

“Our study points toward a more personalized treatment approach for people with a specific symptom profile that’s been linked to a particular neurobiological abnormality,” says lead author Alexander Neumeister, MD, director of the molecular imaging program in the Departments of Psychiatry and Radiology at NYU School of Medicine, and Co-Director of NYU Langone’s Steven and Alexandra Cohen Veterans Center for the Study of Post-Traumatic Stress Disorder and Traumatic Brain Injury. “Understanding more about where and how symptoms of PTSD manifest in the brain is a critical part of research efforts to develop more effective medications and treatment modalities.”

The new study confirms a growing body of evidence linking a particular set of symptoms to specific brain circuits and chemicals, and bolsters a shift within the field of psychiatry away from “one-size-fits-all treatments” and toward more individualized medication regimens that target highly specific neurobiological components. “We know from previous clinical trials that antidepressants, for example, do not work well for dysphoria and the numbing symptoms often found in PTSD,” Dr. Neumeister added. “Currently available antidepressants are just not linked specifically enough to the neurobiological basis of these symptoms in PTSD. Going forward, our study will help pave the way toward development of better options.”

“People with cancer have a variety of different treatment options available based on the type of cancer that they have,” adds Dr. Neumeister. “We aim to do the same thing in psychiatry. We’re deconstructing PTSD symptoms, linking them to different brain dysfunction, and then developing treatments that target those symptoms. It’s really a revolutionary step forward that has been supported by the National Institute of Mental Health (NIMH) over the past few years in their Research Domain Criteria Project.”

The study, funded by the National Institute of Mental Health (NIMH), compared the brain scans of healthy volunteers with those of clinically diagnosed trauma victims with PTSD, major depression, and generalized anxiety disorder whose symptoms ranged from emotional detachment to isolation. Participants received a harmless radioactive tracer that binds to and illuminates a class of opioid receptors, known as kappa, when exposed to high-resolution positron emission tomography (PET). Kappa opioid receptors bind a potent natural opioid known as dynorphin, which is released by the body during times of stress to help relieve dysphoria or numbing.

Chronic exposure to stress, such as the case with PTSD, taxes kappa opioid receptors, however, causing the receptors to retract inside cells, leaving dynorphin without a place to dock. As a result, patients can experience dysphoria, characterized by feelings of hopelessness, detachment and emotional unease.

Results showed that fewer available kappa opioid receptors in the brain regions believed to govern emotions were associated with more intense feelings of dysphoria, but not feelings of anxious arousal. The findings confirm previous studies in animals linking the opioid-receptor system expressed in these specific brain regions to symptoms of dysphoria. The study also found an association between lower levels of cortisol, a stress hormone, and unavailable kappa opioid receptors, suggesting a new role for cortisol as a biomarker for certain types of PTSD symptoms.

“This is the first brain-imaging study to explore any psychiatric condition using a protein that binds to the kappa opioid receptor system,” notes Dr. Neumeister, who says the data support clinical trials under way at NYU Langone and other institutions of new medications that target kappa opioid receptors and other brain systems that can be linked to specific symptoms in trauma survivors. Such medications could be widely available for the treatment of PTSD in the future if ongoing clinical trials yield encouraging results.

(Image: Alamy)

Filed under PTSD amygdala kappa opioid receptors cortisol neuroimaging neuroscience science

107 notes

Maturing brain flips function of amygdala in regulating stress hormones

In contrast to evidence that the amygdala stimulates stress responses in adults, researchers at Yerkes National Primate Research Center, Emory University have found that the amygdala has an inhibitory effect on stress hormones during the early development of nonhuman primates.

image

The results are published this week in Journal of Neuroscience.

The amygdala is a region of the brain known to be important for responses to threatening situations and learning about threats. Alterations in the amygdala have been reported in psychiatric disorders such as depression, anxiety disorders like PTSD, schizophrenia and autism spectrum disorder. However, much of what is known about the amygdala comes from research on adults.

"Our findings fit into an emerging theme in neuroscience research: that during childhood, there is a switch in amygdala function and connectivity with other brain regions, particularly the prefrontal cortex,” says Mar Sanchez, PhD, neuroscience researcher at Yerkes and associate professor of psychiatry and behavioral sciences at Emory University School of Medicine. The first author of the paper is postdoctoral fellow Jessica Raper, PhD.

The findings are part of a larger longitudinal study at Yerkes National Primate Research Center, examining how amygdala damage within the first month of life affects the development of social and emotional behaviors and neuroendocrine systems in rhesus monkeys from infancy through adulthood. The laboratories of Sanchez and Yerkes researchers Jocelyne Bachevalier, PhD and Kim Wallen, PhD are collaborating on this project.

Previous investigations at Yerkes found that as infants, monkeys with amygdala damage showed higher levels of the stress hormone cortisol. This surprising result contrasted with previous research on adults, which showed that amygdala damage results in lower levels of cortisol.

The team hypothesized that damage to the amygdala generated changes in the HPA axis: a network of endocrine interactions between the hypothalamus within the brain, the pituitary and the adrenal glands, critical for reactions to stress.

"We wanted to examine whether the alterations in stress hormones seen during infancy persisted, and what brain changes were responsible for them," Sanchez says. "In studies of adults, the amygdala and its connections are fully formed at the time of the manipulation, but here neither the amygdala or its connections were fully matured when the damage occurred."

In the current paper, the authors demonstrated that in contrast with adult animals with amygdala damage, juvenile monkeys with early amygdala damage had increased levels of cortisol in the blood, compared to controls. In their cerebrospinal fluid, they also had elevated levels of corticotropin releasing factor (CRF), the neuropeptide that initiates the stress response in the brain. Elevated CRF and cortisol are linked to anxiety and emotional dysregulation in patients with mood disorders.

Despite the increased levels of stress hormones, monkeys with early amygdala damage exhibit a blunted emotional reactivity to threats, including decreased fear and aggression, and reduced anxiety in response to stress. Still, monkeys with neonatal amygdala damage remain competent in interacting with others in their large social groups. These findings are consistent with reports of human patients with damage to the amygdala, Raper says.

"We speculate that the rich social environment provided to the monkeys promotes compensatory mechanisms in cortical regions implicated in the regulation of social behavior," she says. "But neonatal amygdala damage seems more detrimental for the development of stress neuroendocrine circuits in other areas of the brain."

The investigators plan to follow the animals into adulthood to investigate the long-term effects of early amygdala damage on stress hormones, behavior and physiological systems possibly affected by chronically high cortisol levels, such as immune, growth and reproductive functions.

(Source: news.emory.edu)

Filed under amygdala stress cortisol prefrontal cortex HPA axis neuroscience science

519 notes

Researcher shows how stress hormones promote brain’s building of negative memories
When a person experiences a devastating loss or tragic event, why does every detail seem burned into memory whereas a host of positive experiences simply fade away?
It’s a bit more complicated than scientists originally thought, according to a study recently published in the journal Neuroscience by ASU researcher Sabrina Segal.
When people experience a traumatic event, the body releases two major stress hormones: norepinephrine and cortisol. Norepinephrine boosts heart rate and controls the fight-or-flight response, commonly rising when individuals feel threatened or experience highly emotional reactions. It is chemically similar to the hormone epinephrine – better known as adrenaline.
In the brain, norepinephrine in turn functions as a powerful neurotransmitter or chemical messenger that can enhance memory.
Research on cortisol has demonstrated that this hormone can also have a powerful effect on strengthening memories. However, studies in humans up until now have been inconclusive – with cortisol sometimes enhancing memory, while at other times having no effect.
A key factor in whether cortisol has an effect on strengthening certain memories may rely on activation of norepinephrine during learning, a finding previously reported in studies with rats.
In her study, Segal, an assistant research professor at the Institute for Interdisciplinary Salivary Bioscience Research at ASU, and her colleagues at the University of California-Irvine showed that human memory enhancement functions in a similar way.
Conducted in the laboratory of Larry Cahill at U.C. Irvine, Segal’s study included 39 women who viewed 144 images from the International Affective Picture Set. This set is a standardized picture set used by researchers to elicit a range of responses, from neutral to strong emotional reactions, upon view.
Segal and her colleagues gave each of the study’s subjects either a dose of hydrocortisone – to simulate stress – or a placebo just prior to viewing the picture set. Each woman then rated her feelings at the time she was viewing the image, in addition to giving saliva samples before and after. One week later, a surprise recall test was administered.
What Segal’s team found was that “negative experiences are more readily remembered when an event is traumatic enough to release cortisol after the event, and only if norepinephrine is released during or shortly after the event.”
“This study provides a key component to better understanding how traumatic memories may be strengthened in women,” Segal added, “because it suggests that if we can lower norepinephrine levels immediately following a traumatic event, we may be able to prevent this memory enhancing mechanism from occurring, regardless of how much cortisol is released following a traumatic event.”
Further studies are needed to explore to what extent the relationship between these two stress hormones differ depending on whether you are male or female, particularly because women are twice as likely to develop disorders from stress and trauma that affect memory, such as in Posttraumatic Stress Disorder (PTSD). In the meantime, the team’s findings are a first step toward a better understanding of neurobiological mechanisms that underlie traumatic disorders, such as PTSD.
(Image: Wikimedia Commons)

Researcher shows how stress hormones promote brain’s building of negative memories

When a person experiences a devastating loss or tragic event, why does every detail seem burned into memory whereas a host of positive experiences simply fade away?

It’s a bit more complicated than scientists originally thought, according to a study recently published in the journal Neuroscience by ASU researcher Sabrina Segal.

When people experience a traumatic event, the body releases two major stress hormones: norepinephrine and cortisol. Norepinephrine boosts heart rate and controls the fight-or-flight response, commonly rising when individuals feel threatened or experience highly emotional reactions. It is chemically similar to the hormone epinephrine – better known as adrenaline.

In the brain, norepinephrine in turn functions as a powerful neurotransmitter or chemical messenger that can enhance memory.

Research on cortisol has demonstrated that this hormone can also have a powerful effect on strengthening memories. However, studies in humans up until now have been inconclusive – with cortisol sometimes enhancing memory, while at other times having no effect.

A key factor in whether cortisol has an effect on strengthening certain memories may rely on activation of norepinephrine during learning, a finding previously reported in studies with rats.

In her study, Segal, an assistant research professor at the Institute for Interdisciplinary Salivary Bioscience Research at ASU, and her colleagues at the University of California-Irvine showed that human memory enhancement functions in a similar way.

Conducted in the laboratory of Larry Cahill at U.C. Irvine, Segal’s study included 39 women who viewed 144 images from the International Affective Picture Set. This set is a standardized picture set used by researchers to elicit a range of responses, from neutral to strong emotional reactions, upon view.

Segal and her colleagues gave each of the study’s subjects either a dose of hydrocortisone – to simulate stress – or a placebo just prior to viewing the picture set. Each woman then rated her feelings at the time she was viewing the image, in addition to giving saliva samples before and after. One week later, a surprise recall test was administered.

What Segal’s team found was that “negative experiences are more readily remembered when an event is traumatic enough to release cortisol after the event, and only if norepinephrine is released during or shortly after the event.”

“This study provides a key component to better understanding how traumatic memories may be strengthened in women,” Segal added, “because it suggests that if we can lower norepinephrine levels immediately following a traumatic event, we may be able to prevent this memory enhancing mechanism from occurring, regardless of how much cortisol is released following a traumatic event.”

Further studies are needed to explore to what extent the relationship between these two stress hormones differ depending on whether you are male or female, particularly because women are twice as likely to develop disorders from stress and trauma that affect memory, such as in Posttraumatic Stress Disorder (PTSD). In the meantime, the team’s findings are a first step toward a better understanding of neurobiological mechanisms that underlie traumatic disorders, such as PTSD.

(Image: Wikimedia Commons)

Filed under stress negative emotions PTSD memory learning norepinephrine cortisol neuroscience science

77 notes

Hormones affect voting behavior
Researchers from the University of Nebraska at Omaha (UNO), the University of Nebraska-Lincoln (UNL) and Rice University have released a study that shows hormone levels can affect voter turnout.
As witnessed by recent voter turnout in primary elections, participation in U.S. national elections is low, relative to other western democracies. In fact, voter turnout in biennial national elections ranges includes only 40 to 60 percent of eligible voters.
The study, published June 22 in Physiology and Behavior, reports that while participation in electoral politics is affected by a host of social and demographic variables, there are also biological factors that may play a role, as well. Specifically, the paper points to low levels of the stress hormone cortisol as a strong predictor of actual voting behavior, determined via voting records maintained by the Secretary of State.
"Politics and political participation is an inherently stressful activity," explained the paper’s lead author, Jeff French, Varner Professor of Psychology and Biology and director of UNO’s neuroscience program. "It would logically follow that those individuals with low thresholds for stress might avoid engaging in that activity and our study confirmed that hypothesis."
Additional authors on the paper are Adam Guck and Andrew K. Birnie from UNO’s Department of Psychology; Kevin B. Smith and John R. Hibbing from UNL’s Department of Political Science; and John R. Alford from the Department of Political Science at Rice University.
The study is part of a larger body of research exploring connections between biology and political orientation, led by Smith and Hibbing. Previous studies have involved twins, eye-tracking equipment and skin conductance in their efforts to identify physical and genetic links to political beliefs.
"It’s one more piece of solid evidence that there are biological markers for political attitudes and behavior," said Smith. "It’s long been known that cortisol levels are associated with your willingness to interact socially – that’s something fairly well established in the research literature. The big contribution here is that nobody really looked at politics and voting behaviors before."
"This research shows that cortisol is related to a willingness to participate in politics," he said.
To reach their conclusion, researchers collected the saliva of over 100 participants who identified themselves as highly conservative, highly liberal or disinterested in politics altogether and analyzed the levels of cortisol found.
Cortisol was measured in saliva collected from the participants before and during activities designed to raise and lower stress. These data were then compared against the participants’ earlier responses regarding involvement in political activities (voting and nonvoting) and religious participation.
"Not only did the study show, expectedly, that high-stress activities led to higher levels of cortisol production, but that political participation was significantly correlated with low baseline levels of cortisol," French explained. "Participation in another group-oriented activity, specifically religious participation, was not as strongly associated with cortisol levels. Involvement in nonvoting political activities, such as volunteering for a campaign, financial political contributions, or correspondence with elected officials, was not predicted by levels of stress hormones."
According to the study, the only other factor that was predictive of voting behavior was age; older adults were likely to have voted more often than younger adults. Research from other groups has also pointed to education, income, and race as important predictors of voting behavior.
In explaining why elevated cortisol could be linked with lower rates of participation in elections, French cited previous experiments in which high levels of afternoon cortisol are linked to major depressive disorder, social withdrawal, separation anxiety and enhanced memory for fearful stimuli.
"High afternoon cortisol is reflective of a variety of social, cognitive, and emotional processes, and may also influence a trait as complex as voting behavior," French suggested.
"The key takeaway from this research, I believe, is that while social scientists have spent decades trying to predict voting behavior based on demographic information, there is much to be learned from looking at biological differences as well," he said. "Many factors influence the decision to participate in the most important political activity in our democracy, and our study demonstrates that stress physiology is an important biological factor in this decision. Our experiment helps to more fully explain why some people engage in electoral politics and others do not."

Hormones affect voting behavior

Researchers from the University of Nebraska at Omaha (UNO), the University of Nebraska-Lincoln (UNL) and Rice University have released a study that shows hormone levels can affect voter turnout.

As witnessed by recent voter turnout in primary elections, participation in U.S. national elections is low, relative to other western democracies. In fact, voter turnout in biennial national elections ranges includes only 40 to 60 percent of eligible voters.

The study, published June 22 in Physiology and Behavior, reports that while participation in electoral politics is affected by a host of social and demographic variables, there are also biological factors that may play a role, as well. Specifically, the paper points to low levels of the stress hormone cortisol as a strong predictor of actual voting behavior, determined via voting records maintained by the Secretary of State.

"Politics and political participation is an inherently stressful activity," explained the paper’s lead author, Jeff French, Varner Professor of Psychology and Biology and director of UNO’s neuroscience program. "It would logically follow that those individuals with low thresholds for stress might avoid engaging in that activity and our study confirmed that hypothesis."

Additional authors on the paper are Adam Guck and Andrew K. Birnie from UNO’s Department of Psychology; Kevin B. Smith and John R. Hibbing from UNL’s Department of Political Science; and John R. Alford from the Department of Political Science at Rice University.

The study is part of a larger body of research exploring connections between biology and political orientation, led by Smith and Hibbing. Previous studies have involved twins, eye-tracking equipment and skin conductance in their efforts to identify physical and genetic links to political beliefs.

"It’s one more piece of solid evidence that there are biological markers for political attitudes and behavior," said Smith. "It’s long been known that cortisol levels are associated with your willingness to interact socially – that’s something fairly well established in the research literature. The big contribution here is that nobody really looked at politics and voting behaviors before."

"This research shows that cortisol is related to a willingness to participate in politics," he said.

To reach their conclusion, researchers collected the saliva of over 100 participants who identified themselves as highly conservative, highly liberal or disinterested in politics altogether and analyzed the levels of cortisol found.

Cortisol was measured in saliva collected from the participants before and during activities designed to raise and lower stress. These data were then compared against the participants’ earlier responses regarding involvement in political activities (voting and nonvoting) and religious participation.

"Not only did the study show, expectedly, that high-stress activities led to higher levels of cortisol production, but that political participation was significantly correlated with low baseline levels of cortisol," French explained. "Participation in another group-oriented activity, specifically religious participation, was not as strongly associated with cortisol levels. Involvement in nonvoting political activities, such as volunteering for a campaign, financial political contributions, or correspondence with elected officials, was not predicted by levels of stress hormones."

According to the study, the only other factor that was predictive of voting behavior was age; older adults were likely to have voted more often than younger adults. Research from other groups has also pointed to education, income, and race as important predictors of voting behavior.

In explaining why elevated cortisol could be linked with lower rates of participation in elections, French cited previous experiments in which high levels of afternoon cortisol are linked to major depressive disorder, social withdrawal, separation anxiety and enhanced memory for fearful stimuli.

"High afternoon cortisol is reflective of a variety of social, cognitive, and emotional processes, and may also influence a trait as complex as voting behavior," French suggested.

"The key takeaway from this research, I believe, is that while social scientists have spent decades trying to predict voting behavior based on demographic information, there is much to be learned from looking at biological differences as well," he said. "Many factors influence the decision to participate in the most important political activity in our democracy, and our study demonstrates that stress physiology is an important biological factor in this decision. Our experiment helps to more fully explain why some people engage in electoral politics and others do not."

Filed under stress cortisol voting behavior psychology neuroscience science

382 notes

Stress hormone linked to short-term memory loss as we age
A new study at the University of Iowa reports a potential link between stress hormones and short-term memory loss in older adults.
The study, published in the Journal of Neuroscience, reveals that having high levels of cortisol—a natural hormone in our body whose levels surge when we are stressed—can lead to memory lapses as we age.
Short-term increases in cortisol are critical for survival. They promote coping and help us respond to life’s challenges by making us more alert and able to think on our feet. But abnormally high or prolonged spikes in cortisol—like what happens when we are dealing with long-term stress—can lead to negative consequences that numerous bodies of research have shown to include digestion problems, anxiety, weight gain, and high blood pressure.
In this study, the UI researchers linked elevated amounts of cortisol to the gradual loss of synapses in the prefrontal cortex, the region of the brain that houses short-term memory. Synapses are the connections that help us process, store, and recall information. And when we get older, repeated and long-term exposure to cortisol can cause them to shrink and disappear.
“Stress hormones are one mechanism that we believe leads to weathering of the brain,” says Jason Radley, assistant professor in psychology at the UI and corresponding author on the paper. Like a rock on the shoreline, after years and years it will eventually break down and disappear.
While previous studies have shown cortisol to produce similar effects in other regions of the aging brain, this was the first study to examine its impact on the prefrontal cortex.
And although preliminary, the findings raise the possibility that short-memory decline in aging adults may be slowed or prevented by treatments that decrease levels of cortisol in susceptible individuals, says Radley. That could mean treating people who have naturally high levels of cortisol—such as those who are depressed—or those who experience repeated, long-term stress due to traumatic life events like the death of a loved one.
According to Radley and Rachel Anderson, the paper’s lead author and a second year-graduate student in psychology at the UI, short-term memory lapses related to cortisol start around age 65. That’s about the equivalent of 21 month-old rats, which the pair studied to make their discovery.
The UI scientists compared the elderly rats to four-month old rats, which are roughly the same age as a 20 year-old person. The young and elderly groups were then separated further according to whether the rats had naturally high or naturally low levels of corticosterone—the hormone comparable to cortisol in humans.
The researchers subsequently placed the rats in a T-shaped maze that required them to use their short-term memory. In order to receive a treat, they needed to recall which direction they had turned at the top of the T just 30, 60, or 120 seconds ago and then turn the opposite way each time they ran the maze.
Though memory declined across all groups as the time rats waited before running the maze again increased, older rats with high corticosterone levels consistently performed the worst. They chose the correct direction only 58 percent of the time, compared to their older peers with low corticosterone levels who chose it 80 percent of the time.
When researchers took tissue samples from the rats’ prefrontal cortexes and examined them under a microscope, they found the poor performers had smaller and 20 percent fewer synapses than all other groups, indicating memory loss.
In contrast, older rats with low corticosterone levels showed little memory loss and ran the maze nearly as well as the younger rats, who were not affected by any level of corticosterone—low or high.
Still, researchers say it’s important to remember that stress hormones are only one of a host of factors when it comes to mental decline and memory loss as we age.

Stress hormone linked to short-term memory loss as we age

A new study at the University of Iowa reports a potential link between stress hormones and short-term memory loss in older adults.

The study, published in the Journal of Neuroscience, reveals that having high levels of cortisol—a natural hormone in our body whose levels surge when we are stressed—can lead to memory lapses as we age.

Short-term increases in cortisol are critical for survival. They promote coping and help us respond to life’s challenges by making us more alert and able to think on our feet. But abnormally high or prolonged spikes in cortisol—like what happens when we are dealing with long-term stress—can lead to negative consequences that numerous bodies of research have shown to include digestion problems, anxiety, weight gain, and high blood pressure.

In this study, the UI researchers linked elevated amounts of cortisol to the gradual loss of synapses in the prefrontal cortex, the region of the brain that houses short-term memory. Synapses are the connections that help us process, store, and recall information. And when we get older, repeated and long-term exposure to cortisol can cause them to shrink and disappear.

“Stress hormones are one mechanism that we believe leads to weathering of the brain,” says Jason Radley, assistant professor in psychology at the UI and corresponding author on the paper. Like a rock on the shoreline, after years and years it will eventually break down and disappear.

While previous studies have shown cortisol to produce similar effects in other regions of the aging brain, this was the first study to examine its impact on the prefrontal cortex.

And although preliminary, the findings raise the possibility that short-memory decline in aging adults may be slowed or prevented by treatments that decrease levels of cortisol in susceptible individuals, says Radley. That could mean treating people who have naturally high levels of cortisol—such as those who are depressed—or those who experience repeated, long-term stress due to traumatic life events like the death of a loved one.

According to Radley and Rachel Anderson, the paper’s lead author and a second year-graduate student in psychology at the UI, short-term memory lapses related to cortisol start around age 65. That’s about the equivalent of 21 month-old rats, which the pair studied to make their discovery.

The UI scientists compared the elderly rats to four-month old rats, which are roughly the same age as a 20 year-old person. The young and elderly groups were then separated further according to whether the rats had naturally high or naturally low levels of corticosterone—the hormone comparable to cortisol in humans.

The researchers subsequently placed the rats in a T-shaped maze that required them to use their short-term memory. In order to receive a treat, they needed to recall which direction they had turned at the top of the T just 30, 60, or 120 seconds ago and then turn the opposite way each time they ran the maze.

Though memory declined across all groups as the time rats waited before running the maze again increased, older rats with high corticosterone levels consistently performed the worst. They chose the correct direction only 58 percent of the time, compared to their older peers with low corticosterone levels who chose it 80 percent of the time.

When researchers took tissue samples from the rats’ prefrontal cortexes and examined them under a microscope, they found the poor performers had smaller and 20 percent fewer synapses than all other groups, indicating memory loss.

In contrast, older rats with low corticosterone levels showed little memory loss and ran the maze nearly as well as the younger rats, who were not affected by any level of corticosterone—low or high.

Still, researchers say it’s important to remember that stress hormones are only one of a host of factors when it comes to mental decline and memory loss as we age.

Filed under stress memory cortisol STM prefrontal cortex synapses aging neuroscience science

180 notes

Children with autism have elevated levels of steroid hormones in the womb 
Children who later develop autism are exposed to elevated levels of steroid hormones (for example testosterone, progesterone and cortisol) in the womb, according to scientists from the University of Cambridge and the Statens Serum Institute in Copenhagen, Denmark. The finding may help explain why autism is more common in males than females. However, the researchers caution it should not be used to screen for the condition.
The team of researchers, led by Professor Simon Baron-Cohen and Dr Michael Lombardo in Cambridge and Professor Bent Nørgaard-Pedersen in Denmark, utilized approximately 19,500 amniotic fluid samples stored in a Danish biobank from individuals born between 1993-1999. Amniotic fluid surrounds the baby in the womb during pregnancy and is collected when some women choose to have an amniocentesis around 15-16 weeks of pregnancy. This coincides with a critical period for early brain development and sexual differentiation, and thus allows scientists access into this important window in fetal development. The researchers identified amniotic fluid samples from 128 males later diagnosed with an autism spectrum condition and matched these up with information from a central register of all psychiatric diagnoses in Denmark.
Within the amniotic fluid the researchers looked at four key ‘sex steroid’ hormones that are each synthesized, step-by-step from the preceding one*. They also tested the steroid hormone cortisol that lies outside this pathway. The researchers found that levels of all steroid hormones were highly associated with each other and most importantly, that the autism group on average had higher levels of all steroid hormones, compared to a typically developing male comparison group. The results of the study, which was funded by the Medical Research Council, are published today in the journal Molecular Psychiatry.
Professor Baron-Cohen said: “This is one of the earliest non-genetic biomarkers that has been identified in children who go on to develop autism. We previously knew that elevated prenatal testosterone is associated with slower social and language development, better attention to detail, and more autistic traits. Now, for the first time, we have also shown that these steroid hormones are elevated in children clinically diagnosed with autism. Because some of these hormones are produced in much higher quantities in males than in females, this may help us explain why autism is more common in males.”
He added: “These new results are particularly striking because they are found across all the subgroups on the autism spectrum, for the first time uniting those with Asperger Syndrome, classic autism, or Pervasive Developmental Disorder Not-Otherwise-Specified. We now want to test if the same finding is found in females with autism.”
Dr Michael Lombardo said: “This result potentially has very important implications about the early biological mechanisms that alter brain development in autism and also pinpoints an important window in fetal development when such mechanisms exert their effects.”
Steroid hormones are particularly important because they exert influence on the process of how instructions in the genetic code are translated into building proteins. The researchers believe that altering this process during periods when the building blocks for the brain are being laid down may be particularly important in explaining how genetic risk factors for autism get expressed.
Dr Lombardo adds: “Our discovery here meshes nicely with other recent findings that highlight the prenatal period around 15 weeks gestation as a key period when important genetic risk mechanisms for autism are working together to be expressed in the developing brain.”
Professor Baron-Cohen said: “These results should not be taken as a reason to jump to steroid hormone blockers as a treatment as this could have unwanted side effects and may have little to no effect in changing the potentially permanent effects that fetal steroid hormones exert during the early foundational stages of brain development.”
He cautioned further: “Nor should these results be taken as a promising prenatal screening test. There is considerable overlap between the groups and our findings showed differences found at an average group level, rather than at the level of accurately predicting diagnosis for individuals. The value of the new results lies in identifying key biological mechanisms during fetal development that could play important roles in atypical brain development in autism.”
*Within the amniotic fluid the researchers looked at 4 key ‘sex steroid’ hormones that are each synthesized, step-by-step from the preceding one, in the ‘Δ4 sex steroid’ pathway: progesterone, 17α-hydroxy-progesterone, androstenedione and testosterone.

Children with autism have elevated levels of steroid hormones in the womb

Children who later develop autism are exposed to elevated levels of steroid hormones (for example testosterone, progesterone and cortisol) in the womb, according to scientists from the University of Cambridge and the Statens Serum Institute in Copenhagen, Denmark. The finding may help explain why autism is more common in males than females. However, the researchers caution it should not be used to screen for the condition.

The team of researchers, led by Professor Simon Baron-Cohen and Dr Michael Lombardo in Cambridge and Professor Bent Nørgaard-Pedersen in Denmark, utilized approximately 19,500 amniotic fluid samples stored in a Danish biobank from individuals born between 1993-1999. Amniotic fluid surrounds the baby in the womb during pregnancy and is collected when some women choose to have an amniocentesis around 15-16 weeks of pregnancy. This coincides with a critical period for early brain development and sexual differentiation, and thus allows scientists access into this important window in fetal development. The researchers identified amniotic fluid samples from 128 males later diagnosed with an autism spectrum condition and matched these up with information from a central register of all psychiatric diagnoses in Denmark.

Within the amniotic fluid the researchers looked at four key ‘sex steroid’ hormones that are each synthesized, step-by-step from the preceding one*. They also tested the steroid hormone cortisol that lies outside this pathway. The researchers found that levels of all steroid hormones were highly associated with each other and most importantly, that the autism group on average had higher levels of all steroid hormones, compared to a typically developing male comparison group. The results of the study, which was funded by the Medical Research Council, are published today in the journal Molecular Psychiatry.

Professor Baron-Cohen said: “This is one of the earliest non-genetic biomarkers that has been identified in children who go on to develop autism. We previously knew that elevated prenatal testosterone is associated with slower social and language development, better attention to detail, and more autistic traits. Now, for the first time, we have also shown that these steroid hormones are elevated in children clinically diagnosed with autism. Because some of these hormones are produced in much higher quantities in males than in females, this may help us explain why autism is more common in males.”

He added: “These new results are particularly striking because they are found across all the subgroups on the autism spectrum, for the first time uniting those with Asperger Syndrome, classic autism, or Pervasive Developmental Disorder Not-Otherwise-Specified. We now want to test if the same finding is found in females with autism.”

Dr Michael Lombardo said: “This result potentially has very important implications about the early biological mechanisms that alter brain development in autism and also pinpoints an important window in fetal development when such mechanisms exert their effects.”

Steroid hormones are particularly important because they exert influence on the process of how instructions in the genetic code are translated into building proteins. The researchers believe that altering this process during periods when the building blocks for the brain are being laid down may be particularly important in explaining how genetic risk factors for autism get expressed.

Dr Lombardo adds: “Our discovery here meshes nicely with other recent findings that highlight the prenatal period around 15 weeks gestation as a key period when important genetic risk mechanisms for autism are working together to be expressed in the developing brain.”

Professor Baron-Cohen said: “These results should not be taken as a reason to jump to steroid hormone blockers as a treatment as this could have unwanted side effects and may have little to no effect in changing the potentially permanent effects that fetal steroid hormones exert during the early foundational stages of brain development.”

He cautioned further: “Nor should these results be taken as a promising prenatal screening test. There is considerable overlap between the groups and our findings showed differences found at an average group level, rather than at the level of accurately predicting diagnosis for individuals. The value of the new results lies in identifying key biological mechanisms during fetal development that could play important roles in atypical brain development in autism.”

*Within the amniotic fluid the researchers looked at 4 key ‘sex steroid’ hormones that are each synthesized, step-by-step from the preceding one, in the ‘Δ4 sex steroid’ pathway: progesterone, 17α-hydroxy-progesterone, androstenedione and testosterone.

Filed under autism steroid hormones cortisol testosterone psychology neuroscience science

203 notes

Migraine Attacks Increase Following Stress
Migraine sufferers who experienced reduced stress from one day to the next are at significantly increased risk of migraine onset on the subsequent day, according to a new study conducted by researchers at the Montefiore Headache Center and Albert Einstein College of Medicine at Yeshiva University. Stress has long been believed to be a common headache trigger. In this study, researchers found that relaxation following heightened stress was an even more significant trigger for migraine attacks. Findings may aid in recommending preventive treatments and behavioral interventions. The study was published online today in Neurology®, the medical journal of the American Academy of Neurology.
Migraine is a chronic condition that affects approximately 38 million Americans. To examine headache triggers, investigators at the Montefiore Headache Center and Einstein conducted a three month electronic daily diary study which captured 2,011 diary records and 110 eligible migraine attacks in 17 participants. The study compared levels of stress and reduction in stress as predictors of headache.
“This study demonstrates a striking association between reduction in perceived stress and the occurrence of migraine headaches,” said study lead author Richard Lipton, M.D., director, Montefiore Headache Center, professor and vice chair of neurology and the Edwin S. Lowe Chair in Neurology, Einstein. “Results were strongest during the first six hours where decline in stress was associated with a nearly five-fold increased risk of migraine onset. The hormone cortisol, which rises during times of stress and reduces pain, may contribute to the triggering of headache during periods of relaxation.”
Data were collected using a custom-programmed electronic diary. Each day participants recorded information about migraine attacks, two types of stress ratings and common migraine triggers, such as hours of sleep, certain foods, drinks and alcohol consumed, and menstrual cycle. They also recorded their mood each day, including feeling happy, sad, relaxed, nervous, lively and bored.
“This study highlights the importance of stress management and healthy lifestyle habits for people who live with migraine,” said Dawn Buse, Ph.D., director, Behavioral Medicine, Montefiore Headache Center, associate professor, Clinical Neurology, Einstein, and study co-author. “It is important for people to be aware of rising stress levels and attempt to relax during periods of stress rather than allowing a major build up to occur. This could include exercising or attending a yoga class or may be as simple as taking a walk or focusing on one’s breath for a few minutes.”

Migraine Attacks Increase Following Stress

Migraine sufferers who experienced reduced stress from one day to the next are at significantly increased risk of migraine onset on the subsequent day, according to a new study conducted by researchers at the Montefiore Headache Center and Albert Einstein College of Medicine at Yeshiva University. Stress has long been believed to be a common headache trigger. In this study, researchers found that relaxation following heightened stress was an even more significant trigger for migraine attacks. Findings may aid in recommending preventive treatments and behavioral interventions. The study was published online today in Neurology®, the medical journal of the American Academy of Neurology.

Migraine is a chronic condition that affects approximately 38 million Americans. To examine headache triggers, investigators at the Montefiore Headache Center and Einstein conducted a three month electronic daily diary study which captured 2,011 diary records and 110 eligible migraine attacks in 17 participants. The study compared levels of stress and reduction in stress as predictors of headache.

“This study demonstrates a striking association between reduction in perceived stress and the occurrence of migraine headaches,” said study lead author Richard Lipton, M.D., director, Montefiore Headache Center, professor and vice chair of neurology and the Edwin S. Lowe Chair in Neurology, Einstein. “Results were strongest during the first six hours where decline in stress was associated with a nearly five-fold increased risk of migraine onset. The hormone cortisol, which rises during times of stress and reduces pain, may contribute to the triggering of headache during periods of relaxation.”

Data were collected using a custom-programmed electronic diary. Each day participants recorded information about migraine attacks, two types of stress ratings and common migraine triggers, such as hours of sleep, certain foods, drinks and alcohol consumed, and menstrual cycle. They also recorded their mood each day, including feeling happy, sad, relaxed, nervous, lively and bored.

“This study highlights the importance of stress management and healthy lifestyle habits for people who live with migraine,” said Dawn Buse, Ph.D., director, Behavioral Medicine, Montefiore Headache Center, associate professor, Clinical Neurology, Einstein, and study co-author. “It is important for people to be aware of rising stress levels and attempt to relax during periods of stress rather than allowing a major build up to occur. This could include exercising or attending a yoga class or may be as simple as taking a walk or focusing on one’s breath for a few minutes.”

Filed under migraines headaches stress stress management cortisol neuroscience science

558 notes

Your stress is my stress
Stress is contagious. Observing another person in a stressful situation can be enough to make our own bodies release the stress hormone cortisol. This is the conclusion reached by scientists involved in a large-scale cooperation project between the departments of Tania Singer at the Max Planck Institute for Cognitive and Brain Sciences in Leipzig and Clemens Kirschbaum at the Technische Universität Dresden. Empathic stress arose primarily when the observer and stressed individual were partners in a couple relationship and the stressful situation could be directly observed through a one-way mirror. However, even the observation of stressed strangers via video transmission was enough to put some people on red alert. In our stress-ridden society, empathic stress is a phenomenon that should not be ignored by the health care system.
Stress is a major health threat in today’s society. It causes a range of psychological problems like burnout, depression and anxiety. Even those who lead relatively relaxed lives constantly come into contact with stressed individuals. Whether at work or on television: someone is always experiencing stress, and this stress can affect the general environment in a physiologically quantifiable way through increased concentrations of the stress hormone cortisol.
“The fact that we could actually measure this empathic stress in the form of a significant hormone release was astonishing,” says Veronika Engert, one of the study’s first authors. This is particularly true considering that many studies experience difficulties to induce firsthand stress to begin with. The authors found that empathic stress reactions could be independent of (“vicarious stress”) or proportional to (“stress resonance”) the stress reactions of the actively stressed individuals. “There must be a transmission mechanism via which the target’s state can elicit a similar state in the observer down to the level of a hormonal stress response.“
During the stress test, the test subjects had to struggle with difficult mental arithmetic tasks and interviews, while two supposed behavioural analysts assessed their performance. Only five percent of the directly stressed test subjects managed to remain calm; the others displayed a physiologically significant increase in their cortisol levels.
In total, 26 percent of observers who were not directly exposed to any stress whatsoever also showed a significant increase in cortisol. The effect was particularly strong when observer and stressed individual were partners in a couple relationship (40 percent). However, even when watching a complete stranger, the stress was transmitted to ten percent of the observers. Accordingly, emotional closeness is a facilitator but not a necessary condition for the occurrence of empathic stress.
When the observers watched the events directly through a one-way mirror, 30 percent of them experienced a stress response. However, even presenting the stress test only virtually via video transmission was sufficient to significantly increase the cortisol levels of 24 percent of the observers. “This means that even television programmes depicting the suffering of other people can transmit that stress to viewers,” says Engert. “Stress has enormous contagion potential.”
Stress becomes a problem primarily when it is chronic. “A hormonal stress response has an evolutionary purpose, of course. When you are exposed to danger, you want your body to respond with an increase in cortisol,” explains Engert. “However, permanently elevated cortisol levels are not good. They have a negative impact on the immune system and neurotoxic properties in the long term.” Thus, individuals working as caregivers or the family members of chronically stressed individualshave an increased risk to suffer from the potentially harmful consequences of empathic stress. Anyone who is confronted with the suffering and stress of another person, particularly when sustained, has a higher risk of being affected by it themselves.
The results of the study also debunked a common prejudice: men and women actually experience empathic stress reactions with equal frequency. “In surveys however, women tend to assess themselves as being more empathic compared to  men’s self-assessments. This self-perception does not seem to hold if probed by implicit measures”
Future studies are intended to reveal exactly how the stress is transmitted and what can be done to reduce its potentially negative influence on society.

Your stress is my stress

Stress is contagious. Observing another person in a stressful situation can be enough to make our own bodies release the stress hormone cortisol. This is the conclusion reached by scientists involved in a large-scale cooperation project between the departments of Tania Singer at the Max Planck Institute for Cognitive and Brain Sciences in Leipzig and Clemens Kirschbaum at the Technische Universität Dresden. Empathic stress arose primarily when the observer and stressed individual were partners in a couple relationship and the stressful situation could be directly observed through a one-way mirror. However, even the observation of stressed strangers via video transmission was enough to put some people on red alert. In our stress-ridden society, empathic stress is a phenomenon that should not be ignored by the health care system.

Stress is a major health threat in today’s society. It causes a range of psychological problems like burnout, depression and anxiety. Even those who lead relatively relaxed lives constantly come into contact with stressed individuals. Whether at work or on television: someone is always experiencing stress, and this stress can affect the general environment in a physiologically quantifiable way through increased concentrations of the stress hormone cortisol.

“The fact that we could actually measure this empathic stress in the form of a significant hormone release was astonishing,” says Veronika Engert, one of the study’s first authors. This is particularly true considering that many studies experience difficulties to induce firsthand stress to begin with. The authors found that empathic stress reactions could be independent of (“vicarious stress”) or proportional to (“stress resonance”) the stress reactions of the actively stressed individuals. “There must be a transmission mechanism via which the target’s state can elicit a similar state in the observer down to the level of a hormonal stress response.“

During the stress test, the test subjects had to struggle with difficult mental arithmetic tasks and interviews, while two supposed behavioural analysts assessed their performance. Only five percent of the directly stressed test subjects managed to remain calm; the others displayed a physiologically significant increase in their cortisol levels.

In total, 26 percent of observers who were not directly exposed to any stress whatsoever also showed a significant increase in cortisol. The effect was particularly strong when observer and stressed individual were partners in a couple relationship (40 percent). However, even when watching a complete stranger, the stress was transmitted to ten percent of the observers. Accordingly, emotional closeness is a facilitator but not a necessary condition for the occurrence of empathic stress.

When the observers watched the events directly through a one-way mirror, 30 percent of them experienced a stress response. However, even presenting the stress test only virtually via video transmission was sufficient to significantly increase the cortisol levels of 24 percent of the observers. “This means that even television programmes depicting the suffering of other people can transmit that stress to viewers,” says Engert. “Stress has enormous contagion potential.”

Stress becomes a problem primarily when it is chronic. “A hormonal stress response has an evolutionary purpose, of course. When you are exposed to danger, you want your body to respond with an increase in cortisol,” explains Engert. “However, permanently elevated cortisol levels are not good. They have a negative impact on the immune system and neurotoxic properties in the long term.” Thus, individuals working as caregivers or the family members of chronically stressed individualshave an increased risk to suffer from the potentially harmful consequences of empathic stress. Anyone who is confronted with the suffering and stress of another person, particularly when sustained, has a higher risk of being affected by it themselves.

The results of the study also debunked a common prejudice: men and women actually experience empathic stress reactions with equal frequency. “In surveys however, women tend to assess themselves as being more empathic compared to  men’s self-assessments. This self-perception does not seem to hold if probed by implicit measures”

Future studies are intended to reveal exactly how the stress is transmitted and what can be done to reduce its potentially negative influence on society.

Filed under empathy cortisol stress empathic stress HPA axis neuroscience psychology science

163 notes

Fight Memory Loss with a Smile (or Chuckle) 
Too much stress can take its toll on the body, mood, and mind. As we age it can contribute to a number of health problems, including high blood pressure, diabetes, and heart disease. Recent research has shown that the stress hormone cortisol damages certain neurons in the brain and can negatively affect memory and learning ability in the elderly. Researchers at Loma Linda University have delved deeper into cortisol’s relationship to memory and whether humor and laughter—a well-known stress reliever—can help lessen the damage that cortisol can cause. Their findings were presented on Sunday, April 27, at the Experimental Biology meeting.
Gurinder Singh Bains et al. showed a 20-minute laugh-inducing funny video to a group of healthy elderly individuals and a group of elderly people with diabetes. The groups where then asked to complete a memory assessment that measured their learning, recall, and sight recognition. Their performance was compared to a control group of elderly people who also completed the memory assessment, but were not shown a funny video. Cortisol concentrations for both groups were also recorded at the beginning and end of the experiment.
The research team found a significant decrease in cortisol concentrations among both groups who watched the video. Video-watchers also showed greater improvement in all areas of the memory assessment when compared to controls, with the diabetic group seeing the most dramatic benefit in cortisol level changes and the healthy elderly seeing the most significant changes in memory test scores.
From the authors: “Our research findings offer potential clinical and rehabilitative benefits that can be applied to wellness programs for the elderly,” Dr. Bains said. “The cognitive components—learning ability and delayed recall—become more challenging as we age and are essential to older adults for an improved quality of life: mind, body, and spirit. Although older adults have age-related memory deficits, complimentary, enjoyable, and beneficial humor therapies need to be implemented for these individuals.”
Study co-author and long-time psychoneuroimmunology humor researcher, Dr. Lee Berk, added, “It’s simple, the less stress you have the better your memory. Humor reduces detrimental stress hormones like cortisol that decrease memory hippocampal neurons, lowers your blood pressure, and increases blood flow and your mood state. The act of laughter—or simply enjoying some humor—increases the release of endorphins and dopamine in the brain, which provides a sense of pleasure and reward. These positive and beneficial neurochemical changes, in turn, make the immune system function better. There are even changes in brain wave activity towards what’s called the “gamma wave band frequency”, which also amp up memory and recall. So, indeed, laughter is turning out to be not only a good medicine, but also a memory enhancer adding to our quality of life.”

Fight Memory Loss with a Smile (or Chuckle)

Too much stress can take its toll on the body, mood, and mind. As we age it can contribute to a number of health problems, including high blood pressure, diabetes, and heart disease. Recent research has shown that the stress hormone cortisol damages certain neurons in the brain and can negatively affect memory and learning ability in the elderly. Researchers at Loma Linda University have delved deeper into cortisol’s relationship to memory and whether humor and laughter—a well-known stress reliever—can help lessen the damage that cortisol can cause. Their findings were presented on Sunday, April 27, at the Experimental Biology meeting.

Gurinder Singh Bains et al. showed a 20-minute laugh-inducing funny video to a group of healthy elderly individuals and a group of elderly people with diabetes. The groups where then asked to complete a memory assessment that measured their learning, recall, and sight recognition. Their performance was compared to a control group of elderly people who also completed the memory assessment, but were not shown a funny video. Cortisol concentrations for both groups were also recorded at the beginning and end of the experiment.

The research team found a significant decrease in cortisol concentrations among both groups who watched the video. Video-watchers also showed greater improvement in all areas of the memory assessment when compared to controls, with the diabetic group seeing the most dramatic benefit in cortisol level changes and the healthy elderly seeing the most significant changes in memory test scores.

From the authors: “Our research findings offer potential clinical and rehabilitative benefits that can be applied to wellness programs for the elderly,” Dr. Bains said. “The cognitive components—learning ability and delayed recall—become more challenging as we age and are essential to older adults for an improved quality of life: mind, body, and spirit. Although older adults have age-related memory deficits, complimentary, enjoyable, and beneficial humor therapies need to be implemented for these individuals.”

Study co-author and long-time psychoneuroimmunology humor researcher, Dr. Lee Berk, added, “It’s simple, the less stress you have the better your memory. Humor reduces detrimental stress hormones like cortisol that decrease memory hippocampal neurons, lowers your blood pressure, and increases blood flow and your mood state. The act of laughter—or simply enjoying some humor—increases the release of endorphins and dopamine in the brain, which provides a sense of pleasure and reward. These positive and beneficial neurochemical changes, in turn, make the immune system function better. There are even changes in brain wave activity towards what’s called the “gamma wave band frequency”, which also amp up memory and recall. So, indeed, laughter is turning out to be not only a good medicine, but also a memory enhancer adding to our quality of life.”

Filed under aging memory memory loss laughter stress cortisol Experimental Biology Meeting 2014 neuroscience science

98 notes

Exposure to Cortisol-Like Medications Before Birth May Contribute to Emotional Problems and Brain Changes

Neonatologists seem to perform miracles in the fight to support the survival of babies born prematurely.

To promote their survival, cortisol-like drugs called glucocorticoids are administered frequently to women in preterm labor to accelerate their babies’ lung maturation prior to birth. Cortisol is a substance naturally released by the body when stressed. But the levels of glucocorticoids administered to promote lung development are higher than that achieved with typical stress, perhaps only mirrored in the body’s reaction to extreme stresses.

The benefit of glucocorticoids is undisputed and has certainly saved the lives of countless babies, but this exposure also may have some negative consequences. Indeed, excessive glucocorticoid levels may have effects on brain development, perhaps contributing to emotional problems later in life.

In this issue of Biological Psychiatry, Dr. Elysia Davis at the University of Denver and her colleagues report new findings on the effects of synthetic glucocorticoid on human brain development. Their study focused on healthy children who were born full-term, avoiding the confounding effects of premature birth.

The investigators conducted brain imaging sessions in and carefully assessed 54 children, 6-10 years of age. The mothers of the participating children also completed reports on their child’s behavior. The researchers then divided the children into two groups: those who were exposed to glucocorticoids prenatally and those who were not.

In this study, children with fetal glucocorticoid exposure showed significant cortical thinning, and a thinner cortex also predicted more emotional problems. In one particularly affected part of the brain, the rostral anterior cingulate cortex, it was 8-9% thinner among children exposed to glucocorticoids. Interestingly, other studies have shown that this region of the brain is affected in individuals diagnosed with mood and anxiety disorders.

"Fetal exposure to a frequently administered stress hormone is associated with consequences for child brain development that persist for at least 6 to 10 years. These neurological changes are associated with increased risk for stress and emotional problems," Davis explained of their findings. "Importantly, these findings were observed among healthy children born full term."

Although such a finding does not indicate that glucocorticoids ‘caused’ these changes, the researchers did determine that the findings can’t be explained by any obvious confounding differences between the groups. The two groups did not differ on weight or gestational age at birth, apgar scores, maternal factors, or any other basic demographics. Thus, the findings do suggest that glucocorticoid administration may somehow alter the trajectory of brain development of exposed children.

"This study provides evidence that prenatal exposure to stress hormones shapes the construction of the fetal nervous system with consequences for the developing brain that persist into the preadolescent period," she added.

"This study highlights potential links between early cortisol exposure, cortical thinning and mood symptoms in children. It may provide important insights into the development of the brain and the long-term impact of maternal stress," commented Dr. John Krystal, Editor of Biological Psychiatry.

(Source: elsevier.com)

Filed under stress glucocorticoids cortisol brain development psychology neuroscience science

free counters