Neuroscience

Articles and news from the latest research reports.

Posts tagged emotions

311 notes

Study suggests neurobiological basis of human-pet relationship
It has become common for people who have pets to refer to themselves as  “pet parents,” but how closely does the relationship between people and their non-human companions mirror the parent-child relationship? A small study from a group of Massachusetts General Hospital (MGH) researchers makes a contribution to answering this complex question by investigating differences in how important brain structures are activated when women view images of their children and of their own dogs. Their report is being published in the open-access journal PLOS ONE.
“Pets hold a special place in many people’s hearts and lives, and there is compelling evidence from clinical and laboratory studies that interacting with pets can be beneficial to the physical, social and emotional wellbeing of humans,” says Lori Palley, DVM, of the MGH Center for Comparative Medicine, co-lead author of the report.  “Several previous studies have found that levels of neurohormones like oxytocin – which is involved in pair-bonding and maternal attachment – rise after interaction with pets, and new brain imaging technologies are helping us begin to understand the neurobiological basis of the relationship, which is exciting.”
In order to compare patterns of brain activation involved with the human-pet bond with those elicited by the maternal-child bond, the study enrolled a group of women with at least one child aged 2 to 10 years old and one pet dog that had been in the household for two years or longer. Participation consisted of two sessions, the first being a home visit during which participants completed several questionnaires, including ones regarding their relationships with both their child and pet dog. The participants’ dog and child were also photographed in each participants’ home.
The second session took place at the Athinoula A. Martinos Center for Biomedical Imaging at MGH, where functional magnetic resonance imaging (fMRI) – which indicates levels of activation in specific brain structures by detecting changes in blood flow and oxygen levels – was performed as participants lay in a scanner and viewed a series of photographs. The photos included images of each participant’s own child and own dog alternating with those of an unfamiliar child and dog belonging to another study participant. After the scanning session, each participant completed additional assessments, including an image recognition test to confirm she had paid close attention to photos presented during scanning, and rated several images from each category shown during the session on factors relating to pleasantness and excitement.
Of 16 women originally enrolled, complete information and MR data was available for 14 participants. The imaging studies revealed both similarities and differences in the way important brain regions reacted to images of a woman’s own child and own dog. Areas previously reported as important for functions such as emotion, reward, affiliation, visual processing and social interaction all showed increased activity when participants viewed either their own child or their own dog. A region known to be important to bond formation – the substantia nigra/ventral tegmental area (SNi/VTA) – was activated only in response to images of a participant’s own child. The fusiform gyrus, which is involved in facial recognition and other visual processing functions, actually showed greater response to own-dog images than own-child images.
“Although this is a small study that may not apply to other individuals, the results suggest there is a common brain network important for pair-bond formation and maintenance that is activated when mothers viewed images of either their child or their dog,” says Luke Stoeckel, PhD, MGH Department of Psychiatry, co-lead author of the PLOS One report. “We also observed differences in activation of some regions that may reflect variance in the evolutionary course and function of these relationships. For example, like the SNi/VTA, the nucleus accumbens has been reported to have an important role in pair-bonding in both human and animal studies. But that region showed greater deactivation when mothers viewed their own-dog images instead of greater activation in response to own-child images, as one might expect. We think the greater response of the fusiform gyrus to images of participants’ dogs may reflect the increased reliance on visual than verbal cues in human-animal communications.”
Co-author Randy Gollub, MD, PhD, of MGH Psychiatry adds, “Since fMRI is an indirect measure of neural activity and can only correlate brain activity with an individual’s experience, it will be interesting to see if future studies can directly test whether these patterns of brain activity are explained by the specific cognitive and emotional functions involved in human-animal relationships. Further, the similarities and differences in brain activity revealed by functional neuroimaging may help to generate hypotheses that eventually provide an explanation for the complexities underlying human-animal relationships.”
The investigators note that further research is needed to replicate these findings in a larger sample and to see if they are seen in other populations – such as women without children, fathers and parents of adopted children – and in relationships with other animal species. Combining fMRI studies with additional behavioral and physiological measures could obtain evidence to support a direct relationship between the observed brain activity and the purported functions.
(Image: Fotolia)

Study suggests neurobiological basis of human-pet relationship

It has become common for people who have pets to refer to themselves as  “pet parents,” but how closely does the relationship between people and their non-human companions mirror the parent-child relationship? A small study from a group of Massachusetts General Hospital (MGH) researchers makes a contribution to answering this complex question by investigating differences in how important brain structures are activated when women view images of their children and of their own dogs. Their report is being published in the open-access journal PLOS ONE.

“Pets hold a special place in many people’s hearts and lives, and there is compelling evidence from clinical and laboratory studies that interacting with pets can be beneficial to the physical, social and emotional wellbeing of humans,” says Lori Palley, DVM, of the MGH Center for Comparative Medicine, co-lead author of the report.  “Several previous studies have found that levels of neurohormones like oxytocin – which is involved in pair-bonding and maternal attachment – rise after interaction with pets, and new brain imaging technologies are helping us begin to understand the neurobiological basis of the relationship, which is exciting.”

In order to compare patterns of brain activation involved with the human-pet bond with those elicited by the maternal-child bond, the study enrolled a group of women with at least one child aged 2 to 10 years old and one pet dog that had been in the household for two years or longer. Participation consisted of two sessions, the first being a home visit during which participants completed several questionnaires, including ones regarding their relationships with both their child and pet dog. The participants’ dog and child were also photographed in each participants’ home.

The second session took place at the Athinoula A. Martinos Center for Biomedical Imaging at MGH, where functional magnetic resonance imaging (fMRI) – which indicates levels of activation in specific brain structures by detecting changes in blood flow and oxygen levels – was performed as participants lay in a scanner and viewed a series of photographs. The photos included images of each participant’s own child and own dog alternating with those of an unfamiliar child and dog belonging to another study participant. After the scanning session, each participant completed additional assessments, including an image recognition test to confirm she had paid close attention to photos presented during scanning, and rated several images from each category shown during the session on factors relating to pleasantness and excitement.

Of 16 women originally enrolled, complete information and MR data was available for 14 participants. The imaging studies revealed both similarities and differences in the way important brain regions reacted to images of a woman’s own child and own dog. Areas previously reported as important for functions such as emotion, reward, affiliation, visual processing and social interaction all showed increased activity when participants viewed either their own child or their own dog. A region known to be important to bond formation – the substantia nigra/ventral tegmental area (SNi/VTA) – was activated only in response to images of a participant’s own child. The fusiform gyrus, which is involved in facial recognition and other visual processing functions, actually showed greater response to own-dog images than own-child images.

“Although this is a small study that may not apply to other individuals, the results suggest there is a common brain network important for pair-bond formation and maintenance that is activated when mothers viewed images of either their child or their dog,” says Luke Stoeckel, PhD, MGH Department of Psychiatry, co-lead author of the PLOS One report. “We also observed differences in activation of some regions that may reflect variance in the evolutionary course and function of these relationships. For example, like the SNi/VTA, the nucleus accumbens has been reported to have an important role in pair-bonding in both human and animal studies. But that region showed greater deactivation when mothers viewed their own-dog images instead of greater activation in response to own-child images, as one might expect. We think the greater response of the fusiform gyrus to images of participants’ dogs may reflect the increased reliance on visual than verbal cues in human-animal communications.”

Co-author Randy Gollub, MD, PhD, of MGH Psychiatry adds, “Since fMRI is an indirect measure of neural activity and can only correlate brain activity with an individual’s experience, it will be interesting to see if future studies can directly test whether these patterns of brain activity are explained by the specific cognitive and emotional functions involved in human-animal relationships. Further, the similarities and differences in brain activity revealed by functional neuroimaging may help to generate hypotheses that eventually provide an explanation for the complexities underlying human-animal relationships.”

The investigators note that further research is needed to replicate these findings in a larger sample and to see if they are seen in other populations – such as women without children, fathers and parents of adopted children – and in relationships with other animal species. Combining fMRI studies with additional behavioral and physiological measures could obtain evidence to support a direct relationship between the observed brain activity and the purported functions.

(Image: Fotolia)

Filed under brain structure brain activity neuroimaging pets emotions neuroscience science

296 notes

(Image caption: This is a coronal view of the hippocampus brain region of a patient with Alzheimer’s disease. Image courtesy of Daniel Tranel’s Laboratory at the UI’s Department of Neurology.)
Alzheimer’s patients can still feel the emotion long after the memories have vanished
A new University of Iowa study further supports an inescapable message: caregivers have a profound influence—good or bad—on the emotional state of individuals with Alzheimer’s disease. Patients may not remember a recent visit by a loved one or having been neglected by staff at a nursing home, but those actions can have a lasting impact on how they feel.
The findings of this study are published in the September 2014 issue of the journal Cognitive and Behavioral Neurology.
UI researchers showed individuals with Alzheimer’s disease clips of sad and happy movies. The patients experienced sustained states of sadness and happiness despite not being able to remember the movies.
“This confirms that the emotional life of an Alzheimer’s patient is alive and well,” says lead author Edmarie Guzmán-Vélez, a doctoral student in clinical psychology, a Dean’s Graduate Research Fellow, and a National Science Foundation Graduate Research Fellow.
Guzmán-Vélez conducted the study with Daniel Tranel, UI professor of neurology and psychology, and Justin Feinstein, assistant professor at the University of Tulsa and the Laureate Institute for Brain Research.
Tranel and Feinstein published a paper in 2010 that predicted the importance of attending to the emotional needs of people with Alzheimer’s, which is expected to affect as many as 16 million people in the United States by 2050 and cost an estimated $1.2 trillion.
“It’s extremely important to see data that support our previous prediction,” Tranel says. “Edmarie’s research has immediate implications for how we treat patients and how we teach caregivers.”
Despite the considerable amount of research aimed at finding new treatments for Alzheimer’s, no drug has succeeded at either preventing or substantially influencing the disease’s progression. Against this foreboding backdrop, the results of this study highlight the need to develop new caregiving techniques aimed at improving the well-being and minimizing the suffering for the millions of individuals afflicted with Alzheimer’s.
For this behavioral study, Guzmán-Vélez and her colleagues invited 17 patients with Alzheimer’s disease and 17 healthy comparison participants to view 20 minutes of sad and then happy movies. These movie clips triggered the expected emotion: sorrow and tears during the sad films and laughter during the happy ones.
About five minutes after watching the movies, the researchers gave participants a memory test to see if they could recall what they had just seen. As expected, the patients with Alzheimer’s disease retained significantly less information about both the sad and happy films than the healthy people. In fact, four patients were unable to recall any factual information about the films, and one patient didn’t even remember watching any movies.
Before and after seeing the films, participants answered questions to gauge their feelings. Patients with Alzheimer’s disease reported elevated levels of either sadness or happiness for up to 30 minutes after viewing the films despite having little or no recollection of the movies.
Quite strikingly, the less the patients remembered about the films, the longer their sadness lasted. While sadness tended to last a little longer than happiness, both emotions far outlasted the memory of the films.
The fact that forgotten events can continue to exert a profound influence on a patient’s emotional life highlights the need for caregivers to avoid causing negative feelings and to try to induce positive feelings.
“Our findings should empower caregivers by showing them that their actions toward patients really do matter,” Guzmán-Vélez says. “Frequent visits and social interactions, exercise, music, dance, jokes, and serving patients their favorite foods are all simple things that can have a lasting emotional impact on a patient’s quality of life and subjective well-being.”

(Image caption: This is a coronal view of the hippocampus brain region of a patient with Alzheimer’s disease. Image courtesy of Daniel Tranel’s Laboratory at the UI’s Department of Neurology.)

Alzheimer’s patients can still feel the emotion long after the memories have vanished

A new University of Iowa study further supports an inescapable message: caregivers have a profound influence—good or bad—on the emotional state of individuals with Alzheimer’s disease. Patients may not remember a recent visit by a loved one or having been neglected by staff at a nursing home, but those actions can have a lasting impact on how they feel.

The findings of this study are published in the September 2014 issue of the journal Cognitive and Behavioral Neurology.

UI researchers showed individuals with Alzheimer’s disease clips of sad and happy movies. The patients experienced sustained states of sadness and happiness despite not being able to remember the movies.

“This confirms that the emotional life of an Alzheimer’s patient is alive and well,” says lead author Edmarie Guzmán-Vélez, a doctoral student in clinical psychology, a Dean’s Graduate Research Fellow, and a National Science Foundation Graduate Research Fellow.

Guzmán-Vélez conducted the study with Daniel Tranel, UI professor of neurology and psychology, and Justin Feinstein, assistant professor at the University of Tulsa and the Laureate Institute for Brain Research.

Tranel and Feinstein published a paper in 2010 that predicted the importance of attending to the emotional needs of people with Alzheimer’s, which is expected to affect as many as 16 million people in the United States by 2050 and cost an estimated $1.2 trillion.

“It’s extremely important to see data that support our previous prediction,” Tranel says. “Edmarie’s research has immediate implications for how we treat patients and how we teach caregivers.”

Despite the considerable amount of research aimed at finding new treatments for Alzheimer’s, no drug has succeeded at either preventing or substantially influencing the disease’s progression. Against this foreboding backdrop, the results of this study highlight the need to develop new caregiving techniques aimed at improving the well-being and minimizing the suffering for the millions of individuals afflicted with Alzheimer’s.

For this behavioral study, Guzmán-Vélez and her colleagues invited 17 patients with Alzheimer’s disease and 17 healthy comparison participants to view 20 minutes of sad and then happy movies. These movie clips triggered the expected emotion: sorrow and tears during the sad films and laughter during the happy ones.

About five minutes after watching the movies, the researchers gave participants a memory test to see if they could recall what they had just seen. As expected, the patients with Alzheimer’s disease retained significantly less information about both the sad and happy films than the healthy people. In fact, four patients were unable to recall any factual information about the films, and one patient didn’t even remember watching any movies.

Before and after seeing the films, participants answered questions to gauge their feelings. Patients with Alzheimer’s disease reported elevated levels of either sadness or happiness for up to 30 minutes after viewing the films despite having little or no recollection of the movies.

Quite strikingly, the less the patients remembered about the films, the longer their sadness lasted. While sadness tended to last a little longer than happiness, both emotions far outlasted the memory of the films.

The fact that forgotten events can continue to exert a profound influence on a patient’s emotional life highlights the need for caregivers to avoid causing negative feelings and to try to induce positive feelings.

“Our findings should empower caregivers by showing them that their actions toward patients really do matter,” Guzmán-Vélez says. “Frequent visits and social interactions, exercise, music, dance, jokes, and serving patients their favorite foods are all simple things that can have a lasting emotional impact on a patient’s quality of life and subjective well-being.”

Filed under alzheimer's disease emotions emotional state psychology neuroscience science

209 notes

EEG Study Findings Reveal How Fear is Processed in the Brain
An estimated 8% of Americans will suffer from post traumatic stress disorder (PTSD) at some point during their lifetime. Brought on by an overwhelming or stressful event or events, PTSD is the result of altered chemistry and physiology of the brain. Understanding how threat is processed in a normal brain versus one altered by PTSD is essential to developing effective interventions. 
New research from the Center for BrainHealth at The University of Texas at Dallas published online today in Brain and Cognition illustrates how fear arises in the brain when individuals are exposed to threatening images. This novel study is the first to separate emotion from threat by controlling for the dimension of arousal, the emotional reaction provoked, whether positive or negative, in response to stimuli. Building on previous animal and human research, the study identifies an electrophysiological marker for threat in the brain.
“We are trying to find where thought exists in the mind,” explained John Hart, Jr., M.D., Medical Science Director at the Center for BrainHealth. “We know that groups of neurons firing on and off create a frequency and pattern that tell other areas of the brain what to do. By identifying these rhythms, we can correlate them with a cognitive unit such as fear.”
Utilizing electroencephalography (EEG), Dr. Hart’s research team identified theta and beta wave activity that signifies the brain’s reaction to visually threatening images. 
“We have known for a long time that the brain prioritizes threatening information over other cognitive processes,” explained Bambi DeLaRosa, study lead author. “These findings show us how this happens. Theta wave activity starts in the back of the brain, in it’s fear center – the amygdala – and then interacts with brain’s memory center - the hippocampus – before traveling to the frontal lobe where thought processing areas are engaged. At the same time, beta wave activity indicates that the motor cortex is revving up in case the feet need to move to avoid the perceived threat.” 
For the study, 26 adults (19 female, 7 male), ages 19-30 were shown 224 randomized images that were either unidentifiably scrambled or real pictures. Real pictures were separated into two categories: threatening (weapons, combat, nature or animals) and non-threatening (pleasant situations, food, nature or animals). 
While wearing an EEG cap, participants were asked to push a button with their right index finger for real items and another button with their right middle finger for nonreal/scrambled items. Shorter response times were recorded for scrambled images than the real images. There was no difference in reaction time for threatening versus non-threatening images. 
EEG results revealed that threatening images evoked an early increase in theta activity in the occipital lobe (the area in the brain where visual information is processed), followed by a later increase in theta power in the frontal lobe (where higher mental functions such as thinking, decision-making, and planning occur). A left lateralized desynchronization of the beta band, the wave pattern associated with motor behavior (like the impulse to run), also consistently appeared in the threatening condition.
This study will serve as a foundation for future work that will explore normal versus abnormal fear associated with an object in other atypical populations including individuals with PTSD.

EEG Study Findings Reveal How Fear is Processed in the Brain

An estimated 8% of Americans will suffer from post traumatic stress disorder (PTSD) at some point during their lifetime. Brought on by an overwhelming or stressful event or events, PTSD is the result of altered chemistry and physiology of the brain. Understanding how threat is processed in a normal brain versus one altered by PTSD is essential to developing effective interventions. 

New research from the Center for BrainHealth at The University of Texas at Dallas published online today in Brain and Cognition illustrates how fear arises in the brain when individuals are exposed to threatening images. This novel study is the first to separate emotion from threat by controlling for the dimension of arousal, the emotional reaction provoked, whether positive or negative, in response to stimuli. Building on previous animal and human research, the study identifies an electrophysiological marker for threat in the brain.

“We are trying to find where thought exists in the mind,” explained John Hart, Jr., M.D., Medical Science Director at the Center for BrainHealth. “We know that groups of neurons firing on and off create a frequency and pattern that tell other areas of the brain what to do. By identifying these rhythms, we can correlate them with a cognitive unit such as fear.”

Utilizing electroencephalography (EEG), Dr. Hart’s research team identified theta and beta wave activity that signifies the brain’s reaction to visually threatening images. 

“We have known for a long time that the brain prioritizes threatening information over other cognitive processes,” explained Bambi DeLaRosa, study lead author. “These findings show us how this happens. Theta wave activity starts in the back of the brain, in it’s fear center – the amygdala – and then interacts with brain’s memory center - the hippocampus – before traveling to the frontal lobe where thought processing areas are engaged. At the same time, beta wave activity indicates that the motor cortex is revving up in case the feet need to move to avoid the perceived threat.” 

For the study, 26 adults (19 female, 7 male), ages 19-30 were shown 224 randomized images that were either unidentifiably scrambled or real pictures. Real pictures were separated into two categories: threatening (weapons, combat, nature or animals) and non-threatening (pleasant situations, food, nature or animals). 

While wearing an EEG cap, participants were asked to push a button with their right index finger for real items and another button with their right middle finger for nonreal/scrambled items. Shorter response times were recorded for scrambled images than the real images. There was no difference in reaction time for threatening versus non-threatening images. 

EEG results revealed that threatening images evoked an early increase in theta activity in the occipital lobe (the area in the brain where visual information is processed), followed by a later increase in theta power in the frontal lobe (where higher mental functions such as thinking, decision-making, and planning occur). A left lateralized desynchronization of the beta band, the wave pattern associated with motor behavior (like the impulse to run), also consistently appeared in the threatening condition.

This study will serve as a foundation for future work that will explore normal versus abnormal fear associated with an object in other atypical populations including individuals with PTSD.

Filed under fear PTSD emotions EEG brainwaves amygdala motor cortex hippocampus neuroscience science

86 notes

2-D or 3-D? That is the Question
The increased visual realism of 3-D films is believed to offer viewers a more vivid and lifelike experience—more thrilling and intense than 2-D because it more closely approximates real life. However, psychology researchers at the University of Utah, among those who use film clips routinely in the lab to study patients’ emotional conditions, have found that there is no significant difference between the two formats. The results were published recently in PLOS ONE.
The study aimed to validate the effectiveness of 3-D film, a newer technology, as compared to 2-D film that is currently widely used as a research tool. Film clips are used in psychological and neuroscience studies as a standardized method for assessing emotional development. Because it is less invasive than other methods, it is especially useful when studying the emotional responses of young people for whom emotional well-being is critical to healthy development.
Author Sheila Crowell, assistant professor of psychology at the U, says that results of the large and tightly controlled study also suggest that as an entertainment medium, 3-D may not provide a different experience from 2-D, insofar as evoking emotional responses go.
“We set out to learn whether technological advances like 3-D enhance the study of emotion, especially for young patients who are routinely exposed to high-tech devices and mediums in their daily lives,” says Crowell. “Both 2-D and 3-D are equally effective at eliciting emotional responses, which also may mean that the expense involved in producing 3-D films is not creating much more than novelty. Further studies are of course warranted, but our findings should be encouraging to researchers who cannot now afford 3-D technologies.”
How the study was conducted
Researchers looked at several measures of emotional state in 408 subjects, including palm sweat, breathing and cardiovascular responses, such as heart rate. These measures are commonly used to gauge emotional responses.
Four film clips were chosen because each prompted one discrete emotion intensely and in context without viewing the entire film. Study participants viewed a 3-D and 2-D clip of approximately five minutes of each film: “My Bloody Valentine” (fear), “Despicable Me” (amusement), “Tangled” (sadness) and “The Polar Express” (thrill or excitement). Participants were randomized to view the films in a design that balanced the pairs of films watched, in which format, and order of presentation. The complex configurations allowed the researchers to compare not only emotional responses, but effects of format and viewing order on the results.
Taken as a whole, the results showed few significant differences between physiological reactions to the films. When accounting for the large number of statistical tests, only one difference was seen between the formats—the number of electrodermal responses (palm sweat) during a thrilling scene from “The Polar Express” 3-D clip. The researchers believe that could be because the 3-D content of the film is of especially high quality, with more and a larger variety of 3-D effects than the others.
Supporting the overall finding is that participants’ individual differences in anxiety, inability to control emotional responses or “thrill seeking” did not alter the psychological or physiological responses to 3-D viewing. In other words, personality differences did not change the results: 2-D is still equally effective for emotion elicitation. According to Crowell, “this could be good news for people who would rather not wear 3-D glasses or pay the extra money to see these types of films.”

2-D or 3-D? That is the Question

The increased visual realism of 3-D films is believed to offer viewers a more vivid and lifelike experience—more thrilling and intense than 2-D because it more closely approximates real life. However, psychology researchers at the University of Utah, among those who use film clips routinely in the lab to study patients’ emotional conditions, have found that there is no significant difference between the two formats. The results were published recently in PLOS ONE.

The study aimed to validate the effectiveness of 3-D film, a newer technology, as compared to 2-D film that is currently widely used as a research tool. Film clips are used in psychological and neuroscience studies as a standardized method for assessing emotional development. Because it is less invasive than other methods, it is especially useful when studying the emotional responses of young people for whom emotional well-being is critical to healthy development.

Author Sheila Crowell, assistant professor of psychology at the U, says that results of the large and tightly controlled study also suggest that as an entertainment medium, 3-D may not provide a different experience from 2-D, insofar as evoking emotional responses go.

“We set out to learn whether technological advances like 3-D enhance the study of emotion, especially for young patients who are routinely exposed to high-tech devices and mediums in their daily lives,” says Crowell. “Both 2-D and 3-D are equally effective at eliciting emotional responses, which also may mean that the expense involved in producing 3-D films is not creating much more than novelty. Further studies are of course warranted, but our findings should be encouraging to researchers who cannot now afford 3-D technologies.”

How the study was conducted

Researchers looked at several measures of emotional state in 408 subjects, including palm sweat, breathing and cardiovascular responses, such as heart rate. These measures are commonly used to gauge emotional responses.

Four film clips were chosen because each prompted one discrete emotion intensely and in context without viewing the entire film. Study participants viewed a 3-D and 2-D clip of approximately five minutes of each film: “My Bloody Valentine” (fear), “Despicable Me” (amusement), “Tangled” (sadness) and “The Polar Express” (thrill or excitement). Participants were randomized to view the films in a design that balanced the pairs of films watched, in which format, and order of presentation. The complex configurations allowed the researchers to compare not only emotional responses, but effects of format and viewing order on the results.

Taken as a whole, the results showed few significant differences between physiological reactions to the films. When accounting for the large number of statistical tests, only one difference was seen between the formats—the number of electrodermal responses (palm sweat) during a thrilling scene from “The Polar Express” 3-D clip. The researchers believe that could be because the 3-D content of the film is of especially high quality, with more and a larger variety of 3-D effects than the others.

Supporting the overall finding is that participants’ individual differences in anxiety, inability to control emotional responses or “thrill seeking” did not alter the psychological or physiological responses to 3-D viewing. In other words, personality differences did not change the results: 2-D is still equally effective for emotion elicitation. According to Crowell, “this could be good news for people who would rather not wear 3-D glasses or pay the extra money to see these types of films.”

Filed under emotions 3-D films anxiety electrodermal activity heart rate psychology neuroscience science

177 notes

(Image caption: This image depicts the injection sites and the expression of the viral constructs in the two areas of the brain studied: the Dentate Gyrus of the hippocampus (middle) and the Basolateral Amygdala (bottom corners). Image courtesy of the researchers)
Neuroscientists reverse memories’ emotional associations
Most memories have some kind of emotion associated with them: Recalling the week you just spent at the beach probably makes you feel happy, while reflecting on being bullied provokes more negative feelings.
A new study from MIT neuroscientists reveals the brain circuit that controls how memories become linked with positive or negative emotions. Furthermore, the researchers found that they could reverse the emotional association of specific memories by manipulating brain cells with optogenetics — a technique that uses light to control neuron activity.
The findings, described in the Aug. 27 issue of Nature, demonstrated that a neuronal circuit connecting the hippocampus and the amygdala plays a critical role in associating emotion with memory. This circuit could offer a target for new drugs to help treat conditions such as post-traumatic stress disorder, the researchers say.
“In the future, one may be able to develop methods that help people to remember positive memories more strongly than negative ones,” says Susumu Tonegawa, the Picower Professor of Biology and Neuroscience, director of the RIKEN-MIT Center for Neural Circuit Genetics at MIT’s Picower Institute for Learning and Memory, and senior author of the paper.
The paper’s lead authors are Roger Redondo, a Howard Hughes Medical Institute postdoc at MIT, and Joshua Kim, a graduate student in MIT’s Department of Biology.
Shifting memories
Memories are made of many elements, which are stored in different parts of the brain. A memory’s context, including information about the location where the event took place, is stored in cells of the hippocampus, while emotions linked to that memory are found in the amygdala.
Previous research has shown that many aspects of memory, including emotional associations, are malleable. Psychotherapists have taken advantage of this to help patients suffering from depression and post-traumatic stress disorder, but the neural circuitry underlying such malleability is not known.
In this study, the researchers set out to explore that malleability with an experimental technique they recently devised that allows them to tag neurons that encode a specific memory, or engram. To achieve this, they label hippocampal cells that are turned on during memory formation with a light-sensitive protein called channelrhodopsin. From that point on, any time those cells are activated with light, the mice recall the memory encoded by that group of cells.
Last year, Tonegawa’s lab used this technique to implant, or “incept,” false memories in mice by reactivating engrams while the mice were undergoing a different experience. In the new study, the researchers wanted to investigate how the context of a memory becomes linked to a particular emotion. First, they used their engram-labeling protocol to tag neurons associated with either a rewarding experience (for male mice, socializing with a female mouse) or an unpleasant experience (a mild electrical shock). In this first set of experiments, the researchers labeled memory cells in a part of the hippocampus called the dentate gyrus.
Two days later, the mice were placed into a large rectangular arena. For three minutes, the researchers recorded which half of the arena the mice naturally preferred. Then, for mice that had received the fear conditioning, the researchers stimulated the labeled cells in the dentate gyrus with light whenever the mice went into the preferred side. The mice soon began avoiding that area, showing that the reactivation of the fear memory had been successful.
The reward memory could also be reactivated: For mice that were reward-conditioned, the researchers stimulated them with light whenever they went into the less-preferred side, and they soon began to spend more time there, recalling the pleasant memory.
A couple of days later, the researchers tried to reverse the mice’s emotional responses. For male mice that had originally received the fear conditioning, they activated the memory cells involved in the fear memory with light for 12 minutes while the mice spent time with female mice. For mice that had initially received the reward conditioning, memory cells were activated while they received mild electric shocks.
Next, the researchers again put the mice in the large two-zone arena. This time, the mice that had originally been conditioned with fear and had avoided the side of the chamber where their hippocampal cells were activated by the laser now began to spend more time in that side when their hippocampal cells were activated, showing that a pleasant association had replaced the fearful one. This reversal also took place in mice that went from reward to fear conditioning.
Altered connections
The researchers then performed the same set of experiments but labeled memory cells in the basolateral amygdala, a region involved in processing emotions. This time, they could not induce a switch by reactivating those cells — the mice continued to behave as they had been conditioned when the memory cells were first labeled.
This suggests that emotional associations, also called valences, are encoded somewhere in the neural circuitry that connects the dentate gyrus to the amygdala, the researchers say. A fearful experience strengthens the connections between the hippocampal engram and fear-encoding cells in the amygdala, but that connection can be weakened later on as new connections are formed between the hippocampus and amygdala cells that encode positive associations.
“That plasticity of the connection between the hippocampus and the amygdala plays a crucial role in the switching of the valence of the memory,” Tonegawa says.
These results indicate that while dentate gyrus cells are neutral with respect to emotion, individual amygdala cells are precommitted to encode fear or reward memory. The researchers are now trying to discover molecular signatures of these two types of amygdala cells. They are also investigating whether reactivating pleasant memories has any effect on depression, in hopes of identifying new targets for drugs to treat depression and post-traumatic stress disorder.
David Anderson, a professor of biology at the California Institute of Technology, says the study makes an important contribution to neuroscientists’ fundamental understanding of the brain and also has potential implications for treating mental illness.
“This is a tour de force of modern molecular-biology-based methods for analyzing processes, such as learning and memory, at the neural-circuitry level. It’s one of the most sophisticated studies of this type that I’ve seen,” he says.

(Image caption: This image depicts the injection sites and the expression of the viral constructs in the two areas of the brain studied: the Dentate Gyrus of the hippocampus (middle) and the Basolateral Amygdala (bottom corners). Image courtesy of the researchers)

Neuroscientists reverse memories’ emotional associations

Most memories have some kind of emotion associated with them: Recalling the week you just spent at the beach probably makes you feel happy, while reflecting on being bullied provokes more negative feelings.

A new study from MIT neuroscientists reveals the brain circuit that controls how memories become linked with positive or negative emotions. Furthermore, the researchers found that they could reverse the emotional association of specific memories by manipulating brain cells with optogenetics — a technique that uses light to control neuron activity.

The findings, described in the Aug. 27 issue of Nature, demonstrated that a neuronal circuit connecting the hippocampus and the amygdala plays a critical role in associating emotion with memory. This circuit could offer a target for new drugs to help treat conditions such as post-traumatic stress disorder, the researchers say.

“In the future, one may be able to develop methods that help people to remember positive memories more strongly than negative ones,” says Susumu Tonegawa, the Picower Professor of Biology and Neuroscience, director of the RIKEN-MIT Center for Neural Circuit Genetics at MIT’s Picower Institute for Learning and Memory, and senior author of the paper.

The paper’s lead authors are Roger Redondo, a Howard Hughes Medical Institute postdoc at MIT, and Joshua Kim, a graduate student in MIT’s Department of Biology.

Shifting memories

Memories are made of many elements, which are stored in different parts of the brain. A memory’s context, including information about the location where the event took place, is stored in cells of the hippocampus, while emotions linked to that memory are found in the amygdala.

Previous research has shown that many aspects of memory, including emotional associations, are malleable. Psychotherapists have taken advantage of this to help patients suffering from depression and post-traumatic stress disorder, but the neural circuitry underlying such malleability is not known.

In this study, the researchers set out to explore that malleability with an experimental technique they recently devised that allows them to tag neurons that encode a specific memory, or engram. To achieve this, they label hippocampal cells that are turned on during memory formation with a light-sensitive protein called channelrhodopsin. From that point on, any time those cells are activated with light, the mice recall the memory encoded by that group of cells.

Last year, Tonegawa’s lab used this technique to implant, or “incept,” false memories in mice by reactivating engrams while the mice were undergoing a different experience. In the new study, the researchers wanted to investigate how the context of a memory becomes linked to a particular emotion. First, they used their engram-labeling protocol to tag neurons associated with either a rewarding experience (for male mice, socializing with a female mouse) or an unpleasant experience (a mild electrical shock). In this first set of experiments, the researchers labeled memory cells in a part of the hippocampus called the dentate gyrus.

Two days later, the mice were placed into a large rectangular arena. For three minutes, the researchers recorded which half of the arena the mice naturally preferred. Then, for mice that had received the fear conditioning, the researchers stimulated the labeled cells in the dentate gyrus with light whenever the mice went into the preferred side. The mice soon began avoiding that area, showing that the reactivation of the fear memory had been successful.

The reward memory could also be reactivated: For mice that were reward-conditioned, the researchers stimulated them with light whenever they went into the less-preferred side, and they soon began to spend more time there, recalling the pleasant memory.

A couple of days later, the researchers tried to reverse the mice’s emotional responses. For male mice that had originally received the fear conditioning, they activated the memory cells involved in the fear memory with light for 12 minutes while the mice spent time with female mice. For mice that had initially received the reward conditioning, memory cells were activated while they received mild electric shocks.

Next, the researchers again put the mice in the large two-zone arena. This time, the mice that had originally been conditioned with fear and had avoided the side of the chamber where their hippocampal cells were activated by the laser now began to spend more time in that side when their hippocampal cells were activated, showing that a pleasant association had replaced the fearful one. This reversal also took place in mice that went from reward to fear conditioning.

Altered connections

The researchers then performed the same set of experiments but labeled memory cells in the basolateral amygdala, a region involved in processing emotions. This time, they could not induce a switch by reactivating those cells — the mice continued to behave as they had been conditioned when the memory cells were first labeled.

This suggests that emotional associations, also called valences, are encoded somewhere in the neural circuitry that connects the dentate gyrus to the amygdala, the researchers say. A fearful experience strengthens the connections between the hippocampal engram and fear-encoding cells in the amygdala, but that connection can be weakened later on as new connections are formed between the hippocampus and amygdala cells that encode positive associations.

“That plasticity of the connection between the hippocampus and the amygdala plays a crucial role in the switching of the valence of the memory,” Tonegawa says.

These results indicate that while dentate gyrus cells are neutral with respect to emotion, individual amygdala cells are precommitted to encode fear or reward memory. The researchers are now trying to discover molecular signatures of these two types of amygdala cells. They are also investigating whether reactivating pleasant memories has any effect on depression, in hopes of identifying new targets for drugs to treat depression and post-traumatic stress disorder.

David Anderson, a professor of biology at the California Institute of Technology, says the study makes an important contribution to neuroscientists’ fundamental understanding of the brain and also has potential implications for treating mental illness.

“This is a tour de force of modern molecular-biology-based methods for analyzing processes, such as learning and memory, at the neural-circuitry level. It’s one of the most sophisticated studies of this type that I’ve seen,” he says.

Filed under optogenetics hippocampus memory emotions amygdala dentate gyrus neuroscience science

379 notes

Study cracks how the brain processes emotions
Although feelings are personal and subjective, the human brain turns them into a standard code that objectively represents emotions across different senses, situations and even people, reports a new study by Cornell University neuroscientist Adam Anderson.
“We discovered that fine-grained patterns of neural activity within the orbitofrontal cortex, an area of the brain associated with emotional processing, act as a neural code which captures an individual’s subjective feeling,” says Anderson, associate professor of human development in Cornell’s College of Human Ecology and senior author of the study. “Population coding of affect across stimuli, modalities and individuals,” published online in Nature Neuroscience.
Their findings provide insight into how the brain represents our innermost feelings – what Anderson calls the last frontier of neuroscience – and upend the long-held view that emotion is represented in the brain simply by activation in specialized regions for positive or negative feelings, he says.
“If you and I derive similar pleasure from sipping a fine wine or watching the sun set, our results suggest it is because we share similar fine-grained patterns of activity in the orbitofrontal cortex,” Anderson says.
“It appears that the human brain generates a special code for the entire valence spectrum of pleasant-to-unpleasant, good-to-bad feelings, which can be read like a ‘neural valence meter’ in which the leaning of a population of neurons in one direction equals positive feeling and the leaning in the other direction equals negative feeling,” Anderson explains.
For the study, the researchers presented participants with a series of pictures and tastes during functional neuroimaging, then analyzed participants’ ratings of their subjective experiences along with their brain activation patterns.
Anderson’s team found that valence was represented as sensory-specific patterns or codes in areas of the brain associated with vision and taste, as well as sensory-independent codes in the orbitofrontal cortices (OFC), suggesting, the authors say, that representation of our internal subjective experience is not confined to specialized emotional centers, but may be central to perception of sensory experience.
They also discovered that similar subjective feelings – whether evoked from the eye or tongue – resulted in a similar pattern of activity in the OFC, suggesting the brain contains an emotion code common across distinct experiences of pleasure (or displeasure), they say. Furthermore, these OFC activity patterns of positive and negative experiences were partly shared across people.
“Despite how personal our feelings feel, the evidence suggests our brains use a standard code to speak the same emotional language,” Anderson concludes.

Study cracks how the brain processes emotions

Although feelings are personal and subjective, the human brain turns them into a standard code that objectively represents emotions across different senses, situations and even people, reports a new study by Cornell University neuroscientist Adam Anderson.

“We discovered that fine-grained patterns of neural activity within the orbitofrontal cortex, an area of the brain associated with emotional processing, act as a neural code which captures an individual’s subjective feeling,” says Anderson, associate professor of human development in Cornell’s College of Human Ecology and senior author of the study. “Population coding of affect across stimuli, modalities and individuals,” published online in Nature Neuroscience.

Their findings provide insight into how the brain represents our innermost feelings – what Anderson calls the last frontier of neuroscience – and upend the long-held view that emotion is represented in the brain simply by activation in specialized regions for positive or negative feelings, he says.

“If you and I derive similar pleasure from sipping a fine wine or watching the sun set, our results suggest it is because we share similar fine-grained patterns of activity in the orbitofrontal cortex,” Anderson says.

“It appears that the human brain generates a special code for the entire valence spectrum of pleasant-to-unpleasant, good-to-bad feelings, which can be read like a ‘neural valence meter’ in which the leaning of a population of neurons in one direction equals positive feeling and the leaning in the other direction equals negative feeling,” Anderson explains.

For the study, the researchers presented participants with a series of pictures and tastes during functional neuroimaging, then analyzed participants’ ratings of their subjective experiences along with their brain activation patterns.

Anderson’s team found that valence was represented as sensory-specific patterns or codes in areas of the brain associated with vision and taste, as well as sensory-independent codes in the orbitofrontal cortices (OFC), suggesting, the authors say, that representation of our internal subjective experience is not confined to specialized emotional centers, but may be central to perception of sensory experience.

They also discovered that similar subjective feelings – whether evoked from the eye or tongue – resulted in a similar pattern of activity in the OFC, suggesting the brain contains an emotion code common across distinct experiences of pleasure (or displeasure), they say. Furthermore, these OFC activity patterns of positive and negative experiences were partly shared across people.

“Despite how personal our feelings feel, the evidence suggests our brains use a standard code to speak the same emotional language,” Anderson concludes.

Filed under emotions orbitofrontal cortex neural activity feelings neuroscience science

235 notes

People with tinnitus process emotions differently from their peers
Patients with persistent ringing in the ears – a condition known as tinnitus – process emotions differently in the brain from those with normal hearing, researchers report in the journal Brain Research.
Tinnitus afflicts 50 million people in the United States, according to the American Tinnitus Association, and causes those with the condition to hear noises that aren’t really there. These phantom sounds are not speech, but rather whooshing noises, train whistles, cricket noises or whines. Their severity often varies day to day.
University of Illinois speech and hearing science professor Fatima Husain, who led the study, said previous studies showed that tinnitus is associated with increased stress, anxiety, irritability and depression, all of which are affiliated with the brain’s emotional processing systems.
“Obviously, when you hear annoying noises constantly that you can’t control, it may affect your emotional processing systems,” Husain said. “But when I looked at experimental work done on tinnitus and emotional processing, especially brain imaging work, there hadn’t been much research published.”
She decided to use functional magnetic resonance imaging (fMRI) brain scans to better understand how tinnitus affects the brain’s ability to process emotions. These scans show the areas of the brain that are active in response to stimulation, based upon blood flow to those areas.
Three groups of participants were used in the study: people with mild-to-moderate hearing loss and mild tinnitus; people with mild-to-moderate hearing loss without tinnitus; and a control group of age-matched people without hearing loss or tinnitus. Each person was put in an fMRI machine and listened to a standardized set of 30 pleasant, 30 unpleasant and 30 emotionally neutral sounds (for example, a baby laughing, a woman screaming and a water bottle opening). The participants pressed a button to categorize each sound as pleasant, unpleasant or neutral.
The tinnitus and normal-hearing groups responded more quickly to emotion-inducing sounds than to neutral sounds, while patients with hearing loss had a similar response time to each category of sound. Over all, the tinnitus group’s reaction times were slower than the reaction times of those with normal hearing.
Activity in the amygdala, a brain region associated with emotional processing, was lower in the tinnitus and hearing-loss patients than in people with normal hearing. Tinnitus patients also showed more activity than normal-hearing people in two other brain regions associated with emotion, the parahippocampus and the insula. The findings surprised Husain.
“We thought that because people with tinnitus constantly hear a bothersome, unpleasant stimulus, they would have an even higher amount of activity in the amygdala when hearing these sounds, but it was lesser,” she said. “Because they’ve had to adjust to the sound, some plasticity in the brain has occurred. They have had to reduce this amygdala activity and reroute it to other parts of the brain because the amygdala cannot be active all the time due to this annoying sound.”
Because of the sheer number of people who suffer from tinnitus in the United States, a group that includes many combat veterans, Husain hopes her group’s future research will be able to increase tinnitus patients’ quality of life.
“It’s a communication issue and a quality-of-life issue,” she said. “We want to know how we can get better in the clinical realm. Audiologists and clinicians are aware that tinnitus affects emotional aspects, too, and we want to make them aware that these effects are occurring so they can better help their patients.”

People with tinnitus process emotions differently from their peers

Patients with persistent ringing in the ears – a condition known as tinnitus – process emotions differently in the brain from those with normal hearing, researchers report in the journal Brain Research.

Tinnitus afflicts 50 million people in the United States, according to the American Tinnitus Association, and causes those with the condition to hear noises that aren’t really there. These phantom sounds are not speech, but rather whooshing noises, train whistles, cricket noises or whines. Their severity often varies day to day.

University of Illinois speech and hearing science professor Fatima Husain, who led the study, said previous studies showed that tinnitus is associated with increased stress, anxiety, irritability and depression, all of which are affiliated with the brain’s emotional processing systems.

“Obviously, when you hear annoying noises constantly that you can’t control, it may affect your emotional processing systems,” Husain said. “But when I looked at experimental work done on tinnitus and emotional processing, especially brain imaging work, there hadn’t been much research published.”

She decided to use functional magnetic resonance imaging (fMRI) brain scans to better understand how tinnitus affects the brain’s ability to process emotions. These scans show the areas of the brain that are active in response to stimulation, based upon blood flow to those areas.

Three groups of participants were used in the study: people with mild-to-moderate hearing loss and mild tinnitus; people with mild-to-moderate hearing loss without tinnitus; and a control group of age-matched people without hearing loss or tinnitus. Each person was put in an fMRI machine and listened to a standardized set of 30 pleasant, 30 unpleasant and 30 emotionally neutral sounds (for example, a baby laughing, a woman screaming and a water bottle opening). The participants pressed a button to categorize each sound as pleasant, unpleasant or neutral.

The tinnitus and normal-hearing groups responded more quickly to emotion-inducing sounds than to neutral sounds, while patients with hearing loss had a similar response time to each category of sound. Over all, the tinnitus group’s reaction times were slower than the reaction times of those with normal hearing.

Activity in the amygdala, a brain region associated with emotional processing, was lower in the tinnitus and hearing-loss patients than in people with normal hearing. Tinnitus patients also showed more activity than normal-hearing people in two other brain regions associated with emotion, the parahippocampus and the insula. The findings surprised Husain.

“We thought that because people with tinnitus constantly hear a bothersome, unpleasant stimulus, they would have an even higher amount of activity in the amygdala when hearing these sounds, but it was lesser,” she said. “Because they’ve had to adjust to the sound, some plasticity in the brain has occurred. They have had to reduce this amygdala activity and reroute it to other parts of the brain because the amygdala cannot be active all the time due to this annoying sound.”

Because of the sheer number of people who suffer from tinnitus in the United States, a group that includes many combat veterans, Husain hopes her group’s future research will be able to increase tinnitus patients’ quality of life.

“It’s a communication issue and a quality-of-life issue,” she said. “We want to know how we can get better in the clinical realm. Audiologists and clinicians are aware that tinnitus affects emotional aspects, too, and we want to make them aware that these effects are occurring so they can better help their patients.”

Filed under tinnitus emotions amygdala neuroimaging hearing neuroscience science

169 notes

Neural sweet talk: Taste metaphors emotionally engage the brain

So accustomed are we to metaphors related to taste that when we hear a kind smile described as “sweet,” or a resentful comment as “bitter,” we most likely don’t even think of those words as metaphors. But while it may seem to our ears that “sweet” by any other name means the same thing, new research shows that taste-related words actually engage the emotional centers of the brain more than literal words with the same meaning.

Researchers from Princeton University and the Free University of Berlin report in the Journal of Cognitive Neuroscience the first study to experimentally show that the brain processes these everyday metaphors differently than literal language. In the study, participants read 37 sentences that included common metaphors based on taste while the researchers recorded their brain activity. Each taste-related word was then swapped with a literal counterpart so that, for instance, “She looked at him sweetly” became “She looked at him kindly.”

The researchers found that the sentences containing words that invoked taste activated areas known to be associated with emotional processing, such as the amygdala, as well as the areas known as the gustatory cortices that allow for the physical act of tasting. Interestingly, the metaphorical and literal words only resulted in brain activity related to emotion when part of a sentence, but stimulated the gustatory cortices both in sentences and as stand-alone words.

Metaphorical sentences may spark increased brain activity in emotion-related regions because they allude to physical experiences, said co-author Adele Goldberg, a Princeton professor of linguistics in the Council of the Humanities. Human language frequently uses physical sensations or objects to refer to abstract domains such as time, understanding or emotion, Goldberg said. For instance, people liken love to a number of afflictions including being “sick” or shot through the heart with an arrow. Similarly, “sweet” has a much clearer physical component than “kind.” The new research suggests that these associations go beyond just being descriptive to engage our brains on an emotional level and potentially amplify the impact of the sentence, Goldberg said.

"You begin to realize when you look at metaphors how common they are in helping us understand abstract domains," Goldberg said. "It could be that we are more engaged with abstract concepts when we use metaphorical language that ties into physical experiences."

If metaphors in general elicit an emotional response from the brain that is similar to that caused by taste-related metaphors, then that could mean that figurative language presents a “rhetorical advantage” when communicating with others, explained co-author Francesca Citron, a postdoctoral researcher of psycholinguistics at the Free University’s Languages of Emotion research center.

"Figurative language may be more effective in communication and may facilitate processes such as affiliation, persuasion and support," Citron said. "Further, as a reader or listener, one should be wary of being overly influenced by metaphorical language."

Colloquially, metaphors seem to be employed precisely to evoke an emotional reaction, yet the actual emotional effect of figurative phrases on the person hearing them has not before been deeply explored, said Benjamin Bergen, an associate professor of cognitive science at the University of California-San Diego who studies language comprehension, and metaphorical language and thought.

"There’s a lot of research on the conceptual effects of metaphors, such as how they allow people to think about new or abstract concepts in terms of concrete things they’re familiar with. But there’s very little work on the emotional impact of metaphor," said Bergen, who had no role in the research but is familiar with it.

"Emotional impact seems to be one of the main reasons people use metaphors to begin with. For instance, a senator might describe a bill as ‘job-killing’ to evoke an emotional reaction," he said. "These results suggest that using certain metaphorical expressions induces more of an emotional reaction than saying the same thing literally. Those expressions that have this property are likely to have the effects on reasoning, inference, judgment and decision-making that emotion is known to have."

The brain areas that taste-related words did not stimulate are also an important outcome of the study, Citron said. Existing research on metaphors and neural processing has shown that figurative language generally requires more brainpower than literal language, Citron and Goldberg wrote. But these bursts of neural activity have been related to higher-order processing from thinking through an unfamiliar metaphor.

The brain activity Citron and Goldberg observed did not correlate with this process. In order to create the metaphorical- and literal-sentence stimuli, they had a group of people separate from the study participants rate sentences for familiarity, apparent arousal, imageability — which is how easily a phrase can be imagined in the reader’s mind — and how positive or negative each sentence was interpreted as being. The metaphorical and literal sentences were equal on all of these factors. In addition, each metaphorical phrase and its literal counterpart were rated as being highly similar in meaning.

These steps helped to ensure that the metaphorical and literal sentences were equally as easy to comprehend. Thus, the brain activity the researchers recorded was not likely to be in response to any additional difficulty study participants had in understanding the metaphors.

"It is important to rule out possible effects of familiarity, since less familiar items may require more processing resources to be understood and elicit enhanced brain responses in several brain regions," Citron said.

Citron and Goldberg plan to follow up on their results by examining if figurative language is remembered more accurately than literal language, if metaphors are more physically stimulating, and if metaphors related to other senses also provoke an emotional response from the brain.

Filed under brain activity taste metaphorical expressions amygdala emotions psychology neuroscience science

367 notes

Dad’s Brain Becomes More ‘Maternal’ When He’s Primary Caregiver

Fathers who spend more time taking care of their newborn child undergo changes in brain activity that make them more apt to fret about their baby’s safety, a new study shows.

image

(Image: Shutterstock)

In particular, fathers who are the primary caregiver experience an increase in activity in their amygdala and other emotional-processing systems, causing them to experience parental emotions similar to those typically experienced by mothers, the researchers noted.

The findings suggest there is a neural network in the brain dedicated to parenting, and that the network responds to changes in parental roles, said study senior author Ruth Feldman, a researcher in the department of psychology and the Gonda Brain Sciences Center at Bar-Ilan University in Israel.

"Pregnancy, childbirth and lactation are very powerful primers in women to worry about their child’s survival," said Feldman, who also serves as an adjunct professor at the Yale Child Study Center at Yale University. "Fathers have the capacity to do it as well as mothers, but they need daily caregiving activities to ignite that mothering network."

Read more

Filed under parenting amygdala brain activity emotions psychology neuroscience science

free counters