Posts tagged emotions

Posts tagged emotions
Unique multimedia eBook presents scientists’, practitioners’, and therapists’ experiences
Questions about the difference between empathy and compassion, or about whether compassion can be learned, are now answered by a newly published eBook. Edited by Tania Singer and Matthias Bolz from the Max Planck Institute for Human Cognitive and Brain Sciences, the book also explains how mental training transforms the human brain, and that compassion can reduce pain.
The eBook Compassion: Bridging Practice and Science has just been published and can be downloaded free of charge. It summarises fascinating results of the science of compassion, but also describes training programmes and practical experiences. The book thus provides not only a unique overview of current research into empathy and compassion, but also offers an exciting way of approaching the topic for interested readers—including useful advice for everyday life.
A major part of the eBook concerns the science of compassion. Tania Singer, director of the Department of Social Neuroscience, shows how empathy differs from compassion. In a recent study, she was able to show empirically that empathy—the ability to recognize emotions experienced by others—and compassion are supported by different biological systems and neuronal networks. In other chapters, researchers from Singer’s department explain how meditation-based compassion practices can reduce pain, and how compassion training can promote positive emotions and social closeness, which in turn can improve mental and physical health. In another chapter, the endocrinologist Charles Raison describes how compassion training can lead to a decrease in stress-related hormones such as cortisol. “With our research, and with this book, we hope to raise awareness of compassion in our society, and to support the development of a more caring and sustainable society which recognizes the importance of secular ethics and the interdependence of all beings”, Singer emphasises.
Moreover, scientifically validated compassion training programmes are introduced for the first time, and expert users describe their experiences with some of these in schools, therapy, or end-of-life care situations. These reports provide interesting, enlightening, but also touching insights into the everyday-life effects of compassion training. One chapter, for example, shows how compassion training gains increasing significance for clinical staff—not only for their interactions with terminally ill or dying patients, but also for their processing of daily events, thus helping to prevent burnout-related illnesses among physicians and caretakers.
The book also provides theories and concepts of compassion from different perspectives. Paul Gilbert presents an evolutionary model of compassion, which argues that compassion is deeply rooted in our caring system. From a cognitive neuroscientific point of view, compassion is based on attentional, cognitive, and socio-affective processes, each of which draws on specific neuronal networks. The book also offers a Buddhist perspective on compassion, which insists compassion must begin with the move from self- to other-centredness.
The eBook has evolved from a successful workshop, How to Train Compassion, which was organised by Singer’s department in artist Olafur Eliasson’s studio in Berlin back in 2011. After the event, participants all agreed that the topics shared and discussed at the workshop should be made accessible to a wider range of people. Thus, with the support of the Max Planck Society, the eBook was produced—offering its readers many videos from the workshop, sound art collages by Nathalie Singer, as well as impressive pieces of visual art by Olafur Eliasson.
The documentary Raising Compassion, produced by Tania Singer und Olafur Eliasson, shows a unique exchange between the very different participants of the workshop.
Brain Sets Prices With Emotional Value
You might be falling in love with that new car, but you probably wouldn’t pay as much for it if you could resist the feeling.
Researchers at Duke University who study how the brain values things — a field called neuroeconomics — have found that your feelings about something and the value you put on it are calculated similarly in a specific area of the brain.
The region is small area right between the eyes at the front of the brain. It’s called the ventromedial prefrontal cortex, or vmPFC for short. Scott Huettel, director of Duke’s Center for Interdisciplinary Decision Science, said scientists studying emotion and neuroeconomics had independently singled out this area of the brain in their research but neither group recognized that the other’s research was focused on it too.
Now, after a series of experiments in which subjects were asked to modify how they felt about something either positively or negatively, the Duke group is arguing that emotional and economic calculations are more closely related than brain scientists had realized. The study appears July 3 in the Journal of Neuroscience.
Earlier research by other groups had shown the vmPFC participates in calculating the value of rewards and that it is engaged by positive stimuli that aren’t really rewards, like a happy memory or a picture of a happy face. A separate line of studies had shown that this brain region also set values on little things like snacks.
The vmPFC handles value tradeoffs such as ‘is that product worth parting with my hard-earned money?’ “This says that your emotions would enter into that tradeoff,” Huettel said.
"The neuroscience fits with your intuitive understanding," said Amy Winecoff, a graduate student in psychology and neuroscience who led the research. "Emotions appear to be relying on the same value system."
In the Duke study, experimental subjects were first trained to do “reappraisal,” in which they could change their emotional response to a situation. “In reappraisal you reassess the meaning of an emotional stimulus, rather than trying to avoid the emotional stimulus or suppress your reaction to it,” Winecoff said.
While the subjects’ brains were being scanned using functional MRI, they were shown images of evocative scenes and faces. After each image the subjects were told to either let their feelings flow or to practice reappraisal to change their thoughts. Then they were asked to rate how positive or negative they felt.
In the case of “an unregulated positive affect” — letting the good feelings flow — the vmPFC was shown to be working harder, which the researchers say could be used to predict how much value a person is putting on something. But when the subjects dampened their emotion responses to positive images, the vmPFC activation diminished, as if the images were less valuable to the subjects.
"This changes our frame of reference for thinking about these things," Huettel said. He said advertisers have long been using emotional appeals to get people to value their products, "but they didn’t know why it worked."
Previous studies had focused only on reappraisal of negative emotions, but this time around the Duke scientists wanted to watch people reappraise both negative and positive responses. “We have kind of a skewed picture because this has only been done on the negative,” Winecoff said.
"It’s not the case that you never want to reappraise a positive emotion," said Huettel. But when buying a house or a car, it’s a good idea to dampen your infatuation down a bit, he added.
Babies can read each other’s moods
Although it may seem difficult for adults to understand what an infant is feeling, a new study from Brigham Young University finds that it’s so easy a baby could do it.
Psychology professor Ross Flom’s study, published in the academic journal Infancy, shows that infants can recognize each other’s emotions by five months of age. This study comes on the heels of other significant research by Flom on infants’ ability to understand the moods of dogs, monkeys and classical music.
“Newborns can’t verbalize to their mom or dad that they are hungry or tired, so the first way they communicate is through affect or emotion,” says Flom. “Thus it is not surprising that in early development, infants learn to discriminate changes in affect.”
Infants can match emotion in adults at seven months and familiar adults at six months. In order to test infant’s perception of their peer’s emotions, Flom and his team of researchers tested a baby’s ability to match emotional infant vocalizations with a paired infant facial expression.
“We found that 5 month old infants can match their peer’s positive and negative vocalizations with the appropriate facial expression,” says Flom. “This is the first study to show a matching ability with an infant this young. They are exposed to affect in a peer’s voice and face which is likely more familiar to them because it’s how they themselves convey or communicate positive and negative emotions.”
In the study, infants were seated in front of two monitors. One of the monitors displayed video of a happy, smiling baby while the other monitor displayed video of a second sad, frowning baby. When audio was played of a third happy baby, the infant participating in the study looked longer to the video of the baby with positive facial expressions. The infant also was able to match negative vocalizations with video of the sad frowning baby. The audio recordings were from a third baby and not in sync with the lip movements of the babies in either video.
“These findings add to our understanding of early infant development by reiterating the fact that babies are highly sensitive to and comprehend some level of emotion,” says Flom. “Babies learn more in their first 2 1/2 years of life than they do the rest of their lifespan, making it critical to examine how and what young infants learn and how this helps them learn other things.”
Flom co-authored the study of 40 infants from Utah and Florida with Professor Lorraine Bahrick from Florida International University.
Flom’s next step in studying infant perception is to run the experiments with a twist: test whether babies could do this at even younger ages if instead they were watching and hearing clips of themselves.
And while the talking twin babies in this popular YouTube clip are older, it’s still a lot of fun to watch them babble at each other.

Researchers Identify Emotions Based on Brain Activity
For the first time, scientists at Carnegie Mellon University have identified which emotion a person is experiencing based on brain activity.
The study, published in the June 19 issue of PLOS ONE, combines functional magnetic resonance imaging (fMRI) and machine learning to measure brain signals to accurately read emotions in individuals. Led by researchers in CMU’s Dietrich College of Humanities and Social Sciences, the findings illustrate how the brain categorizes feelings, giving researchers the first reliable process to analyze emotions. Until now, research on emotions has been long stymied by the lack of reliable methods to evaluate them, mostly because people are often reluctant to honestly report their feelings. Further complicating matters is that many emotional responses may not be consciously experienced.
Identifying emotions based on neural activity builds on previous discoveries by CMU’s Marcel Just and Tom M. Mitchell, which used similar techniques to create a computational model that identifies individuals’ thoughts of concrete objects, often dubbed “mind reading.”
“This research introduces a new method with potential to identify emotions without relying on people’s ability to self-report,” said Karim Kassam, assistant professor of social and decision sciences and lead author of the study. “It could be used to assess an individual’s emotional response to almost any kind of stimulus, for example, a flag, a brand name or a political candidate.”
One challenge for the research team was find a way to repeatedly and reliably evoke different emotional states from the participants. Traditional approaches, such as showing subjects emotion-inducing film clips, would likely have been unsuccessful because the impact of film clips diminishes with repeated display. The researchers solved the problem by recruiting actors from CMU’s School of Drama.
“Our big breakthrough was my colleague Karim Kassam’s idea of testing actors, who are experienced at cycling through emotional states. We were fortunate, in that respect, that CMU has a superb drama school,” said George Loewenstein, the Herbert A. Simon University Professor of Economics and Psychology.
For the study, 10 actors were scanned at CMU’s Scientific Imaging & Brain Research Center while viewing the words of nine emotions: anger, disgust, envy, fear, happiness, lust, pride, sadness and shame. While inside the fMRI scanner, the actors were instructed to enter each of these emotional states multiple times, in random order.
Another challenge was to ensure that the technique was measuring emotions per se, and not the act of trying to induce an emotion in oneself. To meet this challenge, a second phase of the study presented participants with pictures of neutral and disgusting photos that they had not seen before. The computer model, constructed from using statistical information to analyze the fMRI activation patterns gathered for 18 emotional words, had learned the emotion patterns from self-induced emotions. It was able to correctly identify the emotional content of photos being viewed using the brain activity of the viewers.
To identify emotions within the brain, the researchers first used the participants’ neural activation patterns in early scans to identify the emotions experienced by the same participants in later scans. The computer model achieved a rank accuracy of 0.84. Rank accuracy refers to the percentile rank of the correct emotion in an ordered list of the computer model guesses; random guessing would result in a rank accuracy of 0.50.
Next, the team took the machine learning analysis of the self-induced emotions to guess which emotion the subjects were experiencing when they were exposed to the disgusting photographs. The computer model achieved a rank accuracy of 0.91. With nine emotions to choose from, the model listed disgust as the most likely emotion 60 percent of the time and as one of its top two guesses 80 percent of the time.
Finally, they applied machine learning analysis of neural activation patterns from all but one of the participants to predict the emotions experienced by the hold-out participant. This answers an important question: If we took a new individual, put them in the scanner and exposed them to an emotional stimulus, how accurately could we identify their emotional reaction? Here, the model achieved a rank accuracy of 0.71, once again well above the chance guessing level of 0.50.
“Despite manifest differences between people’s psychology, different people tend to neurally encode emotions in remarkably similar ways,” noted Amanda Markey, a graduate student in the Department of Social and Decision Sciences.
A surprising finding from the research was that almost equivalent accuracy levels could be achieved even when the computer model made use of activation patterns in only one of a number of different subsections of the human brain.
“This suggests that emotion signatures aren’t limited to specific brain regions, such as the amygdala, but produce characteristic patterns throughout a number of brain regions,” said Vladimir Cherkassky, senior research programmer in the Psychology Department.
The research team also found that while on average the model ranked the correct emotion highest among its guesses, it was best at identifying happiness and least accurate in identifying envy. It rarely confused positive and negative emotions, suggesting that these have distinct neural signatures. And, it was least likely to misidentify lust as any other emotion, suggesting that lust produces a pattern of neural activity that is distinct from all other emotional experiences.
Just, the D.O. Hebb University Professor of Psychology, director of the university’s Center for Cognitive Brain Imaging and leading neuroscientist, explained, “We found that three main organizing factors underpinned the emotion neural signatures, namely the positive or negative valence of the emotion, its intensity — mild or strong, and its sociality — involvement or non-involvement of another person. This is how emotions are organized in the brain.”
In the future, the researchers plan to apply this new identification method to a number of challenging problems in emotion research, including identifying emotions that individuals are actively attempting to suppress and multiple emotions experienced simultaneously, such as the combination of joy and envy one might experience upon hearing about a friend’s good fortune.
Insomnia may cause dysfunction in emotional brain circuitry
A new study provides neurobiological evidence for dysfunction in the neural circuitry underlying emotion regulation in people with insomnia, which may have implications for the risk relationship between insomnia and depression.
“Insomnia has been consistently identified as a risk factor for depression,” said lead author Peter Franzen, PhD, an assistant professor of psychiatry at the University of Pittsburgh School of Medicine. “Alterations in the brain circuitry underlying emotion regulation may be involved in the pathway for depression, and these results suggest a mechanistic role for sleep disturbance in the development of psychiatric disorders.”
The study involved 14 individuals with chronic primary insomnia without other primary psychiatric disorders, as well as 30 good sleepers who served as a control group. Participants underwent an fMRI scan during an emotion regulation task in which they were shown negative or neutral pictures. They were asked to passively view the images or to decrease their emotional responses using cognitive reappraisal, a voluntary emotion regulation strategy in which you interpret the meaning depicted in the picture in order to feel less negative.
Results show that in the primary insomnia group, amygdala activity was significantly higher during reappraisal than during passive viewing. Located in the temporal lobe of the brain, the amygdala plays an important role in emotional processing and regulation.
In analysis between groups, amygdala activity during reappraisal trials was significantly greater in the primary insomnia group compared with good sleepers. The two groups did not significantly differ when passively viewing negative pictures.
“Previous studies have demonstrated that successful emotion regulation using reappraisal decreases amygdala response in healthy individuals, yet we were surprised that activity was even higher during reappraisal of, versus passive viewing of, pictures with negative emotional content in this sample of individuals with primary insomnia,” said Franzen.
The research abstract was published recently in an online supplement of the journal SLEEP, and Franzen will present the findings Wednesday, June 5, in Baltimore, Md., at SLEEP 2013, the 27th annual meeting of the Associated Professional Sleep Societies LLC.
The American Academy of Sleep Medicine reports that about 10 to 15 percent of adults have an insomnia disorder with distress or daytime impairment. According to the National Institute of Mental Health, 6.7 percent of the U.S. adult population suffers from major depressive disorder. Both insomnia and depression are more common in women than in men.

Bach to the blues, our emotions match music to colors
Whether we’re listening to Bach or the blues, our brains are wired to make music-color connections depending on how the melodies make us feel, according to new research from the University of California, Berkeley. For instance, Mozart’s jaunty Flute Concerto No. 1 in G major is most often associated with bright yellow and orange, whereas his dour Requiem in D minor is more likely to be linked to dark, bluish gray.
Moreover, people in both the United States and Mexico linked the same pieces of classical orchestral music with the same colors. This suggests that humans share a common emotional palette – when it comes to music and color – that appears to be intuitive and can cross cultural barriers, UC Berkeley researchers said.
“The results were remarkably strong and consistent across individuals and cultures and clearly pointed to the powerful role that emotions play in how the human brain maps from hearing music to seeing colors,” said UC Berkeley vision scientist Stephen Palmer, lead author of a paper published this week in the journal Proceedings of the National Academy of Sciences.
Using a 37-color palette, the UC Berkeley study found that people tend to pair faster-paced music in a major key with lighter, more vivid, yellow colors, whereas slower-paced music in a minor key is more likely to be teamed up with darker, grayer, bluer colors.
“Surprisingly, we can predict with 95 percent accuracy how happy or sad the colors people pick will be based on how happy or sad the music is that they are listening to,” said Palmer, who will present these and related findings at the International Association of Colour conference at the University of Newcastle in the U.K. on July 8. At the conference, a color light show will accompany a performance by the Northern Sinfonia orchestra to demonstrate “the patterns aroused by music and color converging on the neural circuits that register emotion,” he said.
The findings may have implications for creative therapies, advertising and even music player gadgetry. For example, they could be used to create more emotionally engaging electronic music visualizers, computer software that generates animated imagery synchronized to the music being played. Right now, the colors and patterns appear to be randomly generated and do not take emotion into account, researchers said.
They may also provide insight into synesthesia, a neurological condition in which the stimulation of one perceptual pathway, such as hearing music, leads to automatic, involuntary experiences in a different perceptual pathway, such as seeing colors. An example of sound-to-color synesthesia was portrayed in the 2009 movie The Soloist when cellist Nathaniel Ayers experiences a mesmerizing interplay of swirling colors while listening to the Los Angeles symphony. Artists such as Wassily Kandinksky and Paul Klee may have used music-to-color synesthesia in their creative endeavors.
Nearly 100 men and women participated in the UC Berkeley music-color study, of which half resided in the San Francisco Bay Area and the other half in Guadalajara, Mexico. In three experiments, they listened to 18 classical music pieces by composers Johann Sebastian Bach, Wolfgang Amadeus Mozart and Johannes Brahms that varied in tempo (slow, medium, fast) and in major versus minor keys.
In the first experiment, participants were asked to pick five of the 37 colors that best matched the music to which they were listening. The palette consisted of vivid, light, medium, and dark shades of red, orange, yellow, green, yellow-green, green, blue-green, blue, and purple.
Participants consistently picked bright, vivid, warm colors to go with upbeat music and dark, dull, cool colors to match the more tearful or somber pieces. Separately, they rated each piece of music on a scale of happy to sad, strong to weak, lively to dreary and angry to calm.
Two subsequent experiments studying music-to-face and face-to-color associations supported the researchers’ hypothesis that “common emotions are responsible for music-to-color associations,” said Karen Schloss, a postdoctoral researchers at UC Berkeley and co-author of the paper.
For example, the same pattern occurred when participants chose the facial expressions that “went best” with the music selections, Schloss said. Upbeat music in major keys was consistently paired with happy-looking faces while subdued music in minor keys was paired with sad-looking faces. Similarly, happy faces were paired with yellow and other bright colors and angry faces with dark red hues.
Next, Palmer and his research team plan to study participants in Turkey where traditional music employs a wider range of scales than just major and minor. “We know that in Mexico and the U.S. the responses are very similar,” he said. “But we don’t yet know about China or Turkey.”
To suppress or to explore? Emotional strategy may influence anxiety
When trouble approaches, what do you do? Run for the hills? Hide? Pretend it isn’t there? Or do you focus on the promise of rain in those looming dark clouds?
New research suggests that the way you regulate your emotions, in bad times and in good, can influence whether – or how much – you suffer from anxiety.
The study appears in the journal Emotion.
In a series of questionnaires, researchers asked 179 healthy men and women how they managed their emotions and how anxious they felt in various situations. The team analyzed the results to see if different emotional strategies were associated with more or less anxiety.
The study revealed that those who engage in an emotional regulation strategy called reappraisal tended to also have less social anxiety and less anxiety in general than those who avoid expressing their feelings. Reappraisal involves looking at a problem in a new way, said University of Illinois graduate student Nicole Llewellyn, who led the research with psychology professor Florin Dolcos, an affiliate of the Beckman Institute at Illinois.
"When something happens, you think about it in a more positive light, a glass half full instead of half empty," Llewellyn said. "You sort of reframe and reappraise what’s happened and think what are the positives about this? What are the ways I can look at this and think of it as a stimulating challenge rather than a problem?"
Study participants who regularly used this approach reported less severe anxiety than those who tended to suppress their emotions.
Anxiety disorders are a major public health problem in the U.S. According to the National Institute of Mental Health, roughly 18 percent of the U.S. adult population is afflicted with general or social anxiety that is so intense that it warrants a diagnosis.
"The World Health Organization predicts that by 2020, anxiety and depression –which tend to co-occur – will be among the most prevalent causes of disability worldwide, secondary only to cardiovascular disease," Dolcos said. "So it’s associated with big costs."
Not all anxiety is bad, however, he said. Low-level anxiety may help you maintain the kind of focus that gets things done. Suppressing or putting a lid on your emotions also can be a good strategy in a short-term situation, such as when your boss yells at you, Dolcos said. Similarly, an always-positive attitude can be dangerous, causing a person to ignore health problems, for example, or to engage in risky behavior.
Previous studies had found that people who were temperamentally inclined to focus on making good things happen were less likely to suffer from anxiety than those who focused on preventing bad things from happening, Llewellyn said. But she could find no earlier research that explained how this difference in focus translated to behaviors that people could change. The new study appears to explain the strategies that contribute to a person having more or less anxiety, she said.
"This is something you can change," she said. "You can’t do much to affect the genetic or environmental factors that contribute to anxiety. But you can change your emotion regulation strategies."
Men are traditionally thought to have more problems in understanding women compared to understanding other men, though evidence supporting this assumption remains sparse. Recently, it has been shown, however, that meńs problems in recognizing women’s emotions could be linked to difficulties in extracting the relevant information from the eye region, which remain one of the richest sources of social information for the attribution of mental states to others. To determine possible differences in the neural correlates underlying emotion recognition from female, as compared to male eyes, a modified version of the Reading the Mind in the Eyes Test in combination with functional magnetic resonance imaging (fMRI) was applied to a sample of 22 participants. We found that men actually had twice as many problems in recognizing emotions from female as compared to male eyes, and that these problems were particularly associated with a lack of activation in limbic regions of the brain (including the hippocampus and the rostral anterior cingulate cortex). Moreover, men revealed heightened activation of the right amygdala to male stimuli regardless of condition (sex vs. emotion recognition). Thus, our findings highlight the function of the amygdala in the affective component of theory of mind (ToM) and in empathy, and provide further evidence that men are substantially less able to infer mental states expressed by women, which may be accompanied by sex-specific differences in amygdala activity.
New research shows how our bodies interact with our minds in response to fear and other emotions
New research has shown that the way our minds react to and process emotions such as fear can vary according to what is happening in other parts of our bodies.
In two different presentations today (Monday) at the British Neuroscience Association Festival of Neuroscience (BNA2013) in London, researchers have shown for the first time that the heart’s cycle affects the way we process fear, and that a part of the brain that responds to stimuli, such as touch, felt by other parts of the body also plays a role.
Dr Sarah Garfinkel, a postdoctoral fellow at the Brighton and Sussex Medical School (Brighton, UK), told a news briefing: “Cognitive neuroscience strives to understand how biological processes interact to create and influence the conscious mind. While neural activity in the brain is typically the focus of research, there is a growing appreciation that other bodily organs interact with brain function to shape and influence our perceptions, cognitions and emotions.
“We demonstrate for the first time that the way in which we process fear is different dependent on when we see fearful images in relation to our heart.”
Dr Garfinkel and her colleagues hooked up 20 healthy volunteers to heart monitors, which were linked to computers. Images of fearful faces were shown on the computers and the electrocardiography (ECG) monitors were able to communicate with the computers in order to time the presentation of the faces with specific points in the heart’s cycle.
“Our results show that if we see a fearful face during systole (when the heart is pumping) then we judge this fearful face as more intense than if we see the very same fearful face during diastole (when the heart is relaxed). To look at neural activity underlying this effect, we performed this experiment in an MRI [magnetic resonance imaging] scanner and demonstrated that a part of the brain called the amygdala influences how our heart changes our perception of fear.
“From previous research, we know that if we present images very fast then we have trouble detecting them, but if an image is particularly emotional then it can ‘pop’ out and be seen. In a second experiment, we exploited our cardiac effect on emotion to show that our conscious experience is affected by our heart. We demonstrated that fearful faces are better detected at systole (when they are perceived as more fearful), relative to diastole. Thus our hearts can also affect what we see and what we don’t see – and can guide whether we see fear.
“Lastly, we have demonstrated that the degree to which our hearts can change the way we see and process fear is influenced by how anxious we are. The anxiety level of our individual subjects altered the extent their hearts could change the way they perceived emotional faces and also altered neural circuitry underlying heart modulation of emotion.”
Dr Garfinkel says that her findings might have the potential to help people who suffer from anxiety or other conditions such as post traumatic stress disorder (PTSD).
“We have identified an important mechanism by which the heart and brain ‘speak’ to each other to change our emotions and reduce fear. We hope to explore the therapeutic implications in people with high anxiety. Anxiety disorders can be debilitating and are very prevalent in the UK and elsewhere. We hope that by increasing our understanding about how fear is processed and ways that it could be reduced, we may be able to develop more successful treatments for these people, and also for those, such as war veterans, who may be suffering from PTSD.
“In addition, there is a growing appreciation about how different forms of meditation can have therapeutic consequences. Work that integrates body, brain and mind to understand changes in emotion can help us understand how meditation and mindfulness practices can have calming effects.”
In a second presentation, Dr Alejandra Sel, a postdoctoral researcher in the Department of Psychology at City University (London, UK), investigated a part of the brain called the somatosensory cortex – the area that perceives bodily sensations, such as touch, pain, body temperature and the perception of the body’s place in space, and which is activated when we observe emotional expressions in the faces of other people.
“In order to understand other’s people emotions we need to experience the same observed emotions in our body. Specifically, observing an emotional face, as opposed to a neutral face, is associated with an increased activity in the somatosensory cortex as if we were expressing and experiencing our own emotions. It is also known that people with damage to the somatosensory cortex find it difficult to recognise emotion in other people’s faces,” Dr Sel told the news briefing.
However, until now, it has not been clear whether activity in the somatosensory cortex was simply a by-product of the way we process visual information, or whether it reacts independently to emotions expressed in other people’s faces, actively contributing to how we perceive emotions in others.
In order to discover whether the somatosensory cortex contributes to the processing of emotion independently of any visual processes, Dr Sel and her colleagues tested two situations on volunteers. Using electroencephalography (EEG) to measure the brain response to images, they showed participants either a face showing fear (emotional) or a neutral face. Secondly, they combined the showing of the face with a small tap to an index finger or the left cheek immediately afterwards.
Dr Sel said: “By tapping someone’s cheek or finger you can modify the ‘resting state’ of the somatosensory cortex inducing changes in brain electrical activity in this area. These changes are measureable and observable with EEG and this enables us to pinpoint the brain activity that is specifically related to the somatosensory cortex and its reaction to external stimuli.
“If the ‘resting state’ of the somatosensory cortex when a fearful face is shown has greater electrical activity than when a neutral face is shown, the changes in the activity of the somatosensory cortex induced by the taps and measured by EEG also will be greater when observing fearful as opposed to neutral faces.
“We subtracted results of the first situation (face only) from the second situation (face and tap), and compared changes in the activity related with the tap in the somatosensory cortex when seeing emotional faces versus neutral faces. This way, we could observe responses of the somatosensory cortex to emotional faces independently of visual processes,” she explained.
The researchers found that there was enhanced activity in the somatosensory cortex in response to fearful faces in comparison to neutral faces, independent of any visual processes. Importantly, this activity was focused in the primary and secondary somatosensory areas; the primary area receives sensory information directly from the body, while the secondary area combines sensory information from the body with information related to body movement and other information, such as memories of previous, sensitive experiences.
“Our experimental approach allows us to isolate and show for the first time (as far as we are aware) changes in somatosensory activity when seeing emotional faces after taking away all visual information in the brain. We have shown the crucial role of the somatosensory cortex in the way our minds and bodies perceive human emotions. These findings can serve as starting point for developing interventions tailored for people with problems in recognising other’s emotions, such as autistic children,” said Dr Sel.
The researchers now plan to investigate whether they get similar results when people are shown faces with other expressions such as happy or angry, and whether the timing of the physical stimulus, the tap to the finger or cheek, makes any difference. In this experiment, the tap occurred 105 milliseconds after a face was shown, and Dr Sel wonders about the effect of a longer time interval.
(Image: Shutterstock)
Human Emotion: We Report Our Feelings in 3-D
Like it or not and despite the surrounding debate of its merits, 3-D is the technology du jour for movie-making in Hollywood. It now turns out that even our brains use 3 dimensions to communicate emotions.
According to a new study published in Biological Psychiatry, the human report of emotion relies on three distinct systems: one system that directs attention to affective states (“I feel”), a second system that categorizes these states into words (“good”, “bad”, etc.); and a third system that relates the intensity of affective responses (“bad” or “awful”?).
Emotions are central to the human experience. Whether we are feeling happy, sad, afraid, or angry, we are often asked to identify and report on these feelings. This happens when friends ask us how we are doing, when we talk about professional or personal relationships, when we meditate, and so on. In fact, the very commonness and ease of reporting what we are feeling can lead us to overlook just how important such reports are - and how devastating the impairment of this ability may be for individuals with clinical disorders ranging from major depression to schizophrenia to autism spectrum disorders.
Progress in brain science has steadily been shedding light on the circuits and processes that underlie mood states. One of the leaders in this effort, Dr. Kevin Ochsner, Director of the Social Cognitive Neuroscience Lab at Columbia University, studies the neural bases of social, cognitive and affective processes. In this new study, he and his team set out to study the processes involved in constructing self-reports of emotion, rather than the effects of the self-reports or the emotional states themselves for which there is already much research.
To accomplish this, they recruited healthy participants who underwent brain scans while completing an experimental task that generated a self-report of emotion. This effort allowed the researchers to examine the neural architecture underlying the emotional reports.
“We find that the seemingly simple ability is supported by three different kinds of brain systems: largely subcortical regions that trigger an initial affective response, parts of medial prefrontal cortex that focus our awareness on the response and help generate possible ways of describing what we are feeling, and a part of the lateral prefrontal cortex that helps pick the best words for the feelings at hand,” said Ochsner.
“These findings suggest that self-reports of emotion - while seemingly simple - are supported by a network of brain regions that together take us from an affecting event to the words that make our feelings known to ourselves and others,” he added. “As such, these results have important implications for understanding both the nature of everyday emotional life - and how the ability to understand and talk about our emotions can break down in clinical populations.”
Dr. John Krystal, Editor of Biological Psychiatry, said, “It is critical that we understand the mechanisms underlying the absorption in emotion, the valence of emotion, and the intensity of emotion. In the short run, appreciation of the distinct circuits mediating these dimensions of emotional experience helps us to understand how brain injury, stroke, and tumors produce different types of mood changes. In the long run, it may help us to better treat mood disorders.”