Neuroscience

Articles and news from the latest research reports.

Posts tagged emotion

91 notes

In Old Age, Lack of Emotion and Interest May Signal Your Brain Is Shrinking

Older people who have apathy but not depression may have smaller brain volumes than those without apathy, according to a new study published in the April 16, 2014, online issue of Neurology®, the medical journal of the American Academy of Neurology. Apathy is a lack of interest or emotion.

image

“Just as signs of memory loss may signal brain changes related to brain disease, apathy may indicate underlying changes,” said Lenore J. Launer, PhD, with the National Institute on Aging at the National Institutes of Health (NIH) in Bethesda, MD, and a member of the American Academy of Neurology. “Apathy symptoms are common in older people without dementia. And the fact that participants in our study had apathy without depression should turn our attention to how apathy alone could indicate brain disease.”

Launer’s team used brain volume as a measure of accelerated brain aging. Brain volume losses occur during normal aging, but in this study, larger amounts of brain volume loss could indicate brain diseases.

For the study, 4,354 people without dementia and with an average age of 76 underwent an MRI scan. They were also asked questions that measure apathy symptoms, which include lack of interest, lack of emotion, dropping activities and interests, preferring to stay at home and having a lack of energy.

The study found that people with two or more apathy symptoms had 1.4 percent smaller gray matter volume and 1.6 percent less white matter volume compared to those who had less than two symptoms of apathy. Excluding people with depression symptoms did not change the results.

Gray matter is where learning takes place and memories are stored in the brain. White matter acts as the communication cables that connect different parts of the brain.

“If these findings are confirmed, identifying people with apathy earlier may be one way to target an at-risk group,” Launer said.

Filed under apathy emotion aging gray matter white matter brain structure neuroimaging neuroscience science

459 notes

New Study Suggests a Better Way to Deal with Bad Memories

What’s one of your worst memories? How did it make you feel? According to psychologists, remembering the emotions felt during a negative personal experience, such as how sad you were or how embarrassed you felt, can lead to emotional distress, especially when you can’t stop thinking about it. 

image

(Image: iStockphoto)

When these negative memories creep up, thinking about the context of the memories, rather than how you felt, is a relatively easy and effective way to alleviate the negative effects of these memories, a new study suggests.

Researchers at the Beckman Institute at the University of Illinois, led by psychology professor Florin Dolcos of the Cognitive Neuroscience Group, studied the behavioral and neural mechanisms of focusing away from emotion during recollection of personal emotional memories, and found that thinking about the contextual elements of the memories significantly reduced their emotional impact.

“Sometimes we dwell on how sad, embarrassed, or hurt we felt during an event, and that makes us feel worse and worse. This is what happens in clinical depression—ruminating on the negative aspects of a memory,” Dolcos said. “But we found that instead of thinking about your emotions during a negative memory, looking away from the worst emotions and thinking about the context, like a friend who was there, what the weather was like, or anything else non-emotional that was part of the memory, will rather effortlessly take your mind away from the unwanted emotions associated with that memory. Once you immerse yourself in other details, your mind will wander to something else entirely, and you won’t be focused on the negative emotions as much.”

This simple strategy, the study suggests, is a promising alternative to other emotion-regulation strategies, like suppression or reappraisal. 

“Suppression is bottling up your emotions, trying to put them away in a box. This is a strategy that can be effective in the short term, but in the long run, it increases anxiety and depression,” explains Sanda Dolcos, co-author on the study and postdoctoral research associate at the Beckman Institute and in the Department of Psychology. 

“Another otherwise effective emotion regulation strategy, reappraisal, or looking at the situation differently to see the glass half full, can be cognitively demanding. The strategy of focusing on non-emotional contextual details of a memory, on the other hand, is as simple as shifting the focus in the mental movie of your memories and then letting your mind wander.”

Not only does this strategy allow for effective short-term emotion regulation, but it has the possibility of lessening the severity of a negative memory with prolonged use.

In the study, participants were asked to share their most emotional negative and positive memories, such as the birth of a child, winning an award, or failing an exam, explained Sanda Dolcos. Several weeks later participants were given cues that would trigger their memories while their brains were being scanned using magnetic resonance imaging (MRI). Before each memory cue, the participants were asked to remember each event by focusing on either the emotion surrounding the event or the context. For example, if the cue triggered a memory of a close friend’s funeral, thinking about the emotional context could consist of remembering your grief during the event. If you were asked to remember contextual elements, you might instead remember what outfit you wore or what you ate that day.

“Neurologically, we wanted to know what happened in the brain when people were using this simple emotion-regulation strategy to deal with negative memories or enhance the impact of positive memories,” explained Ekaterina Denkova, first author of the report. “One thing we found is that when participants were focused on the context of the event, brain regions involved in basic emotion processing were working together with emotion control regions in order to, in the end, reduce the emotional impact of these memories.” 

Using this strategy promotes healthy functioning not only by reducing the negative impact of remembering unwanted memories, but also by increasing the positive impact of cherished memories, Florin Dolcos said. 

In the future, the researchers hope to determine if this strategy is effective in lessening the severity of negative memories over the long term. They also hope to work with clinically depressed or anxious participants to see if this strategy is effective in alleviating these psychiatric conditions. 

These results were published in Social Cognitive and Affective Neuroscience.

(Source: beckman.illinois.edu)

Filed under suppression prefrontal cortex memories autobiographical memory emotion regulation emotion psychology neuroscience science

316 notes

Computers See Through Faked Expressions of Pain Better Than People
A joint study by researchers at the University of California, San Diego and the University of Toronto has found that a computer system spots real or faked expressions of pain more accurately than people can.
The work, titled “Automatic Decoding of Deceptive Pain Expressions,” is published in the latest issue of Current Biology.
“The computer system managed to detect distinctive dynamic features of facial expressions that people missed,” said Marian Bartlett, research professor at UC San Diego’s Institute for Neural Computation and lead author of the study. “Human observers just aren’t very good at telling real from faked expressions of pain.”
Senior author Kang Lee, professor at the Dr. Eric Jackman Institute of Child Study at the University of Toronto, said “humans can simulate facial expressions and fake emotions well enough to deceive most observers. The computer’s pattern-recognition abilities prove better at telling whether pain is real or faked.”
The research team found that humans could not discriminate real from faked expressions of pain better than random chance – and, even after training, only improved accuracy to a modest 55 percent. The computer system attains an 85 percent accuracy.
“In highly social species such as humans,” said Lee, “faces have evolved to convey rich information, including expressions of emotion and pain. And, because of the way our brains are built, people can simulate emotions they’re not actually experiencing – so successfully that they fool other people. The computer is much better at spotting the subtle differences between involuntary and voluntary facial movements.”
“By revealing the dynamics of facial action through machine vision systems,” said Bartlett, “our approach has the potential to elucidate ‘behavioral fingerprints’ of the neural-control systems involved in emotional signaling.”
The single most predictive feature of falsified expressions, the study shows, is the mouth, and how and when it opens. Fakers’ mouths open with less variation and too regularly.
“Further investigations,” said the researchers, “will explore whether over-regularity is a general feature of fake expressions.”
In addition to detecting pain malingering, the computer-vision system might be used to detect other real-world deceptive actions in the realms of homeland security, psychopathology, job screening, medicine, and law, said Bartlett.
“As with causes of pain, these scenarios also generate strong emotions, along with attempts to minimize, mask, and fake such emotions, which may involve ‘dual control’ of the face,” she said. “In addition, our computer-vision system can be applied to detect states in which the human face may provide important clues as to health, physiology, emotion, or thought, such as drivers’ expressions of sleepiness, students’ expressions of attention and comprehension of lectures, or responses to treatment of affective disorders.”

Computers See Through Faked Expressions of Pain Better Than People

A joint study by researchers at the University of California, San Diego and the University of Toronto has found that a computer system spots real or faked expressions of pain more accurately than people can.

The work, titled “Automatic Decoding of Deceptive Pain Expressions,” is published in the latest issue of Current Biology.

“The computer system managed to detect distinctive dynamic features of facial expressions that people missed,” said Marian Bartlett, research professor at UC San Diego’s Institute for Neural Computation and lead author of the study. “Human observers just aren’t very good at telling real from faked expressions of pain.”

Senior author Kang Lee, professor at the Dr. Eric Jackman Institute of Child Study at the University of Toronto, said “humans can simulate facial expressions and fake emotions well enough to deceive most observers. The computer’s pattern-recognition abilities prove better at telling whether pain is real or faked.”

The research team found that humans could not discriminate real from faked expressions of pain better than random chance – and, even after training, only improved accuracy to a modest 55 percent. The computer system attains an 85 percent accuracy.

“In highly social species such as humans,” said Lee, “faces have evolved to convey rich information, including expressions of emotion and pain. And, because of the way our brains are built, people can simulate emotions they’re not actually experiencing – so successfully that they fool other people. The computer is much better at spotting the subtle differences between involuntary and voluntary facial movements.”

“By revealing the dynamics of facial action through machine vision systems,” said Bartlett, “our approach has the potential to elucidate ‘behavioral fingerprints’ of the neural-control systems involved in emotional signaling.”

The single most predictive feature of falsified expressions, the study shows, is the mouth, and how and when it opens. Fakers’ mouths open with less variation and too regularly.

“Further investigations,” said the researchers, “will explore whether over-regularity is a general feature of fake expressions.”

In addition to detecting pain malingering, the computer-vision system might be used to detect other real-world deceptive actions in the realms of homeland security, psychopathology, job screening, medicine, and law, said Bartlett.

“As with causes of pain, these scenarios also generate strong emotions, along with attempts to minimize, mask, and fake such emotions, which may involve ‘dual control’ of the face,” she said. “In addition, our computer-vision system can be applied to detect states in which the human face may provide important clues as to health, physiology, emotion, or thought, such as drivers’ expressions of sleepiness, students’ expressions of attention and comprehension of lectures, or responses to treatment of affective disorders.”

Filed under pain emotion facial expressions computer-vision system psychology neuroscience science

147 notes

Musical brain-reading sheds light on neural processing of music

Finnish and Danish researchers have developed a new method that performs decoding, or brain-reading, during continuous listening to real music. Based on recorded brain responses, the method predicts how certain features related to tone color and rhythm of the music change over time, and recognizes which piece of music is being listened to. The method also allows pinpointing the areas in the brain that are most crucial for the processing of music. The study was published in the journal NeuroImage.

image

Using functional magnetic resonance imaging (fMRI), the research team at the Finnish Centre of Excellence in Interdisciplinary Music Research in the Universities of Jyväskylä and Helsinki, and the Center for Functionally Integrative Neuroscience in Aarhus University, Denmark, recorded the brain responses of participants while they were listening to a 16-minute excerpt of the album Abbey Road by the Beatles. Following this, they used computational algorithms to extract a collection of musical features from the musical recording. Subsequently, they employed a collection of machine-learning methods to train a computer model that predicts how the features of the music change over time. Finally, they develop a classifier that predicts which part of the music the participant was listening to at each time.

The researchers found that most of the musical features included in the study could be reliably predicted from the brain data. They also found that the piece being listened to could be predicted significantly better than chance. Fairly large differences were however found between participants in terms of the prediction accuracy. An interesting finding was that areas outside of the auditory cortex, including motor, limbic, and frontal areas, had to be included in the models to obtain reliable predictions, providing thus evidence for the important role of these areas in the processing of musical features.

"We believe that decoding provides a method that complements other existing methods to obtain more reliable information about the complex processing of music in the brain", says Professor Petri Toiviainen from the University of Jyväskylä. "Our results provide additional evidence for the important involvement of emotional and motor areas in music processing."

(Source: jyu.fi)

Filed under auditory cortex neuroimaging music emotion neuroscience science

271 notes

Do Patients in a Vegetative State Recognize Loved Ones?

TAU researchers find unresponsive patients’ brains may recognize photographs of their family and friends

image

Patients in a vegetative state are awake, breathe on their own, and seem to go in and out of sleep. But they do not respond to what is happening around them and exhibit no signs of conscious awareness. With communication impossible, friends and family are left wondering if the patients even know they are there.

Now, using functional magnetic resonance imaging (fMRI), Dr. Haggai Sharon and Dr. Yotam Pasternak of Tel Aviv University’s Functional Brain Center and Sackler Faculty of Medicine and the Tel Aviv Sourasky Medical Center have shown that the brains of patients in a vegetative state emotionally react to photographs of people they know personally as though they recognize them.

"We showed that patients in a vegetative state can react differently to different stimuli in the environment depending on their emotional value," said Dr. Sharon. "It’s not a generic thing; it’s personal and autobiographical. We engaged the person, the individual, inside the patient."

The findings, published in PLOS ONE, deepen our understanding of the vegetative state and may offer hope for better care and the development of novel treatments. Researchers from TAU’s School of Psychological Sciences, Department of Neurology, and Sagol School of Neuroscience and the Loewenstein Hospital in Ranaana contributed to the research.

Talking to the brain

For many years, patients in a vegetative state were believed to have no awareness of self or environment. But in recent years, doctors have made use of fMRI to examine brain activity in such patients. They have found that some patients in a vegetative state can perform complex cognitive tasks on command, like imagining a physical activity such as playing tennis, or, in one case, even answering yes-or-no questions. But these cases are rare and don’t provide any indication as to whether patients are having personal emotional experiences in such a state.

To gain insight into “what it feels like to be in a vegetative state,” the researchers worked with four patients in a persistent (defined as “month-long”) or permanent (persisting for more than three months) vegetative state. They showed them photographs of people they did and did not personally know, then gauged the patients’ reactions using fMRI, which measures blood flow in the brain to detect areas of neurological activity in real time. In response to all the photographs, a region specific to facial recognition was activated in the patients’ brains, indicating that their brains had correctly identified that they were looking at faces.

But in response to the photographs of close family members and friends, brain regions involved in emotional significance and autobiographical information were also activated in the patients’ brains. In other words, the patients reacted with activations of brain centers involved in processing emotion, as though they knew the people in the photographs. The results suggest patients in a vegetative state can register and categorize complex visual information and connect it to memories – a groundbreaking finding.

The ghost in the machine

However, the researchers could not be sure if the patients were conscious of their emotions or just reacting spontaneously. So they then verbally asked the patients to imagine their parents’ faces. Surprisingly, one patient, a 60-year-old kindergarten teacher who was hit by a car while crossing the street, exhibited complex brain activity in the face- and emotion-specific brain regions, identical to brain activity seen in healthy people. The researchers say her response is the strongest evidence yet that vegetative-state patients can be “emotionally aware.” A second patient, a 23-year-old woman, exhibited activity just in the emotion-specific brain regions. (Significantly, both patients woke up within two months of the tests. They did not remember being in a vegetative state.)

"This experiment, a first of its kind, demonstrates that some vegetative patients may not only possess emotional awareness of the environment but also experience emotional awareness driven by internal processes, such as images," said Dr. Sharon.

Research focused on the “emotional awareness” of patients in a vegetative state is only a few years old. The researchers hope their work will eventually contribute to improved care and treatment. They have also begun working with patients in a minimally conscious state to better understand how regions of the brain interact in response to familiar cues. Emotions, they say, could help unlock the secrets of consciousness.

(Source: aftau.org)

Filed under vegetative state emotion neuroimaging brain activity facial recognition consciousness neuroscience science

192 notes

Playing computer games makes brains feel and think alike
Scientists have discovered that playing computer games can bring players’ emotional responses and brain activity into unison.
By measuring the activity of facial muscles and imaging the brain while gaming, the group found out that people go through similar emotions and display matching brainwaves. The study of Helsinki Institute for Information Technology HIIT researchers is now published in PLOS ONE.
– It’s well known that people who communicate face-to-face will start to imitate each other. People adopt each other’s poses and gestures, much like infectious yawning. What is less known is that the very physiology of interacting people shows a type of mimicry – which we call synchrony or linkage, explains Michiel Sovijärvi-Spapé.
In the study, test participants play a computer game called Hedgewars, in which they manage their own team of animated hedgehogs and in turns shoot the opposing team with ballistic artillery. The goal is to destroy the opposing team’s hedgehogs. The research team varied the amount of competitiveness in the gaming situation: players teamed up against the computer and they were also pinned directly against each other.
The players were measured for facial muscle reactions with facial electromyography, or fEMG, and their brainwaves were measured with electroencephalography, EEG.
– Replicating previous studies, we found linkage in the fEMG: two players showed both similar emotions and similar brainwaves at similar times. We further observed a linkage also in the brainwaves with EEG, tells Sovijärvi-Spapé.
A striking discovery indicates further that the more competitive the gaming gets, the more in sync are the emotional responses of the players. The test subjects were to report emotions themselves, and negative emotions were associated with the linkage effect.
– Although counterintuitive, the discovered effect increases as a game becomes more competitive. And the more competitive it gets, the more the players’ positive emotions begin to reflect each other. All the while their experiences of negative emotions increase.
The results present promising upshots for further study.
– Feeling others’ emotions could be particularly beneficial in competitive settings: the linkage may enable one to better anticipate the actions of opponents.
Another interpretation suggested by the group is that the physical linkage of emotion may work to compensate a possibly faltering social bond while competing in a gaming setting.
– Since our participants were all friends before the game, we can speculate that the linkage is most prominent when a friendship is ‘threatened’ while competing against each other, ponders Sovijärvi-Spapé.

Playing computer games makes brains feel and think alike

Scientists have discovered that playing computer games can bring players’ emotional responses and brain activity into unison.

By measuring the activity of facial muscles and imaging the brain while gaming, the group found out that people go through similar emotions and display matching brainwaves. The study of Helsinki Institute for Information Technology HIIT researchers is now published in PLOS ONE.

– It’s well known that people who communicate face-to-face will start to imitate each other. People adopt each other’s poses and gestures, much like infectious yawning. What is less known is that the very physiology of interacting people shows a type of mimicry – which we call synchrony or linkage, explains Michiel Sovijärvi-Spapé.

In the study, test participants play a computer game called Hedgewars, in which they manage their own team of animated hedgehogs and in turns shoot the opposing team with ballistic artillery. The goal is to destroy the opposing team’s hedgehogs. The research team varied the amount of competitiveness in the gaming situation: players teamed up against the computer and they were also pinned directly against each other.

The players were measured for facial muscle reactions with facial electromyography, or fEMG, and their brainwaves were measured with electroencephalography, EEG.

– Replicating previous studies, we found linkage in the fEMG: two players showed both similar emotions and similar brainwaves at similar times. We further observed a linkage also in the brainwaves with EEG, tells Sovijärvi-Spapé.

A striking discovery indicates further that the more competitive the gaming gets, the more in sync are the emotional responses of the players. The test subjects were to report emotions themselves, and negative emotions were associated with the linkage effect.

– Although counterintuitive, the discovered effect increases as a game becomes more competitive. And the more competitive it gets, the more the players’ positive emotions begin to reflect each other. All the while their experiences of negative emotions increase.

The results present promising upshots for further study.

– Feeling others’ emotions could be particularly beneficial in competitive settings: the linkage may enable one to better anticipate the actions of opponents.

Another interpretation suggested by the group is that the physical linkage of emotion may work to compensate a possibly faltering social bond while competing in a gaming setting.

– Since our participants were all friends before the game, we can speculate that the linkage is most prominent when a friendship is ‘threatened’ while competing against each other, ponders Sovijärvi-Spapé.

Filed under brain activity emotion emotional response brainwaves neuroimaging neuroscience science

85 notes

Snakes on the brain: Are primates hard-wired to see snakes?

Was the evolution of high-quality vision in our ancestors driven by the threat of snakes? Work by neuroscientists in Japan and Brazil is supporting the theory originally put forward by Lynne Isbell, professor of anthropology at the University of California, Davis.

image

In a paper published Oct. 28 in the journal Proceedings of the National Academy of Sciences, Isbell; Hisao Nishijo and Quan Van Le at Toyama University, Japan; and Rafael Maior and Carlos Tomaz at the University of Brasilia, Brazil; and colleagues show that there are specific nerve cells in the brains of rhesus macaque monkeys that respond to images of snakes.

The snake-sensitive neurons were more numerous, and responded more strongly and rapidly, than other nerve cells that fired in response to images of macaque faces or hands, or to geometric shapes. Isbell said she was surprised that more neurons responded to snakes than to faces, given that primates are highly social animals.

"We’re finding results consistent with the idea that snakes have exerted strong selective pressure on primates," Isbell said.

Isbell originally published her hypothesis in 2006, following up with a book, “The Fruit, the Tree and the Serpent” (Harvard University Press, 2009) in which she argued that our primate ancestors evolved good, close-range vision primarily to spot and avoid dangerous snakes.

Modern mammals and snakes big enough to eat them evolved at about the same time, 100 million years ago. Venomous snakes are thought to have appeared about 60 million years ago — “ambush predators” that have shared the trees and grasslands with primates.

Nishijo’s laboratory studies the neural mechanisms responsible for emotion and fear in rhesus macaque monkeys, especially instinctive responses that occur without learning or memory. Previous researchers have used snakes to provoke fear in monkeys, he noted. When Nishijo heard of Isbell’s theory, he thought it might explain why monkeys are so afraid of snakes.

"The results show that the brain has special neural circuits to detect snakes, and this suggests that the neural circuits to detect snakes have been genetically encoded," Nishijo said.

The monkeys tested in the experiment were reared in a walled colony and neither had previously encountered a real snake.

"I don’t see another way to explain the sensitivity of these neurons to snakes except through an evolutionary path," Isbell said.

Isbell said she’s pleased to be able to collaborate with neuroscientists.

"I don’t do neuroscience and they don’t do evolution, but we can put our brains together and I think it brings a wider perspective to neuroscience and new insights for evolution," she said.

(Source: news.ucdavis.edu)

Filed under evolution emotion fear brain mapping neuroscience science

228 notes

Keep your friends close, but …
Counterintuitive findings from a new USC study show that the part of the brain that is associated with empathizing with the pain of others is activated more strongly by watching the suffering of hateful people as opposed to likable people.

While one might assume that we would empathize more with people we like, the study may indicate that the human brain focuses more greatly on the need to monitor enemies closely, especially when they are suffering.
“When you watch an action movie and the bad guy appears to be defeated, the moment of his demise draws our focus intensely,” said Lisa Aziz-Zadeh of the Brain and Creativity Institute of the USC Dornsife College of Letters, Arts and Sciences. “We watch him closely to see whether he’s really down for the count because it’s critical for predicting his potential for retribution in the future.”
Aziz-Zadeh, who has a joint appointment with the USC Division of Occupational Science and Occupational Therapy, collaborated with lead author Glenn Fox, a PhD candidate at USC, and Mona Sobhani, formerly a graduate student at USC and who is now a postdoctoral researcher at Vanderbilt University, on a study that appears this month in Frontiers in Psychology.
The study examined activity in the so-called “pain matrix” of the brain, a network that includes the insula cortex, the anterior cingulate and the somatosensory cortices — regions known to activate when an individual watches another person suffer.
The pain matrix is thought to be a related to empathy — allowing us to understand another’s pain. However, this study indicates that the pain matrix may be more involved in processing pain in general and not necessarily tied to empathic processing.
Participants — all of them white, male and Jewish — first watched videos of hateful, anti-Semitic individuals in pain and then other videos of tolerant, nonhateful individuals in pain. Their brains were scanned with functional magnetic resonance imaging (fMRI) to show activity levels in the pain matrix.
Surprisingly, the participants’ pain matrices were more activated by watching the anti-Semites suffer compared to the tolerant individuals.
“The results further revealed the brain’s flexibility in processing complex social situations,” Fox said. “The brain uses the complete context of the situation to mount an appropriate response. In this case, the brain’s response is likely tied to the relative increase in the need to attend to and understand the pain of the hateful person.”
A possible next step for the researchers will be to try to understand how regulating one’s emotional reaction to stimuli such as these alters the resulting patterns of brain activity.

Keep your friends close, but …

Counterintuitive findings from a new USC study show that the part of the brain that is associated with empathizing with the pain of others is activated more strongly by watching the suffering of hateful people as opposed to likable people.

While one might assume that we would empathize more with people we like, the study may indicate that the human brain focuses more greatly on the need to monitor enemies closely, especially when they are suffering.

“When you watch an action movie and the bad guy appears to be defeated, the moment of his demise draws our focus intensely,” said Lisa Aziz-Zadeh of the Brain and Creativity Institute of the USC Dornsife College of Letters, Arts and Sciences. “We watch him closely to see whether he’s really down for the count because it’s critical for predicting his potential for retribution in the future.”

Aziz-Zadeh, who has a joint appointment with the USC Division of Occupational Science and Occupational Therapy, collaborated with lead author Glenn Fox, a PhD candidate at USC, and Mona Sobhani, formerly a graduate student at USC and who is now a postdoctoral researcher at Vanderbilt University, on a study that appears this month in Frontiers in Psychology.

The study examined activity in the so-called “pain matrix” of the brain, a network that includes the insula cortex, the anterior cingulate and the somatosensory cortices — regions known to activate when an individual watches another person suffer.

The pain matrix is thought to be a related to empathy — allowing us to understand another’s pain. However, this study indicates that the pain matrix may be more involved in processing pain in general and not necessarily tied to empathic processing.

Participants — all of them white, male and Jewish — first watched videos of hateful, anti-Semitic individuals in pain and then other videos of tolerant, nonhateful individuals in pain. Their brains were scanned with functional magnetic resonance imaging (fMRI) to show activity levels in the pain matrix.

Surprisingly, the participants’ pain matrices were more activated by watching the anti-Semites suffer compared to the tolerant individuals.

“The results further revealed the brain’s flexibility in processing complex social situations,” Fox said. “The brain uses the complete context of the situation to mount an appropriate response. In this case, the brain’s response is likely tied to the relative increase in the need to attend to and understand the pain of the hateful person.”

A possible next step for the researchers will be to try to understand how regulating one’s emotional reaction to stimuli such as these alters the resulting patterns of brain activity.

Filed under somatosensory cortex brain mapping neuroimaging emotion empathy neuroscience science

290 notes

I’m ok, you’re not ok
Egoism and narcissism appear to be on the rise in our society, while empathy is on the decline. And yet, the ability to put ourselves in other people’s shoes is extremely important for our coexistence. A research team headed by Tania Singer from the Max Planck Institute for Human Cognitive and Brain Sciences has discovered that our own feelings can distort our capacity for empathy. This emotionally driven egocentricity is recognised and corrected by the brain. When, however, the right supramarginal gyrus doesn’t function properly or when we have to make particularly quick decisions, our empathy is severely limited.
When assessing the world around us and our fellow humans, we use ourselves as a yardstick and tend to project our own emotional state onto others. While cognition research has already studied this phenomenon in detail, nothing is known about how it works on an emotional level. It was assumed that our own emotional state can distort our understanding of other people’s emotions, in particular if these are completely different to our own. But this emotional egocentricity had not been measured before now.
This is precisely what the Max Planck researchers have accomplished in a complex marathon of experiments and tests. They also discovered the area of the brain responsible for this function, which helps us to distinguish our own emotional state from that of other people. The area in question is the supramarginal gyrus, a convolution of the cerebral cortex which is approximately located at the junction of the parietal, temporal and frontal lobe. “This was unexpected, as we had the temporo-parietal junction in our sights. This is located more towards the front of the brain,” explains Claus Lamm, one of the publication’s authors.
On the empathy trail with toy slime and synthetic fur
Using a perception experiment, the researchers began by showing that our own feelings actually do influence our capacity for empathy, and that this egocentricity can also be measured. The participants, who worked in teams of two, were exposed to either pleasant or unpleasant simultaneous visual and tactile stimuli.
While participant 1, for example, could see a picture of maggots and feel slime with her hand, participant 2 saw a picture of a puppy and could feel soft, fleecy fur on her skin. “It was important to combine the two stimuli. Without the tactile stimulus, the participants would only have evaluated the situation ‘with their heads’ and their feelings would have been excluded,” explains Claus Lamm. The participants could also see the stimulus to which their team partners were exposed at the same time.
The two participants were then asked to evaluate either their own emotions or those of their partners. As long as both participants were exposed to the same type of positive or negative stimuli, they found it easy to assess their partner’s emotions. The participant who was confronted with a stinkbug could easily imagine how unpleasant the sight and feeling of a spider must be for her partner.
Differences only arose during the test runs in which one partner was confronted with pleasant stimuli and the other with unpleasant ones. Their capacity for empathy suddenly plummeted. The participants’ own emotions distorted their assessment of the other person’s feelings. The participants who were feeling good themselves assessed their partners’ negative experiences as less severe than they actually were. In contrast, those who had just had an unpleasant experience assessed their partners’ good experiences less positively.
Particularly quick decisions cause a decline in empathy
The researchers pinpointed the area of the brain responsible for this phenomenon with the help of functional magnetic resonance imaging, generally referred to as a brain scanning. The right supramarginal gyrus ensures that we can decouple our perception of ourselves from that of others. When the neurons in this part of the brain were disrupted in the course of this task, the participants found it difficult not to project their own feelings onto others. The participants’ assessments were also less accurate when they were forced to make particularly quick decisions.
Up to now, the social neuroscience models have assumed that we mainly draw on our own emotions as a reference for empathy. This only works, however, if we are in a neutral state or the same state as our counterpart – otherwise, the brain must counteract and correct.

I’m ok, you’re not ok

Egoism and narcissism appear to be on the rise in our society, while empathy is on the decline. And yet, the ability to put ourselves in other people’s shoes is extremely important for our coexistence. A research team headed by Tania Singer from the Max Planck Institute for Human Cognitive and Brain Sciences has discovered that our own feelings can distort our capacity for empathy. This emotionally driven egocentricity is recognised and corrected by the brain. When, however, the right supramarginal gyrus doesn’t function properly or when we have to make particularly quick decisions, our empathy is severely limited.

When assessing the world around us and our fellow humans, we use ourselves as a yardstick and tend to project our own emotional state onto others. While cognition research has already studied this phenomenon in detail, nothing is known about how it works on an emotional level. It was assumed that our own emotional state can distort our understanding of other people’s emotions, in particular if these are completely different to our own. But this emotional egocentricity had not been measured before now.

This is precisely what the Max Planck researchers have accomplished in a complex marathon of experiments and tests. They also discovered the area of the brain responsible for this function, which helps us to distinguish our own emotional state from that of other people. The area in question is the supramarginal gyrus, a convolution of the cerebral cortex which is approximately located at the junction of the parietal, temporal and frontal lobe. “This was unexpected, as we had the temporo-parietal junction in our sights. This is located more towards the front of the brain,” explains Claus Lamm, one of the publication’s authors.

On the empathy trail with toy slime and synthetic fur

Using a perception experiment, the researchers began by showing that our own feelings actually do influence our capacity for empathy, and that this egocentricity can also be measured. The participants, who worked in teams of two, were exposed to either pleasant or unpleasant simultaneous visual and tactile stimuli.

While participant 1, for example, could see a picture of maggots and feel slime with her hand, participant 2 saw a picture of a puppy and could feel soft, fleecy fur on her skin. “It was important to combine the two stimuli. Without the tactile stimulus, the participants would only have evaluated the situation ‘with their heads’ and their feelings would have been excluded,” explains Claus Lamm. The participants could also see the stimulus to which their team partners were exposed at the same time.

The two participants were then asked to evaluate either their own emotions or those of their partners. As long as both participants were exposed to the same type of positive or negative stimuli, they found it easy to assess their partner’s emotions. The participant who was confronted with a stinkbug could easily imagine how unpleasant the sight and feeling of a spider must be for her partner.

Differences only arose during the test runs in which one partner was confronted with pleasant stimuli and the other with unpleasant ones. Their capacity for empathy suddenly plummeted. The participants’ own emotions distorted their assessment of the other person’s feelings. The participants who were feeling good themselves assessed their partners’ negative experiences as less severe than they actually were. In contrast, those who had just had an unpleasant experience assessed their partners’ good experiences less positively.

Particularly quick decisions cause a decline in empathy

The researchers pinpointed the area of the brain responsible for this phenomenon with the help of functional magnetic resonance imaging, generally referred to as a brain scanning. The right supramarginal gyrus ensures that we can decouple our perception of ourselves from that of others. When the neurons in this part of the brain were disrupted in the course of this task, the participants found it difficult not to project their own feelings onto others. The participants’ assessments were also less accurate when they were forced to make particularly quick decisions.

Up to now, the social neuroscience models have assumed that we mainly draw on our own emotions as a reference for empathy. This only works, however, if we are in a neutral state or the same state as our counterpart – otherwise, the brain must counteract and correct.

Filed under empathy emotion cerebral cortex supramarginal gyrus psychology neuroscience science

347 notes

Is the human brain capable of identifying a fake smile?
Since Leonardo Da Vinci painted the Mona Lisa, much has been said about what lies behind her smile. Now, Spanish researchers have discovered how far this attention-grabbing expression confuses our emotion recognition and makes us perceive a face as happy, even if it is not.
Human beings deduce others´ state of mind from their facial expressions. “Fear, anger, sadness, displeasure and surprise are quickly inferred in this way,” David Beltrán Guerrero, researcher at the University of La Laguna, explains to SINC. But some emotions are more difficult to perceive.
“There is a wide range of more ambiguous expressions, from which it is difficult to deduce the underlying emotional state. A typical example is the expression of happiness,” says Beltrán, who is part of a group of experts at the Canarian institution who have analyzed, in three scientific articles, the smile’s capacity to distort people’s innate deductive ability.
“The smile plays a key role in recognizing others´ happiness. But, as we know, we are not really happy every time we smile,” he adds. In some cases, a smile merely expresses politeness or affiliation. In others, it may even be a way of hiding negative feelings and incentives, such as dominance, sarcasm, nervousness or embarrassment.
To develop this line of research, the authors created faces comprising smiling mouths and eyes expressing non-happy emotions, and compared them with faces in which both mouths and eyes expressed the same type of emotional state.
The main objective was to discover how far the smile skews the recognition of ambiguous expressions, making us identify them with happiness even though they are accompanied by eyes which clearly express a different feeling.
The power of a smile
“The influence of the smile is highly dependent on the type of task given to participants and, therefore, on the type of activity we are involved in when we come across this type of expression,” Beltrán notes.
Thus when the task is purely perceptive – like the detection of facial features - the smile has a very strong influence, to the extent that differences between ambiguous expressions (happy mouth and non-happy eyes) and genuinely happy expressions (happy mouth and eyes) are not distinguished.
On the other hand, when the task involves categorizing expressions, that is recognizing if they are happy, sad or any other emotion, the influence of the smile weakens, although it is still important, since 40% of the time, participants identify ambiguous expressions as genuinely happy.
However, the influence of the smile disappears in emotional assessment, that is when someone is asked to assess whether a facial expression is positive or negative: “A smile can cause us to interpret a non-happy expression as happy,  except when we are involved in the emotional assessment of said expression,” he highlights.
A stimulus which is difficult to assess
According to the authors, the reason why a smile sometimes leads to the incorrect categorization of an expression is related to its high visual “salience”– its attention-grabbing capacity – and its almost exclusive association with the emotional state of happiness.
In a recent study, it was found that the smile dominates many of the initial stages of the brain processing of faces, to the extent that it prompts similar electrical activity in the brain for genuinely happy expressions and ambiguous expressions with smiles and non-happy eyes.
By measuring eye movements, it was observed that an ambiguous expression is confused and categorized as happy if the first gaze falls on the area of the smiling mouth,  rather than the area of the eyes.
However, curiously the influence of the smile in these assessments is not the same for everyone. “Another study showed that people with social anxiety tend to confuse ambiguous expressions with genuinely happy expressions less frequently,” Beltrán concludes.

References: 
Manuel G. Calvo, Hipólito Marrero, David Beltrán. “When does the brain distinguish between genuine and ambiguous smiles? An ERP study”. Brain and Cognition 81 (2013) 237–246.
Manuel G. Calvo, Andrés Fernández-Martín, Lauri Nummenmaa. “Perceptual, categorical, and affective processing of ambiguous smiling facial expressions”. Cognition 125 (2012) 373–393.
Manuel G. Calvo; Aida Gutiérrez-García; Pedro Avero; Daniel Lundqvist. “Attentional Mechanisms in Judging Genuine and Fake Smiles: Eye-Movement Patterns”. Emotion 2013, Vol. 13 (2013), No. 4, 792–802.

Is the human brain capable of identifying a fake smile?

Since Leonardo Da Vinci painted the Mona Lisa, much has been said about what lies behind her smile. Now, Spanish researchers have discovered how far this attention-grabbing expression confuses our emotion recognition and makes us perceive a face as happy, even if it is not.

Human beings deduce others´ state of mind from their facial expressions. “Fear, anger, sadness, displeasure and surprise are quickly inferred in this way,” David Beltrán Guerrero, researcher at the University of La Laguna, explains to SINC. But some emotions are more difficult to perceive.

“There is a wide range of more ambiguous expressions, from which it is difficult to deduce the underlying emotional state. A typical example is the expression of happiness,” says Beltrán, who is part of a group of experts at the Canarian institution who have analyzed, in three scientific articles, the smile’s capacity to distort people’s innate deductive ability.

“The smile plays a key role in recognizing others´ happiness. But, as we know, we are not really happy every time we smile,” he adds. In some cases, a smile merely expresses politeness or affiliation. In others, it may even be a way of hiding negative feelings and incentives, such as dominance, sarcasm, nervousness or embarrassment.

To develop this line of research, the authors created faces comprising smiling mouths and eyes expressing non-happy emotions, and compared them with faces in which both mouths and eyes expressed the same type of emotional state.

The main objective was to discover how far the smile skews the recognition of ambiguous expressions, making us identify them with happiness even though they are accompanied by eyes which clearly express a different feeling.

The power of a smile

“The influence of the smile is highly dependent on the type of task given to participants and, therefore, on the type of activity we are involved in when we come across this type of expression,” Beltrán notes.

Thus when the task is purely perceptive – like the detection of facial features - the smile has a very strong influence, to the extent that differences between ambiguous expressions (happy mouth and non-happy eyes) and genuinely happy expressions (happy mouth and eyes) are not distinguished.

On the other hand, when the task involves categorizing expressions, that is recognizing if they are happy, sad or any other emotion, the influence of the smile weakens, although it is still important, since 40% of the time, participants identify ambiguous expressions as genuinely happy.

However, the influence of the smile disappears in emotional assessment, that is when someone is asked to assess whether a facial expression is positive or negative: “A smile can cause us to interpret a non-happy expression as happy,  except when we are involved in the emotional assessment of said expression,” he highlights.

A stimulus which is difficult to assess

According to the authors, the reason why a smile sometimes leads to the incorrect categorization of an expression is related to its high visual “salience”– its attention-grabbing capacity – and its almost exclusive association with the emotional state of happiness.

In a recent study, it was found that the smile dominates many of the initial stages of the brain processing of faces, to the extent that it prompts similar electrical activity in the brain for genuinely happy expressions and ambiguous expressions with smiles and non-happy eyes.

By measuring eye movements, it was observed that an ambiguous expression is confused and categorized as happy if the first gaze falls on the area of the smiling mouth,  rather than the area of the eyes.

However, curiously the influence of the smile in these assessments is not the same for everyone. “Another study showed that people with social anxiety tend to confuse ambiguous expressions with genuinely happy expressions less frequently,” Beltrán concludes.

References:

Manuel G. Calvo, Hipólito Marrero, David Beltrán. “When does the brain distinguish between genuine and ambiguous smiles? An ERP study”. Brain and Cognition 81 (2013) 237–246.

Manuel G. Calvo, Andrés Fernández-Martín, Lauri Nummenmaa. “Perceptual, categorical, and affective processing of ambiguous smiling facial expressions”. Cognition 125 (2012) 373–393.

Manuel G. Calvo; Aida Gutiérrez-García; Pedro Avero; Daniel Lundqvist. “Attentional Mechanisms in Judging Genuine and Fake Smiles: Eye-Movement Patterns”. Emotion 2013, Vol. 13 (2013), No. 4, 792–802.

Filed under facial expressions smile emotion happiness psychology neuroscience science

free counters