Posts tagged emotion

Posts tagged emotion
People forget where they have left their keys because their brain is wired to recall emotionally charged events and ignore the mundane, a study has found.
When we see or experience something emotional such as the birth of a child or a traumatic event, our brain interprets it more vividly and stores it with greater clarity. In contrast everyday events are only processed with a minimal level of detail, explaining why we can remember things from our childhood but not what we ate for dinner 24 hours ago, researchers claim.
Rebecca Todd of the University of Toronto, who led the study, said: “We’ve discovered that we see things that are emotionally arousing with greater clarity than those that are more mundane. What’s more, we found that how vividly we perceive something in the first place predicts how vividly we will remember it later on … it is like the flash of a flashbulb that illuminates an event as it’s captured for memory.”
ScienceDaily (Aug. 23, 2012) — We use language every day to express our emotions, but can this language actually affect what and how we feel? Two new studies from Psychological Science, a journal of the Association for Psychological Science, explore the ways in which the interaction between language and emotion influences our well-being.
Putting Feelings into Words Can Help Us Cope with Scary Situations
Katharina Kircanski and colleagues at the University of California, Los Angeles investigated whether verbalizing a current emotional experience, even when that experience is negative, might be an effective method for treating for people with spider phobias. In an exposure therapy study, participants were split into different experimental groups and they were instructed to approach a spider over several consecutive days.
One group was told to put their feelings into words by describing their negative emotions about approaching the spider. Another group was asked to ‘reappraise’ the situation by describing the spider using emotionally neutral words. A third group was told to talk about an unrelated topic (things in their home) and a fourth group received no intervention. Participants who put their negative feelings into words were most effective at lowering their levels of physiological arousal. They were also slightly more willing to approach the spider. The findings suggest that talking about your feelings — even if they’re negative — may help you to cope with a scary situation.
Unlocking Past Emotion: The Verbs We Use Can Affect Mood and Happiness
Our memory for events is influenced by the language we use. When we talk about a past occurrence, we can describe it as ongoing (I was running) or already completed (I ran). To investigate whether using these different wordings might affect our mood and overall happiness, Will Hart of the University of Alabama conducted four experiments in which participants either recalled or experienced a positive, negative, or neutral event. They found that people who described a positive event with words that suggested it was ongoing felt more positive. And when they described a negative event in the same way, they felt more negative.
The authors conclude that one potential way to improve mood could be to talk about negative past events as something that already happened as opposed to something that was happening.
Source: Science Daily
Necomimi feature NeuroSky’s brain-computer interface technology to control the motion of the cat ears. The latest product sporting the company’s brainwave-reading technology features a slightly more fun form factor – fluffy, wearable cat ears that move in response to the wearer’s emotional state.
Have you ever wondered why you can remember things from long ago as if they happened yesterday, yet sometimes can’t recall what you ate for dinner last night? According to a new study led by psychologists at the University of Toronto, it’s because how much something means to you actually influences how you see it as well as how vividly you can recall it later.
"We’ve discovered that we see things that are emotionally arousing with greater clarity than those that are more mundane," says Rebecca Todd, a postdoctoral fellow in U of T’s Department of Psychology and lead author of the study published recently in the Journal of Neuroscience. "Whether they’re positive — for example, a first kiss, the birth of a child, winning an award — or negative, such as traumatic events, breakups, or a painful and humiliating childhood moment that we all carry with us, the effect is the same."
"What’s more, we found that how vividly we perceive something in the first place predicts how vividly we will remember it later on," says Todd. "We call this ‘emotionally enhanced vividness’ and it is like the flash of a flashbub that illuminates an event as it’s captured for memory."

Brain discovery sheds light on link between vision and emotion
Neuroscientists have discovered a new area of the brain that is uniquely specialised for peripheral vision and could be targeted in future treatments for panic disorders and Alzheimer’s disease.
Published today in high impact journal Current Biology, researchers led by Dr Hsin-Hao Yu and Professor Marcello Rosa from Monash University’s Department of Physiology found that a brain area, known as prostriata, was specialised in detecting fast-moving objects in peripheral vision.
This area, located in a primitive part of the cerebral cortex, has characteristics unlike any other visual area described before, including a “direct line” of communication to brain areas controlling emotion and quick reactions.
Dr Yu said the discovery, identified during the development of the Monash Vision Group’s bionic eye, funded through the ARC Research in Bionic Vision Science and Technology Initiative, could lead to new treatments for panic disorders such as agoraphobia (fear of open spaces) and may extend into other medical areas including Alzheimer’s treatment.
“The brain is the most complex organ in the human body and perhaps the most remarkable. These findings change how we think of the brain in terms of how visual information is processed,” Dr Yu said.
“This area is likely to be hyperactive in panic disorder, with agoraphobia. This knowledge could lead to treatment options for the hyperactivity, and therefore sensitivity to such disorders, particularly the fear of open spaces.
“Correlation with previous studies also shows that prostriata is one of the first areas affected in Alzheimer’s disease. This knowledge helps to explain spatial disorientation and the tendency to fall, which are among the earliest signs of a problem associated with Alzheimer’s.”
Professor Rosa said this area had ultra-fast responses to visual stimuli, simultaneously broadcasting information to brain areas that control attention, emotional and motor reactions. This challenges current conceptions of how the brain processes visual information.
“This suggests a specialised brain circuit through which stimuli in peripheral vision can be fast-tracked to command quickly coordinated physical and emotional responses,” Professor Rosa said.
July 11, 2012
What can explain extreme differences in altruism among individuals, from Ebenezer Scrooge to Mother Teresa? It may all come down to variation in the size and activity of a brain region involved in appreciating others’ perspectives, according to a study published in the July 12th issue of the journal Neuron. The findings also provide a neural explanation for why altruistic tendencies remain stable over time.

The junction (yellow) between the parietal and the temporal lobes, in which the relative proportion of gray matter is significantly positively correlated with the propensity for altruistic behavior. Credit: University of Zurich
"This is the first study to link both brain anatomy and brain activation to human altruism,” says senior study author Ernst Fehr of the University of Zurich. “The findings suggest that the development of altruism through appropriate training or social practices might occur through changes in the brain structure and the neural activations that we identified in our study.”
Individuals who excel at understanding others’ intents and beliefs are more altruistic than those who struggle at this task. The ability to understand others’ perspectives has previously been associated with activity in a brain region known as the temporoparietal junction (TPJ). Based on these past findings, Fehr and his team reasoned that the size and activation of the TPJ would relate to individual differences in altruism.
In the new study, subjects underwent a brain imaging scan and played a game in which they had to decide how to split money between themselves and anonymous partners. Subjects who made more generous decisions had a larger TPJ in the right hemisphere of the brain compared with subjects who made stingy decisions.
Moreover, activity in the TPJ reflected each subject’s specific cutoff value for the maximal cost the subject was willing to endure to increase the partner’s payoff. Activity in the TPJ was higher during hard decisions—when the personal cost of an altruistic act was just below the cutoff value—than during easy decisions associated with a very low or very high cost.
"The structure of the TPJ strongly predicts an individual’s setpoint for altruistic behavior, while activity in this brain region predicts an individual’s acceptable cost for altruistic actions," says study author Yosuke Morishima of the University of Zurich. "We have elucidated the relationship between the hardware and software of human altruistic behavior."
Provided by Cell Press
Source: medicalxpress.com
June 18, 2012
Fear conditioning using sound and taste aversion, as applied to mice, have revealed interesting information on the basis of memory allocation.

Credit: Thinkstock
European ‘Cellular mechanisms underlying formation of the fear memory trace in the mouse amygdala’ (FEAR Memory TRACE) project is investigating memory allocation and the recruitment of certain neurons to encode a memory. By studying conditioned fear memory in response to an auditory stimulus, the researchers have delved into pathological emotional states and neural mechanisms involved in memory allocation, retrieval and extinction.
Prior research has revealed that the conditioned fear response in mice is located in a specific bundle of neurons in the amygdala. Memory allocation modulation is due to expression of the transcription factor, cyclic adenosine 3’, 5’-monophosphate response element binding protein (CREB) and possibly neuronal excitability.
FEAR Memory TRACE focused on the electrophysiological properties of neurons encoding the same memory. The project also aimed to ascertain the biophysical mechanisms in the plasticity changes recorded in the specific set of neurons in the fear memory trace.
Recording information on auditory fear conditioning and conditioned taste aversion, the scientists used intra-amygdala surgery using viral vectors and electrophysiological experiments to detect neuronal excitability.
Transfected by virus, CREB tagged with green fluorescent protein together with the gene for channelrhodopsin2 were used in neural control experiments. Combined, these two elements caused neuron firing in specific nerve cells. Molecular techniques included western blot for protein detection, genotyping and viral DNA preparation.
Behavioural tests on long- and short-term memory of mice involving fear conditioning and taste aversion showed increased memory performance at the three-hour point rather than the five-hour point. The intrinsic excitability of the mice receiving both shock and the tone was increased at three hours, not five, compared to mice that only received the tone.
As the project continues to its close in two years, the aim is to identify biophysical mechanisms involved in recruiting neurons that compete with each other for a specific memory. FEAR Memory TRACE will also develop computational models to assess the role of these mechanisms in memory performance.
Information on biochemical processes in neural mechanisms has wide application in many clinical situations including patients suffering memory loss, such as stroke victims. Fear response manipulation can be applied in treatment of neuroses and phobias.
Provided by CORDIS
Source: medicalxpress.com
ScienceDaily (June 14, 2012) — Rockville, Md. — Scientists have found new evidence that people spot a face in the crowd more quickly when teeth are visible — whether smiling or grimacing — than a face with a particular facial expression. The new findings, published in the Journal of Vision, counters the long held “face-in -the-crowd” effect that suggests only angry looking faces are detected more readily in a crowd.

Examples of stimuli — closed mouth and open mouth with visible teeth — presented in the experiment. (Credit: ARVO)
"The research concerned with the face-in-the-crowd effect essentially deals with the question of how we detect social signals of friendly or unfriendly intent in the human face," said author Gernot Horstmann, PhD, of the Center for Interdisciplinary Research and Department of Psychology at Bielefeld University, Germany. "Our results indicate that, contrary to previous assertions, detection of smiles or frowns is relatively slow in crowds of neutral faces, whereas toothy grins and snarls are quite easily detected."
In two studies, the researchers asked subjects to search for a happy or an angry face within a crowd of neutral faces, and measured the search speed. While the search was relatively slow when emotion was signaled with a closed mouth face, the speed search doubled when emotion was signaled with an open mouth and visible teeth. This was the case for both happy and angry faces, and happy faces were found even somewhat faster than angry faces.
Horstmann and his colleagues conducted these experiments as a result of discrepancies in previous studies that investigated visual search for emotional faces. According to the research team, the inconsistent results with respect to which of the two expressions are found faster — the happy face or the angry face — suggested that the emotional expression category could not be the only important factor determining the face-in- the-crowd effect.
The scientists believe this new study may explain the discrepancies. “This will probably inspire researchers to clarify whether emotion and, in particular, threat plays an additional, unique role in face detection,” said Horstmann.
Source: Science Daily
June 4, 2012
(Medical Xpress) — Ever been stuck in traffic when a feel-good song comes on the radio and suddenly your mood lightens?

Our emotions and feelings are typically associated with the right side of the brain. For example, processing the emotion in human facial expressions is done in the right hemisphere.
However, new Australian research is challenging the widely-held view that emotions and feelings are the domain of the right hemisphere only.
Dr. Sharpley Hsieh and colleagues from Neuroscience Research Australia (NeuRA) found that people with semantic dementia, a disease where parts of the left hemisphere are severely affected, have difficulty recognising emotion in music.
These findings have exciting implications for our understanding of how music, language and emotions are handled by the brain.
“It’s known that processing whether a face is happy or sad is impaired in people who lose key regions of the right hemisphere, as happens in people with Alzheimer’s and semantic dementia”, says Dr. Hsieh.
“What we have now learnt from looking at people with semantic dementia is that understanding emotions in music involves key parts of the other side of the brain as well”, she says.
“Ours is the first study from patients with dementia to show that language-based areas of the brain, primarily on the left, are important for extracting emotional meaning from music. Our findings suggest that the brain considers melodies and speech to be similar and that overlapping parts of the brain are required for both”, says Hsieh.
This paper is published in the journal Neuropsychologia.
How was this study done?
• People with Alzheimer’s disease lose episodic memory (‘What did I do yesterday?’); people with semantic dementia lose semantic memory (‘What is a zebra?’).
• Dr. Hsieh studied people with Alzheimer’s disease, semantic dementia and healthy people without either disease. Participants were played new pieces of music and had to indicate whether the song was happy, sad, peaceful or scary.
• Images were then taken of the patients’ brains using MRI so that diseased parts of the brain could be compared statistically to the answers provided in the musical test.
• Patients with Alzheimer’s and semantic dementia have problems deciding whether a human face looks happy or sad because the amygdala in the right hemisphere is diseased.
• Patients with semantic dementia have additional problems labelling whether a piece of music is happy or sad because the anterior temporal lobe in the left hemisphere is diseased.
Provided by Neuroscience Research Australia
Source: medicalxpress.com
ScienceDaily (May 28, 2012) — Do you smile when you’re frustrated? Most people think they don’t — but they actually do, a new study from MIT has found. What’s more, it turns out that computers programmed with the latest information from this research do a better job of differentiating smiles of delight and frustration than human observers do.

Can you tell which of these smiles is showing happiness? Or which one is the result of frustration? A computer system developed at MIT can. The answer: The smile on the right is the sign of frustration. (Credit: Images courtesy of Hoque et al.)
The research could pave the way for computers that better assess the emotional states of their users and respond accordingly. It could also help train those who have difficulty interpreting expressions, such as people with autism, to more accurately gauge the expressions they see.
"The goal is to help people with face-to-face communication," says Ehsan Hoque, a graduate student in the Affective Computing Group of MIT’s Media Lab who is lead author of a paper just published in the IEEE Transactions on Affective Computing. Hoque’s co-authors are Rosalind Picard, a professor of media arts and sciences, and Media Lab graduate student Daniel McDuff.
In experiments conducted at the Media Lab, people were first asked to act out expressions of delight or frustration, as webcams recorded their expressions. Then, they were either asked to fill out an online form designed to cause frustration or invited to watch a video designed to elicit a delighted response — also while being recorded.
When asked to feign frustration, Hoque says, 90 percent of subjects did not smile. But when presented with a task that caused genuine frustration — filling out a detailed online form, only to then find the information deleted after pressing the “submit” button — 90 percent of them did smile, he says. Still images showed little difference between these frustrated smiles and the delighted smiles elicited by a video of a cute baby, but video analysis showed that the progression of the two kinds of smiles was quite different: Often, the happy smiles built up gradually, while frustrated smiles appeared quickly but faded fast.
In such experiments, researchers usually rely on acted expressions of emotion, Hoque says, which may provide misleading results. “The acted data was much easier to classify accurately” than the real responses, he says. But when trying to interpret images of real responses, people performed no better than chance, assessing these correctly only about 50 percent of the time.
Understanding the subtleties that reveal underlying emotions is a major goal of this research, Hoque says. “People with autism are taught that a smile means someone is happy,” he says, but research shows that it’s not that simple.
While people may not know exactly what cues they are responding to, timing does have a lot to do with how people interpret expressions, he says, For example, former British prime minister Gordon Brown was widely seen as having a phony smile, largely because of the unnatural timing of his grin, Hoque says. Similarly, a campaign commercial for former presidential candidate Herman Cain featured a smile that developed so slowly — it took nine seconds to appear — that it was widely parodied, including a spoof by comedian Stephen Colbert. “Getting the timing right is very crucial if you want to be perceived as sincere and genuine with your smiles,” Hoque says.
Jeffrey Cohn, a professor of psychology at the University of Pittsburgh who was not involved in this research, says this work “breaks new ground with its focus on frustration, a fundamental human experience. While pain researchers have identified smiling in the context of expressions of pain, the MIT group may be the first to implicate smiles in expressions of negative emotion.”
Cohn adds, “This is very exciting work in computational behavioral science that integrates psychology, computer vision, speech processing and machine learning to generate new knowledge … with clinical implications.” He says this “is an important reminder that not all smiles are positive. There has been a tendency to ‘read’ enjoyment whenever smiles are found. For human-computer interaction, among other fields and applications, a more nuanced view is needed.”
In addition to providing training for people who have difficulty with expressions, the findings may be of interest to marketers, Hoque says. “Just because a customer is smiling, that doesn’t necessarily mean they’re satisfied,” he says. And knowing the difference could be important in gauging how best to respond to the customer, he says: “The underlying meaning behind the smile is crucial.”
The analysis could also be useful in creating computers that respond in ways appropriate to the moods of their users. One goal of the research of Affective Computing Group is to “make a computer that’s more intelligent and respectful,” Hoque says.
Source: Science Daily