Posts tagged psychology
Posts tagged psychology
Research has revealed that multitasking impedes performance across a variety of tasks. Emergency room nurses that are interrupted multiple times while treating a patient can be more likely to make medication errors. Driving while speaking on a mobile phone significantly increases the probability of an automobile accident. At the same time, however, experienced golfers putt better when distracted than experienced golfers who are focusing on performance. Distractions resulting from the presence of other people can increase an individual’s performance, too. Why?
Addressing the Contradictions
In a forthcoming issue of Psychological Science, one of the world’s top-ranked empirical journals in psychology, a team of researchers from the University of Basel helps to clarify these apparent contradictions. Lead author Janina Hoffmann, a Ph.D. student in Economic Psychology, and her co-authors Dr. Bettina von Helversen and Prof. Dr. Jörg Rieskamp, find that the type of judgment strategy that an individual employs strongly conditions how the “cognitive load” induced by multitasking affects performance. Higher cognitive load can actually improve performance when the task can be best completed using a less demanding, similarity-based strategy that informs judgments by retrieving past instances from memory.
The study is supported by the findings of two experiments conducted at the University of Basel. The first study exposed 90 participants to variable cognitive loads as they were asked to solve a judgment task whose solution was best achieved through the use of a similarity-based strategy (predicting how many cartoon characters another cartoon character could catch). Most participants switched to using a similarity-based strategy and produced more accurate judgments. The second study then exposed 60 participants to a linear task whose solution was not conducive to similarity-based strategies but rather rule- based strategies. Those participants who employed a similarity-based strategy made poorer judgments. The experiments were conducted with financial support from the Swiss National Science Foundation.
Cognitive load does not per se lead to worse performance, but rather it can, dependent on strategy choice, lead to better performance. The researchers believe that it is important to decipher cognitive strategies that people choose under given levels of cognitive load. Hoffmann claims, “A better understanding of these cognitive strategies may permit future studies to predict the precise circumstances under which people can solve a problem particularly well.”
Human intelligence cannot be explained by the size of the brain’s frontal lobes, say researchers.
Research into the comparative size of the frontal lobes in humans and other species has determined that they are not - as previously thought - disproportionately enlarged relative to other areas of the brain, according to the most accurate and conclusive study of this area of the brain.
It concludes that the size of our frontal lobes cannot solely account for humans’ superior cognitive abilities.
The study by Durham and Reading universities suggests that supposedly more ‘primitive’ areas, such as the cerebellum, were equally important in the expansion of the human brain. These areas may therefore play unexpectedly important roles in human cognition and its disorders, such as autism and dyslexia, say the researchers.
The study is published in the Proceedings of the National Academy of Sciences (PNAS) today.
The frontal lobes are an area in the brain of mammals located at the front of each cerebral hemisphere, and are thought to be critical for advanced intelligence.
Lead author Professor Robert Barton from the Department of Anthropology at Durham University, said: “Probably the most widespread assumption about how the human brain evolved is that size increase was concentrated in the frontal lobes.
“It has been thought that frontal lobe expansion was particularly crucial to the development of modern human behaviour, thought and language, and that it is our bulging frontal lobes that truly make us human. We show that this is untrue: human frontal lobes are exactly the size expected for a non-human brain scaled up to human size.
“This means that areas traditionally considered to be more primitive were just as important during our evolution. These other areas should now get more attention. In fact there is already some evidence that damage to the cerebellum, for example, is a factor in disorders such as autism and dyslexia.”
The scientists argue that many of our high-level abilities are carried out by more extensive brain networks linking many different areas of the brain. They suggest it may be the structure of these extended networks more than the size of any isolated brain region that is critical for cognitive functioning.
Previously, various studies have been conducted to try and establish whether humans’ frontal lobes are disproportionately enlarged compared to their size in other primates such as apes and monkeys. They have resulted in a confused picture with use of different methods and measurements leading to inconsistent findings.
The Durham and Reading researchers, funded by The Leverhulme Trust, analysed data sets from previous animal and human studies using phylogenetic, or ‘evolutionary family tree’, methods, and found consistent results across all their data. They used a new method to look at the speed with which evolutionary change occurred, concluding that the frontal lobes did not evolve especially fast along the human lineage after it split from the chimpanzee lineage.
Name: Tommy McHugh
Disorder: Sudden artistic output following brain damage
“I was sitting on the toilet. I suddenly felt an explosion in the left side of my head and ended up on the floor. I think the only thing that kept me conscious was that I didn’t want to be found with my pants down. Then the other side of my head went bang! I woke up in hospital and looked out of the window to see the tree was sprouting numbers. 3, 6, 9. Then I started talking in rhyme…”
Ten days after having a subarachnoid haemorrhage – a stroke caused by bleeding in and around the brain – Tommy McHugh, an ex-con who’d been in his fair share of scraps, became a new man, with a personality that nobody recognised.
When he was a young man, Tommy did time in prison. But after his stroke at age 51, everything changed. “I could taste the femininity inside of myself,” he said. “My head was full of rhymes and images and pictures.”
Not only did he feel a sudden urge to write poetry, but he also began to paint and draw obsessively for up to 19 hours a day. He was never artistic before – in fact, he joked that he’d never even been in an art gallery “except to maybe steal something”.
Desperate to find out what was going on, Tommy wrote to several neuroscientists and end up working closely with Alice Flaherty at Harvard Medical School and Mark Lythgoe at University College London.
Flaherty says the haemorrhage sent blood squirting around the brain surface, affecting a lot of areas. It left Tommy unusually emotional and unable to hurt anyone, “like Zen monks sweeping steps before they walk,” says Flaherty. “Everything strikes him as beautiful and cosmically meaningful.”
Scanning Tommy’s brain was impossible after an operation to treat the stroke damage left him with a piece of metal in his head. Instead, Lythgoe performed a neuropsychological evaluation. Tommy’s IQ was in the normal range. However, he showed verbal disinhibition – he tended to talk a lot – and had difficulty with tests that required him to switch between different cognitive tasks. All of which suggested problems with the frontal lobes.
The frontal lobes play a vital role in abstract thought and creativity. They are constantly bombarded with raw sensory data from the world around us, most of which is deemed irrelevant by the brain and screened from conscious awareness. Blocking this inhibition using magnetic pulses can make people more creative, even unleashing savant-like skills.
“That’s what Tommy’s mind does all the time,” says Lythgoe. Everything he heard and saw triggered a stream of associations that he found difficult to stop. Tommy saw it as having a brain that shows him “endless, endless corridors”. He said his paintings represented a snapshot of a millisecond in his brain.
“I’ll paint three or six or nine pictures at a time. I see those numbers in my head all the time. Canvases became too costly, so I started painting the ceilings and the wallpaper and the floor. I can’t stop painting and sculpting. Give me a mountain and I’ll turn it into a profile. If you give me a bare tree I’ll change it, so when spring come all the leaves will create the face, the mouth, the lips. Without hurting the tree.”
Offering advice for others with brain damage, he said that people who have had strokes need to learn not to think of themselves as ill, with the dangers of depression that can bring. “Some repairs to the brain are constructive, some are negative. One has to learn to develop one’s damaged brain, adapt and start to live again. You can either sit on your bum or look in the mirror and say ‘I’m alive’.”
He wouldn’t even have wanted his old mind back: “The most wonderful thing that happened to Tommy McHugh,” he laughed, “is having a stroke while doing a poo.”
He wouldn’t have changed a thing. “My two strokes have given me 11 years of a magnificent adventure that nobody could have expected.”
Tommy McHugh passed away on 19 September 2012, having spoken to New Scientist several times that year. Samples of his artwork can be viewed on his website.
When trouble approaches, what do you do? Run for the hills? Hide? Pretend it isn’t there? Or do you focus on the promise of rain in those looming dark clouds?
New research suggests that the way you regulate your emotions, in bad times and in good, can influence whether – or how much – you suffer from anxiety.
The study appears in the journal Emotion.
In a series of questionnaires, researchers asked 179 healthy men and women how they managed their emotions and how anxious they felt in various situations. The team analyzed the results to see if different emotional strategies were associated with more or less anxiety.
The study revealed that those who engage in an emotional regulation strategy called reappraisal tended to also have less social anxiety and less anxiety in general than those who avoid expressing their feelings. Reappraisal involves looking at a problem in a new way, said University of Illinois graduate student Nicole Llewellyn, who led the research with psychology professor Florin Dolcos, an affiliate of the Beckman Institute at Illinois.
“When something happens, you think about it in a more positive light, a glass half full instead of half empty,” Llewellyn said. “You sort of reframe and reappraise what’s happened and think what are the positives about this? What are the ways I can look at this and think of it as a stimulating challenge rather than a problem?”
Study participants who regularly used this approach reported less severe anxiety than those who tended to suppress their emotions.
Anxiety disorders are a major public health problem in the U.S. According to the National Institute of Mental Health, roughly 18 percent of the U.S. adult population is afflicted with general or social anxiety that is so intense that it warrants a diagnosis.
“The World Health Organization predicts that by 2020, anxiety and depression –which tend to co-occur – will be among the most prevalent causes of disability worldwide, secondary only to cardiovascular disease,” Dolcos said. “So it’s associated with big costs.”
Not all anxiety is bad, however, he said. Low-level anxiety may help you maintain the kind of focus that gets things done. Suppressing or putting a lid on your emotions also can be a good strategy in a short-term situation, such as when your boss yells at you, Dolcos said. Similarly, an always-positive attitude can be dangerous, causing a person to ignore health problems, for example, or to engage in risky behavior.
Previous studies had found that people who were temperamentally inclined to focus on making good things happen were less likely to suffer from anxiety than those who focused on preventing bad things from happening, Llewellyn said. But she could find no earlier research that explained how this difference in focus translated to behaviors that people could change. The new study appears to explain the strategies that contribute to a person having more or less anxiety, she said.
“This is something you can change,” she said. “You can’t do much to affect the genetic or environmental factors that contribute to anxiety. But you can change your emotion regulation strategies.”
I first met Henry Molaison more than half a century ago, during the spring of my third year in graduate school. I have tried to resurrect the details of my interactions with him that week, but human memory does not allow such excursions. The explicit minutiae of unique episodes fade as time passes, making it impossible for us to vividly re-experience the details of events in the distant past. What I do know is that I was very excited to have the opportunity to study such a rare case as Henry, and I had spent months preparing. Looking back at the results of all the tests he did that week, it was clear even then that the consequences of the operation carried out on him in 1957 – an experimental procedure to cure his epilepsy – had been catastrophic. Henry was left in a permanent state of amnesia, unable to retain any new information.
At the time of Henry’s operation, little was known about how memory processes worked. The extensive damage to the inner part of the temporal lobes on both sides of Henry’s brain made him a vital case study for memory researchers then and now. As the years passed, his fame grew and eventually spread to countries outside North America – and all that time Henry was stuck in the same moment. From time to time, I would tell him how important and well known he was, and he would smile sheepishly, as the praise was already slipping out of his consciousness. In his lifetime he was known as HM; only after his death, in 2008, was his identity revealed to the world.
The pain sensations of others can be felt by some people, just by witnessing their agony, according to new research.
A Monash University study into the phenomenon known as somatic contagion found almost one in three people could feel pain when they see others experience pain. It identified two groups of people that were prone to this response - those who acquire it following trauma, injury such as amputation or chronic pain, and those with the condition present at birth, known as the congenital variant.
Presenting her findings at the Australian and New Zealand College of Anaesthetists’ annual scientific meeting in Melbourne earlier this week, Dr Melita Giummarra, from the School of Psychology and Psychiatry, said in some cases people suffered severe painful sensations in response to another person’s pain.
“My research is now beginning to differentiate between at least these two unique profiles of somatic contagion,” Dr Giummarra said.
“While the congenital variant appears to involve a blurring of the boundary between self and other, with heightened empathy, acquired somatic contagion involves reduced empathic concern for others, but increased personal distress.
“This suggests that the pain triggered corresponds to a focus on their own pain experience rather than that of others.”
Most people experience emotional discomfort when they witness pain in another person and neuroimaging studies have shown that this is linked to activation in the parts of the brain that are also involved in the personal experience of pain.
Dr Giummarra said for some people the pain they ‘absorb’ mirrors the location and site of the pain in another they are witnessing and is generally localised.
“We know that the same regions of the brain are activated for these groups of people as when they experience their own pain. First in emotional regions but then there is also sensory activation. It is a vicarious – it literally triggers their pain, Dr Giummarra said”
Dr Giummarra has developed a new tool to characterise the reactions people have to pain in others that is also sensitive to somatic contagion – the Empathy for Pain Scale.
In the absence of any real progress in defining neuronal codes for the brain, the simple idea of the grandmother cell continues to percolate through the scientific and popular literature. Many researchers have reported marked increases in the firing rate of otherwise quiet or idling neurons in response to very specific stimuli, like for example, a picture of grandma. If these experiments are taken at face value, we must accept that grandmother cells, at least in some form, exist. Last December, Asim Roy from Arizona State revived some discussion of this topic with a paper in Frontiers in Cognitive Science. He has just released a follow-up paper in the same journal where he seeks to further extend the idea of the grandmother cell into a more general concept cell principle. A further implication of his paper is that such localist neurons should not be rare in the brain, but rather a commonly found feature.
The concept cell derives from an expanding body of research showing that some neurons respond not just to a constellation of stimulus features within a given sensory modality, but also to invariant ideas. For example, researchers have previously reported finding an “Oprah Winfrey” concept cell that could be excited not just by visual percepts of Oprah, but also her name, and even the sound of her name. Roy’s new paper suggests that concepts cells would have meaning by themselves, in contrast to neurons in a distributed model, which would represent ideas only as a pattern of activity across a network.
The concept cell theory has been dismissed by many researchers, but represents a valid extremum on the continuity of ways neuron networks can be structured. As such, a theory like this needs to be disproven rather than ignored. Even better then being disproven, a more detailed theory would be welcome. One possible interpretation that reconciles concept cells with distributed network models is to simply have distributed networks of concept cells. When fishing down through the cortex along any given electrode penetration path, it is quite possible to have many quiescent concept cells all around that for whatever reason are not activated at that moment, or are otherwise hidden to the experimenter. Interpreting cells participating in a distributed network as concept cells might just be a lack of sufficient sampling of the relevant network. In that case, the larger reality would be that both viewpoints are just two different interpretations of the same underlying phenomenon.
To get around objections that the idea space is practically infinite while the number of cells that might represent it is finite, Roy notes that concept cells need not be limited to a single concept. At this point, it might be productive to proceed by imagining how concept cells might emerge in a network. For example, would a baby already have grandmother cells? Most would probably argue they don’t. A newborn has never seen its grandmother, and although he or she may have some built-in structural hierarchy, that hierarchy has yet to be flashed with very many unique or salient icons. It therefore might be reasonable to assume neurons start out in some kind of distributed mode, but represent little other than perhaps what they experienced in the womb.
When young kids first take up little league baseball or soccer, they generally attempt (at least in the beginning) to maximize their fun such that everyone in the field goes after every ball no matter where it is hit or kicked. Similarly in the newly hatched brain, neurons may quickly learn that spiking at every perturbation that comes its way quickly becomes exhausting. Furthermore, it seems that making synaptic partners indiscriminately must in some way be disadvantageous to the neuron. Competitive mechanisms appear to be in place that link neuron activity and growth to as yet fully defined reward on the molecular level. Such neural Darwinism might simply be the struggle for access to nutrients from the vasculature, like glucose and oxygen, and to dispose of metabolites, like transmitter byproducts. These processes might be enhanced by making the right synaptic partners residing on coveted real estate, and spiking most often at the right time to greatest effect.
As the young athletes learn to adopt more predictive strategies of play, their movements are directed to where the ball is going to be rather than where it is at any given moment. In the extreme, this imperative crystallizes the field into variously named positions with uniquely defined roles and skill sets. Similarly in the brain, the emergence of concept cells could develop over time as a fundamental byproduct of the need to adopt the most energy efficient representations of sensory inputs that map to motor outputs. Included in these sensorimotor hand-offs would be inputs from the body itself, and other expressive or physiologic outputs constrained by the structure of the organism. There are no immediate indications that these transitional representatives in the brain need correspond to real concepts built upon possible activities that can occur in the environment, but there is also no reason why that cannot be the case.
Within the human medial temporal lobe (MTL), up to 40% of the neurons found in some studies have been classified as concept cells. The classification criteria and activity patterns recorded here would warrant closer inspection to draw sweeping conclusions, but some immediate observations have been made. For example, the maximum activation found was reported as a 300-fold increase in spike rate. The background spike rate of a cortical neuron tends to be low, perhaps approaching zero in many cases, so perhaps a better indicator would be an absolute maximum spike rate. We might simply assume a spontaneous background rate of 1 hz for such a cell, and 300 hz for its instantaneous response to an optimal stimulus. We can also ask the following theoretical question: under what conditions does it make sense, from an energetic perspective, for cells within a given network to respond at these relatively fantastic rates to certain rare concepts, while for most others not at all?
Part of the answer may depend on how hard it is for cells to fire at incrementally fast rates, and also how numerous and far away their targets are. Another important consideration is whether the cells can afford to fire at elevated rates on a continued basis without incurring significant damage to themselves. One can even speculate whether there might exist optimal frequencies where possible resonant flow of ions, or overlap of electrical and pressure pulse waves may afford more efficient spiking when high spike rates are called for. In contrast to the cortex, the retinal ganglion cells which comprise the optic nerve tend to fire continuously at relatively high spontaneous rates. Excitatory inputs to retinal ganglion cells result in an increased firing rate while inhibitory inputs result in a depressed rate of firing.
Having a high spontaneous rate gives maximal flexibility and sensitivity for the retina, which is one place where energy expenditure is probably not the major decision point. Another way to look at these cells is that since they can not fire negative spikes, they can effectively double their bandwidth by going with an elevated spontaneous rate in the absence of a stimulus. It is a similar strategy to that often used in electronics for analog-to-digital signal conversion, where bipolar signal sources might not be readily available, and also for small signal amplification in situations where rail-to-tail power sources may otherwise be inconvenient.
In reality, retinal ganglion cell spontaneous rate would probably not be fully one-half that of their maximal rate, but considerably less. A key point to realize is that an important feature of an adaptive system like this is the built-in ability to adjust spontaneous rate across the network according to attention, arousal, and stimulus conditions. This optimizes sensitivity under the dual constraints of the energy available, and the need to eliminate toxic byproducts of using that energy. Whether a neuron can run itself to death by exhaustion, like a racehorse might occasionally do, or whether natural feedback mechanisms in the normal condition would generally prevent this, is unknown. At some point in going inward from the sensory level to the higher cortical areas of the brain, information flow (at least from the retina) transitions to a sparser, lower spontaneous rate environment. At what level, or time, concept cells might begin to appear is only beginning to be unraveled.
Much of the brain can be viewed hierarchically, but there is almost always significant feedback at, across, and among levels. In proceeding hierarchically from sensory to association areas, there seems to be significant convergence from temporal lobe association areas to the hippocampus. The output of the hippocampus then converges, along with other significant pathways from the brain and brainstem, on to particular regions of the interconnected hypothalamus. Ultimately this convergence culminates at specific cells in certain nuclei that convert the electrical currency of the brain into dollops of potent chemical secretions which are active at nanomolar concentrations in the blood.
In the extreme, we could imagine the ultimate concept cells as those few kingpins in certain hypothalamic nuclei controlling things like growth hormone or sex steroid release. These electoral cells spritz appropriately according to both their many far-flung advisors, and to local consensus to control the time and magnitude of each release. Similarly in the deep layers of the motor cortex, the large Betz cells appear to make disproportionately large contributions to motor command to the spinal cord.
Finding these variously incarnated kingpin cells is a major goal in building successful brain-computer interfaces (BCIs), particularly when the number of electrodes is limited. Generally, one does not want to risk stimulating these to death or approaching them too close when trying to hear what they might say. Increasingly, in human experiments, the methods section of the eventual published paper includes statements like, “the subject was then told to focus their thoughts on the target (particular movement).” While no doubt that is a very powerful experimental technique, at this point in time at least, it is also quite vague. Fleshing out exactly what happens when we “focus one’s thoughts,” is perhaps one the most important research questions of our day.
Visionary study Age may dim our eyes, but our brains make sure aspects of the rich world of colour experience defy the passing of time, a UK scientist has found.
It’s well known that our colour vision declines with age. Gradual yellowing of the lenses cuts out light in the blue range of the spectrum, while colour-sensing cone receptors on our retinas slowly lose sensitivity.
“Our ability to discriminate small colour differences declines as we age, there is no doubt about that,” says neuroscientist Sophie Wuerger from the Department of Psychological Sciences, University of Liverpool.
But she has found our brains apparently compensate for at least some of these physical frailties. Her results are published online this week in the journal PLoS One.
Wuerger explored the colour perception of 185 people aged between 18 and 75 years with normal colour vision, an unusually large and diverse group for a study of this kind.
First, she used well-known data on how the lens changes with age to predict the light signal that would be sent to the brain by the volunteers’ retinas.
She then asked the participants to undertake a variety of tests that required them to select patches of colour representing pure red, green, yellow, or blue, under different lighting conditions.
The idea was to compare the predicted physiological changes in the eye with the participants’ actual experience of colours.
“That’s the surprising bit. If you look just at the lens, it should introduce significant colour changes in older people, but we observed that … most of the time we have a very constant perception and it doesn’t change with age,” says Wuerger.
The only age-related effects detected in the study were small changes that became apparent for green hues viewed under daylight.
In other words, although the colour signal being sent from the eye was changing significantly with age, the perception of colour was almost constant regardless of how old the study subject was.
This suggests that somewhere between the retina and the conscious perception of colour, the brain must recalibrate itself, she says.
“Something must be happening to change neural connections to maintain constant colour appearance,” Wuerger says.
Exactly how this happens was not part of this study, but Wuerger offers one possible explanation.
“You could think our brain might be using some external standard like the blue sky or sunlight as a reference. There are things in the environment that don’t change and we could use them to recalibrate our visual system.”
One useful clue about the mechanisms involved came from the fact that age did not affect all aspects of the visual system equally. While 18 year olds and 75 year olds were equally good at picking pure red or green and so on, older people were less able to distinguish between subtly different colours, particularly in the bluish range.
Because the recalibration doesn’t affect all our colour vision abilities, Wuerger concludes the adjustment isn’t likely to be taking place in the retina.
“I think that suggests that it must be happening later in the visual processing pathway, closer to the brain. We don’t have any proof of that but the experiments taken together suggest it’s … a kind of plasticity in the adult brain.”
The next question might be why the brain performs this recalibration. What benefit is there in ensuring our perception of colours remains constant? For now, answering that question requires entering the realm of speculation.
Perhaps it has to do with a need to communicate colours effectively when describing objects, Wuerger ventures. “After all, to communicate colour meaningfully,” she says with a chuckle, “we all need to be - so to speak - on the same wavelength.”
Children of parents who were addicted to drugs or alcohol are more likely to be depressed in adulthood, according to a new study by University of Toronto researchers.
“These findings underscore the intergenerational consequences of drug and alcohol addiction and reinforce the need to develop interventions that support healthy childhood development,” said the study’s lead author, Esme Fuller-Thomson, professor and Sandra Rotman Endowed Chair in the University of Toronto’s Factor-Inwentash Faculty of Social Work and the Department of Family and Community Medicine.
In a paper published online in the journal Psychiatry Research this month, investigators examined the association between parental addictions and adult depression in a representative sample of 6,268 adults, drawn from the 2005 Canadian Community Health Survey.
Of these respondents, 312 had a major depressive episode within the year preceding the survey and 877 reported that while they were under the age of 18 and still living at home that at least one parent who drank or used drugs “so often that it caused problems for the family.”
Results indicate that individuals whose parents were addicted to drugs or alcohol are more likely to develop depression than their peers. After adjusting for age, sex and race, parental addictions were associated with more than twice the odds of adult depression, says Fuller-Thomson.
“Even after adjusting for factors ranging from childhood maltreatment and parental unemployment to adult health behaviours including smoking and alcohol consumption, we found that parental addictions were associated with 69 per cent higher odds of depression in adulthood,” explains Fuller-Thomson. The study was co-authored with four graduate students at the University of Toronto: Robyn Katz, Vi Phan, Jessica Liddycoat and Sarah Brennenstuhl.
This study could not determine the cause of the relationship between parental addictions and adult depression. Co-author Robyn Katz, suggests that “It is possible that the prolonged and inescapable strain of parental addictions may permanently alter the way these children’s bodies react to stress throughout their life.
“One important avenue for future research is to investigate potential dysfunctions in cortisol production – the hormone that prepares us for ‘fight or flight’ – which may influence the later development of depression.”
“As an important first step, children who experience toxic stress at home can be greatly helped by the stable involvement of caring adults, including grandparents, teachers, coaches, neighbours and social workers,” said Fuller-Thomson. “Although more research is needed to determine if access to a responsive and loving adult decreases the likelihood of adult depression among children exposed to parental addictions, we do know that these caring relationships promote healthy development and buffer stress.”
Different brain areas are activated when we choose to suppress an emotion, compared to when we are instructed to inhibit an emotion, according a new study from the UCL Institute of Cognitive Neuroscience and Ghent University.
In this study, published in Brain Structure and Function, the researchers scanned the brains of healthy participants and found that key brain systems were activated when choosing for oneself to suppress an emotion. They had previously linked this brain area to deciding to inhibit movement.
“This result shows that emotional self-control involves a quite different brain system from simply being told how to respond emotionally,” said lead author Dr Simone Kuhn (Ghent University).
In most previous studies, participants were instructed to feel or inhibit an emotional response. However, in everyday life we are rarely told to suppress our emotions, and usually have to decide ourselves whether to feel or control our emotions.
In this new study the researchers showed fifteen healthy women unpleasant or frightening pictures. The participants were given a choice to feel the emotion elicited by the image, or alternatively to inhibit the emotion, by distancing themselves through an act of self-control.
The researchers used functional magnetic resonance imaging (fMRI) to scan the brains of the participants. They compared this brain activity to another experiment where the participants were instructed to feel or inhibit their emotions, rather than choose for themselves.
Different parts of the brain were activated in the two situations. When participants decided for themselves to inhibit negative emotions, the scientists found activation in the dorso-medial prefrontal area of the brain. They had previously linked this brain area to deciding to inhibit movement.
In contrast, when participants were instructed by the experimenter to inhibit the emotion, a second, more lateral area was activated.
“We think controlling one’s emotions and controlling one’s behaviour involve overlapping mechanisms,” said Dr Kuhn.
“We should distinguish between voluntary and instructed control of emotions, in the same way as we can distinguish between making up our own mind about what do, versus following instructions.”
Regulating emotions is part of our daily life, and is important for our mental health. For example, many people have to conquer fear of speaking in public, while some professionals such as health-care workers and firemen have to maintain an emotional distance from unpleasant or distressing scenes that occur in their jobs.
Professor Patrick Haggard (UCL Institute of Cognitive Neuroscience) co-author of the paper said the brain mechanism identified in this study could be a potential target for therapies.
“The ability to manage one’s own emotions is affected in many mental health conditions, so identifying this mechanism opens interesting possibilities for future research.
“Most studies of emotion processing in the brain simply assume that people passively receive emotional stimuli, and automatically feel the corresponding emotion. In contrast, the area we have identified may contribute to some individuals’ ability to rise above particular emotional situations.
“This kind of self-control mechanism may have positive aspects, for example making people less vulnerable to excessive emotion. But altered function of this brain area could also potentially lead to difficulties in responding appropriately to emotional situations.”