Neuroscience

Articles and news from the latest research reports.

Posts tagged psychology

324 notes

Distinguishing Brain From Mind
In coming years, neuroscience will answer questions we don’t even yet know to ask. Sometimes, though, focus on the brain is misleading. 
From the recent announcement of President Obama’s BRAIN Initiative to the Technicolor brain scans (“This is your brain on God/love/envy etc”) on magazine covers all around, neuroscience has captured the public imagination like never before.
Understanding the brain is of course essential to developing treatments for devastating illnesses like schizophrenia and Parkinson’s. More abstract but no less compelling, the functioning of the brain is intimately tied to our sense of self, our identity, our memories and aspirations. But the excitement to explore the brain has spawned a new fixation that my colleague Scott Lilienfeld and I call neurocentrism — the view that human behavior can be best explained by looking solely or primarily at the brain.
Sometimes the neural level of explanation is appropriate. When scientists develop diagnostic tests or a medications for, say, Alzheimer’s disease, they investigate the hallmarks of the condition: amyloid plaques that disrupt communication between neurons, and neurofibrillary tangles that degrade them.
Other times, a neural explanation can lead us astray. In my own field of addiction psychiatry, neurocentrism is ascendant — and not for the better. Thanks to heavy promotion by the National Institute on Drug Abuse, part of the National Institutes of Health, addiction has been labeled a “brain disease.”
The logic for this designation, as explained by former director Alan I. Leshner, is that “addiction is tied to changes in brain structure and function.” True enough, repeated use of drugs such as heroin, cocaine, and alcohol alter the neural circuits that mediate the experience of pleasure as well as motivation, memory, inhibition, and planning — modifications that we can often see on brain scans.
The critical question, though, is whether this neural disruption proves that the addict’s behavior is involuntary and that he is incapable of self-control. It does not.
Take the case of actor Robert Downey, Jr., whose name was once synonymous with celebrity addiction. He said, “It’s like I have a loaded gun in my mouth and my finger’s on the trigger, and I like the taste of gunmetal.” Downey went though episodes of rehabilitation and then relapse, but ultimately decided, while in the throes of “brain disease,” to change his life.
The neurocentric model leaves the addicted person (Downey, in this case) in the shadows. Yet to treat addicts and guide policy, it is important to understand how addicts think. It is the minds of addicts that contain the stories of how addiction happens, why they continue to use, and, if they decide to stop, how they manage. The answers can’t be divined from an examination of his brain, no matter how sophisticated the probe.
It is only natural that advances in knowledge about the brain make us think more mechanistically about ourselves. But in one venue, in particular - the courtroom - this bias can be a prescription for confusion. The brain-based defense (“Look at this fMRI scan, your Honor. My client’s brain made him do it.”) is now commonplace in capital defenses. The problem with these claims is that, with rare exception, neuroscientists cannot yet translate aberrant brain functions into the legal requirements for criminal responsibility — intent, rational capacity and self-control.
What we know about many criminals is that they did not control themselves. That is very different from being unable to do so. To date, brain science cannot allow us to distinguish between these alternatives. What’s more, even abnormal-looking brains, have owners who are otherwise quite normal.
Looking to the future, some neuroscientists envision a dramatic transformation of criminal law. David Eagleman of the Baylor College of Medicine’s Initiative on Neuroscience and Law, hopes that “we may someday find that many types of bad behavior have a basic biological explanation [and] eventually think about bad decision making in the same way we think about any physical process, such as diabetes or lung disease.”
But is this the correct conclusion to draw from neuroscience? If every troublesome behavior is eventually traced to correlates of brain activity that we can detect and visualize, will we be able to excuse it on a don’t-blame-me-blame my-brain theory? Will no one ever be judged responsible?
Eagleman’s way of thinking represents what law professor Stephen Morse calls the “psycho-legal error,” our powerful temptation to equate cause with excuse. Morse notes that the law excuses criminal behavior only when a causal factor produces an impairment so severe that it deprives the defendant of his or her rationality. Bad genes, bad parents, or even bad stars are not an excuse.
Finally, what are the implications of brain science for morality? Although we generally think of ourselves as free agents who make choices, a number of prominent scholars claim that we are mistaken. "Our growing knowledge about the brain makes the notions of volition, culpability, and, ultimately, the very premise of the criminal justice system, deeply suspect," contends biologist Robert Sapolsky.
To be sure, everyone agrees that people can be held accountable only if they have freedom of choice. But, there is a longstanding debate about the kind of freedom that is necessary. Some contend that we can be held accountable as long as we are able to engage in conscious deliberation, follow rules, and generally control ourselves.
Others, like Sapolsky, disagree, insisting that our deliberations and decisions do not make us free because they are dictated by neuronal circumstances. They say that, as we come to understand the mechanical workings of our brains, we’ll be compelled to adopt a strictly utilitarian model of justice in which criminals are “punished” solely as a way to change their behavior, not because they truly deserve blame.
Although it’s cloaked in neuroscientific garb, this free-will question remains one of the great conceptual impasses of all time, far beyond the capacity of brain science to resolve. Unless, that is, investigators can show something truly spectacular: that people are not conscious beings whose actions flow from reasons and who are responsive to reason. True, we do not exert as much conscious control over our actions as we think we do. Every student of the mind, beginning most notably with William James and Sigmund Freud, knows this. But it doesn’t mean we are powerless.
The study of the brain is said to be the final scientific frontier. Will we lose sight of the mind, though, in the age of brain science? While the scans are dazzling and the technology an unqualified marvel, we can always keep our bearings by remembering that the brain and the mind are two different frameworks.
The neurobiological domain is one of brains and physical causes, the mechanisms behind our thoughts and emotions. The psychological domain, the realm of the mind, is one of people — their desires, intentions, ideals, and anxieties. Both are essential to a full understanding of why we act as we do.

Distinguishing Brain From Mind

In coming years, neuroscience will answer questions we don’t even yet know to ask. Sometimes, though, focus on the brain is misleading.

From the recent announcement of President Obama’s BRAIN Initiative to the Technicolor brain scans (“This is your brain on God/love/envy etc”) on magazine covers all around, neuroscience has captured the public imagination like never before.

Understanding the brain is of course essential to developing treatments for devastating illnesses like schizophrenia and Parkinson’s. More abstract but no less compelling, the functioning of the brain is intimately tied to our sense of self, our identity, our memories and aspirations. But the excitement to explore the brain has spawned a new fixation that my colleague Scott Lilienfeld and I call neurocentrism — the view that human behavior can be best explained by looking solely or primarily at the brain.

Sometimes the neural level of explanation is appropriate. When scientists develop diagnostic tests or a medications for, say, Alzheimer’s disease, they investigate the hallmarks of the condition: amyloid plaques that disrupt communication between neurons, and neurofibrillary tangles that degrade them.

Other times, a neural explanation can lead us astray. In my own field of addiction psychiatry, neurocentrism is ascendant — and not for the better. Thanks to heavy promotion by the National Institute on Drug Abuse, part of the National Institutes of Health, addiction has been labeled a “brain disease.”

The logic for this designation, as explained by former director Alan I. Leshner, is that “addiction is tied to changes in brain structure and function.” True enough, repeated use of drugs such as heroin, cocaine, and alcohol alter the neural circuits that mediate the experience of pleasure as well as motivation, memory, inhibition, and planning — modifications that we can often see on brain scans.

The critical question, though, is whether this neural disruption proves that the addict’s behavior is involuntary and that he is incapable of self-control. It does not.

Take the case of actor Robert Downey, Jr., whose name was once synonymous with celebrity addiction. He said, “It’s like I have a loaded gun in my mouth and my finger’s on the trigger, and I like the taste of gunmetal.” Downey went though episodes of rehabilitation and then relapse, but ultimately decided, while in the throes of “brain disease,” to change his life.

The neurocentric model leaves the addicted person (Downey, in this case) in the shadows. Yet to treat addicts and guide policy, it is important to understand how addicts think. It is the minds of addicts that contain the stories of how addiction happens, why they continue to use, and, if they decide to stop, how they manage. The answers can’t be divined from an examination of his brain, no matter how sophisticated the probe.

It is only natural that advances in knowledge about the brain make us think more mechanistically about ourselves. But in one venue, in particular - the courtroom - this bias can be a prescription for confusion. The brain-based defense (“Look at this fMRI scan, your Honor. My client’s brain made him do it.”) is now commonplace in capital defenses. The problem with these claims is that, with rare exception, neuroscientists cannot yet translate aberrant brain functions into the legal requirements for criminal responsibility — intent, rational capacity and self-control.

What we know about many criminals is that they did not control themselves. That is very different from being unable to do so. To date, brain science cannot allow us to distinguish between these alternatives. What’s more, even abnormal-looking brains, have owners who are otherwise quite normal.

Looking to the future, some neuroscientists envision a dramatic transformation of criminal law. David Eagleman of the Baylor College of Medicine’s Initiative on Neuroscience and Law, hopes that “we may someday find that many types of bad behavior have a basic biological explanation [and] eventually think about bad decision making in the same way we think about any physical process, such as diabetes or lung disease.”

But is this the correct conclusion to draw from neuroscience? If every troublesome behavior is eventually traced to correlates of brain activity that we can detect and visualize, will we be able to excuse it on a don’t-blame-me-blame my-brain theory? Will no one ever be judged responsible?

Eagleman’s way of thinking represents what law professor Stephen Morse calls the “psycho-legal error,” our powerful temptation to equate cause with excuse. Morse notes that the law excuses criminal behavior only when a causal factor produces an impairment so severe that it deprives the defendant of his or her rationality. Bad genes, bad parents, or even bad stars are not an excuse.

Finally, what are the implications of brain science for morality? Although we generally think of ourselves as free agents who make choices, a number of prominent scholars claim that we are mistaken. "Our growing knowledge about the brain makes the notions of volition, culpability, and, ultimately, the very premise of the criminal justice system, deeply suspect," contends biologist Robert Sapolsky.

To be sure, everyone agrees that people can be held accountable only if they have freedom of choice. But, there is a longstanding debate about the kind of freedom that is necessary. Some contend that we can be held accountable as long as we are able to engage in conscious deliberation, follow rules, and generally control ourselves.

Others, like Sapolsky, disagree, insisting that our deliberations and decisions do not make us free because they are dictated by neuronal circumstances. They say that, as we come to understand the mechanical workings of our brains, we’ll be compelled to adopt a strictly utilitarian model of justice in which criminals are “punished” solely as a way to change their behavior, not because they truly deserve blame.

Although it’s cloaked in neuroscientific garb, this free-will question remains one of the great conceptual impasses of all time, far beyond the capacity of brain science to resolve. Unless, that is, investigators can show something truly spectacular: that people are not conscious beings whose actions flow from reasons and who are responsive to reason. True, we do not exert as much conscious control over our actions as we think we do. Every student of the mind, beginning most notably with William James and Sigmund Freud, knows this. But it doesn’t mean we are powerless.

The study of the brain is said to be the final scientific frontier. Will we lose sight of the mind, though, in the age of brain science? While the scans are dazzling and the technology an unqualified marvel, we can always keep our bearings by remembering that the brain and the mind are two different frameworks.

The neurobiological domain is one of brains and physical causes, the mechanisms behind our thoughts and emotions. The psychological domain, the realm of the mind, is one of people — their desires, intentions, ideals, and anxieties. Both are essential to a full understanding of why we act as we do.

Filed under brain psychology neuroscience science

245 notes

The man who needs to paralyse himself

"I have attempted to break my back, but I missed. I need to be paraplegic, paralysed from the waist down."

Sean O’Connor is a very rational man. But he also tried, unsuccessfully, to sever his spine, and still feels a need to be paralysed.

image

Sean has body integrity identity disorder (BIID), which causes him to feel that his limbs just don’t belong to his body.

Sean’s legs function correctly and he has full sensation in them, but they feel disconnected from him. “I don’t hate my limbs – they just feel wrong,” he says. “I’m aware that they are as nature designed them to be, but there is an intense discomfort at being able to feel my legs and move them.”

The cause of his disorder has yet to be pinpointed, but it almost certainly stems from a problem in the early development of his brain. “My earliest memories of feeling I should be paralysed go back to when I was 4 or 5 years old,” says Sean.

The first case of BIID was reported in the 18th century, when a French surgeon was held at gunpoint by an Englishman who demanded that one of his legs be removed. The surgeon, against his will, performed the operation. Later, he received a handsome payment from the Englishman, with an accompanying letter of thanks for removing “a limb which put an invincible obstacle to my happiness” (Experimental Brain Research).

We now think that there are at least two forms of BIID. In one, people wish that part of their body were paralysed. Another form causes people to want to have a limb removed. BIID doesn’t have to affect limbs either – there have been anecdotal accounts of people wishing they were blind or deaf.

DIY operations

There are many reported cases of people with BIID attempting to break their back, like Sean, or perform a DIY operation to alleviate their discomfort. Some even pay for surgeons to amputate their healthy limbs. Now the first study of this desperate form of treatment, by Peter Brugger at the University of Zurich, Switzerland, and colleagues, suggests that chopping off a healthy limb “cures” people of this form of BIID. Brugger says they interviewed about 20 people with BIID, many of whom have had an illegal amputation. All said they were satisfied with the outcome.

But the findings, so far unpublished, are tentative and do not justify such a treatment, says Brugger. “We don’t have enough scientific evidence to propose amputation or paralysis. Before we have an understanding of something, we can’t think of developing a treatment.”

Brugger disagrees with the suggestion that the disorder is psychological. “The neurological side of the data is too convincing,” he says. “Why would a vague desire to be handicapped show itself as a precise need to be amputated two centimetres above the knee, for example? I certainly think it’s more a representational deficit in the brain in all cases, than a psychological need for attention.”

The parietal lobe, situated at the top of the brain, is almost certainly involved. It is here that a complex set of brain networks enable us to attach a sense of self to our limbs. In 2011, V. S. Ramachandran, at the University of California, San Diego, and his colleagues examined the brain activity of four people with BIID.

Confusion in the brain

They found significantly reduced activation in the right superior parietal lobe when researchers touched the part of the leg that people wanted amputated, compared with when they touched the part people wanted to keep. The researchers say that this area of the brain is key to creating a “coherent sense of having a body” (Journal of Neurological Neurosurgery and Psychiatry).

The brain hates to be confused, says Ramachandran. So when people with BIID feel the sensation of touch, they can’t incorporate this message into the regions of the brain that identify the limb as being part of themselves. In an attempt to remove the confusion, it seems the brain rejects the limb altogether.

Brugger hypothesises that some people are born with a relative weakness in brain networks which enable us to accept all our limbs as our own. This is usually naturally corrected as they grow up, he says, but in some people, the sight of an amputee at a very young age may have reinforce the alterations in the brain. About half of people with BIID – itself a condition so rare there aren’t proper estimates of its prevalence – recall having a fascination or close relationship with an amputee while they were a child.

Would Sean contemplate having his limbs amputated? “I would, if it was available,” he says, “but there are no surgeons currently offering the treatment openly.”

"But I am who and what I am in part because of having BIID and my lived experiences. Take away BIID, and I will be a different person. Not necessarily better, nor worse, but different. But the idea of making all my pain go away? It’s definitely appealing."

Filed under body integrity identity disorder limb amputation paralysis parietal lobe psychology neuroscience science

94 notes

Microbleeding in Brain May Be Behind Senior Moments
People may grow wiser with age, but they don’t grow smarter. Many of our mental abilities decline after midlife, and now researchers say that they’ve fingered a culprit. A study presented here last week at the annual  meeting of the Association for Psychological Science points to microbleeding in the brain caused by stiffening arteries. The finding may lead to new therapies to combat senior moments.
This isn’t the first time that microbleeds have been suspected as a cause of cognitive decline. “We have known [about them] for some time thanks to neuroimaging studies,” says Matthew Pase, a psychology Ph.D. student at Swinburne University of Technology in Melbourne, Australia. The brains of older people are sometimes peppered with dark splotches where blood vessels have burst and created tiny dead zones of tissue. How important these microbleeds are to cognitive decline, and what causes them, have remained open questions, however.
Pase wondered if high blood pressure might be behind the microbleeds. The brain is a very blood-hungry organ, he notes. “It accounts for only 2% of the body weight yet receives 15% of the cardiac output and consumes 20% of the body’s oxygen expenditure.” Rather than getting the oxygen in pulses, the brain needs a smooth, continuous supply. So the aorta, the largest blood vessel branching off the heart, smooths out blood pressure before it reaches the brain by absorbing the pressure with its flexible walls. But as people age, the aorta stiffens. That translates to higher pressure on the brain, especially during stress. The pulse of blood can be strong enough to burst vessels in the brain, resulting in microbleeds.
A stumbling block has been accurately measuring the blood pressure that the brain experiences. The hand-pumped armband devices commonly used in doctor’s offices measure only the local pressure of blood in the arm, known as the brachial pressure. To calculate aorta stiffness, the “central blood pressure” in the aorta is needed. A technique for measuring central blood pressure was developed in the late 1990s, called applanation tonometry (AT). It works by comparing the pressure wave of blood from the heart with the reflected pressure wave from the vessels farthest from the heart—the aorta stiffness is calculated from the difference in pressure from the two. Devices for measuring AT have appeared on the market that are fast and painless.
To see if central blood pressure and aorta stiffening are related to cognitive abilities, Pase and colleagues recruited 493 people in Melbourne, 20 to 82 years old. They made traditional blood pressure measurements and also used AT to measure central blood pressure and estimate aorta stiffness. They also measured their subjects’ cognitive abilities with a standard battery of computer tests.
Central blood pressure and aorta stiffness alone were sensitive predictors of cognitive abilities, Pase reported at the meeting. The higher the central pressure and aorta stiffness, the worse people tended to perform on tests of visual processing and memory. The traditional measures of blood pressure in the arm were correlated with only scores on one test of visual processing.
To prove that aorta stiffening causes microbleeds, the researchers will need to repeat the experiment on the same people over the course of several years, using neuroimaging as well to establish that aorta stiffening leads to both microbleeding and cognitive decline. Pase notes that other causes of microbleeding have been proposed, such as weakening of blood vessels in the brain.
"This work is so important because the problem is so pervasive," says Earl Hunt, a veteran intelligence researcher at the University of Washington, Seattle, who was not involved in the work. The individual effects of these microbleeds are probably too small to measure. "But even a trifling difference multiplied a million times is big," he says. Pase’s collaborator at Swinburne, Con Stough, is now leading a study of how to prevent microbleeding through dietary supplements. He proposes that the elasticity of the aorta could be preserved by providing fatty acids or antioxidants that help maintain its structure. The results are expected in 2015.

Microbleeding in Brain May Be Behind Senior Moments

People may grow wiser with age, but they don’t grow smarter. Many of our mental abilities decline after midlife, and now researchers say that they’ve fingered a culprit. A study presented here last week at the annual meeting of the Association for Psychological Science points to microbleeding in the brain caused by stiffening arteries. The finding may lead to new therapies to combat senior moments.

This isn’t the first time that microbleeds have been suspected as a cause of cognitive decline. “We have known [about them] for some time thanks to neuroimaging studies,” says Matthew Pase, a psychology Ph.D. student at Swinburne University of Technology in Melbourne, Australia. The brains of older people are sometimes peppered with dark splotches where blood vessels have burst and created tiny dead zones of tissue. How important these microbleeds are to cognitive decline, and what causes them, have remained open questions, however.

Pase wondered if high blood pressure might be behind the microbleeds. The brain is a very blood-hungry organ, he notes. “It accounts for only 2% of the body weight yet receives 15% of the cardiac output and consumes 20% of the body’s oxygen expenditure.” Rather than getting the oxygen in pulses, the brain needs a smooth, continuous supply. So the aorta, the largest blood vessel branching off the heart, smooths out blood pressure before it reaches the brain by absorbing the pressure with its flexible walls. But as people age, the aorta stiffens. That translates to higher pressure on the brain, especially during stress. The pulse of blood can be strong enough to burst vessels in the brain, resulting in microbleeds.

A stumbling block has been accurately measuring the blood pressure that the brain experiences. The hand-pumped armband devices commonly used in doctor’s offices measure only the local pressure of blood in the arm, known as the brachial pressure. To calculate aorta stiffness, the “central blood pressure” in the aorta is needed. A technique for measuring central blood pressure was developed in the late 1990s, called applanation tonometry (AT). It works by comparing the pressure wave of blood from the heart with the reflected pressure wave from the vessels farthest from the heart—the aorta stiffness is calculated from the difference in pressure from the two. Devices for measuring AT have appeared on the market that are fast and painless.

To see if central blood pressure and aorta stiffening are related to cognitive abilities, Pase and colleagues recruited 493 people in Melbourne, 20 to 82 years old. They made traditional blood pressure measurements and also used AT to measure central blood pressure and estimate aorta stiffness. They also measured their subjects’ cognitive abilities with a standard battery of computer tests.

Central blood pressure and aorta stiffness alone were sensitive predictors of cognitive abilities, Pase reported at the meeting. The higher the central pressure and aorta stiffness, the worse people tended to perform on tests of visual processing and memory. The traditional measures of blood pressure in the arm were correlated with only scores on one test of visual processing.

To prove that aorta stiffening causes microbleeds, the researchers will need to repeat the experiment on the same people over the course of several years, using neuroimaging as well to establish that aorta stiffening leads to both microbleeding and cognitive decline. Pase notes that other causes of microbleeding have been proposed, such as weakening of blood vessels in the brain.

"This work is so important because the problem is so pervasive," says Earl Hunt, a veteran intelligence researcher at the University of Washington, Seattle, who was not involved in the work. The individual effects of these microbleeds are probably too small to measure. "But even a trifling difference multiplied a million times is big," he says. Pase’s collaborator at Swinburne, Con Stough, is now leading a study of how to prevent microbleeding through dietary supplements. He proposes that the elasticity of the aorta could be preserved by providing fatty acids or antioxidants that help maintain its structure. The results are expected in 2015.

Filed under brain microbleeding cognitive decline blood vessels blood pressure psychology neuroscience science

55 notes

Are men better than women at acoustic size judgments?

Scientists from the University of Sussex have revealed that men are significantly better than women at using speech ‘formants’ to compare the apparent size of the source. Formants are important phonetic elements of human speech that are used by mammals to assess the body size of potential mates and rivals. This research is the first to indicate that formant perception may have evolved through sexual selection.

image

Dr. Benjamin D. Charlton and his team tested 18 males and 37 females, aged between 17 and 20 years. Participants heard 60 unique stimulus pairs with different formants, representing two different animals, and their task was to decide which one sounded ‘larger’. Researchers tested the ability of listeners to detect small differences in apparent size across a wide range of formants which encompassed the range of the human speaking voice.

Speech formants, which give us our particular vowel sounds, are based on the length of the vocal tract, and thus relate directly to body size. But whereas men appear to use formants to judge the physical dominance of potential rivals, formants are not consistently found to predict how women rate the attractiveness of men’s voices. Women have been found to be more reliant on voice pitch rather than formants when rating how attractive they find a male voice.

The researchers conclude that the sex differences they report could be either innate or acquired or both. Hence, while they are compatible with the hypothesis that males rely on size assessment more than females, they do not conclusively demonstrate that these abilities arose through sexual selection. For example, it is possible that males learn this skill because this information is more important to them during their everyday social interactions. There may also be key differences across cultures, particularly in societies where gender roles differ markedly. Thus, they look forward to future studies examining the effects of training and personality as well as social and cultural factors.

(Source: royalsociety.org)

Filed under sex differences formants formant perception speech vowel sounds psychology neuroscience science

310 notes

Avatar therapy helps silence voices in schizophrenia
An avatar system that enables people with schizophrenia to control the voice of their hallucinations is being developed by researchers at UCL with support from the Wellcome Trust.
The computer-based system could provide quick and effective therapy that is far more successful than current pharmaceutical treatments, helping to reduce the frequency and severity of episodes of schizophrenia.
In an early pilot of this approach involving 16 patients and up to seven, 30 minute sessions of therapy, almost all of the patients reported an improvement in the frequency and severity of the voices that they hear. Three of the patients stopped hearing voices completely after experiencing 16, 13 and 3.5 years of auditory hallucinations, respectively. The avatar does not address the patients’ delusions directly, but the study found that they do improve as an overall effect of the therapy.
The team has now received a £1.3 million Translation Award from the Wellcome Trust to refine the system and conduct a larger scale, randomised study to evaluate this novel approach to schizophrenia therapy which will be conducted at King’s College London Institute of Psychiatry.
The first stage in the therapy is for the patient to create a computer-based avatar, by choosing the face and voice of the entity they believe is talking to them. The system then synchronises the avatar’s lips with its speech, enabling a therapist to speak to the patient through the avatar in real time. The therapist encourages the patient to oppose the voice and gradually teaches them to take control of their hallucinations.
Julian Leff, Emeritus Professor in UCL Mental Health Sciences, developed the therapy and is leading the project. He said: “Even though patients interact with the avatar as though it was a real person, because they have created it, they know that it cannot harm them, as opposed to the voices, which often threaten to kill or harm them and their family. As a result the therapy helps patients gain the confidence and courage to confront the avatar, and their persecutor.
“We record every therapy session on MP3 so that the patient essentially has a therapist in their pocket which they can listen to at any time when harassed by the voices. We’ve found that this helps them to recognise that the voices originate within their own mind and reinforces their control over the hallucinations.
The larger-scale study will begin enrolling the first patients in early July. The team are currently training the therapists and research staff to deliver the avatar therapy and finalising the study set-up. The first results of this larger study are expected towards the end of 2015.
Professor Thomas Craig of King’s College London Institute of Psychiatry, who will lead the larger trial, said: “Auditory hallucinations are a very distressing experience that can be extremely difficult to treat successfully, blighting patients’ lives for many years. I am delighted to be leading the group that will carry out a rigorous randomised study of this intriguing new therapy with 142 people who have experienced distressing voices for many years.
“The beauty of the therapy is its simplicity and brevity. Most other psychological therapies for these conditions are costly and take many months to deliver. If we show that this treatment is effective, we expect it could be widely available in the UK within just a couple of years as the basic technology is well developed and many mental health professionals already have the basic therapy skills that are needed to deliver it.”
Schizophrenia affects around 1 in 100 people worldwide, the most common symptoms being delusions (false beliefs) and auditory hallucinations (hearing voices). The illness often has a devastating effect, making it impossible to work and to sustain social relationships. Even with the most effective anti-psychotic medication, around one in four people with schizophrenia continue to suffer from persecutory auditory hallucinations, severely impairing their ability to concentrate.
Current guidelines from the National Institute for Health and Care Excellence (NICE) recommend that schizophrenia is treated using a combination of medication and talking therapies, such as cognitive behavioural therapy. However, fewer than one in ten patients with schizophrenia in the UK have access to this kind of psychological therapy.
Ted Bianco, Director of Technology Transfer and Acting Director of the Wellcome Trust, said: “At a time when many companies have become wary about investing in drug discovery for mental health, we are delighted to be able to facilitate the evaluation of an alternative approach to treatment based on the fusion of a talking therapy with computer-assisted ‘training’.
“In addition to the attraction that the intervention is not reliant on development of a new medication, the approach has the benefit of being directly testable in patients. Should the results of the trial prove encouraging, we expect there may be further applications of the basic strategy worth exploring in other areas of mental health.”

Avatar therapy helps silence voices in schizophrenia

An avatar system that enables people with schizophrenia to control the voice of their hallucinations is being developed by researchers at UCL with support from the Wellcome Trust.

The computer-based system could provide quick and effective therapy that is far more successful than current pharmaceutical treatments, helping to reduce the frequency and severity of episodes of schizophrenia.

In an early pilot of this approach involving 16 patients and up to seven, 30 minute sessions of therapy, almost all of the patients reported an improvement in the frequency and severity of the voices that they hear. Three of the patients stopped hearing voices completely after experiencing 16, 13 and 3.5 years of auditory hallucinations, respectively. The avatar does not address the patients’ delusions directly, but the study found that they do improve as an overall effect of the therapy.

The team has now received a £1.3 million Translation Award from the Wellcome Trust to refine the system and conduct a larger scale, randomised study to evaluate this novel approach to schizophrenia therapy which will be conducted at King’s College London Institute of Psychiatry.

The first stage in the therapy is for the patient to create a computer-based avatar, by choosing the face and voice of the entity they believe is talking to them. The system then synchronises the avatar’s lips with its speech, enabling a therapist to speak to the patient through the avatar in real time. The therapist encourages the patient to oppose the voice and gradually teaches them to take control of their hallucinations.

Julian Leff, Emeritus Professor in UCL Mental Health Sciences, developed the therapy and is leading the project. He said: “Even though patients interact with the avatar as though it was a real person, because they have created it, they know that it cannot harm them, as opposed to the voices, which often threaten to kill or harm them and their family. As a result the therapy helps patients gain the confidence and courage to confront the avatar, and their persecutor.

“We record every therapy session on MP3 so that the patient essentially has a therapist in their pocket which they can listen to at any time when harassed by the voices. We’ve found that this helps them to recognise that the voices originate within their own mind and reinforces their control over the hallucinations.

The larger-scale study will begin enrolling the first patients in early July. The team are currently training the therapists and research staff to deliver the avatar therapy and finalising the study set-up. The first results of this larger study are expected towards the end of 2015.

Professor Thomas Craig of King’s College London Institute of Psychiatry, who will lead the larger trial, said: “Auditory hallucinations are a very distressing experience that can be extremely difficult to treat successfully, blighting patients’ lives for many years. I am delighted to be leading the group that will carry out a rigorous randomised study of this intriguing new therapy with 142 people who have experienced distressing voices for many years.

“The beauty of the therapy is its simplicity and brevity. Most other psychological therapies for these conditions are costly and take many months to deliver. If we show that this treatment is effective, we expect it could be widely available in the UK within just a couple of years as the basic technology is well developed and many mental health professionals already have the basic therapy skills that are needed to deliver it.”

Schizophrenia affects around 1 in 100 people worldwide, the most common symptoms being delusions (false beliefs) and auditory hallucinations (hearing voices). The illness often has a devastating effect, making it impossible to work and to sustain social relationships. Even with the most effective anti-psychotic medication, around one in four people with schizophrenia continue to suffer from persecutory auditory hallucinations, severely impairing their ability to concentrate.

Current guidelines from the National Institute for Health and Care Excellence (NICE) recommend that schizophrenia is treated using a combination of medication and talking therapies, such as cognitive behavioural therapy. However, fewer than one in ten patients with schizophrenia in the UK have access to this kind of psychological therapy.

Ted Bianco, Director of Technology Transfer and Acting Director of the Wellcome Trust, said: “At a time when many companies have become wary about investing in drug discovery for mental health, we are delighted to be able to facilitate the evaluation of an alternative approach to treatment based on the fusion of a talking therapy with computer-assisted ‘training’.

“In addition to the attraction that the intervention is not reliant on development of a new medication, the approach has the benefit of being directly testable in patients. Should the results of the trial prove encouraging, we expect there may be further applications of the basic strategy worth exploring in other areas of mental health.”

Filed under avatar therapy schizophrenia auditory hallucinations psychology neuroscience science

105 notes

Healthy lifestyle choices mean fewer memory complaints

Research has shown that healthy behaviors are associated with a lower risk of Alzheimer’s disease and dementia, but less is known about the potential link between positive lifestyle choices and milder memory complaints, especially those that occur earlier in life and could be the first indicators of later problems.

image

To examine the impact of these lifestyle choices on memory throughout adult life, UCLA researchers and the Gallup organization collaborated on a nationwide poll of more than 18,500 individuals between the ages of 18 and 99. Respondents were surveyed about both their memory and their health behaviors, including whether they smoked, how much they exercised and how healthy their diet was.

As the researchers expected, healthy eating, not smoking and exercising regularly were related to better self-perceived memory abilities for most adult groups. Reports of memory problems also increased with age. However, there were a few surprises.

Older adults (age 60–99) were more likely to report engaging in healthy behaviors than middle-aged (40–59) and younger adults (18–39), a finding that runs counter to the stereotype that aging is a time of dependence and decline. In addition, a higher-than-expected percentage of younger adults complained about their memory.

"These findings reinforce the importance of educating young and middle-aged individuals to take greater responsibility for their health — including memory — by practicing positive lifestyle behaviors earlier in life," said the study’s first author, Dr. Gary Small, director of the UCLA Longevity Center and a professor of psychiatry and biobehavioral sciences at the Semel Institute for Neuroscience and Human Behavior at UCLA who holds the Parlow–Solomon Chair on Aging.

Published in the June issue of International Psychogeriatrics, the study may also provide a baseline for the future study of memory complaints in a wide range of adult age groups.

For the survey, Gallup pollsters conducted land-line and cell phone interviews with 18,552 adults in the U.S. The inclusion of cell phone–only households and Spanish-language interviews helped capture a representative 90 percent of the U.S. population, the researchers said.

"We found that the more healthy lifestyle behaviors were practiced, the less likely one was to complain about memory issues," said senior author Fernando Torres-Gil, a professor at UCLA’s Luskin School of Public Affairs and associate director of the UCLA Longevity Center.

In particular, the study found that respondents across all age groups who engaged in just one healthy behavior were 21 percent less likely to report memory problems than those who didn’t engage in any healthy behaviors. Those with two positive behaviors were 45 percent less likely to report problems, those with three were 75 percent less likely, and those with more than three were 111 percent less likely.

Interestingly, the poll found that healthy behaviors were more common among older adults than the other two age groups. Seventy percent of older adults engaged in at least one healthy behavior, compared with 61 percent of middle-aged individuals and 58 percent of younger respondents.

In addition, only 12 percent of older adults smoked, compared with 25 percent of young adults and 24 percent of middle-aged adults, and a higher percentage of older adults reported eating healthy the day before being interviewed (80 percent) and eating five or more daily servings of fruits and vegetables during the previous week (64 percent).

According to the researchers, older adults may participate in more healthy behaviors because they feel the consequences of unhealthy living and take the advice of their doctors to adopt healthier lifestyles. Or there simply could be fewer older adults with bad habits, since they may not live as long.

While 26 percent of older adults and 22 percent of middle-aged respondents reported memory issues, it was surprising to find that 14 percent of the younger group complained about their memory too, the researchers said.

"Memory issues were to be expected in the middle-aged and older groups, but not in younger people," Small said. "A better understanding and recognition of mild memory symptoms earlier in life may have the potential to help all ages."

Small said that, generally, memory issues in younger people may be different from those that plague older generations. Stress may play more of a role. He also noted that the ubiquity of technology — including the Internet, texting and wireless devices that can result in constant multi-tasking, especially with younger people — may impact attention span, making it harder to focus and remember.

Small noted that further study and polling may help tease out such memory-complaint differences. Either way, he said, the survey reinforces the importance, for all ages, of adopting a healthy lifestyle to help limit and forestall age-related cognitive decline and neurodegeneration.

The Gallup poll used in the study took place between December 2011 and January 2012 and was part of the Gallup–Healthways Well-Being Index, which includes health- and lifestyle-related polling questions. The five questions asked were: (1) Do you smoke? (2) Did you eat healthy all day yesterday? (3) In the last seven days, on how many days did you have five or more servings of vegetables and fruits? (4) In the last seven days, on how many days did you exercise for 30 minutes or more? (5) Do you have any problems with your memory? 

(Source: newsroom.ucla.edu)

Filed under memory adults lifestyle choices memory problems poll psychology neuroscience science

176 notes

Neuroscientists Discover New Phase of Synaptic Development
Breakthrough Could Lead to Better Understanding of Learning and Memory
Students preparing for final exams might want to wait before pulling an all-night cram session — at least as far as their neurons are concerned. Carnegie Mellon University neuroscientists have discovered a new intermediate phase in neuronal development during which repeated exposure to a stimulus shrinks synapses. The findings are published in the May 8 issue of the Journal of Neuroscience.
It’s well known that synapses in the brain, the connections between neurons and other cells that allow for the transmission of information, grow when they’re exposed to a stimulus. New research from the lab of Carnegie Mellon Associate Professor of Biological Sciences Alison L. Barth has shown that in the short term, synapses get even stronger than previously thought, but then quickly go through a transitional phase where they weaken.
"When you think of learning, you think that it’s cumulative. We thought that synapses started small and then got bigger and bigger. This isn’t the case," said Barth, who also is a member of the joint Carnegie Mellon/University of Pittsburgh Center for the Neural Basis of Cognition. "Based on our data, it seems like synapses that have recently been strengthened are peculiarly vulnerable — more stimulation can actually wipe out the effects of learning.
"Psychologists know that for long-lasting memory, spaced training - like studying for your classes after very lecture, all semester long — is superior to cramming all night before the exam," Barth said. "This study shows why. Right after plasticity, synapses are almost fragile — more training during this labile phases is actually counterproductive."
Previous research from Barth’s lab established the biochemical mechanisms responsible for the strengthening of synapses in the neocortex, the part of the brain responsible for thought and language, but only measured the synapses after 24 hours. In the current study, post-doctoral student Jing A. Wen investigated how the synapses developed throughout the first 24 hours of exposure to a stimulus using a specialized transgenic mouse model created by Barth. The model senses its surroundings using only one whisker, which alters its ability to sense its environment and creates a sensory imbalance that increases plasticity in the brain. Since each whisker is linked to a specific area of the cortex, researchers can easily track neuronal changes.
Wen found that during this first day of learning, synapses go through three distinct phases. In the initiation phase, synaptic plasticity is spurred on by NMDA receptors. Over the next 12 hours or so, the synapses get stronger and stronger. As the stimulus is repeated, the NDMA receptors change their function and start to weaken the synapses in what the researchers have called the labile phase. After a few hours of weakening, another receptor, mGluR5, initiates a stabilization phase during which the synapses maintain their residual strength.
Furthermore, the researchers found that they could maintain the super-activated state found at the beginning of the labile phase by stopping the stimulus altogether or by injecting a glutamate receptor antagonist drug at an optimal time point. The findings are analogous to those seen in many psychological studies that use space training to improve memory.
"While synaptic changes can be long lasting, we’ve found that in this initial period there are a number of different things we could play with," Barth said. "The discovery of this labile phase suggests there are ways to control learning through the manipulation of the biochemical pathways that maintain memory."

Neuroscientists Discover New Phase of Synaptic Development

Breakthrough Could Lead to Better Understanding of Learning and Memory

Students preparing for final exams might want to wait before pulling an all-night cram session — at least as far as their neurons are concerned. Carnegie Mellon University neuroscientists have discovered a new intermediate phase in neuronal development during which repeated exposure to a stimulus shrinks synapses. The findings are published in the May 8 issue of the Journal of Neuroscience.

It’s well known that synapses in the brain, the connections between neurons and other cells that allow for the transmission of information, grow when they’re exposed to a stimulus. New research from the lab of Carnegie Mellon Associate Professor of Biological Sciences Alison L. Barth has shown that in the short term, synapses get even stronger than previously thought, but then quickly go through a transitional phase where they weaken.

"When you think of learning, you think that it’s cumulative. We thought that synapses started small and then got bigger and bigger. This isn’t the case," said Barth, who also is a member of the joint Carnegie Mellon/University of Pittsburgh Center for the Neural Basis of Cognition. "Based on our data, it seems like synapses that have recently been strengthened are peculiarly vulnerable — more stimulation can actually wipe out the effects of learning.

"Psychologists know that for long-lasting memory, spaced training - like studying for your classes after very lecture, all semester long — is superior to cramming all night before the exam," Barth said. "This study shows why. Right after plasticity, synapses are almost fragile — more training during this labile phases is actually counterproductive."

Previous research from Barth’s lab established the biochemical mechanisms responsible for the strengthening of synapses in the neocortex, the part of the brain responsible for thought and language, but only measured the synapses after 24 hours. In the current study, post-doctoral student Jing A. Wen investigated how the synapses developed throughout the first 24 hours of exposure to a stimulus using a specialized transgenic mouse model created by Barth. The model senses its surroundings using only one whisker, which alters its ability to sense its environment and creates a sensory imbalance that increases plasticity in the brain. Since each whisker is linked to a specific area of the cortex, researchers can easily track neuronal changes.

Wen found that during this first day of learning, synapses go through three distinct phases. In the initiation phase, synaptic plasticity is spurred on by NMDA receptors. Over the next 12 hours or so, the synapses get stronger and stronger. As the stimulus is repeated, the NDMA receptors change their function and start to weaken the synapses in what the researchers have called the labile phase. After a few hours of weakening, another receptor, mGluR5, initiates a stabilization phase during which the synapses maintain their residual strength.

Furthermore, the researchers found that they could maintain the super-activated state found at the beginning of the labile phase by stopping the stimulus altogether or by injecting a glutamate receptor antagonist drug at an optimal time point. The findings are analogous to those seen in many psychological studies that use space training to improve memory.

"While synaptic changes can be long lasting, we’ve found that in this initial period there are a number of different things we could play with," Barth said. "The discovery of this labile phase suggests there are ways to control learning through the manipulation of the biochemical pathways that maintain memory."

Filed under neuronal development synapses neocortex plasticity learning psychology neuroscience science

366 notes

Picking Up a Second Language Is Predicted by Ability to Learn Patterns
Some people seem to pick up a second language with relative ease, while others have a much more difficult time. Now, a new study suggests that learning to understand and read a second language may be driven, at least in part, by our ability to pick up on statistical regularities.
The study is published in Psychological Science, a journal of the Association for Psychological Science.
Some research suggests that learning a second language draws on capacities that are language-specific, while other research suggests that it reflects a more general capacity for learning patterns. According to psychological scientist and lead researcher Ram Frost of Hebrew University, the data from the new study clearly point to the latter:
“These new results suggest that learning a second language is determined to a large extent by an individual ability that is not at all linguistic,” says Frost.
In the study, Frost and colleagues used three different tasks to measure how well American students in an overseas program picked up on the structure of words and sounds in Hebrew. The students were tested once in the first semester and again in the second semester.
The students also completed a task that measured their ability to pick up on statistical patterns in visual stimuli. The participants watched a stream of complex shapes that were presented one at a time. Unbeknownst to the participants, the 24 shapes were organized into 8 triplets — the order of the triplets was randomized, though the shapes within each triplet always appeared in the same sequence. After viewing the stream of shapes, the students were tested to see whether they implicitly picked up the statistical regularities of the shape sequences.
The data revealed a strong association between statistical learning and language learning: Students who were high performers on the shapes task tended to pick up the most Hebrew over the two semesters.
“It’s surprising that a short 15-minute test involving the perception of visual shapes could predict to such a large extent which of the students who came to study Hebrew would  finish the year with a better grasp of the language,” says Frost.
According to the researchers, establishing a link between second language acquisition and a general capacity for statistical learning may have broad implications.
“This finding points to the possibility that a unified and universal principle of statistical learning can quantitatively explain a wide range of cognitive processes across domains, whether they are linguistic or nonlinguistic,” they conclude.

Picking Up a Second Language Is Predicted by Ability to Learn Patterns

Some people seem to pick up a second language with relative ease, while others have a much more difficult time. Now, a new study suggests that learning to understand and read a second language may be driven, at least in part, by our ability to pick up on statistical regularities.

The study is published in Psychological Science, a journal of the Association for Psychological Science.

Some research suggests that learning a second language draws on capacities that are language-specific, while other research suggests that it reflects a more general capacity for learning patterns. According to psychological scientist and lead researcher Ram Frost of Hebrew University, the data from the new study clearly point to the latter:

“These new results suggest that learning a second language is determined to a large extent by an individual ability that is not at all linguistic,” says Frost.

In the study, Frost and colleagues used three different tasks to measure how well American students in an overseas program picked up on the structure of words and sounds in Hebrew. The students were tested once in the first semester and again in the second semester.

The students also completed a task that measured their ability to pick up on statistical patterns in visual stimuli. The participants watched a stream of complex shapes that were presented one at a time. Unbeknownst to the participants, the 24 shapes were organized into 8 triplets — the order of the triplets was randomized, though the shapes within each triplet always appeared in the same sequence. After viewing the stream of shapes, the students were tested to see whether they implicitly picked up the statistical regularities of the shape sequences.

The data revealed a strong association between statistical learning and language learning: Students who were high performers on the shapes task tended to pick up the most Hebrew over the two semesters.

“It’s surprising that a short 15-minute test involving the perception of visual shapes could predict to such a large extent which of the students who came to study Hebrew would  finish the year with a better grasp of the language,” says Frost.

According to the researchers, establishing a link between second language acquisition and a general capacity for statistical learning may have broad implications.

“This finding points to the possibility that a unified and universal principle of statistical learning can quantitatively explain a wide range of cognitive processes across domains, whether they are linguistic or nonlinguistic,” they conclude.

Filed under bilingualism learning patterns individual differences language learning language acquisition psychology neuroscience science

136 notes

Art appreciation is measureable
Is it your own innate taste or what you have been taught that decides if you like a work of art? Both, according to an Australian-Norwegian research team.
Have you experienced seeing a painting or a play that has left you with no feelings whatsoever, whilst a friend thought it was beautiful and meaningful? Experts have argued for years about the feasibility of researching art appreciation, and what should be taken into consideration.
Neuroscientists believe that biological processes that take place in the brain decide whether one likes a work of art or not. Historians and philosophers say that this is far too narrow a viewpoint. They believe that what you know about the artist’s intentions, when the work was created, and other external factors, also affect how you experience a work of art.
Building bridgesA new model that combines both the historical and the psychological approach has been developed.
We think that both traditions are just as important, although incomplete. We want to show that they complement each other, says Rolf Reber, Professor of Psychology at the University of Bergen. Together with Nicolas Bullot, Doctor of Philosophy at the Macquarie University in Australia, he has developed a new model to help us understand art appreciation. The results have been published in ‘Behavioral and Brain Sciences’ and are commented on by 27 scientists from different disciplines.
Neuroscientists often measure brain activity to find out how much a testee likes a work of art, without investigating whether he or she actually understands the work. This is insufficient, as artistic understanding also affects assessment, says Reber.
Eye-opening experience- We know from earlier research that a painting that is difficult – yet possible – to interpret, is felt to be more meaningful than a painting that one looks at and understands immediately. The painter, Eugène Delacroix, made use of this fact to depict war. Joseph Mallord William Turner did the same in ‘Snow storm’. When you have to struggle to understand, you can have an eye-opening experience, which the brain appreciates, explains Reber.
He hopes that other scientists will use the Australian-Norwegian model.- By measuring brain activity, interviewing test persons about thoughts and reactions, and charting their artistic knowledge, it’s possible to gain new and exciting insight into what makes people appreciate good works of art. The model can be used for visual art, music, theatre and literature, says Reber.

Art appreciation is measureable

Is it your own innate taste or what you have been taught that decides if you like a work of art? Both, according to an Australian-Norwegian research team.

Have you experienced seeing a painting or a play that has left you with no feelings whatsoever, whilst a friend thought it was beautiful and meaningful? Experts have argued for years about the feasibility of researching art appreciation, and what should be taken into consideration.

Neuroscientists believe that biological processes that take place in the brain decide whether one likes a work of art or not. Historians and philosophers say that this is far too narrow a viewpoint. They believe that what you know about the artist’s intentions, when the work was created, and other external factors, also affect how you experience a work of art.

Building bridges
A new model that combines both the historical and the psychological approach has been developed.

  • We think that both traditions are just as important, although incomplete. We want to show that they complement each other, says Rolf Reber, Professor of Psychology at the University of Bergen. Together with Nicolas Bullot, Doctor of Philosophy at the Macquarie University in Australia, he has developed a new model to help us understand art appreciation. The results have been published in ‘Behavioral and Brain Sciences and are commented on by 27 scientists from different disciplines.
  • Neuroscientists often measure brain activity to find out how much a testee likes a work of art, without investigating whether he or she actually understands the work. This is insufficient, as artistic understanding also affects assessment, says Reber.

Eye-opening experience
- We know from earlier research that a painting that is difficult – yet possible – to interpret, is felt to be more meaningful than a painting that one looks at and understands immediately. The painter, Eugène Delacroix, made use of this fact to depict war. Joseph Mallord William Turner did the same in ‘Snow storm’. When you have to struggle to understand, you can have an eye-opening experience, which the brain appreciates, explains Reber.

He hopes that other scientists will use the Australian-Norwegian model.
- By measuring brain activity, interviewing test persons about thoughts and reactions, and charting their artistic knowledge, it’s possible to gain new and exciting insight into what makes people appreciate good works of art. The model can be used for visual art, music, theatre and literature, says Reber.

Filed under brain brain activity art appreciation art psychology neuroscience science

234 notes

Changing gut bacteria through diet affects brain function
UCLA researchers now have the first evidence that bacteria ingested in food can affect brain function in humans. In an early proof-of-concept study of healthy women, they found that women who regularly consumed beneficial bacteria known as probiotics through yogurt showed altered brain function, both while in a resting state and in response to an emotion-recognition task.
The study, conducted by scientists with UCLA’s Gail and Gerald Oppenheimer Family Center for Neurobiology of Stress and the Ahmanson–Lovelace Brain Mapping Center at UCLA, appears in the current online edition of the peer-reviewed journal Gastroenterology.
The discovery that changing the bacterial environment, or microbiota, in the gut can affect the brain carries significant implications for future research that could point the way toward dietary or drug interventions to improve brain function, the researchers said.
"Many of us have a container of yogurt in our refrigerator that we may eat for enjoyment, for calcium or because we think it might help our health in other ways," said Dr. Kirsten Tillisch, an associate professor of medicine at UCLA’s David Geffen School of Medicine and lead author of the study. "Our findings indicate that some of the contents of yogurt may actually change the way our brain responds to the environment. When we consider the implications of this work, the old sayings ‘you are what you eat’ and ‘gut feelings’ take on new meaning."
Researchers have known that the brain sends signals to the gut, which is why stress and other emotions can contribute to gastrointestinal symptoms. This study shows what has been suspected but until now had been proved only in animal studies: that signals travel the opposite way as well.
"Time and time again, we hear from patients that they never felt depressed or anxious until they started experiencing problems with their gut," Tillisch said. "Our study shows that the gut–brain connection is a two-way street."
The small study involved 36 women between the ages of 18 and 55. Researchers divided the women into three groups: one group ate a specific yogurt containing a mix of several probiotics — bacteria thought to have a positive effect on the intestines — twice a day for four weeks; another group consumed a dairy product that looked and tasted like the yogurt but contained no probiotics; and a third group ate no product at all.
Functional magnetic resonance imaging (fMRI) scans conducted both before and after the four-week study period looked at the women’s brains in a state of rest and in response to an emotion-recognition task in which they viewed a series of pictures of people with angry or frightened faces and matched them to other faces showing the same emotions. This task, designed to measure the engagement of affective and cognitive brain regions in response to a visual stimulus, was chosen because previous research in animals had linked changes in gut flora to changes in affective behaviors.
The researchers found that, compared with the women who didn’t consume the probiotic yogurt, those who did showed a decrease in activity in both the insula — which processes and integrates internal body sensations, like those form the gut — and the somatosensory cortex during the emotional reactivity task.
Further, in response to the task, these women had a decrease in the engagement of a widespread network in the brain that includes emotion-, cognition- and sensory-related areas. The women in the other two groups showed a stable or increased activity in this network.
During the resting brain scan, the women consuming probiotics showed greater connectivity between a key brainstem region known as the periaqueductal grey and cognition-associated areas of the prefrontal cortex. The women who ate no product at all, on the other hand, showed greater connectivity of the periaqueductal grey to emotion- and sensation-related regions, while the group consuming the non-probiotic dairy product showed results in between.
The researchers were surprised to find that the brain effects could be seen in many areas, including those involved in sensory processing and not merely those associated with emotion, Tillisch said.
The knowledge that signals are sent from the intestine to the brain and that they can be modulated by a dietary change is likely to lead to an expansion of research aimed at finding new strategies to prevent or treat digestive, mental and neurological disorders, said Dr. Emeran Mayer, a professor of medicine, physiology and psychiatry at the David Geffen School of Medicine at UCLA and the study’s senior author.
"There are studies showing that what we eat can alter the composition and products of the gut flora — in particular, that people with high-vegetable, fiber-based diets have a different composition of their microbiota, or gut environment, than people who eat the more typical Western diet that is high in fat and carbohydrates," Mayer said. "Now we know that this has an effect not only on the metabolism but also affects brain function."
The UCLA researchers are seeking to pinpoint particular chemicals produced by gut bacteria that may be triggering the signals to the brain. They also plan to study whether people with gastrointestinal symptoms such as bloating, abdominal pain and altered bowel movements have improvements in their digestive symptoms which correlate with changes in brain response.
Meanwhile, Mayer notes that other researchers are studying the potential benefits of certain probiotics in yogurts on mood symptoms such as anxiety. He said that other nutritional strategies may also be found to be beneficial.
By demonstrating the brain effects of probiotics, the study also raises the question of whether repeated courses of antibiotics can affect the brain, as some have speculated. Antibiotics are used extensively in neonatal intensive care units and in childhood respiratory tract infections, and such suppression of the normal microbiota may have long-term consequences on brain development.
Finally, as the complexity of the gut flora and its effect on the brain is better understood, researchers may find ways to manipulate the intestinal contents to treat chronic pain conditions or other brain related diseases, including, potentially, Parkinson’s disease, Alzheimer’s disease and autism.
Answers will be easier to come by in the near future as the declining cost of profiling a person’s microbiota renders such tests more routine, Mayer said.

Changing gut bacteria through diet affects brain function

UCLA researchers now have the first evidence that bacteria ingested in food can affect brain function in humans. In an early proof-of-concept study of healthy women, they found that women who regularly consumed beneficial bacteria known as probiotics through yogurt showed altered brain function, both while in a resting state and in response to an emotion-recognition task.

The study, conducted by scientists with UCLA’s Gail and Gerald Oppenheimer Family Center for Neurobiology of Stress and the Ahmanson–Lovelace Brain Mapping Center at UCLA, appears in the current online edition of the peer-reviewed journal Gastroenterology.

The discovery that changing the bacterial environment, or microbiota, in the gut can affect the brain carries significant implications for future research that could point the way toward dietary or drug interventions to improve brain function, the researchers said.

"Many of us have a container of yogurt in our refrigerator that we may eat for enjoyment, for calcium or because we think it might help our health in other ways," said Dr. Kirsten Tillisch, an associate professor of medicine at UCLA’s David Geffen School of Medicine and lead author of the study. "Our findings indicate that some of the contents of yogurt may actually change the way our brain responds to the environment. When we consider the implications of this work, the old sayings ‘you are what you eat’ and ‘gut feelings’ take on new meaning."

Researchers have known that the brain sends signals to the gut, which is why stress and other emotions can contribute to gastrointestinal symptoms. This study shows what has been suspected but until now had been proved only in animal studies: that signals travel the opposite way as well.

"Time and time again, we hear from patients that they never felt depressed or anxious until they started experiencing problems with their gut," Tillisch said. "Our study shows that the gut–brain connection is a two-way street."

The small study involved 36 women between the ages of 18 and 55. Researchers divided the women into three groups: one group ate a specific yogurt containing a mix of several probiotics — bacteria thought to have a positive effect on the intestines — twice a day for four weeks; another group consumed a dairy product that looked and tasted like the yogurt but contained no probiotics; and a third group ate no product at all.

Functional magnetic resonance imaging (fMRI) scans conducted both before and after the four-week study period looked at the women’s brains in a state of rest and in response to an emotion-recognition task in which they viewed a series of pictures of people with angry or frightened faces and matched them to other faces showing the same emotions. This task, designed to measure the engagement of affective and cognitive brain regions in response to a visual stimulus, was chosen because previous research in animals had linked changes in gut flora to changes in affective behaviors.

The researchers found that, compared with the women who didn’t consume the probiotic yogurt, those who did showed a decrease in activity in both the insula — which processes and integrates internal body sensations, like those form the gut — and the somatosensory cortex during the emotional reactivity task.

Further, in response to the task, these women had a decrease in the engagement of a widespread network in the brain that includes emotion-, cognition- and sensory-related areas. The women in the other two groups showed a stable or increased activity in this network.

During the resting brain scan, the women consuming probiotics showed greater connectivity between a key brainstem region known as the periaqueductal grey and cognition-associated areas of the prefrontal cortex. The women who ate no product at all, on the other hand, showed greater connectivity of the periaqueductal grey to emotion- and sensation-related regions, while the group consuming the non-probiotic dairy product showed results in between.

The researchers were surprised to find that the brain effects could be seen in many areas, including those involved in sensory processing and not merely those associated with emotion, Tillisch said.

The knowledge that signals are sent from the intestine to the brain and that they can be modulated by a dietary change is likely to lead to an expansion of research aimed at finding new strategies to prevent or treat digestive, mental and neurological disorders, said Dr. Emeran Mayer, a professor of medicine, physiology and psychiatry at the David Geffen School of Medicine at UCLA and the study’s senior author.

"There are studies showing that what we eat can alter the composition and products of the gut flora — in particular, that people with high-vegetable, fiber-based diets have a different composition of their microbiota, or gut environment, than people who eat the more typical Western diet that is high in fat and carbohydrates," Mayer said. "Now we know that this has an effect not only on the metabolism but also affects brain function."

The UCLA researchers are seeking to pinpoint particular chemicals produced by gut bacteria that may be triggering the signals to the brain. They also plan to study whether people with gastrointestinal symptoms such as bloating, abdominal pain and altered bowel movements have improvements in their digestive symptoms which correlate with changes in brain response.

Meanwhile, Mayer notes that other researchers are studying the potential benefits of certain probiotics in yogurts on mood symptoms such as anxiety. He said that other nutritional strategies may also be found to be beneficial.

By demonstrating the brain effects of probiotics, the study also raises the question of whether repeated courses of antibiotics can affect the brain, as some have speculated. Antibiotics are used extensively in neonatal intensive care units and in childhood respiratory tract infections, and such suppression of the normal microbiota may have long-term consequences on brain development.

Finally, as the complexity of the gut flora and its effect on the brain is better understood, researchers may find ways to manipulate the intestinal contents to treat chronic pain conditions or other brain related diseases, including, potentially, Parkinson’s disease, Alzheimer’s disease and autism.

Answers will be easier to come by in the near future as the declining cost of profiling a person’s microbiota renders such tests more routine, Mayer said.

Filed under brain function brain activity emotion probiotic bacteria prefrontal cortex neuroimaging psychology neuroscience science

free counters