Neuroscience

Articles and news from the latest research reports.

Posts tagged brain

135 notes

The memories of near death experiences (NDE): more real than reality?
University of Liège researchers have demonstrated that the physiological mechanisms triggered during NDE lead to a more vivid perception not only of imagined events in the history of an individual but also of real events which have taken place in their lives! These surprising results – obtained using an original method which now requires further investigation – are published in PLOS ONE.
Seeing a bright light, going through a tunnel, having the feeling of ending up in another ‘reality’ or leaving one’s own body are very well known features of the complex phenomena known as ‘Near-Death Experiences ‘ (NDE), which people who are close to death can experience in particular. Products of the mind? Psychological defence mechanisms? Hallucinations? These phenomena have been widely documented in the media and have generated numerous beliefs and theories of every kind. From a scientific point of view, these experiences are all the more difficult to understand in that they come into being in chaotic conditions, which make studying them in real time almost impossible. The University of Liège’s researchers have thus tried a different approach.
Working together, researchers at the Coma Science Group (Directed by Steven Laureys) and the University of Liège’s Cognitive Psychology Research (Professor Serge Brédart and Hedwige Dehon), have looked into the memories of NDE with the hypothesis that if the memories of NDE were pure products of the imagination, their phenomenological characteristics (e.g., sensorial, self referential, emotional, etc. details) should be closer to those of imagined memories. Conversely, if the NDE are experienced in a way similar to that of reality, their characteristics would be closer to the memories of real events.
The researchers compared the responses provided by three groups of patients, each of which had survived (in a different manner) a coma, and a group of healthy volunteers. They studied the memories of NDE and the memories of real events and imagined events with the help of a questionnaire which evaluated the phenomenological characteristics of the memories. The results were surprising. From the perspective being studied, not only were the NDEs not similar to the memories of imagined events, but the phenomenological characteristics inherent to the memories of real events (e.g. memories of sensorial details) are even more numerous in the memories of NDE than in the memories of real events.
The brain, in conditions conducive to such phenomena occurring, is prey to chaos. Physiological and pharmacological mechanisms are completely disturbed, exacerbated or, conversely, diminished. Certain studies have put forward a physiological explanation for certain components of NDE, such as Out-of-Body Experiences, which could be explained by dysfunctions of the temporo-parietal lobe. In this context the study published in PLOS ONE suggests that these same mechanisms could also ‘create’ a perception – which would thus be processed by the individual as coming from the exterior – of reality. In a kind of way their brain is lying to them, like in a hallucination. These events being particularly surprising and especially important from an emotional and personal perspective, the conditions are ripe for the memory of this event being extremely detailed, precise and durable.
Numerous studies have looked into the physiological mechanisms of NDE, the production of these phenomena by the brain, but, taken separately, these two theories are incapable of explaining these experiences in their entirety. The study published in PLOS ONE does not claim to offer a unique explanation for NDE, but it contributes to study pathways which take into account psychological phenomena as factors associated with, and not contradictory to, physiological phenomena.

The memories of near death experiences (NDE): more real than reality?

University of Liège researchers have demonstrated that the physiological mechanisms triggered during NDE lead to a more vivid perception not only of imagined events in the history of an individual but also of real events which have taken place in their lives! These surprising results – obtained using an original method which now requires further investigation – are published in PLOS ONE.

Seeing a bright light, going through a tunnel, having the feeling of ending up in another ‘reality’ or leaving one’s own body are very well known features of the complex phenomena known as ‘Near-Death Experiences ‘ (NDE), which people who are close to death can experience in particular. Products of the mind? Psychological defence mechanisms? Hallucinations? These phenomena have been widely documented in the media and have generated numerous beliefs and theories of every kind. From a scientific point of view, these experiences are all the more difficult to understand in that they come into being in chaotic conditions, which make studying them in real time almost impossible. The University of Liège’s researchers have thus tried a different approach.

Working together, researchers at the Coma Science Group (Directed by Steven Laureys) and the University of Liège’s Cognitive Psychology Research (Professor Serge Brédart and Hedwige Dehon), have looked into the memories of NDE with the hypothesis that if the memories of NDE were pure products of the imagination, their phenomenological characteristics (e.g., sensorial, self referential, emotional, etc. details) should be closer to those of imagined memories. Conversely, if the NDE are experienced in a way similar to that of reality, their characteristics would be closer to the memories of real events.

The researchers compared the responses provided by three groups of patients, each of which had survived (in a different manner) a coma, and a group of healthy volunteers. They studied the memories of NDE and the memories of real events and imagined events with the help of a questionnaire which evaluated the phenomenological characteristics of the memories. The results were surprising. From the perspective being studied, not only were the NDEs not similar to the memories of imagined events, but the phenomenological characteristics inherent to the memories of real events (e.g. memories of sensorial details) are even more numerous in the memories of NDE than in the memories of real events.

The brain, in conditions conducive to such phenomena occurring, is prey to chaos. Physiological and pharmacological mechanisms are completely disturbed, exacerbated or, conversely, diminished. Certain studies have put forward a physiological explanation for certain components of NDE, such as Out-of-Body Experiences, which could be explained by dysfunctions of the temporo-parietal lobe. In this context the study published in PLOS ONE suggests that these same mechanisms could also ‘create’ a perception – which would thus be processed by the individual as coming from the exterior – of reality. In a kind of way their brain is lying to them, like in a hallucination. These events being particularly surprising and especially important from an emotional and personal perspective, the conditions are ripe for the memory of this event being extremely detailed, precise and durable.

Numerous studies have looked into the physiological mechanisms of NDE, the production of these phenomena by the brain, but, taken separately, these two theories are incapable of explaining these experiences in their entirety. The study published in PLOS ONE does not claim to offer a unique explanation for NDE, but it contributes to study pathways which take into account psychological phenomena as factors associated with, and not contradictory to, physiological phenomena.

Filed under near death experiences memory perception brain psychology neuroscience science

136 notes

MRI shows brain abnormalities in migraine patients

A new study suggests that migraines are related to brain abnormalities present at birth and others that develop over time. The research is published online in the journal Radiology.

image

Migraines are intense, throbbing headaches, sometimes accompanied by nausea, vomiting and sensitivity to light. Some patients experience auras, a change in visual or sensory function that precedes or occurs during the migraine. More than 300 million people suffer from migraines worldwide, according to the World Health Organization.

Previous research on migraine patients has shown atrophy of cortical regions in the brain related to pain processing, possibly due to chronic stimulation of those areas. Cortical refers to the cortex, or outer layer of the brain.

Much of that research has relied on voxel-based morphometry, which provides estimates of the brain’s cortical volume. In the new study, Italian researchers used a different approach: a surface-based MRI method to measure cortical thickness.

"For the first time, we assessed cortical thickness and surface area abnormalities in patients with migraine, which are two components of cortical volume that provide different and complementary pieces of information," said Massimo Filippi, M.D., director of the Neuroimaging Research Unit at the University Ospedale San Raffaele and professor of neurology at the University Vita-Salute’s San Raffaele Scientific Institute in Milan. "Indeed, cortical surface area increases dramatically during late fetal development as a consequence of cortical folding, while cortical thickness changes dynamically throughout the entire life span as a consequence of development and disease."

Dr. Filippi and colleagues used magnetic resonance imaging (MRI) to acquire T2-weighted and 3-D T1-weighted brain images from 63 migraine patients and 18 healthy controls. Using special software and statistical analysis, they estimated cortical thickness and surface area and correlated it with the patients’ clinical and radiologic characteristics.

Compared to controls, migraine patients showed reduced cortical thickness and surface area in regions related to pain processing. There was only minimal anatomical overlap of cortical thickness and cortical surface area abnormalities, with cortical surface area abnormalities being more pronounced and distributed than cortical thickness abnormalities. The presence of aura and white matter hyperintensities—areas of high intensity on MRI that appear to be more common in people with migraine—was related to the regional distribution of cortical thickness and surface area abnormalities, but not to disease duration and attack frequency.

"The most important finding of our study was that cortical abnormalities that occur in patients with migraine are a result of the balance between an intrinsic predisposition, as suggested by cortical surface area modification, and disease-related processes, as indicated by cortical thickness abnormalities," Dr. Filippi said. "Accurate measurements of cortical abnormalities could help characterize migraine patients better and improve understanding of the pathophysiological processes underlying the condition."

Additional research is needed to fully understand the meaning of cortical abnormalities in the pain processing areas of migraine patients, according to Dr. Filippi.

"Whether the abnormalities are a consequence of the repetition of migraine attacks or represent an anatomical signature that predisposes to the development of the disease is still debated," he said. "In my opinion, they might contribute to make migraine patients more susceptible to pain and to an abnormal processing of painful conditions and stimuli."

The researchers are conducting a longitudinal study of the patient group to see if their cortical abnormalities are stable or tend to worsen over the course of the disease. They are also studying the effects of treatments on the observed modifications of cortical folding and looking at pediatric patients with migraine to assess whether the abnormalities represent a biomarker of the disease.

(Source: eurekalert.org)

Filed under brain migraines cortex cortical abnormalities neuroimaging neuroscience science

135 notes

Brain scans predict which criminals are more likely to reoffend
In a twist that evokes the dystopian science fiction of writer Philip K. Dick, neuroscientists have found a way to predict whether convicted felons are likely to commit crimes again from looking at their brain scans. Convicts showing low activity in a brain region associated with decision-making and action are more likely to be arrested again, and sooner.
Kent Kiehl, a neuroscientist at the non-profit Mind Research Network in Albuquerque, New Mexico, and his collaborators studied a group of 96 male prisoners just before their release. The researchers used functional magnetic resonance imaging (fMRI) to scan the prisoners’ brains during computer tasks in which subjects had to make quick decisions and inhibit impulsive reactions.
The scans focused on activity in a section of the anterior cingulate cortex (ACC), a small region in the front of the brain involved in motor control and executive functioning. The researchers then followed the ex-convicts for four years to see how they fared.
Among the subjects of the study, men who had lower ACC activity during the quick-decision tasks were more likely to be arrested again after getting out of prison, even after the researchers accounted for other risk factors such as age, drug and alcohol abuse and psychopathic traits. Men who were in the lower half of the ACC activity ranking had a 2.6-fold higher rate of rearrest for all crimes and a 4.3-fold higher rate for nonviolent crimes. The results are published in the Proceedings of the National Academy of Sciences.
There is growing interest in using neuroimaging to predict specific behaviour, says Tor Wager, a neuroscientist at the University of Colorado in Boulder. He says that studies such as this one, which tie brain imaging to concrete clinical outcomes, “provide a new and so far very promising way” to find patterns of brain activity that have broader implications for society.But the authors themselves stress that much more work is needed to prove that the technique is reliable and consistent, and that it is likely to flag only the truly high-risk felons and leave the low-risk ones alone. “This isn’t ready for prime time,” says Kiehl.
Wager adds that the part of the ACC examined in this study “is one of the most frequently activated areas in the human brain across all kinds of tasks and psychological states”. Low ACC activity could have a variety of causes — impulsivity, caffeine use, vascular health, low motivation or better neural efficiency — and not all of these are necessarily related to criminal behaviour.
Crime prediction was the subject of Dick’s 1956 short story “The Minority Report” (adapted for the silver screen by Steven Spielberg in 2002), which highlighted the thorny ethics of arresting people for crimes they had yet to commit.
Brain scans are of course a far cry from the clairvoyants featured in that science-fiction story. But even if the science turns out to be reliable, the legal and social implications remain to be explored, the authors warn. Perhaps the most appropriate use for neurobiological markers would be for helping to make low-stakes decisions, such as which rehabilitation treatment to assign a prisoner, rather than high-stakes ones such as sentencing or releasing on parole.
“A treatment of [these clinical neuroimaging studies] that is either too glibly enthusiastic or over-critical,” Wager says, “will be damaging for this emerging science in the long run.”

Brain scans predict which criminals are more likely to reoffend

In a twist that evokes the dystopian science fiction of writer Philip K. Dick, neuroscientists have found a way to predict whether convicted felons are likely to commit crimes again from looking at their brain scans. Convicts showing low activity in a brain region associated with decision-making and action are more likely to be arrested again, and sooner.

Kent Kiehl, a neuroscientist at the non-profit Mind Research Network in Albuquerque, New Mexico, and his collaborators studied a group of 96 male prisoners just before their release. The researchers used functional magnetic resonance imaging (fMRI) to scan the prisoners’ brains during computer tasks in which subjects had to make quick decisions and inhibit impulsive reactions.

The scans focused on activity in a section of the anterior cingulate cortex (ACC), a small region in the front of the brain involved in motor control and executive functioning. The researchers then followed the ex-convicts for four years to see how they fared.

Among the subjects of the study, men who had lower ACC activity during the quick-decision tasks were more likely to be arrested again after getting out of prison, even after the researchers accounted for other risk factors such as age, drug and alcohol abuse and psychopathic traits. Men who were in the lower half of the ACC activity ranking had a 2.6-fold higher rate of rearrest for all crimes and a 4.3-fold higher rate for nonviolent crimes. The results are published in the Proceedings of the National Academy of Sciences.

There is growing interest in using neuroimaging to predict specific behaviour, says Tor Wager, a neuroscientist at the University of Colorado in Boulder. He says that studies such as this one, which tie brain imaging to concrete clinical outcomes, “provide a new and so far very promising way” to find patterns of brain activity that have broader implications for society.

But the authors themselves stress that much more work is needed to prove that the technique is reliable and consistent, and that it is likely to flag only the truly high-risk felons and leave the low-risk ones alone. “This isn’t ready for prime time,” says Kiehl.

Wager adds that the part of the ACC examined in this study “is one of the most frequently activated areas in the human brain across all kinds of tasks and psychological states”. Low ACC activity could have a variety of causes — impulsivity, caffeine use, vascular health, low motivation or better neural efficiency — and not all of these are necessarily related to criminal behaviour.

Crime prediction was the subject of Dick’s 1956 short story “The Minority Report” (adapted for the silver screen by Steven Spielberg in 2002), which highlighted the thorny ethics of arresting people for crimes they had yet to commit.

Brain scans are of course a far cry from the clairvoyants featured in that science-fiction story. But even if the science turns out to be reliable, the legal and social implications remain to be explored, the authors warn. Perhaps the most appropriate use for neurobiological markers would be for helping to make low-stakes decisions, such as which rehabilitation treatment to assign a prisoner, rather than high-stakes ones such as sentencing or releasing on parole.

“A treatment of [these clinical neuroimaging studies] that is either too glibly enthusiastic or over-critical,” Wager says, “will be damaging for this emerging science in the long run.”

Filed under brain brain activity brain scans neuroimaging anterior cingulate cortex neuroscience science

186 notes

New mechanism for long-term memory formation discovered
UC Irvine neurobiologists have found a novel molecular mechanism that helps trigger the formation of long-term memory. The researchers believe the discovery of this mechanism adds another piece to the puzzle in the ongoing effort to uncover the mysteries of memory and, potentially, certain intellectual disabilities.
In a study led by Marcelo Wood of UC Irvine’s Center for the Neurobiology of Learning & Memory, the team investigated the role of this mechanism – a gene designated Baf53b – in long-term memory formation. Baf53b is one of several proteins making up a molecular complex called nBAF.
Mutations in the proteins of the nBAF complex have been linked to several intellectual disorders, including Coffin-Siris syndrome, Nicolaides-Baraitser syndrome and sporadic autism. One of the key questions the researchers addressed is how mutations in components of the nBAF complex lead to cognitive impairments.
In their study, Wood and his colleagues used mice bred with mutations in Baf53b. While this genetic modification did not affect the mice’s ability to learn, it did notably inhibit long-term memories from forming and severely impaired synaptic function.
“These findings present a whole new way to look at how long-term memories form,” said Wood, associate professor of neurobiology & behavior. “They also provide a mechanism by which mutations in the proteins of the nBAF complex may underlie the development of intellectual disability disorders characterized by significant cognitive impairments.”
How does this mechanism regulate gene expression required for long-term memory formation? Most genes are tightly packaged by a chromatin structure – chromatin being what compacts DNA so that it fits inside the nucleus of a cell. That compaction mechanism represses gene expression. Baf53b, and the nBAF complex, physically open the chromatin structure so specific genes required for long-term memory formation are turned on. The mutated forms of Baf53b did not allow for this necessary gene expression.
“The results from this study reveal a powerful new mechanism that increases our understanding of how genes are regulated for memory formation,” Wood said. “Our next step is to identify the key genes the nBAF complex regulates. With that information, we can begin to understand what can go wrong in intellectual disability disorders, which paves a path toward possible therapeutics.”
Findings appear online today in Nature Neuroscience.

New mechanism for long-term memory formation discovered

UC Irvine neurobiologists have found a novel molecular mechanism that helps trigger the formation of long-term memory. The researchers believe the discovery of this mechanism adds another piece to the puzzle in the ongoing effort to uncover the mysteries of memory and, potentially, certain intellectual disabilities.

In a study led by Marcelo Wood of UC Irvine’s Center for the Neurobiology of Learning & Memory, the team investigated the role of this mechanism – a gene designated Baf53b – in long-term memory formation. Baf53b is one of several proteins making up a molecular complex called nBAF.

Mutations in the proteins of the nBAF complex have been linked to several intellectual disorders, including Coffin-Siris syndrome, Nicolaides-Baraitser syndrome and sporadic autism. One of the key questions the researchers addressed is how mutations in components of the nBAF complex lead to cognitive impairments.

In their study, Wood and his colleagues used mice bred with mutations in Baf53b. While this genetic modification did not affect the mice’s ability to learn, it did notably inhibit long-term memories from forming and severely impaired synaptic function.

“These findings present a whole new way to look at how long-term memories form,” said Wood, associate professor of neurobiology & behavior. “They also provide a mechanism by which mutations in the proteins of the nBAF complex may underlie the development of intellectual disability disorders characterized by significant cognitive impairments.”

How does this mechanism regulate gene expression required for long-term memory formation? Most genes are tightly packaged by a chromatin structure – chromatin being what compacts DNA so that it fits inside the nucleus of a cell. That compaction mechanism represses gene expression. Baf53b, and the nBAF complex, physically open the chromatin structure so specific genes required for long-term memory formation are turned on. The mutated forms of Baf53b did not allow for this necessary gene expression.

“The results from this study reveal a powerful new mechanism that increases our understanding of how genes are regulated for memory formation,” Wood said. “Our next step is to identify the key genes the nBAF complex regulates. With that information, we can begin to understand what can go wrong in intellectual disability disorders, which paves a path toward possible therapeutics.”

Findings appear online today in Nature Neuroscience.

Filed under brain memory formation LTM genes mutations cognitive impairment neuroscience psychology science

73 notes

Parkinsons’ drug helps older people to make decisions

A drug widely used to treat Parkinson’s Disease can help to reverse age-related impairments in decision making in some older people, a study from researchers at the Wellcome Trust Centre for Neuroimaging has shown.

The study, published today in the journal Nature Neuroscience, also describes changes in the patterns of brain activity of adults in their seventies that help to explain why they are worse at making decisions than younger people.

Poorer decision-making is a natural part of the ageing process that stems from a decline in our brains’ ability to learn from our experiences. Part of the decision-making process involves learning to predict the likelihood of getting a reward from the choices that we make.

An area of the brain called the nucleus accumbens is responsible for interpreting the difference between the reward that we’re expecting to get from a decision and the reward that is actually received. These so called ‘prediction errors’, reported by a brain chemical called dopamine, help us to learn from our actions and modify our behaviour to make better choices the next time.

Dr Rumana Chowdhury, who led the study at the Wellcome Trust Centre for Neuroimaging at UCL, said: “We know that dopamine decline is part of the normal aging process so we wanted to see whether it had any effect on reward-based decision making. We found that when we treated older people who were particularly bad at making decisions with a drug that increases dopamine in the brain, their ability to learn from rewards improved to a level comparable to somebody in their twenties and enabled them to make better decisions.”

The team used a combination of behavioural testing and brain imaging techniques, to investigate the decision-making process in 32 healthy volunteers aged in their early seventies compared with 22 volunteers in their mid-twenties. Older participants were tested on and off L-DOPA, a drug that increases levels of dopamine in the brain. L-DOPA, more commonly known as Levodopa, is widely used in the clinic to treat Parkinson’s.

The participants were asked to complete a behavioural learning task called the two-arm bandit, which mimics the decisions that gamblers make while playing slot machines. Players were shown two images and had to choose the one that they thought would give them the biggest reward. Their performance before and after drug treatment was assessed by the amount of money they won in the task.

"The older volunteers who were less able to predict the likelihood of a reward from their decisions, and so performed worst in the task, showed a significant improvement following drug treatment," Dr Chowdhury explains.

The team then looked at brain activity in the participants as they played the game using functional Magnetic Resonance Imaging (fMRI), and measured connections between areas of the brain that are involved in reward prediction using a technique called Diffusor Tensor Imaging (DTI).

The findings reveal that the older adults who performed best in the gambling game before drug treatment had greater integrity of their dopamine pathways. Older adults who performed poorly before drug treatment were not able to adequately signal reward expectation in the brain – this was corrected by L-DOPA and their performance improved on the drug.

Dr John Williams, Head of Neuroscience and Mental Health at the Wellcome Trust, said: “This careful investigation into the subtle cognitive changes that take place as we age offers important insights into what may happen at both a functional and anatomical level in older people who have problems with making decisions. That the team were able to reverse these changes by manipulating dopamine levels offers the hope of therapeutic approaches that could allow older people to function more effectively in the wider community.”

(Source: eurekalert.org)

Filed under brain brain activity parkinson's disease nucleus accumbens aging neuroimaging neuroscience science

93 notes

Alterations in brain activity in children at risk of schizophrenia predate onset of symptoms
Research from the University of North Carolina has shown that children at risk of developing schizophrenia have brains that function differently than those not at risk.
Brain scans of children who have parents or siblings with the illness reveal a neural circuitry that is hyperactivated or stressed by tasks that peers with no family history of the illness seem to handle with ease.
Because these differences in brain functioning appear before neuropsychiatric symptoms such as trouble focusing, paranoid beliefs, or hallucinations, the scientists believe that the finding could point to early warning signs or “vulnerability markers” for schizophrenia.
“The downside is saying that anyone with a first degree relative with schizophrenia is doomed. Instead, we want to use our findings to identify those individuals with differences in brain function that indicate they are particularly vulnerable, so we can intervene to minimize that risk,” said senior study author Aysenil Belger, PhD, associate professor of psychiatry at the UNC School of Medicine.
The UNC study, published online on March 6, 2013, in the journal Psychiatry Research: Neuroimaging, is one of the first to look for alterations in brain activity associated with mental illness in individuals as young as nine years of age.
Individuals who have a first degree family member with schizophrenia have an 8-fold to 12-fold increased risk of developing the disease. However, there is no way of knowing for certain who will become schizophrenic until symptoms arise and a diagnosis is reached. Some of the earliest signs of schizophrenia are a decline in verbal memory, IQ, and other mental functions, which researchers believe stem from an inefficiency in cortical processing – the brain’s waning ability to tackle complex tasks.
In this study, Belger and her colleagues sought to identify what if any functional changes occur in the brains of adolescents at high risk of developing schizophrenia. She performed functional magnetic resonance imaging (fMRI) on 42 children and adolescents ages 9 to 18, half of which had relatives with schizophrenia and half of which did not. Study participants each spent an hour and a half playing a game where they had to identify a specific image – a simple circle – out of a lineup of emotionally evocative images, such as cute or scary animals. At the same time, the MRI machine scanned for changes in brain activity associated with each target detection task.
Belger found that the circuitry involved in emotion and higher order decision making was hyperactivated in individuals with a family history of schizophrenia, suggesting that the task was stressing out these areas of the brain in the study subjects.
“This finding shows that these regions are not activating normally,” she says. “We think that this hyperactivation eventually damages these specific areas in the brain to the point that they become hypoactivated in patients, meaning that when the brain is asked to go into high gear it no longer can.”
Belger is currently exploring what kind of role stress plays in the changing mental capacity of adolescents at high risk of developing schizophrenia. Though only a fraction of these individuals will be diagnosed with schizophrenia, Belger thinks it is important to pinpoint the most vulnerable people early to explore interventions that may stave off the mental illness.
“It may be as simple as understanding that people are different in how they cope with stress,” says Belger. “Teaching strategies to handle stress could make these individuals less vulnerable to not just schizophrenia but also other neuropsychiatric disorders.”

Alterations in brain activity in children at risk of schizophrenia predate onset of symptoms

Research from the University of North Carolina has shown that children at risk of developing schizophrenia have brains that function differently than those not at risk.

Brain scans of children who have parents or siblings with the illness reveal a neural circuitry that is hyperactivated or stressed by tasks that peers with no family history of the illness seem to handle with ease.

Because these differences in brain functioning appear before neuropsychiatric symptoms such as trouble focusing, paranoid beliefs, or hallucinations, the scientists believe that the finding could point to early warning signs or “vulnerability markers” for schizophrenia.

“The downside is saying that anyone with a first degree relative with schizophrenia is doomed. Instead, we want to use our findings to identify those individuals with differences in brain function that indicate they are particularly vulnerable, so we can intervene to minimize that risk,” said senior study author Aysenil Belger, PhD, associate professor of psychiatry at the UNC School of Medicine.

The UNC study, published online on March 6, 2013, in the journal Psychiatry Research: Neuroimaging, is one of the first to look for alterations in brain activity associated with mental illness in individuals as young as nine years of age.

Individuals who have a first degree family member with schizophrenia have an 8-fold to 12-fold increased risk of developing the disease. However, there is no way of knowing for certain who will become schizophrenic until symptoms arise and a diagnosis is reached. Some of the earliest signs of schizophrenia are a decline in verbal memory, IQ, and other mental functions, which researchers believe stem from an inefficiency in cortical processing – the brain’s waning ability to tackle complex tasks.

In this study, Belger and her colleagues sought to identify what if any functional changes occur in the brains of adolescents at high risk of developing schizophrenia. She performed functional magnetic resonance imaging (fMRI) on 42 children and adolescents ages 9 to 18, half of which had relatives with schizophrenia and half of which did not. Study participants each spent an hour and a half playing a game where they had to identify a specific image – a simple circle – out of a lineup of emotionally evocative images, such as cute or scary animals. At the same time, the MRI machine scanned for changes in brain activity associated with each target detection task.

Belger found that the circuitry involved in emotion and higher order decision making was hyperactivated in individuals with a family history of schizophrenia, suggesting that the task was stressing out these areas of the brain in the study subjects.

“This finding shows that these regions are not activating normally,” she says. “We think that this hyperactivation eventually damages these specific areas in the brain to the point that they become hypoactivated in patients, meaning that when the brain is asked to go into high gear it no longer can.”

Belger is currently exploring what kind of role stress plays in the changing mental capacity of adolescents at high risk of developing schizophrenia. Though only a fraction of these individuals will be diagnosed with schizophrenia, Belger thinks it is important to pinpoint the most vulnerable people early to explore interventions that may stave off the mental illness.

“It may be as simple as understanding that people are different in how they cope with stress,” says Belger. “Teaching strategies to handle stress could make these individuals less vulnerable to not just schizophrenia but also other neuropsychiatric disorders.”

Filed under schizophrenia neuroimaging genetics fMRI brain neuroscience science

109 notes

Human brain research made easier by database
Researchers will be able to access samples from more than 7,000 donated human brains to help study major brain diseases, thanks to a new on-line database, launched by the Medical Research Council (MRC) today.
The UK Brain Banks Network database speeds up access to donated brain samples held across 10 brain banks in the UK and allows researchers studying Multiple Sclerosis, Alzheimer’s, Parkinson’s and a range of other neurodegenerative and developmental diseases to track down human tissue samples for their work.
Thanks to a unique collaboration between the MRC and five leading charities, the database will help scientists from academia and industry investigate the underlying causes of major brain diseases and understand how they take hold in our bodies.
Although scientists can model diseases in the lab, to fully understand dementia and other brain-related disorders they need to study human brain tissue. A lot of research relies on donated brain tissue stored in brain banks across the UK. Until today, researchers had to apply to each brain bank in turn to find out if they held the samples they needed and find the ‘control’ samples (donated brains free from disease) for comparison – a long and drawn out process. Now samples can be found with the click of a button from one source.
Professor James Ironside, Director of the MRC UK Brain Banks Network, said:

“The database is the result of four years of painstaking planning and data analysis by very dedicated people. It will enable quick and easy access for researchers who are already working on neurological or psychiatric disease (perhaps in animal models or cells) and would like to translate their findings into human tissue and is very useful for those who are planning a grant application. The brain banks have already been given ethical approval, cutting out the need for researchers to go through a separate ethics application.


We must remember that vital research would not be possible without the generosity of those individuals who donate their brains to medical research. We’re working hard to make sure that the access for researchers studying brain samples is much easier. The next step is to improve the systems for those wishing to donate their brain to medical research.”

Five leading charities helped to supply data for the database; the MS Society, Parkinson’s UK, Alzheimer’s Society, Alzheimer’s Research UK and Autistica.
For more information about the database visit: http://www.mrc.ac.uk/brainbanksnetwork

Human brain research made easier by database

Researchers will be able to access samples from more than 7,000 donated human brains to help study major brain diseases, thanks to a new on-line database, launched by the Medical Research Council (MRC) today.

The UK Brain Banks Network database speeds up access to donated brain samples held across 10 brain banks in the UK and allows researchers studying Multiple Sclerosis, Alzheimer’s, Parkinson’s and a range of other neurodegenerative and developmental diseases to track down human tissue samples for their work.

Thanks to a unique collaboration between the MRC and five leading charities, the database will help scientists from academia and industry investigate the underlying causes of major brain diseases and understand how they take hold in our bodies.

Although scientists can model diseases in the lab, to fully understand dementia and other brain-related disorders they need to study human brain tissue. A lot of research relies on donated brain tissue stored in brain banks across the UK. Until today, researchers had to apply to each brain bank in turn to find out if they held the samples they needed and find the ‘control’ samples (donated brains free from disease) for comparison – a long and drawn out process. Now samples can be found with the click of a button from one source.

Professor James Ironside, Director of the MRC UK Brain Banks Network, said:

“The database is the result of four years of painstaking planning and data analysis by very dedicated people. It will enable quick and easy access for researchers who are already working on neurological or psychiatric disease (perhaps in animal models or cells) and would like to translate their findings into human tissue and is very useful for those who are planning a grant application. The brain banks have already been given ethical approval, cutting out the need for researchers to go through a separate ethics application.
We must remember that vital research would not be possible without the generosity of those individuals who donate their brains to medical research. We’re working hard to make sure that the access for researchers studying brain samples is much easier. The next step is to improve the systems for those wishing to donate their brain to medical research.”

Five leading charities helped to supply data for the database; the MS Society, Parkinson’s UK, Alzheimer’s Society, Alzheimer’s Research UK and Autistica.

For more information about the database visit: http://www.mrc.ac.uk/brainbanksnetwork

Filed under brain brain diseases brain tissue brain donation psychiatric diseases neuroscience science

61 notes

Researchers Link Gulf War Illness to Physical Changes in Brain Fibers that Process Pain

Researchers at Georgetown University Medical Center (GUMC) have found what they say is evidence that veterans who suffer from “Gulf War Illness” have physical changes in their brains not seen in unaffected individuals. Brain scans of 31 veterans with the illness, compared to 20 control subjects, revealed anomalies in the bundles of nerve fibers that connect brain areas involved in the processing and perception of pain and fatigue.

The discovery, published online March 20 in PLOS ONE, could provide insight into the mysterious medical symptoms reported by more than one-fourth of the 697,000 veterans deployed to the 1990-1991 Persian Gulf War, the researchers say. These symptoms, termed Gulf War Illness, range from mild to debilitating and can include widespread pain, fatigue, and headache, as well as cognitive and gastrointestinal dysfunctions.

Although these veterans were exposed to nerve agents, pesticides and herbicides, among other toxic chemicals, no one has definitively linked any single exposure or underlying mechanism to Gulf War Illness according to the scientists.

This is the first study to show veterans, compared to unaffected subjects, have significant axonal damage. Bundles of axons, which form the brain white matter, are akin to telephone wires that carry nerve impulses between different parts of the gray matter in the brain. The researchers found that damage to the right inferior fronto-occipital fasciculus was significantly correlated with the severity of pain, fatigue, and tenderness.

“This tract of axons links cortical gray matter regions involved in fatigue, pain, emotional and reward processing.  This bundle also supports activity in the ventral attention network, which searches for unexpected signals in the surrounding environment that may be inappropriately interpreted as causing pain or being dangerous. Altered function in this tract may explain the increased vigilance and distractibility observed in veterans.” says lead author Rakib Rayhan, MS, a researcher in the lab of the study’s senior investigator, James Baraniuk, MD, a professor of medicine at GUMC.

In this Department of Defense-funded study, the research team used a form of functional magnetic resonance imaging (fMRI) called diffusion tensor imaging. This imaging method examines patterns of water diffusion in the brain to look for changes in the integrity of white matter, which is not seen on regular MRI scans. “This provides a completely new perspective on Gulf War Illness,” says Baraniuk. “While we can’t exactly tell how this tract is affected at the molecular level — the scans tell us these axons are not working in a normal fashion.”

Although preliminary, “the changes appear distinct from multiple sclerosis, major depression, Alzheimer’s disease and other neurodegenerative diseases,” says Rayhan. “These novel findings are really exciting because they provide validation for many veterans who have long said that no one believes them.”

The results must be replicated, say its authors, but for the first time a potential biomarker for Gulf War Illness may be on the horizon as well as a possible target for therapy aimed at regenerating these neurons.

“Pain and fatigue are perceptions, just like other sensory input, and Gulf War Illness could be due to extensive damage to the structures that facilitate them,” says Rayhan. “Some of the veterans we studied feel pain when doing something as simple as putting on a shirt. Now we have something to tell them about why their lives have been so greatly affected.”

(Source: explore.georgetown.edu)

Filed under gulf war illness brain nerve fibers white matter veterans neuroscience science

78 notes

How two brain areas interact to trigger divergent emotional behaviors
New research from the University of North Carolina School of Medicine for the first time explains exactly how two brain regions interact to promote emotionally motivated behaviors associated with anxiety and reward.
The findings could lead to new mental health therapies for disorders such as addiction, anxiety, and depression. A report of the research was published online by the journal, Nature, on March 20, 2013.
Located deep in the brain’s temporal lobe are tightly packed clusters of brain cells in the almond shaped amygdala that are important for processing memory and emotion. When animals or people are in stressful situations, neurons in an extended portion of the amygdala called the bed nucleus of the stria terminalis, or BNST, become hyperactive.
But, almost paradoxically, neurons in the BNST, which modulate fear and anxiety, reach into a portion of the midbrain that’s involved in behavioral responses to reward, the ventral tegmental area, or VTA.
“For many years it’s been known that dopamine neurons in the VTA are involved in reward processing and motivation. For example, they’re activated during exposure to drugs of abuse and naturally rewarding experiences,” says study senior author Garret Stuber, PhD, assistant professor in the departments of Psychiatry and Cell Biology and Physiology, and the UNC Neuroscience Center.  “On the one hand, you have this area of the brain – the BNST – that’s associated with aversion and anxiety, but it’s in direct communication with a brain reward center. We wanted to figure out exactly how these two brain regions interact to promote different types of behavioral responses related to anxiety and reward.”
In the past, researchers have tried to get a glimpse into the inner workings of the brain using electrical stimulation or drugs, but those techniques couldn’t quickly and specifically change only one type of cell or one type of connection. But optogenetics, a technique that emerged about seven years ago, can.
In the technique, scientists transfer light-sensitive proteins called “opsins” – derived from algae or bacteria that need light to grow – into the mammalian brain cells they wish to study. Then they shine laser beams onto the genetically manipulated brain cells, either exciting or blocking their activity with millisecond precision.

How two brain areas interact to trigger divergent emotional behaviors

New research from the University of North Carolina School of Medicine for the first time explains exactly how two brain regions interact to promote emotionally motivated behaviors associated with anxiety and reward.

The findings could lead to new mental health therapies for disorders such as addiction, anxiety, and depression. A report of the research was published online by the journal, Nature, on March 20, 2013.

Located deep in the brain’s temporal lobe are tightly packed clusters of brain cells in the almond shaped amygdala that are important for processing memory and emotion. When animals or people are in stressful situations, neurons in an extended portion of the amygdala called the bed nucleus of the stria terminalis, or BNST, become hyperactive.

But, almost paradoxically, neurons in the BNST, which modulate fear and anxiety, reach into a portion of the midbrain that’s involved in behavioral responses to reward, the ventral tegmental area, or VTA.

“For many years it’s been known that dopamine neurons in the VTA are involved in reward processing and motivation. For example, they’re activated during exposure to drugs of abuse and naturally rewarding experiences,” says study senior author Garret Stuber, PhD, assistant professor in the departments of Psychiatry and Cell Biology and Physiology, and the UNC Neuroscience Center.  “On the one hand, you have this area of the brain – the BNST – that’s associated with aversion and anxiety, but it’s in direct communication with a brain reward center. We wanted to figure out exactly how these two brain regions interact to promote different types of behavioral responses related to anxiety and reward.”

In the past, researchers have tried to get a glimpse into the inner workings of the brain using electrical stimulation or drugs, but those techniques couldn’t quickly and specifically change only one type of cell or one type of connection. But optogenetics, a technique that emerged about seven years ago, can.

In the technique, scientists transfer light-sensitive proteins called “opsins” – derived from algae or bacteria that need light to grow – into the mammalian brain cells they wish to study. Then they shine laser beams onto the genetically manipulated brain cells, either exciting or blocking their activity with millisecond precision.

Filed under brain brain cells ventral tegmental area temporal lobe amygdala behavioral responses neuroscience science

64 notes

Brain Mapping Reveals Neurological Basis of Decision-Making in Rats
Scientists at UC San Francisco have discovered how memory recall is linked to decision-making in rats, showing that measurable activity in one part of the brain occurs when rats in a maze are playing out memories that help them decide which way to turn. The more they play out these memories, the more likely they are to find their way correctly to the end of the maze.
In their study, reported this week in the journal Neuron, the UCSF researchers implanted electrodes directly on a region of the rat brain known as the hippocampus, which is already known to play a key role in the formation and recall of memory. This same region is active when animals are learning, and it is damaged in people who have Alzheimer’s and post-traumatic stress disorder.
The study showed that when the rats paused before an upcoming choice, sometimes the hippocampus was more active and sometimes it was less active. When it was more active it did a better job of recalling memories of places the animal could go next, and the animal was more likely to go to the right place.
“We know that considering possibilities is important for decision-making, but we haven’t really known how this happens in the brain,” said neuroscientist Loren Frank, PhD, who led the research. Frank is an associate professor of physiology and a member of the UCSF Center for Integrative Neuroscience at UCSF.
The work builds upon several years of investigations in Frank’s laboratory that have shown how activity in the hippocampus is a fundamental constituent of memory retrieval. Their recent work shows that this activity is not just about remembering the past – it is also important for thinking about the future. When the brain does a better job of thinking about future possibilities, it makes better decisions.
Next, the team wants to tease out why sometimes the hippocampus does not do a good job of playing out future options. Problems with memory and decision-making are central to age-related cognitive decline, and a deeper understanding of how this works could pave the way for interventions that make the brain work better.

Brain Mapping Reveals Neurological Basis of Decision-Making in Rats

Scientists at UC San Francisco have discovered how memory recall is linked to decision-making in rats, showing that measurable activity in one part of the brain occurs when rats in a maze are playing out memories that help them decide which way to turn. The more they play out these memories, the more likely they are to find their way correctly to the end of the maze.

In their study, reported this week in the journal Neuron, the UCSF researchers implanted electrodes directly on a region of the rat brain known as the hippocampus, which is already known to play a key role in the formation and recall of memory. This same region is active when animals are learning, and it is damaged in people who have Alzheimer’s and post-traumatic stress disorder.

The study showed that when the rats paused before an upcoming choice, sometimes the hippocampus was more active and sometimes it was less active. When it was more active it did a better job of recalling memories of places the animal could go next, and the animal was more likely to go to the right place.

“We know that considering possibilities is important for decision-making, but we haven’t really known how this happens in the brain,” said neuroscientist Loren Frank, PhD, who led the research. Frank is an associate professor of physiology and a member of the UCSF Center for Integrative Neuroscience at UCSF.

The work builds upon several years of investigations in Frank’s laboratory that have shown how activity in the hippocampus is a fundamental constituent of memory retrieval. Their recent work shows that this activity is not just about remembering the past – it is also important for thinking about the future. When the brain does a better job of thinking about future possibilities, it makes better decisions.

Next, the team wants to tease out why sometimes the hippocampus does not do a good job of playing out future options. Problems with memory and decision-making are central to age-related cognitive decline, and a deeper understanding of how this works could pave the way for interventions that make the brain work better.

Filed under brain memory cognitive decline hippocampus decision-making neuroscience science

free counters