Neuroscience

Articles and news from the latest research reports.

113 notes

Sugar Linked to Memory Problems in Adolescent Rats

Studying rats as model subjects, scientists found that adolescents were at an increased risk of suffering negative health effects from sugar-sweetened beverage consumption.

Adolescent rats that freely consumed large quantities of liquid solutions containing sugar or high-fructose corn syrup (HFCS) in concentrations comparable to popular sugar-sweetened beverages experienced memory problems and brain inflammation, and became pre-diabetic, according to a new study from USC. Neither adult rats fed the sugary drinks nor adolescent rats who did not consume sugar had the same issues.

“The brain is especially vulnerable to dietary influences during critical periods of development, like adolescence,” said Scott Kanoski, corresponding author of the study and an assistant professor at the USC Dornsife College of Letters, Arts and Sciences.

Kanoski collaborated with USC’s Ted Hsu, Vaibhav Konanur, Lilly Taing, Ryan Usui, Brandon Kayser, and Michael Goran. Their study, which tested a total of 76 rats, was published online by the journal Hippocampus on Sept. 23.

About 35 to 40 percent of the rats’ caloric intake was from sugar or HFCS. For comparason, added sugars make up about 17 percent of the total caloric intake of teens in the U.S. on average, according to the CDC.

The rats were then tested in mazes that probe their spatial memory ability. Adolescent rats that had consumed the sugary beverages, particularly HFCS, performed worse on the test than any other group – which may be the result of the neuroinflammation detected in the hippocampus, Kanoski said.

The hippocampus is a part of the temporal lobe located deep within the brain that controls memory formation. People with Alzheimer’s Disease and other dementias often suffer damage to the hippocampus.

“Consuming a diet high in added sugars not only can lead to weight gain and metabolic disturbances, but can also negatively impact our neural functioning and cognitive ability.” Kanoski said. Next, Kanoski and his team plant to see how different monosaccharides (simple sugars) and HFCS affect the brain.

(Source: pressroom.usc.edu)

Filed under hippocampus memory sugar cognitive function adolescence neuroscience science

144 notes

How female fruit flies know when to say ‘yes’
A fundamental question in neurobiology is how animals, including humans, make decisions. A new study publishing in the open access journal PLOS Biology on October 7 reveals how fruit fly females make a very important decision: to either accept or reject male courtship. This decision appears to be generated by a very small number of excitatory neurons that use acetylcholine as their neurotransmitter located in three brain regions. This study provides the framework to understand how decisions are generated and suggests that a decision is reached because that option is literally the most exciting.
Read more

How female fruit flies know when to say ‘yes’

A fundamental question in neurobiology is how animals, including humans, make decisions. A new study publishing in the open access journal PLOS Biology on October 7 reveals how fruit fly females make a very important decision: to either accept or reject male courtship. This decision appears to be generated by a very small number of excitatory neurons that use acetylcholine as their neurotransmitter located in three brain regions. This study provides the framework to understand how decisions are generated and suggests that a decision is reached because that option is literally the most exciting.

Read more

Filed under fruit flies decision making courtship neurons acetylcholine neuroscience science

124 notes

Working memory hinders learning in schizophrenia
A new study pinpoints working memory as a source of learning difficulties in people with schizophrenia.
Working memory is known to be affected in the millions of people — about 1 percent of the population — who have schizophrenia, but it has been unclear whether that has a specific role in making learning more difficult, said Anne Collins, a postdoctoral researcher at Brown University and lead author of the study.
“We really tend to think of learning as a unitary, single process, but really it is not,” said Collins, who in 2012 along with co-author Michael Frank, associate professor of cognitive, linguistic, and psychological sciences, developed an experimental task and a computational model of cognition that can distinguish the contributions of working memory and reinforcement in the learning process. “We thought we could try to disentangle that here and see if the impairment was in both aspects, or only one of them.”
In the new study in the Journal of Neuroscience, cognitive scientists Collins and Frank collaborated with schizophrenia experts James Waltz and James Gold of the University of Maryland to measure the effects of working memory and reinforcement in learning by applying these methods. They found that only working memory was a source of impairment.
Learning about learning’s components
To find that out, they marshaled 49 volunteers with schizophrenia and an otherwise comparable set of 36 people without the condition to participate in the specially designed learning task. In each round, participants were shown a set of images and then were asked to push one of three buttons when they saw each image. With each button push they were told whether they had hit the correct button for that image. Over time, through trial and error, participants could learn which picture called for which button. With perfect memory, one wouldn’t need to see an image more than three times to learn the right button to push when it appeared.
The task explicitly involves employing the brain’s systems for working memory (keeping each image–button association in mind) and for reinforcement learning (wanting to repeat an action that led to the feedback of “correct” and to avoid one that produced “incorrect”). But in different rounds while the degree of reinforcement remained the same, the experimenters varied the number of images in the sets the volunteers saw, from two to six. What varied, therefore, was the degree to which working memory was taxed.
What the researchers found was that for both people with schizophrenia and for controls, the larger the image set size, the more trials it took to learn to press the correct button consistently for each image and the longer it took to react to each stimulus. People with schizophrenia generally performed worse on the task than healthy controls.
Those results show that as the task involved more images, it became harder to do – a matter of working memory, since the capacity to maintain information explicitly in memory is limited – but that alone did not prove that working memory was a source of learning problems for people with schizophrenia. They could also be doing worse because of a slower use of the reinforcement.
To determine that, the researchers used their computational models of how learning occurs in the brain to fit the experimental data. They asked what parameters in the models needed to vary to accurately predict the behavior they measured in people with and without schizophrenia.
That analysis revealed that varying parameters of working memory, such as capacity, but not parameters of reinforcement learning, accounted best for differences in behavior between the groups.
“With model-fitting techniques, I can look quantitatively, trial-by-trial and see that the model predicts subject’s choices,” she said. “The same model explains both the healthy group and the patient group, but with differences in parameters.”
That confirmed that working memory uniquely affected learning in people with schizophrenia, while reinforcement learning mechanisms did not, Collins said.
The study suggests that working memory could be a more important target than reinforcement learning among researchers and clinicians hoping to help improve learning for people with schizophrenia, Collins said.
Among mentally healthy people as well, the study illustrates that the different components of learning can be understood individually, even as they all interact in the brain to make learning happen.
“More broadly, it brings attention to the fact that we need to consider learning as a multiactor kind of behavior that can’t be just summarized by a single system,” Collins said. “It’s important to design tasks that can separate them out so we can extract different sources of variance and correctly match them to different neural systems.”
(Image: Shutterstock)

Working memory hinders learning in schizophrenia

A new study pinpoints working memory as a source of learning difficulties in people with schizophrenia.

Working memory is known to be affected in the millions of people — about 1 percent of the population — who have schizophrenia, but it has been unclear whether that has a specific role in making learning more difficult, said Anne Collins, a postdoctoral researcher at Brown University and lead author of the study.

“We really tend to think of learning as a unitary, single process, but really it is not,” said Collins, who in 2012 along with co-author Michael Frank, associate professor of cognitive, linguistic, and psychological sciences, developed an experimental task and a computational model of cognition that can distinguish the contributions of working memory and reinforcement in the learning process. “We thought we could try to disentangle that here and see if the impairment was in both aspects, or only one of them.”

In the new study in the Journal of Neuroscience, cognitive scientists Collins and Frank collaborated with schizophrenia experts James Waltz and James Gold of the University of Maryland to measure the effects of working memory and reinforcement in learning by applying these methods. They found that only working memory was a source of impairment.

Learning about learning’s components

To find that out, they marshaled 49 volunteers with schizophrenia and an otherwise comparable set of 36 people without the condition to participate in the specially designed learning task. In each round, participants were shown a set of images and then were asked to push one of three buttons when they saw each image. With each button push they were told whether they had hit the correct button for that image. Over time, through trial and error, participants could learn which picture called for which button. With perfect memory, one wouldn’t need to see an image more than three times to learn the right button to push when it appeared.

The task explicitly involves employing the brain’s systems for working memory (keeping each image–button association in mind) and for reinforcement learning (wanting to repeat an action that led to the feedback of “correct” and to avoid one that produced “incorrect”). But in different rounds while the degree of reinforcement remained the same, the experimenters varied the number of images in the sets the volunteers saw, from two to six. What varied, therefore, was the degree to which working memory was taxed.

What the researchers found was that for both people with schizophrenia and for controls, the larger the image set size, the more trials it took to learn to press the correct button consistently for each image and the longer it took to react to each stimulus. People with schizophrenia generally performed worse on the task than healthy controls.

Those results show that as the task involved more images, it became harder to do – a matter of working memory, since the capacity to maintain information explicitly in memory is limited – but that alone did not prove that working memory was a source of learning problems for people with schizophrenia. They could also be doing worse because of a slower use of the reinforcement.

To determine that, the researchers used their computational models of how learning occurs in the brain to fit the experimental data. They asked what parameters in the models needed to vary to accurately predict the behavior they measured in people with and without schizophrenia.

That analysis revealed that varying parameters of working memory, such as capacity, but not parameters of reinforcement learning, accounted best for differences in behavior between the groups.

“With model-fitting techniques, I can look quantitatively, trial-by-trial and see that the model predicts subject’s choices,” she said. “The same model explains both the healthy group and the patient group, but with differences in parameters.”

That confirmed that working memory uniquely affected learning in people with schizophrenia, while reinforcement learning mechanisms did not, Collins said.

The study suggests that working memory could be a more important target than reinforcement learning among researchers and clinicians hoping to help improve learning for people with schizophrenia, Collins said.

Among mentally healthy people as well, the study illustrates that the different components of learning can be understood individually, even as they all interact in the brain to make learning happen.

“More broadly, it brings attention to the fact that we need to consider learning as a multiactor kind of behavior that can’t be just summarized by a single system,” Collins said. “It’s important to design tasks that can separate them out so we can extract different sources of variance and correctly match them to different neural systems.”

(Image: Shutterstock)

Filed under schizophrenia working memory learning reinforcement learning neuroscience science

359 notes

Anorexia/bulimia: A bacterial protein implicated
Eating disorders (ED) such as anorexia nervosa, bulimia, and binge eating disorder affect approximately 5-10% of the general population, but the biological mechanisms involved are unknown. Researchers at Inserm Unit 1073, “Nutrition, inflammation and dysfunction of the gut-brain axis” (Inserm/University of Rouen) have demonstrated the involvement of a protein produced by some intestinal bacteria that may be the source of these disorders. Antibodies produced by the body against this protein also react with the main satiety hormone, which is similar in structure. According to the researchers, it may ultimately be possible to correct this mechanism that causes variations in food intake.
These results are published in the journal Translational Psychiatry, in the online issue of 7 October 2014.
Anorexia nervosa, bulimia and binge eating disorder are all eating disorders (ED). If the less well defined and atypical forms are included, ED affect 15-20% of the population, particularly adolescents and young adults. Despite various psychiatric, genetic and neurobiological studies, the molecular mechanism responsible for these disorders remains mysterious. The common characteristic of the different forms of ED is dysregulation of food intake, which is decreased or increased, depending on the situation.
Sergueï Fetissov’s team in Inserm Joint Research Unit 1073, “Nutrition, inflammation and dysfunction of the gut-brain axis” (Inserm/University of Rouen), led by Pierre Déchelotte, studies the relationships between the gut and the brain that might explain this dysregulation.
The mimic of the satiety hormone
In this new study, the researchers have identified a protein that happens to be a mimic of the satiety hormone (melanotropin). This protein (ClpB) is produced by certain bacteria, such as Escherichia coli, which are naturally present in the intestinal flora. Where this protein is present, antibodies are produced against it by the body. These will also bind to the satiety hormone because of its structural homology to ClpB, and thereby modify the satietogenic effect of the hormone. The sensation of satiety is reached (anorexia) or not reached (bulimia or overeating). Moreover, the bacterial protein itself seems to have anorexigenic properties.
Variations in food intake in the presence of the bacterial protein
To obtain these results, the researchers modified the composition of the intestinal flora of mice to study their immunological and behavioural response. Food intake and level of antibodies against melanotropin in the 1st group of mice, which were given mutant E. coli bacteria (not producing ClpB) did not change. In contrast, antibody level and food intake did vary in the 2nd group of animals, which received E. coli producing ClpB protein.
The likely involvement of this bacterial protein in disordered eating behaviour in humans was established by analysing data from 60 patients.
The standardised scale “Eating Disorders Inventory-2” was used to diagnose these patients and evaluate of the severity of their disorders, based on a questionnaire regarding their behaviour and emotions (wish to lose weight, bulimia, maturity fears, etc.). Plasma levels of antibodies to ClpB and melanotropin were higher in these patients. Furthermore, their immunological response determined the development of eating disorders in the direction of anorexia or bulimia.
These data thus confirm the involvement of the bacterial protein in the regulation of appetite, and open up new perspectives for the diagnosis and specific treatment of eating disorders.
Correcting the action of the protein mimicking the satiety hormone
"We are presently working to develop a blood test based on detection of the bacterial protein ClpB. If we are successful in this, we will be able to establish specific and individualised treatments for eating disorders," say Pierre Déchelotte and Sergueï Fetissov, authors of this study.
At the same time, the researchers are using mice to study how to correct the action of the bacterial protein in order to prevent the dysregulation of food intake that it generates. “According to our initial observations, it would indeed be possible to neutralise this bacterial protein using specific antibodies, without affecting the satiety hormone,” they conclude.

Anorexia/bulimia: A bacterial protein implicated

Eating disorders (ED) such as anorexia nervosa, bulimia, and binge eating disorder affect approximately 5-10% of the general population, but the biological mechanisms involved are unknown. Researchers at Inserm Unit 1073, “Nutrition, inflammation and dysfunction of the gut-brain axis” (Inserm/University of Rouen) have demonstrated the involvement of a protein produced by some intestinal bacteria that may be the source of these disorders. Antibodies produced by the body against this protein also react with the main satiety hormone, which is similar in structure. According to the researchers, it may ultimately be possible to correct this mechanism that causes variations in food intake.

These results are published in the journal Translational Psychiatry, in the online issue of 7 October 2014.

Anorexia nervosa, bulimia and binge eating disorder are all eating disorders (ED). If the less well defined and atypical forms are included, ED affect 15-20% of the population, particularly adolescents and young adults. Despite various psychiatric, genetic and neurobiological studies, the molecular mechanism responsible for these disorders remains mysterious. The common characteristic of the different forms of ED is dysregulation of food intake, which is decreased or increased, depending on the situation.

Sergueï Fetissov’s team in Inserm Joint Research Unit 1073, “Nutrition, inflammation and dysfunction of the gut-brain axis” (Inserm/University of Rouen), led by Pierre Déchelotte, studies the relationships between the gut and the brain that might explain this dysregulation.

The mimic of the satiety hormone

In this new study, the researchers have identified a protein that happens to be a mimic of the satiety hormone (melanotropin). This protein (ClpB) is produced by certain bacteria, such as Escherichia coli, which are naturally present in the intestinal flora. Where this protein is present, antibodies are produced against it by the body. These will also bind to the satiety hormone because of its structural homology to ClpB, and thereby modify the satietogenic effect of the hormone. The sensation of satiety is reached (anorexia) or not reached (bulimia or overeating). Moreover, the bacterial protein itself seems to have anorexigenic properties.

Variations in food intake in the presence of the bacterial protein

To obtain these results, the researchers modified the composition of the intestinal flora of mice to study their immunological and behavioural response. Food intake and level of antibodies against melanotropin in the 1st group of mice, which were given mutant E. coli bacteria (not producing ClpB) did not change. In contrast, antibody level and food intake did vary in the 2nd group of animals, which received E. coli producing ClpB protein.

The likely involvement of this bacterial protein in disordered eating behaviour in humans was established by analysing data from 60 patients.

The standardised scale “Eating Disorders Inventory-2” was used to diagnose these patients and evaluate of the severity of their disorders, based on a questionnaire regarding their behaviour and emotions (wish to lose weight, bulimia, maturity fears, etc.). Plasma levels of antibodies to ClpB and melanotropin were higher in these patients. Furthermore, their immunological response determined the development of eating disorders in the direction of anorexia or bulimia.

These data thus confirm the involvement of the bacterial protein in the regulation of appetite, and open up new perspectives for the diagnosis and specific treatment of eating disorders.

Correcting the action of the protein mimicking the satiety hormone

"We are presently working to develop a blood test based on detection of the bacterial protein ClpB. If we are successful in this, we will be able to establish specific and individualised treatments for eating disorders," say Pierre Déchelotte and Sergueï Fetissov, authors of this study.

At the same time, the researchers are using mice to study how to correct the action of the bacterial protein in order to prevent the dysregulation of food intake that it generates. “According to our initial observations, it would indeed be possible to neutralise this bacterial protein using specific antibodies, without affecting the satiety hormone,” they conclude.

Filed under eating disorders ClpB melanocortin anorexia bulimia neuroscience science

179 notes

Gaming vs. reading: Do they benefit teenagers with cognition or school performance?
Children have an increasing attraction towards electronic media in their play. With video games, phones and the internet in abundance, this article in Educational Psychology examines if such leisure activity is impacting children’s cognition or academic performance or whether it would be more beneficial to read.
After a busy day children do need downtime to rest and relax.  Increasingly kids leisure time is spent gaming, but does it detract from homework or would kids be better off reading a book? Historical research shows in some cases that interactive gaming can have positive effects for cognition by promoting memory, attention and reasoning. Other speed oriented games have been shown to improve perception and motor skills, so should gaming for relaxation be encouraged? Lieury et al investigate whether type of leisure activity produces a ‘transfer effect’ influencing learning processes thus improving student performance at school. With an emphasis on gaming and reading they linked patterns of leisure activity with performance in phonology, reading and comprehension, maths, long term memory and reasoning. Fascinatingly gaming previously thought to improve fluid intelligence showed little or no positive correlations to performance whilst reading did, particularly in memory and comprehension. It seems then despite lack of a causal link that reading may be more likely to enhance academic performance.
Should we assume that time spent gaming and away from homework is harmful to students? A further comparison of reading and gaming to most frequent leisure activities showed no negative patterns but interestingly resting had a favourable effect on performance as well as reading. So frequent leisure activity is not necessarily harmful to progress, or always at the expense of homework but can be enriching. The authors conclude “we think that video games are mainly recreational activities and the cognitive stimulation provided is very different from school learning. On the contrary, the results of this survey fully justify the educational role of parents and teachers in promoting reading.”
(Image: Shutterstock)

Gaming vs. reading: Do they benefit teenagers with cognition or school performance?

Children have an increasing attraction towards electronic media in their play. With video games, phones and the internet in abundance, this article in Educational Psychology examines if such leisure activity is impacting children’s cognition or academic performance or whether it would be more beneficial to read.

After a busy day children do need downtime to rest and relax.  Increasingly kids leisure time is spent gaming, but does it detract from homework or would kids be better off reading a book? Historical research shows in some cases that interactive gaming can have positive effects for cognition by promoting memory, attention and reasoning. Other speed oriented games have been shown to improve perception and motor skills, so should gaming for relaxation be encouraged? Lieury et al investigate whether type of leisure activity produces a ‘transfer effect’ influencing learning processes thus improving student performance at school. With an emphasis on gaming and reading they linked patterns of leisure activity with performance in phonology, reading and comprehension, maths, long term memory and reasoning. Fascinatingly gaming previously thought to improve fluid intelligence showed little or no positive correlations to performance whilst reading did, particularly in memory and comprehension. It seems then despite lack of a causal link that reading may be more likely to enhance academic performance.

Should we assume that time spent gaming and away from homework is harmful to students? A further comparison of reading and gaming to most frequent leisure activities showed no negative patterns but interestingly resting had a favourable effect on performance as well as reading. So frequent leisure activity is not necessarily harmful to progress, or always at the expense of homework but can be enriching. The authors conclude “we think that video games are mainly recreational activities and the cognitive stimulation provided is very different from school learning. On the contrary, the results of this survey fully justify the educational role of parents and teachers in promoting reading.”

(Image: Shutterstock)

Filed under cognitive performance reading gaming video games psychology neuroscience science

162 notes

Why is educational achievement heritable?

New research, led by King’s College London finds that the high heritability of exam grades reflects many genetically influenced traits such as personality, behaviour problems, and self-efficacy and not just intelligence.

image

The study, published today in the Proceedings of the National Academy of Sciences (PNAS), looked at 13,306 twins at age 16 who were part of the Medical Research Council (MRC) funded UK Twins Early Development Study (TEDS). The twins were assessed on a range of cognitive and non-cognitive measures, and the researchers had access to their GCSE (General Certificate of Secondary Education) scores.

In total, 83 scales were condensed into nine domains: intelligence, self-efficacy (confidence in one’s own academic ability), personality, well-being, home environment, school environment, health, parent-reported behaviour problems and child reported behaviour problems.

Identical twins share 100% of their genes, and non-identical twins (just as any other siblings) share 50% of the genes that vary between people. Twin pairs share the same environment (family, schools, teachers etc). By comparing identical and non-identical twins, the researchers were able to estimate the relative contributions of genetic and environmental factors. So, if overall, identical twins are more similar on a particular trait than non-identical twins, the differences between the two groups are due to genetics, rather than environment.

Eva Krapohl, joint first author of the study, from the MRC Social, Genetic and Developmental Psychiatry (SGDP) Centre at the Institute of Psychiatry, Psychology & Neuroscience (IoPPN) at King’s, says: “Previous work has already established that educational achievement is heritable. In this study, we wanted to find out why that is. What our study shows is that the heritability of educational achievement is much more than just intelligence – it is the combination of many traits which are all heritable to different extents.

“It is important to point out that heritability does not mean that anything is set in stone. It simply means that children differ in how easy and enjoyable they find learning and that much of these differences are influenced by genetics.”

The researchers found that the heritability of GCSE scores was 62%.  Individual traits were between 35% and 58% heritable, with intelligence being the most highly heritable. Together, the nine domains accounted for 75% of the heritability of GCSE scores.

Heritability is a population statistic which does not provide any information at an individual level. It describes the extent to which differences between children can be ascribed to DNA differences, on average, in a particular population at a particular time. 

(Source: kcl.ac.uk)

Filed under heritability educational achievement intelligence genetics neuroscience science

131 notes

MRI Technique Detects Evidence of Cognitive Decline Before Symptoms Appear
A magnetic resonance imaging (MRI) technique can detect signs of cognitive decline in the brain even before symptoms appear, according to a new study published online in the journal Radiology. The technique has the potential to serve as a biomarker in very early diagnosis of preclinical dementia.
The World Health Organization estimates that dementia affects more than 35 million people worldwide, a number expected to more than double by 2030. Problems in the brain related to dementia, such as reduced blood flow, might be present for years but are not evident because of cognitive reserve, a phenomenon where other parts of the brain compensate for deficits in one area. Early detection of cognitive decline is critical, because treatments for Alzheimer’s disease, the most common type of dementia, are most effective in this early phase.
Researchers recently studied arterial spin labeling (ASL), a promising MRI technique that doesn’t require injection of a contrast agent. ASL measures brain perfusion, or penetration of blood into the tissue.
"ASL MRI is simple to perform, doesn’t require special equipment and only adds a few minutes to the exam," said study author Sven Haller, M.D., from the University of Geneva in Geneva, Switzerland.
The study group included 148 healthy elderly participants and 65 people with mild cognitive impairment (MCI). The participants underwent brain MRI and a neuropsychological assessment, a common battery of tests used to determine cognitive ability.
Of the 148 healthy individuals, 75 remained stable, while 73 deteriorated cognitively at 18 months clinical follow-up. Those who deteriorated had shown reduced perfusion at their baseline ASL MRI exams, particularly in the posterior cingulate cortex, an area in the middle of the brain that is associated with the default mode network, the neural network that is active when the brain is not concentrating on a specific task. Declines in this network are seen in MCI patients and are more pronounced in those with Alzheimer’s disease.
The pattern of reduced perfusion in the brains of healthy individuals who went on to develop cognitive deficits was similar to that of patients with MCI.
"There is a known close link between neural activity and brain perfusion in the posterior cingulate cortex," Dr. Haller said. "Less perfusion indicates decreased neural activity."
The results suggest that individuals with decreased perfusion detected with ASL MRI may temporarily maintain their cognitive status through the mobilization of their cognitive reserve, but will eventually develop subtle cognitive deficits.
Previous research done with positron emission tomography (PET), the current gold standard for brain metabolism imaging, found that patients with Alzheimer’s disease had reduced metabolism in the same area of the brain where the perfusion abnormalities were found using ASL MRI. This points to a close link between brain metabolism and perfusion, according to Dr. Haller.
ASL MRI has potential as a standalone test or as an adjunct to PET for dementia screening, Dr. Haller said. While PET can identify markers of Alzheimer’s disease in the brain and cerebrospinal fluid, it exposes the patient to radiation. ASL does not expose the patient to radiation and is easy to perform in routine clinical settings.
"ASL might replace the classic yet unspecific fluordesoxyglucose PET that measures brain metabolism. Instead, PET could be done with the new and specific amyloid PET tracers," Dr. Haller said.
The results also support a role for ASL MRI as an alternative to neuropsychological testing.
The researchers plan to perform follow-up studies on the patient group to learn more about ASL and long-term cognitive changes.

MRI Technique Detects Evidence of Cognitive Decline Before Symptoms Appear

A magnetic resonance imaging (MRI) technique can detect signs of cognitive decline in the brain even before symptoms appear, according to a new study published online in the journal Radiology. The technique has the potential to serve as a biomarker in very early diagnosis of preclinical dementia.

The World Health Organization estimates that dementia affects more than 35 million people worldwide, a number expected to more than double by 2030. Problems in the brain related to dementia, such as reduced blood flow, might be present for years but are not evident because of cognitive reserve, a phenomenon where other parts of the brain compensate for deficits in one area. Early detection of cognitive decline is critical, because treatments for Alzheimer’s disease, the most common type of dementia, are most effective in this early phase.

Researchers recently studied arterial spin labeling (ASL), a promising MRI technique that doesn’t require injection of a contrast agent. ASL measures brain perfusion, or penetration of blood into the tissue.

"ASL MRI is simple to perform, doesn’t require special equipment and only adds a few minutes to the exam," said study author Sven Haller, M.D., from the University of Geneva in Geneva, Switzerland.

The study group included 148 healthy elderly participants and 65 people with mild cognitive impairment (MCI). The participants underwent brain MRI and a neuropsychological assessment, a common battery of tests used to determine cognitive ability.

Of the 148 healthy individuals, 75 remained stable, while 73 deteriorated cognitively at 18 months clinical follow-up. Those who deteriorated had shown reduced perfusion at their baseline ASL MRI exams, particularly in the posterior cingulate cortex, an area in the middle of the brain that is associated with the default mode network, the neural network that is active when the brain is not concentrating on a specific task. Declines in this network are seen in MCI patients and are more pronounced in those with Alzheimer’s disease.

The pattern of reduced perfusion in the brains of healthy individuals who went on to develop cognitive deficits was similar to that of patients with MCI.

"There is a known close link between neural activity and brain perfusion in the posterior cingulate cortex," Dr. Haller said. "Less perfusion indicates decreased neural activity."

The results suggest that individuals with decreased perfusion detected with ASL MRI may temporarily maintain their cognitive status through the mobilization of their cognitive reserve, but will eventually develop subtle cognitive deficits.

Previous research done with positron emission tomography (PET), the current gold standard for brain metabolism imaging, found that patients with Alzheimer’s disease had reduced metabolism in the same area of the brain where the perfusion abnormalities were found using ASL MRI. This points to a close link between brain metabolism and perfusion, according to Dr. Haller.

ASL MRI has potential as a standalone test or as an adjunct to PET for dementia screening, Dr. Haller said. While PET can identify markers of Alzheimer’s disease in the brain and cerebrospinal fluid, it exposes the patient to radiation. ASL does not expose the patient to radiation and is easy to perform in routine clinical settings.

"ASL might replace the classic yet unspecific fluordesoxyglucose PET that measures brain metabolism. Instead, PET could be done with the new and specific amyloid PET tracers," Dr. Haller said.

The results also support a role for ASL MRI as an alternative to neuropsychological testing.

The researchers plan to perform follow-up studies on the patient group to learn more about ASL and long-term cognitive changes.

Filed under dementia cognitive decline arterial spin labeling MRI neuroimaging neuroscience science

1,921 notes

Results of the World’s Largest Medical Study of the Human Mind and Consciousness at the Time of Death published
The results of a four-year international study of 2060 cardiac arrest cases across 15 hospitals published and available now on ScienceDirect. The study concludes:
The themes relating to the experience of death appear far broader than what has been understood so far, or what has been described as so called near-death experiences.
In some cases of cardiac arrest, memories of visual awareness compatible with so called out-of-body experiences may correspond with actual events.
A higher proportion of people may have vivid death experiences, but do not recall them due to the effects of brain injury or sedative drugs on memory circuits.
Widely used yet scientifically imprecise terms such as near-death and out-of-body experiences may not be sufficient to describe the actual experience of death. Future studies should focus on cardiac arrest, which is biologically synonymous with death, rather than ill-defined medical states sometimes referred to as ‘near-death’.
The recalled experience surrounding death merits a genuine investigation without prejudice.
Recollections in relation to death, so-called out-of-body experiences (OBEs) or near-death experiences (NDEs), are an often spoken about phenomenon which have frequently been considered hallucinatory or illusory in nature; however, objective studies on these experiences are limited.
In 2008, a large-scale study involving 2060 patients from 15 hospitals in the United Kingdom, United States and Austria was launched. The AWARE (AWAreness during REsuscitation) study, sponsored by the University of Southampton in the UK, examined the broad range of mental experiences in relation to death. Researchers also tested the validity of conscious experiences using objective markers for the first time in a large study to determine whether claims of awareness compatible with out-of-body experiences correspond with real or hallucinatory events.
Results of the study have been published in the journal Resuscitation and are now available online.
Dr Sam Parnia, Assistant Professor of Critical Care Medicine and Director of Resuscitation Research at The State University of New York at Stony Brook, USA, and the study’s lead author, explained: “Contrary to perception, death is not a specific moment but a potentially reversible process that occurs after any severe illness or accident causes the heart, lungs and brain to cease functioning. If attempts are made to reverse this process, it is referred to as ‘cardiac arrest’; however, if these attempts do not succeed it is called ‘death’. In this study we wanted to go beyond the emotionally charged yet poorly defined term of NDEs to explore objectively what happens when we die.”
Thirty-nine per cent of patients who survived cardiac arrest and were able to undergo structured interviews described a perception of awareness, but interestingly did not have any explicit recall of events.
“This suggests more people may have mental activity initially but then lose their memories after recovery, either due to the effects of brain injury or sedative drugs on memory recall,” explained Dr Parnia, who was an Honorary Research Fellow at the University of Southampton when he started the AWARE study.
Among those who reported a perception of awareness and completed further interviews, 46 per cent experienced a broad range of mental recollections in relation to death that were not compatible with the commonly used term of NDE’s. These included fearful and persecutory experiences. Only 9 per cent had experiences compatible with NDEs and 2 per cent exhibited full awareness compatible with OBE’s with explicit recall of ‘seeing’ and ‘hearing’ events.
One case was validated and timed using auditory stimuli during cardiac arrest. Dr Parnia concluded: “This is significant, since it has often been assumed that experiences in relation to death are likely hallucinations or illusions, occurring either before the heart stops or after the heart has been successfully restarted, but not an experience corresponding with ‘real’ events when the heart isn’t beating. In this case, consciousness and awareness appeared to occur during a three-minute period when there was no heartbeat. This is paradoxical, since the brain typically ceases functioning within 20-30 seconds of the heart stopping and doesn’t resume again until the heart has been restarted. Furthermore, the detailed recollections of visual awareness in this case were consistent with verified events.
“Thus, while it was not possible to absolutely prove the reality or meaning of patients’ experiences and claims of awareness, (due to the very low incidence (2 per cent) of explicit recall of visual awareness or so called OBE’s), it was impossible to disclaim them either and more work is needed in this area. Clearly, the recalled experience surrounding death now merits further genuine investigation without prejudice.”
Further studies are also needed to explore whether awareness (explicit or implicit) may lead to long term adverse psychological outcomes including post-traumatic stress disorder.
Dr Jerry Nolan, Editor-in-Chief of Resuscitation, stated: “The AWARE study researchers are to be congratulated on the completion of a fascinating study that will open the door to more extensive research into what happens when we die.”

Results of the World’s Largest Medical Study of the Human Mind and Consciousness at the Time of Death published

The results of a four-year international study of 2060 cardiac arrest cases across 15 hospitals published and available now on ScienceDirect. The study concludes:

  • The themes relating to the experience of death appear far broader than what has been understood so far, or what has been described as so called near-death experiences.
  • In some cases of cardiac arrest, memories of visual awareness compatible with so called out-of-body experiences may correspond with actual events.
  • A higher proportion of people may have vivid death experiences, but do not recall them due to the effects of brain injury or sedative drugs on memory circuits.
  • Widely used yet scientifically imprecise terms such as near-death and out-of-body experiences may not be sufficient to describe the actual experience of death. Future studies should focus on cardiac arrest, which is biologically synonymous with death, rather than ill-defined medical states sometimes referred to as ‘near-death’.
  • The recalled experience surrounding death merits a genuine investigation without prejudice.

Recollections in relation to death, so-called out-of-body experiences (OBEs) or near-death experiences (NDEs), are an often spoken about phenomenon which have frequently been considered hallucinatory or illusory in nature; however, objective studies on these experiences are limited.

In 2008, a large-scale study involving 2060 patients from 15 hospitals in the United Kingdom, United States and Austria was launched. The AWARE (AWAreness during REsuscitation) study, sponsored by the University of Southampton in the UK, examined the broad range of mental experiences in relation to death. Researchers also tested the validity of conscious experiences using objective markers for the first time in a large study to determine whether claims of awareness compatible with out-of-body experiences correspond with real or hallucinatory events.

Results of the study have been published in the journal Resuscitation and are now available online.

Dr Sam Parnia, Assistant Professor of Critical Care Medicine and Director of Resuscitation Research at The State University of New York at Stony Brook, USA, and the study’s lead author, explained: “Contrary to perception, death is not a specific moment but a potentially reversible process that occurs after any severe illness or accident causes the heart, lungs and brain to cease functioning. If attempts are made to reverse this process, it is referred to as ‘cardiac arrest’; however, if these attempts do not succeed it is called ‘death’. In this study we wanted to go beyond the emotionally charged yet poorly defined term of NDEs to explore objectively what happens when we die.”

Thirty-nine per cent of patients who survived cardiac arrest and were able to undergo structured interviews described a perception of awareness, but interestingly did not have any explicit recall of events.

“This suggests more people may have mental activity initially but then lose their memories after recovery, either due to the effects of brain injury or sedative drugs on memory recall,” explained Dr Parnia, who was an Honorary Research Fellow at the University of Southampton when he started the AWARE study.

Among those who reported a perception of awareness and completed further interviews, 46 per cent experienced a broad range of mental recollections in relation to death that were not compatible with the commonly used term of NDE’s. These included fearful and persecutory experiences. Only 9 per cent had experiences compatible with NDEs and 2 per cent exhibited full awareness compatible with OBE’s with explicit recall of ‘seeing’ and ‘hearing’ events.

One case was validated and timed using auditory stimuli during cardiac arrest. Dr Parnia concluded: “This is significant, since it has often been assumed that experiences in relation to death are likely hallucinations or illusions, occurring either before the heart stops or after the heart has been successfully restarted, but not an experience corresponding with ‘real’ events when the heart isn’t beating. In this case, consciousness and awareness appeared to occur during a three-minute period when there was no heartbeat. This is paradoxical, since the brain typically ceases functioning within 20-30 seconds of the heart stopping and doesn’t resume again until the heart has been restarted. Furthermore, the detailed recollections of visual awareness in this case were consistent with verified events.

“Thus, while it was not possible to absolutely prove the reality or meaning of patients’ experiences and claims of awareness, (due to the very low incidence (2 per cent) of explicit recall of visual awareness or so called OBE’s), it was impossible to disclaim them either and more work is needed in this area. Clearly, the recalled experience surrounding death now merits further genuine investigation without prejudice.”

Further studies are also needed to explore whether awareness (explicit or implicit) may lead to long term adverse psychological outcomes including post-traumatic stress disorder.

Dr Jerry Nolan, Editor-in-Chief of Resuscitation, stated: “The AWARE study researchers are to be congratulated on the completion of a fascinating study that will open the door to more extensive research into what happens when we die.”

Filed under consciousness near-death experience out-of-body experience death neuroscience science

84 notes

(Image caption: Neurons (blue) which have absorbed exosomes (green) have increased levels of the enzyme catalase (red), which helps protect them against peroxides. Credit: Institute of Molecular Cell Biology) 
Vesicles influence the function of nerve cells
Tiny vesicles containing protective substances which they transmit to nerve cells apparently play an important role in the functioning of neurons. As cell biologists at Johannes Gutenberg University Mainz (JGU) have discovered, nerve cells can enlist the aid of mini-vesicles of neighboring glial cells to defend themselves against stress and other potentially detrimental factors. These vesicles, called exosomes, appear to stimulate the neurons on various levels: they influence electrical stimulus conduction, biochemical signal transfer, and gene regulation. Exosomes are thus multifunctional signal emitters that can have a significant effect in the brain.
The researchers in Mainz already observed in a previous study that oligodendrocytes release exosomes on exposure to neuronal stimuli. These exosomes are absorbed by the neurons and improve neuronal stress tolerance. Oligodendrocytes are a type of glial cell and they form an insulating myelin sheath around the axons of neurons. The exosomes transport protective proteins such as heat shock proteins, glycolytic enzymes, and enzymes that reduce oxidative stress from one cell type to another, but also transmit genetic information in the form of ribonucleic acids.
"As we have now discovered in cell cultures, exosomes seem to have a whole range of functions," explained Dr. Eva-Maria Krämer-Albers. By means of their transmission activity, the small bubbles that are the vesicles not only promote electrical activity in the nerve cells, but also influence them on the biochemical and gene regulatory level. "The extent of activities of the exosomes is impressive," added Krämer-Albers. The researchers hope that the understanding of these processes will contribute to the development of new strategies for the treatment of neuronal diseases. Their next aim is to uncover how vesicles actually function in the brains of living organisms.

(Image caption: Neurons (blue) which have absorbed exosomes (green) have increased levels of the enzyme catalase (red), which helps protect them against peroxides. Credit: Institute of Molecular Cell Biology)

Vesicles influence the function of nerve cells

Tiny vesicles containing protective substances which they transmit to nerve cells apparently play an important role in the functioning of neurons. As cell biologists at Johannes Gutenberg University Mainz (JGU) have discovered, nerve cells can enlist the aid of mini-vesicles of neighboring glial cells to defend themselves against stress and other potentially detrimental factors. These vesicles, called exosomes, appear to stimulate the neurons on various levels: they influence electrical stimulus conduction, biochemical signal transfer, and gene regulation. Exosomes are thus multifunctional signal emitters that can have a significant effect in the brain.

The researchers in Mainz already observed in a previous study that oligodendrocytes release exosomes on exposure to neuronal stimuli. These exosomes are absorbed by the neurons and improve neuronal stress tolerance. Oligodendrocytes are a type of glial cell and they form an insulating myelin sheath around the axons of neurons. The exosomes transport protective proteins such as heat shock proteins, glycolytic enzymes, and enzymes that reduce oxidative stress from one cell type to another, but also transmit genetic information in the form of ribonucleic acids.

"As we have now discovered in cell cultures, exosomes seem to have a whole range of functions," explained Dr. Eva-Maria Krämer-Albers. By means of their transmission activity, the small bubbles that are the vesicles not only promote electrical activity in the nerve cells, but also influence them on the biochemical and gene regulatory level. "The extent of activities of the exosomes is impressive," added Krämer-Albers. The researchers hope that the understanding of these processes will contribute to the development of new strategies for the treatment of neuronal diseases. Their next aim is to uncover how vesicles actually function in the brains of living organisms.

Filed under nerve cells exosomes oligodendrocytes glial cells signal transduction neuroscience science

152 notes

How Rabies “Hijacks” Neurons to Attack the Brain
Rabies causes acute inflammation of the brain, producing psychosis and violent aggression. The virus, which paralyzes the body’s internal organs, is always deadly for those unable to obtain vaccines in time. Some 55,000 people die from rabies every year.
For the first time, Tel Aviv University scientists have discovered the exact mechanism this killer virus uses to efficiently enter the central nervous system, where it erupts in a toxic explosion of symptoms. The study, published in PLOS Pathogens, was conducted by Dr. Eran Perlson and Shani Gluska of TAU’s Sackler Faculty of Medicine and Sagol School of Neuroscience, in collaboration with the Friedrich Loeffler Institute in Germany.
"Rabies not only hijacks the nervous system’s machinery, it also manipulates that machinery to move faster," said Dr. Perlson. "We have shown that rabies enters a neuron in the peripheral nervous system by binding to a nerve growth factor receptor, responsible for the health of neurons, called p75. The difference is that its transport is very fast, even faster than that of its endogenous ligand, the small molecules that travel regularly along the neuron and keep the neuron healthy."
Faster than a speeding train
To track the rabies virus in the nervous system, the researchers grew mouse sensory neurons in an observation chamber and used live cell imaging to track the path taken by the virus particles. The researchers “saw” the virus hijack the “train” transporting cell components along a neuron and drove it straight into the spinal cord. Once in the spinal cord, the virus caught the first available train to the brain, where it wrought havoc before speeding through the rest of the body, shutting it down organ by organ.
Nerve cells, or neurons, outside the central nervous system are highly asymmetric. A long protrusion called an axon extends from the cell body to another nerve cell or organ along a specific transmission route. In addition to rapid transmission of electric impulses, axons also transport molecular materials over these distances.
"Axonal transport is a delicate and crucial process for neuronal survival, and when disrupted it can lead to neurodegenerative diseases," said Dr. Perlson. "Understanding how an organism such as rabies manipulates this machinery may help us in the future to either restore the process or even to manipulate it to our own therapeutic needs."
Hijacking the hijacker
"A tempting premise is to use this same machinery to introduce drugs or genes into the nervous system," Dr. Perlson added. By shedding light on how the virus hijacks the transport system in nerve cells to reach its target organ with maximal speed and efficiency, the researchers hope their findings will allow scientists to control the neuronal transport machinery to treat rabies and other neurodegenerative diseases.
Disruptions of the neuron train system also contribute to neurodegenerative diseases, like Alzheimer’s disease, Parkinson’s disease, and amyotrophic lateral sclerosis (ALS). According to Dr. Perlson, “An improved understanding of how the neuron train works could lead to new treatments for these disorders as well.”

How Rabies “Hijacks” Neurons to Attack the Brain

Rabies causes acute inflammation of the brain, producing psychosis and violent aggression. The virus, which paralyzes the body’s internal organs, is always deadly for those unable to obtain vaccines in time. Some 55,000 people die from rabies every year.

For the first time, Tel Aviv University scientists have discovered the exact mechanism this killer virus uses to efficiently enter the central nervous system, where it erupts in a toxic explosion of symptoms. The study, published in PLOS Pathogens, was conducted by Dr. Eran Perlson and Shani Gluska of TAU’s Sackler Faculty of Medicine and Sagol School of Neuroscience, in collaboration with the Friedrich Loeffler Institute in Germany.

"Rabies not only hijacks the nervous system’s machinery, it also manipulates that machinery to move faster," said Dr. Perlson. "We have shown that rabies enters a neuron in the peripheral nervous system by binding to a nerve growth factor receptor, responsible for the health of neurons, called p75. The difference is that its transport is very fast, even faster than that of its endogenous ligand, the small molecules that travel regularly along the neuron and keep the neuron healthy."

Faster than a speeding train

To track the rabies virus in the nervous system, the researchers grew mouse sensory neurons in an observation chamber and used live cell imaging to track the path taken by the virus particles. The researchers “saw” the virus hijack the “train” transporting cell components along a neuron and drove it straight into the spinal cord. Once in the spinal cord, the virus caught the first available train to the brain, where it wrought havoc before speeding through the rest of the body, shutting it down organ by organ.

Nerve cells, or neurons, outside the central nervous system are highly asymmetric. A long protrusion called an axon extends from the cell body to another nerve cell or organ along a specific transmission route. In addition to rapid transmission of electric impulses, axons also transport molecular materials over these distances.

"Axonal transport is a delicate and crucial process for neuronal survival, and when disrupted it can lead to neurodegenerative diseases," said Dr. Perlson. "Understanding how an organism such as rabies manipulates this machinery may help us in the future to either restore the process or even to manipulate it to our own therapeutic needs."

Hijacking the hijacker

"A tempting premise is to use this same machinery to introduce drugs or genes into the nervous system," Dr. Perlson added. By shedding light on how the virus hijacks the transport system in nerve cells to reach its target organ with maximal speed and efficiency, the researchers hope their findings will allow scientists to control the neuronal transport machinery to treat rabies and other neurodegenerative diseases.

Disruptions of the neuron train system also contribute to neurodegenerative diseases, like Alzheimer’s disease, Parkinson’s disease, and amyotrophic lateral sclerosis (ALS). According to Dr. Perlson, “An improved understanding of how the neuron train works could lead to new treatments for these disorders as well.”

Filed under rabies nervous system p75 neurons axonal transport RABV neuroscience science

free counters