Neuroscience

Articles and news from the latest research reports.

81 notes

Pig brain models provide insights into human cognitive development
A mutual curiosity about patterns of growth and development in pig brains has brought two University of Illinois research groups together. Animal scientists Rod Johnson and Ryan Dilger have developed a model of the pig brain that they plan to use to answer important questions about human brain development.
“It is important to characterize the normal brain growth trajectory from the neonatal period to sexual maturity,” said Johnson.
“Until we know how the brain grows, we don’t know what is going to change,” added Dilger.
In cooperation with the Beckman Institute, they performed MRI scans on the brains of 16 piglets, starting at the age of 2 weeks, then at 4 weeks, and then at 4-week intervals up to 24 weeks.
“We have world-class people at the Beckman Institute who are pushing and developing the next generation of neuroimaging technology, so we’re able to connect with them and take advantage of their expertise,” said Johnson.
Matt Conrad, a student in Johnson’s lab, used three-dimensional visualization software on over 200 images to manually segment each region on three planes. The software put the information together into a three-dimensional image of the pig brain. This is used to determine the volume of the different structures.
When the piglets were at Beckman for their imaging sessions, Dilger performed other tests, including diffusion tensor imaging (DTI), which shows how neural tracks develop, allowing the exploration of brain complexity and of how neurons form. It was also possible to measure neurochemicals, including creatine and acetylcholine, in the brain, which provides a unique insight into brain metabolism.
The end result of this work is what they call the deformable pig brain atlas.
“We are taking 16 pigs and averaging them, so it’s more representative of all pigs,” said Dilger. “You can then apply it to any individual pig to see how it’s different.”
“It’s called a deformable brain atlas because the software takes information from an individual and deforms it until it fits the template, and then you know how much it had to be deformed to fit,” Johnson explained. “So from that, you can tell whether a brain region is larger or smaller compared to the average.”
Johnson and Dilger said that the goal is to develop a tool for pigs that is equivalent to what is available for the mouse brain and make it publicly available. But they don’t want to stop with tool development.
“We want to use this to address important questions,” Johnson said.
One research direction being pursued in Johnson’s lab is to induce viral pneumonia in piglets at the point in the post-natal period when the brain is undergoing massive growth to see how it alters brain growth and development. They are also looking at effects of prenatal infections in the mother to see if that alters the trajectory of normal brain growth in the offspring. The risk for behavioral disorders and reduced stress resilience is increased by pre- and post-natal infection, but the developmental origins are poorly understood.
Dilger’s group is interested in the effects of early-life nutrition on the brain. They are looking at the effects of specific fatty acids as primary structural components of the human brain and cerebral cortex, and at choline, a nutrient that is important for DNA production and normal functioning of neurons.
“Choline deficiency has been tied to cognitive deficits in the mouse and human, and we’re developing a pig model to study the direct effects choline deficiency has on brain structure and function,” Dilger said. “Many women of child-bearing age may not be receiving enough choline in their diets, and recent evidence suggests this may ultimately affect learning and memory ability in their children. Luckily, choline can be found in common foods, especially eggs and meat products, including bacon.”

Pig brain models provide insights into human cognitive development

A mutual curiosity about patterns of growth and development in pig brains has brought two University of Illinois research groups together. Animal scientists Rod Johnson and Ryan Dilger have developed a model of the pig brain that they plan to use to answer important questions about human brain development.

“It is important to characterize the normal brain growth trajectory from the neonatal period to sexual maturity,” said Johnson.

“Until we know how the brain grows, we don’t know what is going to change,” added Dilger.

In cooperation with the Beckman Institute, they performed MRI scans on the brains of 16 piglets, starting at the age of 2 weeks, then at 4 weeks, and then at 4-week intervals up to 24 weeks.

“We have world-class people at the Beckman Institute who are pushing and developing the next generation of neuroimaging technology, so we’re able to connect with them and take advantage of their expertise,” said Johnson.

Matt Conrad, a student in Johnson’s lab, used three-dimensional visualization software on over 200 images to manually segment each region on three planes. The software put the information together into a three-dimensional image of the pig brain. This is used to determine the volume of the different structures.

When the piglets were at Beckman for their imaging sessions, Dilger performed other tests, including diffusion tensor imaging (DTI), which shows how neural tracks develop, allowing the exploration of brain complexity and of how neurons form. It was also possible to measure neurochemicals, including creatine and acetylcholine, in the brain, which provides a unique insight into brain metabolism.

The end result of this work is what they call the deformable pig brain atlas.

“We are taking 16 pigs and averaging them, so it’s more representative of all pigs,” said Dilger. “You can then apply it to any individual pig to see how it’s different.”

“It’s called a deformable brain atlas because the software takes information from an individual and deforms it until it fits the template, and then you know how much it had to be deformed to fit,” Johnson explained. “So from that, you can tell whether a brain region is larger or smaller compared to the average.”

Johnson and Dilger said that the goal is to develop a tool for pigs that is equivalent to what is available for the mouse brain and make it publicly available. But they don’t want to stop with tool development.

“We want to use this to address important questions,” Johnson said.

One research direction being pursued in Johnson’s lab is to induce viral pneumonia in piglets at the point in the post-natal period when the brain is undergoing massive growth to see how it alters brain growth and development. They are also looking at effects of prenatal infections in the mother to see if that alters the trajectory of normal brain growth in the offspring. The risk for behavioral disorders and reduced stress resilience is increased by pre- and post-natal infection, but the developmental origins are poorly understood.

Dilger’s group is interested in the effects of early-life nutrition on the brain. They are looking at the effects of specific fatty acids as primary structural components of the human brain and cerebral cortex, and at choline, a nutrient that is important for DNA production and normal functioning of neurons.

“Choline deficiency has been tied to cognitive deficits in the mouse and human, and we’re developing a pig model to study the direct effects choline deficiency has on brain structure and function,” Dilger said. “Many women of child-bearing age may not be receiving enough choline in their diets, and recent evidence suggests this may ultimately affect learning and memory ability in their children. Luckily, choline can be found in common foods, especially eggs and meat products, including bacon.”

Filed under piglets diffusion tensor imaging brain atlas brain development MRI neuroscience science

39 notes

Normal prion protein regulates iron metabolism

An iron imbalance caused by prion proteins collecting in the brain is a likely cause of cell death in Creutzfeldt-Jakob disease (CJD), researchers at Case Western Reserve University School of Medicine have found.

The breakthrough follows discoveries that certain proteins found in the brains of Alzheimer’s and Parkinson’s patients also regulate iron. The results suggest that neurotoxicity by the form of iron, called redox-active iron, may be a trait of neurodegenerative conditions in all three diseases, the researchers say.

Further, the role of the normal prion protein known as PrPc in iron metabolism may provide a target for strategies to maintain iron balance and reduce iron-induced neurotoxicity in patients suffering from CJD, a rare degenerative disease for which no cure yet exists.

The researchers report that lack of PrPC hampers iron uptake and storage and more findings are now in the online edition of the Journal of Alzheimer’s Disease.

"There are many skeptics who think iron is a bystander or end-product of neuronal death and has no role to play in neurodegenerative conditions," said Neena Singh, a professor of pathology and neurology at Case Western Reserve and the paper’s senior author. "We’re not saying that iron imbalance is the only cause, but failure to maintain stable levels of iron in the brain appears to contribute significantly to neuronal death."

Prions are misfolded forms of PrPC that are infectious and disease-causing agents of CJD. PrPc is the normal form present in all tissues including the brain. PrPc acts as a ferrireductase, that is, it helps to convert oxidized iron to a form that can be taken up and utilized by the cells, the scientists show.

In their investigation, mouse models that lacked PrPC were iron-deficient. By supplementing their diets with excess inorganic iron, normal levels of iron in the body were restored. When the supplements stopped, the mice returned to being iron-deficient.

Examination of iron metabolism pathways showed that the lack of PrPC impaired iron uptake and storage, and alternate mechanisms of iron uptake failed to compensate for the deficiency.

Cells have a tight regulatory system for iron uptake, storage and release. PrPC is an essential element in this process, and its aggregation in CJD possibly results in an environment of iron imbalance that is damaging to neuronal cells, Singh explained

It is likely that as CJD progresses and PrPC forms insoluble aggregates, loss of ferrireductase function combined with sequestration of iron in prion aggregates leads to insufficiency of iron in diseased brains, creating a potentially toxic environment, as reported earlier by this group and featured in Nature Journal club.

Recently, members of the Singh research team also helped to identify a highly accurate test to confirm the presence of CJD in living sufferers. They found that iron imbalance in the brain is reflected as a specific change in the levels of iron-management proteins other than PrPc in the cerebrospinal fluid. The fluid can be tapped to diagnose the disease with 88.9 percent accuracy, the researchers reported in the journal Antioxidants & Redox Signaling online last month.

Singh’ s team is now investigating how prion protein functions to convert oxidized iron to a usable form. They are also evaluating the role of prion protein in brain iron metabolism, and whether the iron imbalance observed in cases of CJD, Alzheimer’s disease and Parkinson’s disease is reflected in the cerebrospinal fluid. A specific change in the fluid could provide a disease-specific diagnostic test for these disorders.

(Source: eurekalert.org)

Filed under Creutzfeldt-Jakob disease neurodegenerative diseases iron prion proteins brain medicine science

52 notes

New MRI method fingerprints tissues and diseases
A new method of magnetic resonance imaging (MRI) could routinely spot specific cancers, multiple sclerosis, heart disease and other maladies early, when they’re most treatable, researchers at Case Western Reserve University and University Hospitals (UH) Case Medical Center suggest in the journal Nature.
Each body tissue and disease has a unique fingerprint that can be used to quickly diagnose problems, the scientists say.
By using new MRI technologies to scan for different physical properties simultaneously, the team differentiated white matter from gray matter from cerebrospinal fluid in the brain in about 12 seconds, with the promise of doing this much faster in the near future.
The technology has the potential to make an MRI scan standard procedure in annual check-ups, the authors believe. A full-body scan lasting just minutes would provide far more information and require no radiologist to interpret the data, making diagnostics cheap, compared to today’s scans, they contend.
"The overall goal is to specifically identify individual tissues and diseases, to hopefully see things and quantify things before they become a problem," said Mark Griswold, a radiology professor at Case Western Reserve School of Medicine and UH Case Medical Center. "But to try to get there, we’ve had to give up everything we knew about the MRI and start over."
Griswold has been working on this goal with Case Western Reserve’s Vikas Gulani, MD, an assistant professor of radiology, and Nicole Seiberlich, assistant professor of biomedical engineering, for a decade. During the last three years, they developed the technology and proved the concept with graduate student Dan Ma; Kecheng Liu, PhD, collaborations manager from Siemens Medical Solutions Inc.; Jeffrey L. Sunshine, MD, professor of radiology and a radiologist at UH Case Medical Center; and Jeffrey L. Duerk, dean of Case School of Engineering and professor of biomedical engineering.
(Image: Rex Features)

New MRI method fingerprints tissues and diseases

A new method of magnetic resonance imaging (MRI) could routinely spot specific cancers, multiple sclerosis, heart disease and other maladies early, when they’re most treatable, researchers at Case Western Reserve University and University Hospitals (UH) Case Medical Center suggest in the journal Nature.

Each body tissue and disease has a unique fingerprint that can be used to quickly diagnose problems, the scientists say.

By using new MRI technologies to scan for different physical properties simultaneously, the team differentiated white matter from gray matter from cerebrospinal fluid in the brain in about 12 seconds, with the promise of doing this much faster in the near future.

The technology has the potential to make an MRI scan standard procedure in annual check-ups, the authors believe. A full-body scan lasting just minutes would provide far more information and require no radiologist to interpret the data, making diagnostics cheap, compared to today’s scans, they contend.

"The overall goal is to specifically identify individual tissues and diseases, to hopefully see things and quantify things before they become a problem," said Mark Griswold, a radiology professor at Case Western Reserve School of Medicine and UH Case Medical Center. "But to try to get there, we’ve had to give up everything we knew about the MRI and start over."

Griswold has been working on this goal with Case Western Reserve’s Vikas Gulani, MD, an assistant professor of radiology, and Nicole Seiberlich, assistant professor of biomedical engineering, for a decade. During the last three years, they developed the technology and proved the concept with graduate student Dan Ma; Kecheng Liu, PhD, collaborations manager from Siemens Medical Solutions Inc.; Jeffrey L. Sunshine, MD, professor of radiology and a radiologist at UH Case Medical Center; and Jeffrey L. Duerk, dean of Case School of Engineering and professor of biomedical engineering.

(Image: Rex Features)

Filed under MRI white matter cerebrospinal fluid body tissue body scan medicine science

137 notes

Neuron Loss in Schizophrenia and Depression Could Be Prevented With an Antioxidant

Gamma-aminobutyric acid (GABA) deficits have been implicated in schizophrenia and depression. In schizophrenia, deficits have been particularly well-described for a subtype of GABA neuron, the parvalbumin fast-spiking interneurons. The activity of these neurons is critical for proper cognitive and emotional functioning.

It now appears that parvalbumin neurons are particularly vulnerable to oxidative stress, a factor that may emerge commonly in development, particularly in the context of psychiatric disorders like schizophrenia or bipolar disorder, where compromised mitochondrial function plays a role. parvalbumin neurons may be protected from this effect by N-acetylcysteine, also known as Mucomyst, a medication commonly prescribed to protect the liver against the toxic effects of acetaminophen (Tylenol) overdose, reports a new study in the current issue of Biological Psychiatry.

Dr. Kim Do and collaborators, from the Center for Psychiatric Neurosciences of Lausanne University in Switzerland, have worked many years on the hypothesis that one of the causes of schizophrenia is related to vulnerability genes/factors leading to oxidative stress. These oxidative stresses can be due to infections, inflammations, traumas or psychosocial stress occurring during typical brain development, meaning that at-risk subjects are particularly exposed during childhood and adolescence, but not once they reach adulthood.

Their study was performed with mice deficient in glutathione, a molecule essential for cellular protection against oxidations, leaving their neurons more exposed to the deleterious effects of oxidative stress. Under those conditions, they found that the parvalbumin neurons were impaired in the brains of mice that were stressed when they were young. These impairments persisted through their life. Interestingly, the same stresses applied to adults had no effect on their parvalbumin neurons.

Most strikingly, mice treated with the antioxidant N-acetylcysteine, from before birth and onwards, were fully protected against these negative consequences on parvalbumin neurons.

“These data highlight the need to develop novel therapeutic approaches based on antioxidant compounds such as N-acetylcysteine, which could be used preventively in young at-risk subjects,” said Do. “To give an antioxidant from childhood on to carriers of a genetic vulnerability for schizophrenia could reduce the risk of emergence of the disease.”

“This study raises the possibility that GABA neuronal deficits in psychiatric disorder may be preventable using a drug, N-acetylcysteine, which is quite safe to administer to humans,” added Dr. John Krystal, Editor of Biological Psychiatry.

(Source: elsevier.com)

Filed under brain brain development neurons schizophrenia depression GABA neuroscience science

74 notes

Enhancing Cognition with Video Games: A Multiple Game Training Study
Background
Previous evidence points to a causal link between playing action video games and enhanced cognition and perception. However, benefits of playing other video games are under-investigated. We examined whether playing non-action games also improves cognition. Hence, we compared transfer effects of an action and other non-action types that required different cognitive demands.
Methodology/Principal Findings
We instructed 5 groups of non-gamer participants to play one game each on a mobile device (iPhone/iPod Touch) for one hour a day/five days a week over four weeks (20 hours). Games included action, spatial memory, match-3, hidden- object, and an agent-based life simulation. Participants performed four behavioral tasks before and after video game training to assess for transfer effects. Tasks included an attentional blink task, a spatial memory and visual search dual task, a visual filter memory task to assess for multiple object tracking and cognitive control, as well as a complex verbal span task. Action game playing eliminated attentional blink and improved cognitive control and multiple-object tracking. Match-3, spatial memory and hidden object games improved visual search performance while the latter two also improved spatial working memory. Complex verbal span improved after match-3 and action game training.
Conclusion/Significance
Cognitive improvements were not limited to action game training alone and different games enhanced different aspects of cognition. We conclude that training specific cognitive abilities frequently in a video game improves performance in tasks that share common underlying demands. Overall, these results suggest that many video game-related cognitive improvements may not be due to training of general broad cognitive systems such as executive attentional control, but instead due to frequent utilization of specific cognitive processes during game play. Thus, many video game training related improvements to cognition may be attributed to near-transfer effects.

Enhancing Cognition with Video Games: A Multiple Game Training Study

Background

Previous evidence points to a causal link between playing action video games and enhanced cognition and perception. However, benefits of playing other video games are under-investigated. We examined whether playing non-action games also improves cognition. Hence, we compared transfer effects of an action and other non-action types that required different cognitive demands.

Methodology/Principal Findings

We instructed 5 groups of non-gamer participants to play one game each on a mobile device (iPhone/iPod Touch) for one hour a day/five days a week over four weeks (20 hours). Games included action, spatial memory, match-3, hidden- object, and an agent-based life simulation. Participants performed four behavioral tasks before and after video game training to assess for transfer effects. Tasks included an attentional blink task, a spatial memory and visual search dual task, a visual filter memory task to assess for multiple object tracking and cognitive control, as well as a complex verbal span task. Action game playing eliminated attentional blink and improved cognitive control and multiple-object tracking. Match-3, spatial memory and hidden object games improved visual search performance while the latter two also improved spatial working memory. Complex verbal span improved after match-3 and action game training.

Conclusion/Significance

Cognitive improvements were not limited to action game training alone and different games enhanced different aspects of cognition. We conclude that training specific cognitive abilities frequently in a video game improves performance in tasks that share common underlying demands. Overall, these results suggest that many video game-related cognitive improvements may not be due to training of general broad cognitive systems such as executive attentional control, but instead due to frequent utilization of specific cognitive processes during game play. Thus, many video game training related improvements to cognition may be attributed to near-transfer effects.

Filed under video games cognition perception memory peripheral vision psychology neuroscience science

93 notes

Punishment can enhance performance
The stick can work just as well as the carrot in improving our performance, a team of academics at The University of Nottingham has found.
A study led by researchers from the University’s School of Psychology, published recently in the Journal of Neuroscience, has shown that punishment can act as a performance enhancer in a similar way to monetary reward.
Dr Marios Philiastides, who led the work, said: “This work reveals important new information about how the brain functions that could lead to new methods of diagnosing neural development disorders such as autism, ADHD and personality disorders, where decision-making processes have been shown to be compromised.” 
The Nottingham study aimed at looking at how the efficiency with which we make decisions based on ambiguous sensory information — such as visual or auditory — is affected by the potential for, and severity of, anticipated punishment.
Imposing penalties
To investigate this, they asked participants in the study to perform a simple perceptual task — asking them to judge whether a blurred shape behind a rainy window is a person or something else.
They punished incorrect decisions by imposing monetary penalties. At the same time, they measured the participants’ brain activity in response to different amounts of monetary punishment. Brain activity was recorded, non-invasively, using an EEG machine which detects and amplifies brain signals from the surface of the scalp through a set of small electrodes embedded in a swim-like cap fitted on the participants’ head.
They found that participants’ performance increased systematically as the amount of punishment increased, suggesting that punishment acts as a performance enhancer in a similar way to monetary reward.
At the neural level, the academics identified multiple and distinct brain activations induced by punishment and distributed throughout different areas of the brain. Crucially, the timing of these activations confirmed that the punishment does not influence the way in which the brain processes the sensory evidence but does have an impact on the brain’s decision maker responsible for decoding sensory information at a later stage in the decision-making process.
Incentive-based motivation
Finally, they showed that those participants who showed the greatest improvements in performance also showed the biggest changes in brain activity. This is a key finding as it provides a potential route to study differences between individuals and their personality traits in order to characterise why some may respond better to reward and punishment than others.
A more thorough understanding of the influence of punishment on decision-making and how we make choices could lead to useful information on how to use incentive-based motivation to encourage certain behaviour.
The paper, Temporal Characteristics of the Influence of Punishment on Perceptual Decision Making in the Human Brain, is available online via the Journal of Neuroscience.

Punishment can enhance performance

The stick can work just as well as the carrot in improving our performance, a team of academics at The University of Nottingham has found.

A study led by researchers from the University’s School of Psychology, published recently in the Journal of Neuroscience, has shown that punishment can act as a performance enhancer in a similar way to monetary reward.

Dr Marios Philiastides, who led the work, said: “This work reveals important new information about how the brain functions that could lead to new methods of diagnosing neural development disorders such as autism, ADHD and personality disorders, where decision-making processes have been shown to be compromised.”

The Nottingham study aimed at looking at how the efficiency with which we make decisions based on ambiguous sensory information — such as visual or auditory — is affected by the potential for, and severity of, anticipated punishment.

Imposing penalties

To investigate this, they asked participants in the study to perform a simple perceptual task — asking them to judge whether a blurred shape behind a rainy window is a person or something else.

They punished incorrect decisions by imposing monetary penalties. At the same time, they measured the participants’ brain activity in response to different amounts of monetary punishment. Brain activity was recorded, non-invasively, using an EEG machine which detects and amplifies brain signals from the surface of the scalp through a set of small electrodes embedded in a swim-like cap fitted on the participants’ head.

They found that participants’ performance increased systematically as the amount of punishment increased, suggesting that punishment acts as a performance enhancer in a similar way to monetary reward.

At the neural level, the academics identified multiple and distinct brain activations induced by punishment and distributed throughout different areas of the brain. Crucially, the timing of these activations confirmed that the punishment does not influence the way in which the brain processes the sensory evidence but does have an impact on the brain’s decision maker responsible for decoding sensory information at a later stage in the decision-making process.

Incentive-based motivation

Finally, they showed that those participants who showed the greatest improvements in performance also showed the biggest changes in brain activity. This is a key finding as it provides a potential route to study differences between individuals and their personality traits in order to characterise why some may respond better to reward and punishment than others.

A more thorough understanding of the influence of punishment on decision-making and how we make choices could lead to useful information on how to use incentive-based motivation to encourage certain behaviour.

The paper, Temporal Characteristics of the Influence of Punishment on Perceptual Decision Making in the Human Brain, is available online via the Journal of Neuroscience.

Filed under punishment neurodevelopmental disorders performance decision making brain activity EEG psychology neuroscience science

74 notes

Breaking down the Parkinson’s pathway
The key hallmark of Parkinson’s disease is a slowdown of movement caused by a cutoff in the supply of dopamine to the brain region responsible for coordinating movement. While scientists have understood this general process for many years, the exact details of how this happens are still murky.
“We know the neurotransmitter, we know roughly the pathways in the brain that are being affected, but when you come right down to it and ask what exactly is the sequence of events that occurs in the brain, that gets a little tougher,” says Ann Graybiel, an MIT Institute Professor and member of MIT’s McGovern Institute for Brain Research.
A new study from Graybiel’s lab offers insight into some of the precise impairments caused by the loss of dopamine in brain cells affected by Parkinson’s disease. The findings, which appear in the March 12 online edition of the Journal of Neuroscience, could help researchers not only better understand the disease, but also develop more targeted treatments.
The neurons responsible for coordinating movement are located in a part of the brain called the striatum, which receives information from two major sources — the neocortex and a tiny region known as the substantia nigra. The cortex relays sensory information as well as plans for future action, while the substantia nigra sends dopamine that helps to coordinate all of the cortical input.
“This dopamine somehow modulates the circuit interactions in such a way that we don’t move too much, we don’t move too little, we don’t move too fast or too slow, and we don’t get overly repetitive in the movements that we make. We’re just right,” Graybiel says.
Parkinson’s disease develops when the neurons connecting the substantia nigra to the striatum die, cutting off a critical dopamine source; in a process that is not entirely understood, too little dopamine translates to difficulty initiating movement. Most Parkinson’s patients receive L-dopa, which can substitute for the lost dopamine. However, the effects usually wear off after five to 10 years, and complications appear.

Breaking down the Parkinson’s pathway

The key hallmark of Parkinson’s disease is a slowdown of movement caused by a cutoff in the supply of dopamine to the brain region responsible for coordinating movement. While scientists have understood this general process for many years, the exact details of how this happens are still murky.

“We know the neurotransmitter, we know roughly the pathways in the brain that are being affected, but when you come right down to it and ask what exactly is the sequence of events that occurs in the brain, that gets a little tougher,” says Ann Graybiel, an MIT Institute Professor and member of MIT’s McGovern Institute for Brain Research.

A new study from Graybiel’s lab offers insight into some of the precise impairments caused by the loss of dopamine in brain cells affected by Parkinson’s disease. The findings, which appear in the March 12 online edition of the Journal of Neuroscience, could help researchers not only better understand the disease, but also develop more targeted treatments.

The neurons responsible for coordinating movement are located in a part of the brain called the striatum, which receives information from two major sources — the neocortex and a tiny region known as the substantia nigra. The cortex relays sensory information as well as plans for future action, while the substantia nigra sends dopamine that helps to coordinate all of the cortical input.

“This dopamine somehow modulates the circuit interactions in such a way that we don’t move too much, we don’t move too little, we don’t move too fast or too slow, and we don’t get overly repetitive in the movements that we make. We’re just right,” Graybiel says.

Parkinson’s disease develops when the neurons connecting the substantia nigra to the striatum die, cutting off a critical dopamine source; in a process that is not entirely understood, too little dopamine translates to difficulty initiating movement. Most Parkinson’s patients receive L-dopa, which can substitute for the lost dopamine. However, the effects usually wear off after five to 10 years, and complications appear.

Filed under brain parkinson's disease dopamine neurotransmitters substantia nigra interneurons neuroscience science

59 notes

Researchers find age-related changes in how autism affects the brain

Newly released findings from Bradley Hospital published in the Journal of the American Academy of Child & Adolescent Psychiatry have found that autism spectrum disorders (ASD) affect the brain activity of children and adults differently.

In the study, titled “Developmental Meta-Analysis of the Functional Neural Correlates of Autism Spectrum Disorders,” Daniel Dickstein, M.D., FAAP, director of the Pediatric Mood, Imaging and Neurodevelopment Program at Bradley Hospital, found that autism-related changes in brain activity continue into adulthood.

"Our study was innovative because we used a new technique to directly compare the brain activity in children with autism versus adults with autism," said Dickstein. "We found that brain activity changes associated with autism do not just happen in childhood, and then stop. Instead, our study suggests that they continue to develop, as we found brain activity differences in children with autism compared to adults with autism. This is the first study to show that."

This new technique, a meta-analysis, which is a study that compiles pre-existing studies, provided researchers with a powerful way to look at potential differences between children and adults with autism.

Dickstein conducted the research through Bradley Hospital’s PediMIND Program. Started in 2007, this program seeks to identify biological and behavioral markers—scans and tests—that will ultimately improve how children and adolescents are diagnosed and treated for psychiatric conditions. Using special computer games and brain scans, including magnetic resonance imaging (MRI), Dickstein hopes to one day make the diagnosis and treatment of autism and other disorders more specific and more effective.

Among autism’s most disabling symptoms is a disruption in social skills, so it is noteworthy that this study found significantly less brain activity in autistic children than autistic adults during social tasks, such as looking at faces. This was true in brain regions including the right hippocampus and superior temporal gyrus—two brain regions associated with memory and other functions.

Dickstein noted, “Brain changes in the hippocampus in children with autism have been found in studies using other types of brain scan, suggesting that this might be an important target for brain-based treatments, including both therapy and medication that might improve how this brain area works.”

Rowland Barrett, Ph.D., chief psychologist at Bradley Hospital and chief-of-service for The Center for Autism and Developmental Disabilities was also part of the team leading the study.

"Autism spectrum disorders, including autistic disorder, Asperger’s disorder, and pervasive developmental disorder not otherwise specified (PDD-NOS), are among the most common and impairing psychiatric conditions affecting children and adolescents today," said Barrett. "If we can identify the shift in the parts of the brain that autism affects as we age, then we can better target treatments for patients with ASD."

(Source: eurekalert.org)

Filed under ASD autism brain activity MRI hippocampus superior temporal gyrus neuroscience science

123 notes

Drug Treatment Corrects Autism Symptoms in Mouse Model

ucsdhealthsciences:

Autism results from abnormal cell communication. Testing a new theory, researchers at the University of California, San Diego School of Medicine have used a newly discovered function of an old drug to restore cell communications in a mouse model of autism, reversing symptoms of the devastating disorder.

The findings are published in the March 13, 2013 issue of the journal PLOS ONE.

“Our (cell danger) theory suggests that autism happens because cells get stuck in a defensive metabolic mode and fail to talk to each other normally, which can interfere with brain development and function,” said Robert Naviaux, MD, PhD, professor of medicine and co-director of the Mitochondrial and Metabolic Disease Center at UC San Diego. “We used a class of drugs that has been around for almost a century to treat other diseases to block the ‘danger’ signal in a mouse model, allowing cells to return to normal metabolism and restore cell communication.”

“Of course, correcting abnormalities in a mouse is a long way from a cure for humans,” said Naviaux, “but we are encouraged enough to test this approach in a small clinical trial of children with autism spectrum disorder in the coming year. This trial is still in the early stages of development. We think this approach – called antipurinergic therapy or APT – offers a fresh and exciting new path that could lead to development of a new class of drugs to treat autism.”

Autism spectrum disorders (ASDs) are complex disorders defined by abnormalities in the development of language, social and repetitive behaviors. Hundreds of different genetic and environment factors are known to confer risk.  In this study, nearly a dozen UC San Diego scientists from different disciplines collaborated to find a unifying mechanism that explains autism.  Their work is the result of one of just three international “Trailblazer” awards given by the group Autism Speaks in 2011.

Describing a completely new theory for the origin and treatment of autism using APT, Naviaux and colleagues introduce the concept that a large majority of both genetic and environmental causes for autism act by producing a sustained cell danger response – the metabolic state underlying innate immunity and inflammation.

“When cells are exposed to classical forms of dangers, such as a virus, infection or toxic environmental substance, a defense mechanism is activated,” Naviaux explained.  “This results in changes to metabolism and gene expression, and reduces the communication between neighboring cells. Simply put, when cells stop talking to each other, children stop talking.”

Since mitochondria – the so-called “power plants” of the cell – play a central role in both infectious and non-infectious cellular stress, innate immunity and inflammation, Naviaux and colleagues searched for a signaling system in the body that was both linked to mitochondria and critical for innate immunity.  They found it in extracellular nucleotides like adenosine triphosphate (ATP) and other mitokines – signaling molecules made by distressed mitochondria. These mitokines have separate metabolic functions outside of the cell where they bind to and regulate receptors present on every cell of the body.  Fifteen types of purinergic receptors are known to be stimulated by these extracellular nucleotides, and the receptors are known to control a broad range of biological characteristics with relevance to autism.

The researchers tested suramin – a well-known inhibitor of purinergic signaling used medically for the treatment of African sleeping sickness since shortly after it was synthesized in 1916 – in mice.  They found that this APT mediator corrected autism-like symptoms in the animal model, even if the treatment was started well after the onset of symptoms.  The drug restored 17 types of multi-symptom abnormalities including normalizing brain synapse structure, cell-to-cell signaling, social behavior, motor coordination and normalizing mitochondrial metabolism.

“The striking effectiveness shown in this study using APT to ‘reprogram’ the cell danger response and reduce inflammation showcases an opportunity to develop a completely new class of anti-inflammatory drugs to treat autism and several other disorders,” Naviaux said. 

208 notes

Scary Faces Terrify Woman with Unusual Condition
When the 67-year-old woman came to the hospital, she was deeply afraid of two things — the visions of odd-looking faces that appeared hovering before her, and that the hallucinations might mean she was losing her mind.
But this retired teacher wasn’t going crazy, and laboratory tests also ruled out two common culprits of hallucinations — infection and drug interactions.
"She was absolutely terrified by what she was seeing," said Dr. Bharat Kumar, an internal medicine resident at the University of Kentucky who treated the woman. In fact, the patient and her family were so concerned in the days before she came to the hospital, they asked a priest about performing an exorcism, Kumar said.
The woman drew a picture of what she saw. The faces had large teeth, eyes and ears, and a horizontally elongated shape, like a football.
That peculiar shape and the fact that the patient recognized that she was hallucinating (rather than believing the visions to be real) provided two important clues in making a diagnosis, Kumar said. He determined that the woman had condition called Charles Bonnet syndrome.
Patients with the syndrome may see small people and animals, bright moving shapes or distorted faces. These hallucinations are purely visual; no sounds accompany them.
In the woman’s case, the condition developed because she had macular degeneration. Tissue within the retinas of her eyes was deteriorating, and her ability to see was declining.
Charles Bonnet syndrome results from the absence of such sensory input to the brain. “When it expects sensory input and receives nothing, it often creates its own input,” Kumar explained.
The brain isn’t a sophisticated computer that processes information objectively and efficiently, he said. “It’s more of a wibbly-wobbly, messy-guessy ball of goo.”
There is no treatment for the condition, but in many cases the hallucinations stop happening as the brain becomes used to vision loss. Patients who are very frightened might be given anti-psychotic medications, but these drugs have serious side effects and aren’t appropriate for everyone.
The woman was grateful to receive her diagnosis and learn that she was not losing her mind, Kumar said. When he followed up with her three months later, she was still having the hallucinations, but they were happening less often.
A 2010 study showed that 10 to 40 percent of elderly patients with visual impairments may have Charles Bonnet syndrome.
Kumar had never before seen a patient with the condition, although he noted that it may occur more commonly than it is diagnosed. “Patients are often hesitant to say that they see things because they are afraid that they will be called crazy,” he said.
The case report was published online Feb. 25 in the journal Age and Aging.

Scary Faces Terrify Woman with Unusual Condition

When the 67-year-old woman came to the hospital, she was deeply afraid of two things — the visions of odd-looking faces that appeared hovering before her, and that the hallucinations might mean she was losing her mind.

But this retired teacher wasn’t going crazy, and laboratory tests also ruled out two common culprits of hallucinations — infection and drug interactions.

"She was absolutely terrified by what she was seeing," said Dr. Bharat Kumar, an internal medicine resident at the University of Kentucky who treated the woman. In fact, the patient and her family were so concerned in the days before she came to the hospital, they asked a priest about performing an exorcism, Kumar said.

The woman drew a picture of what she saw. The faces had large teeth, eyes and ears, and a horizontally elongated shape, like a football.

That peculiar shape and the fact that the patient recognized that she was hallucinating (rather than believing the visions to be real) provided two important clues in making a diagnosis, Kumar said. He determined that the woman had condition called Charles Bonnet syndrome.

Patients with the syndrome may see small people and animals, bright moving shapes or distorted faces. These hallucinations are purely visual; no sounds accompany them.

In the woman’s case, the condition developed because she had macular degeneration. Tissue within the retinas of her eyes was deteriorating, and her ability to see was declining.

Charles Bonnet syndrome results from the absence of such sensory input to the brain. “When it expects sensory input and receives nothing, it often creates its own input,” Kumar explained.

The brain isn’t a sophisticated computer that processes information objectively and efficiently, he said. “It’s more of a wibbly-wobbly, messy-guessy ball of goo.”

There is no treatment for the condition, but in many cases the hallucinations stop happening as the brain becomes used to vision loss. Patients who are very frightened might be given anti-psychotic medications, but these drugs have serious side effects and aren’t appropriate for everyone.

The woman was grateful to receive her diagnosis and learn that she was not losing her mind, Kumar said. When he followed up with her three months later, she was still having the hallucinations, but they were happening less often.

A 2010 study showed that 10 to 40 percent of elderly patients with visual impairments may have Charles Bonnet syndrome.

Kumar had never before seen a patient with the condition, although he noted that it may occur more commonly than it is diagnosed. “Patients are often hesitant to say that they see things because they are afraid that they will be called crazy,” he said.

The case report was published online Feb. 25 in the journal Age and Aging.

Filed under visual impairment macular degeneration hallucinations Charles Bonnet syndrome neuroscience science

free counters