Neuroscience

Articles and news from the latest research reports.

Posts tagged neuroscience

89 notes

Binge Eating Curbed by Deep Brain Stimulation in Animal Model

Deep brain stimulation (DBS) in a precise region of the brain appears to reduce caloric intake and prompt weight loss in obese animal models, according to a new study led by researchers at the University of Pennsylvania. The study, reported in the Journal of Neuroscience, reinforces the involvement of dopamine deficits in increasing obesity-related behaviors such as binge eating, and demonstrates that DBS can reverse this response via activation of the dopamine type-2 receptor.

"Based on this research, DBS may provide therapeutic relief to binge eating, a behavior commonly seen in obese humans, and frequently unresponsive to other approaches," said senior author Tracy L. Bale, PhD, associate professor of neuroscience in Penn’s School of Veterinary Medicine’s Department of Animal Biology and in the Perelman School of Medicine’s Department of Psychiatry. DBS is currently used to reduce tremors in Parkinson’s disease and is under investigation as a therapy for major depression and obsessive-compulsive disorder.

Nearly 50 percent of obese people binge eat, uncontrollably consuming palatable highly caloric food within a short period of time. In this study, researchers targeted the nucleus accumbens, a small structure in the brain reward center known to be involved in addictive behaviors. Mice receiving the stimulation ate significantly less of the high fat food compared to mice not receiving DBS. Following stimulation, mice did not compensate for the loss of calories by eating more. However, on days when the device was turned off, binge eating resumed.

Researchers also tested the long-term effects of DBS on obese mice that had been given unlimited access to high-fat food. During four days of continuous stimulation, the obese mice consumed fewer calories and, importantly, their body weight dropped. These mice also showed improvement in their glucose sensitivity, suggestive of a reversal of type 2 diabetes.

“These results are our best evidence yet that targeting the nucleus accumbens with DBS may be able to modify specific feeding behaviors linked to body weight changes and obesity,” Bale added.

“Once replicated in human clinical trials, DBS could rapidly become a treatment for people with obesity due to the extensive groundwork already established in other disease areas,” said lead author Casey Halpern, MD, resident in the Department of Neurosurgery of the Perelman School of Medicine at the University of Pennsylvania.

(Source: uphs.upenn.edu)

Filed under deep brain stimulation caloric intake obesity animal model binge eating neuroscience science

46 notes

ALS trial shows novel therapy is safe
An investigational treatment for an inherited form of Lou Gehrig’s disease has passed an early phase clinical trial for safety, researchers at Washington University School of Medicine in St. Louis and Massachusetts General Hospital report.
The researchers have shown that the therapy produced no serious side effects in patients with the disease, also known as amyotrophic lateral sclerosis (ALS). The phase 1 trial’s results, available online in Lancet Neurology, also demonstrate that the drug was successfully introduced into the central nervous system.
The treatment uses a technique that shuts off the mutated gene that causes the disease. This approach had never been tested against a condition that damages nerve cells in the brain and spinal cord.
“These results let us move forward in the development of this treatment and also suggest that it’s time to think about applying this same approach to other mutated genes that cause central nervous system disorders,” says lead author Timothy Miller, MD, PhD, assistant professor of neurology at Washington University. “These could include some forms of Alzheimer’s disease, Parkinson’s disease, Huntington’s disease and other conditions.”
ALS destroys nerves that control muscles, gradually leading to paralysis and death. For treatment of the disease, the sole FDA-approved medication, Riluzole, has only a marginal effect.
Most cases of ALS are sporadic, but about 10 percent are linked to inherited mutations. Scientists have identified changes in 10 genes that can cause ALS and are still looking for others.
The study focused on a form of ALS caused by mutations in a gene called SOD1, which account for 2 percent of all ALS cases. Researchers have found more than 100 mutations in the SOD1 gene that cause ALS.
“At the molecular level, these mutations affect the properties of the SOD1 protein in a variety of ways, but they all lead to ALS,” says Miller, who is director of the Christopher Wells Hobler Lab for ALS Research at the Hope Center for Neurological Disorders at Washington University.
Rather than try to understand how each mutation causes ALS, Miller and his colleagues focused on blocking production of the SOD1 protein using a technique called antisense therapy.
To make a protein, cells have to copy the protein-building instructions from the gene. Antisense therapy blocks the cell from using these copies, allowing researchers to selectively silence individual genes.
“Antisense therapy has been considered and tested for a variety of disorders over the past several decades,” Miller says. “For example, the FDA recently approved an antisense therapy called Kynamro for familial hypercholesterolemia, an inherited condition that increases cholesterol levels in the blood.”
Miller and colleagues at the University of California-San Diego devised an antisense drug for SOD1 and successfully tested it in an animal model of the disease.
Merit Cudkowicz, MD, chief of neurology at Massachusetts General Hospital, was co-PI of the phase I clinical safety trial described in the new paper. Clinicians at Barnes-Jewish Hospital, Massachusetts General Hospital, Johns Hopkins Hospital and the Methodist Neurological Institute in Houston gave antisense therapy or a placebo to 21 patients with SOD1-related ALS. Treatment consisted of spinal infusions that lasted 11 hours.
The scientists found no significant difference between side effects in the control and treatment groups. Headache and back pain, both of which are often associated with spinal infusion, were among the most common side effects.
Immediately after the injections, the researchers took spinal fluid samples. This let them confirm the antisense drug was circulating in the spinal fluid of patients who received the treatment.
To treat SOD1-related ALS in the upcoming phase II trial, researchers will need to increase the dosage of the antisense drug. As the dose rises, they will watch to ensure that the therapy does not cause harmful inflammation or other side effects as it lowers SOD1 protein levels.
“All the information that we have so far suggests lowering SOD1 will be safe,” Miller says. “In fact, completely disabling SOD1 in mice seems to have little to no effect. We think it will be OK in patients, but we won’t know for sure until we’ve conducted further trials.”
The therapy may one day be helpful in the more common, noninherited forms of ALS, some of which may be linked to problems with the SOD1 protein.
“Before we can consider using this same therapy for sporadic ALS, we need more evidence that SOD1 is a major contributor to these forms of the disorder,” Miller says. 
The trial was conducted with support from ISIS Pharmaceuticals, which co-owns a patent on the SOD1 antisense drug.

ALS trial shows novel therapy is safe

An investigational treatment for an inherited form of Lou Gehrig’s disease has passed an early phase clinical trial for safety, researchers at Washington University School of Medicine in St. Louis and Massachusetts General Hospital report.

The researchers have shown that the therapy produced no serious side effects in patients with the disease, also known as amyotrophic lateral sclerosis (ALS). The phase 1 trial’s results, available online in Lancet Neurology, also demonstrate that the drug was successfully introduced into the central nervous system.

The treatment uses a technique that shuts off the mutated gene that causes the disease. This approach had never been tested against a condition that damages nerve cells in the brain and spinal cord.

“These results let us move forward in the development of this treatment and also suggest that it’s time to think about applying this same approach to other mutated genes that cause central nervous system disorders,” says lead author Timothy Miller, MD, PhD, assistant professor of neurology at Washington University. “These could include some forms of Alzheimer’s disease, Parkinson’s disease, Huntington’s disease and other conditions.”

ALS destroys nerves that control muscles, gradually leading to paralysis and death. For treatment of the disease, the sole FDA-approved medication, Riluzole, has only a marginal effect.

Most cases of ALS are sporadic, but about 10 percent are linked to inherited mutations. Scientists have identified changes in 10 genes that can cause ALS and are still looking for others.

The study focused on a form of ALS caused by mutations in a gene called SOD1, which account for 2 percent of all ALS cases. Researchers have found more than 100 mutations in the SOD1 gene that cause ALS.

“At the molecular level, these mutations affect the properties of the SOD1 protein in a variety of ways, but they all lead to ALS,” says Miller, who is director of the Christopher Wells Hobler Lab for ALS Research at the Hope Center for Neurological Disorders at Washington University.

Rather than try to understand how each mutation causes ALS, Miller and his colleagues focused on blocking production of the SOD1 protein using a technique called antisense therapy.

To make a protein, cells have to copy the protein-building instructions from the gene. Antisense therapy blocks the cell from using these copies, allowing researchers to selectively silence individual genes.

“Antisense therapy has been considered and tested for a variety of disorders over the past several decades,” Miller says. “For example, the FDA recently approved an antisense therapy called Kynamro for familial hypercholesterolemia, an inherited condition that increases cholesterol levels in the blood.”

Miller and colleagues at the University of California-San Diego devised an antisense drug for SOD1 and successfully tested it in an animal model of the disease.

Merit Cudkowicz, MD, chief of neurology at Massachusetts General Hospital, was co-PI of the phase I clinical safety trial described in the new paper. Clinicians at Barnes-Jewish Hospital, Massachusetts General Hospital, Johns Hopkins Hospital and the Methodist Neurological Institute in Houston gave antisense therapy or a placebo to 21 patients with SOD1-related ALS. Treatment consisted of spinal infusions that lasted 11 hours.

The scientists found no significant difference between side effects in the control and treatment groups. Headache and back pain, both of which are often associated with spinal infusion, were among the most common side effects.

Immediately after the injections, the researchers took spinal fluid samples. This let them confirm the antisense drug was circulating in the spinal fluid of patients who received the treatment.

To treat SOD1-related ALS in the upcoming phase II trial, researchers will need to increase the dosage of the antisense drug. As the dose rises, they will watch to ensure that the therapy does not cause harmful inflammation or other side effects as it lowers SOD1 protein levels.

“All the information that we have so far suggests lowering SOD1 will be safe,” Miller says. “In fact, completely disabling SOD1 in mice seems to have little to no effect. We think it will be OK in patients, but we won’t know for sure until we’ve conducted further trials.”

The therapy may one day be helpful in the more common, noninherited forms of ALS, some of which may be linked to problems with the SOD1 protein.

“Before we can consider using this same therapy for sporadic ALS, we need more evidence that SOD1 is a major contributor to these forms of the disorder,” Miller says. 

The trial was conducted with support from ISIS Pharmaceuticals, which co-owns a patent on the SOD1 antisense drug.

Filed under ALS Lou Gehrig's disease nervous system sod1 gene nerve cells therapy neuroscience science

82 notes

Epigenetic changes shed light on biological mechanism of autism

Scientists from King’s College London have identified patterns of epigenetic changes involved in autism spectrum disorder (ASD) by studying genetically identical twins who differ in autism traits.

image

The study, published in Molecular Psychiatry, is the largest of its kind and may shed light on the biological mechanism by which environmental influences regulate the activity of certain genes and in turn contribute to the development of ASD and related behaviour traits.

ASD affects approximately 1 in 100 people in the UK and involves a spectrum of disorders which manifest themselves differently in different people. People with ASD have varying levels of impairment across three common areas: deficits in social interactions and understanding, repetitive behaviour and interests, and impairments in language and communication development.

Evidence from twin studies shows there is a strong genetic component to ASD and previous studies suggest that genes that direct brain development may be involved in the disorder. In approximately 70% of cases, when one identical twin has ASD, so does the other. However, in 30% of cases, identical twins differ for ASD. Because identical twins share the same genetic code, this suggests non-genetic, or epigenetic, factors may be involved.

Epigenetic changes affect the expression or activity of genes without changing the underlying DNA sequence – they are believed to be one mechanism by which the environment can interact with the genome. Importantly, epigenetic changes are potentially reversible and may therefore provide targets for the development of new therapies.

The researchers studied an epigenetic mechanism called DNA methylation. DNA methylation acts to block the genetic sequences that drive gene expression, silencing gene activity. They examined DNA methylation at over 27,000 sites across the genome using samples taken from 50 identical twin pairs (100 individuals) from the UK Medical Research Council (MRC) funded Twins Early Development Study (TEDS): 34 pairs who differed for ASD or autism related behaviour traits, 5 pairs where both twins have ASD, and 11 healthy twin pairs.

Dr Chloe Wong, first author of the study from King’s College London’s Institute of Psychiatry, says: “We’ve identified distinctive patterns of DNA methylation associated with both autism diagnosis and related behaviour traits, and increasing severity of symptoms. Our findings give us an insight into the biological mechanism mediating the interaction between gene and environment in autism spectrum disorder.”

DNA methylation at some genetic sites was consistently altered for all individuals with ASD, and differences at other sites were specific to certain symptom groups. The number of DNA methylation sites across the genome was also linked to the severity of autism symptoms suggesting a quantitative relationship between the two. Additionally, some of the differences in DNA methylation markers were located in genetic regions that previous research has associated with early brain development and ASD.

Professor Jonathan Mill, lead author of the paper from King’s College London’s Institute of Psychiatry and the University of Exeter, says: “Research into the intersection between genetic and environmental influences is crucial because risky environmental conditions can sometimes be avoided or changed. Epigenetic changes are potentially reversible, so our next step is to embark on larger studies to see whether we can identify key epigenetic changes common to the majority of people with autism to help us develop possible therapeutic interventions.”

Dr Alycia Halladay, Senior Director of Environmental and Clinical Sciences from Autism Speaks who funded the research, says: “This is the first large-scale study to take a whole genome approach to studying epigenetic influences in twins who are genetically identical but have different symptoms. These findings open the door to future discoveries in the role of epigenetics – in addition to genetics – in the development of autism symptoms.”

(Source: kcl.ac.uk)

Filed under autism ASD monozygotic twins genes epigenetics neuroscience science

125 notes

Brain biology tied to social reorientation during entry to adolescence
A specific region of the brain is in play when children consider their identity and social status as they transition into adolescence — that often-turbulent time of reaching puberty and entering middle school, says a University of Oregon psychologist.
In a study of 27 neurologically typical children who underwent functional magnetic resonance imaging (fMRI) at ages 10 and 13, activity in the brain’s ventromedial prefrontal cortex increased dramatically when the subjects responded to questions about how they view themselves.
The findings, published in the April 24 issue of the Journal of Neuroscience, confirm previous findings that specific brain networks support self-evaluations in the growing brain, but, more importantly, provide evidence that basic biology may well drive some of these changes, says Jennifer H. Pfeifer, professor of psychology and director of the psychology department’s Developmental Social Neuroscience Lab.
"This is a longitudinal fMRI study, which is still relatively uncommon," Pfeifer said. "It suggests a link between neural responses during self-evaluative processing in the social domain, and pubertal development. This provides a rare piece of empirical evidence in humans, rather than animal models, that supports the common theory that adolescents are biologically driven to go through a social reorientation."
Participants were scanned for about seven minutes at each visit. They responded to a series of attributes tied to social or academic domains — social ones such as “I am popular” or “I wish I had more friends” and academic ones such as “I like to read just for fun” or “Writing is so boring.” Social and academic evaluations were made about both the self and a familiar fictional character, Harry Potter.
In previous research, Pfeifer had found that a more dorsal region of the medial prefrontal cortex was more responsive in 10-year-old children during self-evaluations, when they were compared to adults. The new study, she said, provides a more detailed picture of how the brain supports self-development by looking at change within individuals.
The fMRI analyses found it was primarily the social self-evaluations that triggered significant increases over time in blood-oxygen levels, which fMRI detects, in the ventral medial prefrontal cortex. Additionally, these increases were strongest in children who experienced the most pubertal development over the three-year study period, for both girls and boys. Increases during academic self-evaluations were at best marginal. Whole-brain analyses found no other areas of the brain had significant increases or decreases in activity related to pubertal development.
"Neural changes in the social domain were more robust," Pfeifer said. "Increased responses in this one region of the brain from age 10 to 13 were very evident in social self-evaluations, but not academic ones. This pattern is consistent with the enormous importance that most children entering adolescence place on their peer relationships and social status, compared to the relatively diminished value often associated with academics during this transition."
In youth with autism spectrum disorders, this specialized response in ventral medial prefrontal cortex is missing, she added, citing a paper she co-authored in the February 2013 issue of the Journal of Autism and Developmental Disorders and a complementary study led by Michael V. Lombardo, University of Cambridge, in the February 2010 issue of the journal Brain. The absence of this typical effect, Pfeifer said, might be related to the challenges these individuals often face in both self-understanding and social relations.
"Dr. Pfeifer’s research examining self-evaluations during adolescence adds significantly to the intricate puzzle of this turbulent age period," said Kimberly Andrews Espy, vice president for research and innovation and dean of the graduate school. "Researchers at the University of Oregon are piecing together how both biology and the environment dynamically and interactively support healthy social development."

Brain biology tied to social reorientation during entry to adolescence

A specific region of the brain is in play when children consider their identity and social status as they transition into adolescence — that often-turbulent time of reaching puberty and entering middle school, says a University of Oregon psychologist.

In a study of 27 neurologically typical children who underwent functional magnetic resonance imaging (fMRI) at ages 10 and 13, activity in the brain’s ventromedial prefrontal cortex increased dramatically when the subjects responded to questions about how they view themselves.

The findings, published in the April 24 issue of the Journal of Neuroscience, confirm previous findings that specific brain networks support self-evaluations in the growing brain, but, more importantly, provide evidence that basic biology may well drive some of these changes, says Jennifer H. Pfeifer, professor of psychology and director of the psychology department’s Developmental Social Neuroscience Lab.

"This is a longitudinal fMRI study, which is still relatively uncommon," Pfeifer said. "It suggests a link between neural responses during self-evaluative processing in the social domain, and pubertal development. This provides a rare piece of empirical evidence in humans, rather than animal models, that supports the common theory that adolescents are biologically driven to go through a social reorientation."

Participants were scanned for about seven minutes at each visit. They responded to a series of attributes tied to social or academic domains — social ones such as “I am popular” or “I wish I had more friends” and academic ones such as “I like to read just for fun” or “Writing is so boring.” Social and academic evaluations were made about both the self and a familiar fictional character, Harry Potter.

In previous research, Pfeifer had found that a more dorsal region of the medial prefrontal cortex was more responsive in 10-year-old children during self-evaluations, when they were compared to adults. The new study, she said, provides a more detailed picture of how the brain supports self-development by looking at change within individuals.

The fMRI analyses found it was primarily the social self-evaluations that triggered significant increases over time in blood-oxygen levels, which fMRI detects, in the ventral medial prefrontal cortex. Additionally, these increases were strongest in children who experienced the most pubertal development over the three-year study period, for both girls and boys. Increases during academic self-evaluations were at best marginal. Whole-brain analyses found no other areas of the brain had significant increases or decreases in activity related to pubertal development.

"Neural changes in the social domain were more robust," Pfeifer said. "Increased responses in this one region of the brain from age 10 to 13 were very evident in social self-evaluations, but not academic ones. This pattern is consistent with the enormous importance that most children entering adolescence place on their peer relationships and social status, compared to the relatively diminished value often associated with academics during this transition."

In youth with autism spectrum disorders, this specialized response in ventral medial prefrontal cortex is missing, she added, citing a paper she co-authored in the February 2013 issue of the Journal of Autism and Developmental Disorders and a complementary study led by Michael V. Lombardo, University of Cambridge, in the February 2010 issue of the journal Brain. The absence of this typical effect, Pfeifer said, might be related to the challenges these individuals often face in both self-understanding and social relations.

"Dr. Pfeifer’s research examining self-evaluations during adolescence adds significantly to the intricate puzzle of this turbulent age period," said Kimberly Andrews Espy, vice president for research and innovation and dean of the graduate school. "Researchers at the University of Oregon are piecing together how both biology and the environment dynamically and interactively support healthy social development."

Filed under brain brain activity prefrontal cortex fMRI self-evaluation adolescence neuroscience science

39 notes

Large animal models of Huntington’s disease offer new and promising research options

Scientific progress in Huntington’s disease (HD) relies upon the availability of appropriate animal models that enable insights into the disease’s genetics and/or pathophysiology. Large animal models, such as domesticated farm animals, offer some distinct advantages over rodent models, including a larger brain that is amenable to imaging and intracerebral therapy, longer lifespan, and a more human-like neuro-architecture. Three articles in the latest issue of the Journal of Huntington’s Disease discuss the potential benefits of using large animal models in HD research and the implications for the development of gene therapy.

A review by Morton and Howland explores the advantages and drawbacks of small and large animal models of HD. In the same issue, Baxa et al. highlight the development of a transgenic minipig HD model that expresses a human mutant huntingtin (HTT) fragment through the central nervous system (CNS) and peripheral tissues and manifests neurochemical and reproductive changes with age. In another report, Van der Bom et al. describe a technique employing CT and MRI that allows precise intracerebral application of therapeutics to transgenic HD sheep.

Huntington’s disease (HD) is an inherited progressive neurological disorder for which there is presently no effective treatment. It is caused by a single dominant gene mutation an expanded CAG repeat in the HTT gene - leading to expression of mutant HTT protein. Expression of mutant HTT causes changes in cellular functions, which ultimately results in uncontrollable movements, progressive psychiatric difficulties, and loss of mental abilities.

The search for new large animal models of HD arises from the recognition that there are some practical limitations of rodent and other small animal models. Because neurodegenerative diseases like HD progress over a lifetime, a rodent’s short life span excludes the possibility of studying long-term changes. There are also important anatomic differences between the brains of humans and rodents that become especially relevant when studying HD, including the lack of a gyrencephalic (convoluted) cortex and differences in the structure and cellular characteristics of the basal ganglia compared to humans. Not only does a rodent’s small brain often preclude the use of advanced neuroimaging techniques, it is also not clear how intracerebral application of trophic factors, transplant therapies, and gene therapies in small animals might translate to the much larger human brain.

"Importantly, the brains of large animals can be studied using sensitive measures that should be highly translatable to the human condition, including MRI and PET imaging, EEG, and electrophysiology, as well as behavioral tests looking at motor and cognitive function," says Professor Jenny Morton, PhD, of the Department of Physiology, Development and Neuroscience at the University of Cambridge. "Moving to larger-brained animal models after promising results are obtained in rodents is a logical, and possibly necessary, step to optimize delivery and biodistribution, validating on-target mechanism of action, and assessing safety profiles," says Professor Morton

"Strategies directed against the huntingtin gene in the brain are an important part of CHDI’s therapeutic portfolio", says David Howland, PhD, Director of Model Systems at CHDI. "Translating preclinical results for gene-based therapies from rodent models to larger-brained models of HD is an important step along the path toward clinical testing."

Significant advances have been made in the creation and characterization of HD models in nonhuman primates (NHP). “The relevance to human biology of NHP models in Huntington’s disease hold great potential value for preclinical research and development, but we need to fully consider the substantial issues of cost, long-term housing of affected animals, access of the models to HD investigators, and ethical concerns with modeling in these species,” says Dr Howland. “CHDI has invested in efforts to expand modeling in large animals to include sheep and minipigs to work around some of these concerns about NHP models.”

Large domesticated farm animals offer some distinct advantages as models of HD. Sheep, for example, are domesticated, docile, live outdoors, are easy to care for, and relatively economical to maintain. A sheep’s brain is about the same size as a large primate’s, is gyrencephalic, and the basal ganglia that degenerate in HD are anatomically similar to those in humans. Sheep live long enough that the time available for studying progressive neurological diseases such as HD is much greater than is possible in rodents. HD transgenic sheep express HTT protein in the brain and abnormal HD-associated neurochemical changes. These HD sheep have been subject to advanced genomic techniques and, because they carry a human transgene that is expressed at both an mRNA and protein level, they are seen as suitable for testing gene therapy-based reagents directed against human HTT. A further advantage, says Professor Morton, is that “although sheep have a reputation for being stupid, this is probably undeserved they have very good memories and are capable of learning and remembering new tasks.”

In order to advance the use of the HD sheep model, I.M.J. van der Bom, PhD, from the Department of Radiology at the University of Massachusetts, and colleagues developed a multi-modal technique using skull markings seen with CT imaging and brain anatomy from MR imaging to allow more precise placement of intracerebral cannulae into sheep brain. The technique offers the ability to directly image micro-cannula placement to ensure accurate targeting of the therapeutic injection in the brain. With this technique, the authors hope to study the extent of optimal safety, spread and neuronal uptake of adeno-associated virus (AAV) based therapeutics.

"Pigs, and mainly minipigs, represent a viable model for preclinical drug trials and long-term safety studies," says Jan Motlik, DVM, PhD, DSc, from the Laboratory of Cell Regeneration and Plasticity of the Institute of Animal Physiology and Genetics in Libechov, Czech Republic. Advantages include its large brain size and long lifespan. Genetic advances have been made, including defining the porcine genome, with a 96% similarity between the porcine and human huntingtin genes. In addition to well-established methods for pig husbandry, they are economical to house and have body systems very similar to that of humans.

In the report by Baxa et al., a new HD minipig model using lentiviral infection of porcine embryos is described. The authors report that they successfully developed a heterozygote transgenic HD minipig that expresses a human mutant HTT fragment throughout the CNS and peripheral tissues through 4 successive generations. The model produces viable offspring, with a total neonatal mortality rate of 17%. The authors reported that one affected HD minipig showed a decline beginning at 16 months of a neuronal phosphoprotein, DARPP32, in the neostriatum, the brain region most affected by HD. A loss of fertility, possibly HD related, was also found.

(Source: news.bio-medicine.org)

Filed under huntington's disease animal model huntingtin genetics neuroscience science

52 notes

'Clean' your memory to pick a winner
Predicting the winner of a sporting event with accuracy close to that of a statistical computer programme could be possible with proper training, according to researchers.
In a study published today, experiment participants who had been trained on statistically idealised data vastly improved their ability to predict the outcome of a baseball game.
In normal situations, the brain selects a limited number of memories to use as evidence to guide decisions. As real-world events do not always have the most likely outcome, retrieved memories can provide misleading information at the time of a decision.
Now, researchers at UCL and the University of Montreal have found a way to train the brain to accurately predict the outcome of an event, for example a baseball game, by giving subjects idealised scenarios that always conform to statistical probability.
Dr Bradley Love (UCL Department of Cognition, Perception and Brain Sciences), lead author of study, said: “Providing people with idealized situations, as opposed to actual outcomes, ‘cleans’ their memory and provides a stock of good quality evidence for the brain to use.”
In the study, published in Proceedings of the National Academy of Sciences, researchers programmed computers to use all available statistics to form a decision - making them more likely to predict the correct outcome. By using all data from previous sports leagues, the computer’s predictions always reflected the most likely outcome.
Next, researchers ‘trained’ the brains of participants by giving them a scenario which they had to predict the outcome of. Two groups of subjects, those given actual outcomes to situations and those given ideal outcomes were trained and then tested to compare their progress.
The scenarios consisted of games between two Major League baseball teams. Participants had to predict which team would win and were told if their prediction was correct. Those in the ‘actual’ group we told the true outcome of the game and those in the ‘ideal’ group were given fictional results.
Prior to participants’ predictions, the teams had been ranked in order based on their number of wins. For the ideal group, researchers changed the results of the match so the highest ranking team won regardless of the true outcome. This created ideal outcomes for the subjects as the best team always won, which of course does not happen in reality.
Participants in the experiment were tested by being asked to predict the outcomes for the rest of the matches played in the league, but they were not given feedback on their performance. Even though the ‘ideal’ group had been given incorrect data during training, they were significantly better at predicting the winner.
Dr Love explained: “Unlike machine systems, people’s decisions are messy because they rely on whatever memories are retrieved by chance. One consequence is that people perform better when the training situation is idealised – a useful fiction that fits are cognitive limitations.”
Participants’ prediction abilities were compared to computer models that were either optimised for prediction or modelled on human brains. After ideal outcome training, the study showed that ‘ideal’ subjects had greatly enhanced their skills and were comparable with the optimised model when predicting baseball game outcomes.
Authors suggest that idealised real world situations could be used to train professionals who rely on the ability to analyse and classify information. Doctors making diagnoses from x-rays, financial analysts and even those wanting to predict the weather could all benefit from the research.

'Clean' your memory to pick a winner

Predicting the winner of a sporting event with accuracy close to that of a statistical computer programme could be possible with proper training, according to researchers.

In a study published today, experiment participants who had been trained on statistically idealised data vastly improved their ability to predict the outcome of a baseball game.

In normal situations, the brain selects a limited number of memories to use as evidence to guide decisions. As real-world events do not always have the most likely outcome, retrieved memories can provide misleading information at the time of a decision.

Now, researchers at UCL and the University of Montreal have found a way to train the brain to accurately predict the outcome of an event, for example a baseball game, by giving subjects idealised scenarios that always conform to statistical probability.

Dr Bradley Love (UCL Department of Cognition, Perception and Brain Sciences), lead author of study, said: “Providing people with idealized situations, as opposed to actual outcomes, ‘cleans’ their memory and provides a stock of good quality evidence for the brain to use.”

In the study, published in Proceedings of the National Academy of Sciences, researchers programmed computers to use all available statistics to form a decision - making them more likely to predict the correct outcome. By using all data from previous sports leagues, the computer’s predictions always reflected the most likely outcome.

Next, researchers ‘trained’ the brains of participants by giving them a scenario which they had to predict the outcome of. Two groups of subjects, those given actual outcomes to situations and those given ideal outcomes were trained and then tested to compare their progress.

The scenarios consisted of games between two Major League baseball teams. Participants had to predict which team would win and were told if their prediction was correct. Those in the ‘actual’ group we told the true outcome of the game and those in the ‘ideal’ group were given fictional results.

Prior to participants’ predictions, the teams had been ranked in order based on their number of wins. For the ideal group, researchers changed the results of the match so the highest ranking team won regardless of the true outcome. This created ideal outcomes for the subjects as the best team always won, which of course does not happen in reality.

Participants in the experiment were tested by being asked to predict the outcomes for the rest of the matches played in the league, but they were not given feedback on their performance. Even though the ‘ideal’ group had been given incorrect data during training, they were significantly better at predicting the winner.

Dr Love explained: “Unlike machine systems, people’s decisions are messy because they rely on whatever memories are retrieved by chance. One consequence is that people perform better when the training situation is idealised – a useful fiction that fits are cognitive limitations.”

Participants’ prediction abilities were compared to computer models that were either optimised for prediction or modelled on human brains. After ideal outcome training, the study showed that ‘ideal’ subjects had greatly enhanced their skills and were comparable with the optimised model when predicting baseball game outcomes.

Authors suggest that idealised real world situations could be used to train professionals who rely on the ability to analyse and classify information. Doctors making diagnoses from x-rays, financial analysts and even those wanting to predict the weather could all benefit from the research.

Filed under brain statistical probability decision-making prediction psychology neuroscience science

55 notes

New light shed on early stage Alzheimer’s disease

The disrupted metabolism of sugar, fat and calcium is part of the process that causes the death of neurons in Alzheimer’s disease. Researchers from Karolinska Institutet in Sweden have now shown, for the first time, how important parts of the nerve cell that are involved in the cell’s energy metabolism operate in the early stages of the disease. These somewhat surprising results shed new light on how neuronal metabolism relates to the development of the disease.

In the Alzheimer’s disease brain, plaques consisting of so called amyloid-beta-peptide (Aβ) are accumulated. It is also a well-known fact that the nerve cells of patients with Alzheimer’s disease have problems metabolising for example glucose and calcium, and that these disorders are associated with cell death. The metabolism of these substances is the job of the cell mitochondria, which serve as the cell’s power plant and supply the cell with energy.

However, for the mitochondria to do this, they need good contact with another part of the cell called the endoplasmic reticulum (ER). The specialised region of ER that is in contact with mitochondria is called the MAM region. Earlier studies on yeast and other types of cells have shown that the deactivation of certain proteins in the MAM region disrupt the contact points between the mitochondria and the ER, preventing the delivery of energy to the cell and causing cell death.

Now for the first time, researchers at Karolinska Institutet have studied the MAM region in nerve cells, and examined the interaction between the mitochondria and the ER in early stage Alzheimer’s disease. Although at this point in the development of the disease Aβ has not formed large, lumpy plaques, symptoms still appear, implying that Aβ that has not yet formed plaque is toxic to neurons.

The team’s results are slightly surprising. When nerve cells are exposed to low doses of Aβ, it leads to an increase in the number of contact points between the mitochondria and the ER, causing more calcium to be transferred from the ER to the mitochondria. The resulting over-accumulation of calcium is toxic to the mitochondria and affects their ability to supply energy to the nerve cell.

“It’s urgent that we find out what causes neuronal death if we’re to develop molecules that check the disease,” says Maria Ankarcrona, docent and researcher at the Department of Neurobiology, Care Sciences and Society, and the Alzheimer’s Disease Research Centre of Karolinska Institutet. “In the long run we might be able to produce a drug that can arrest the progress of the disease at a stage when the patient is still able to manage their daily lives. If we can extend that period by a number of years, we’d have made great gains. Today there are no drugs that affect the actual disease process.”

The researchers conducted their studies on mice bred to develop symptoms of Alzheimer’s disease. They also studied nerve cells from deceased Alzheimer’s patients and neurons cultivated in the laboratory.

(Source: alphagalileo.org)

Filed under alzheimer's disease nerve cells endoplasmic reticulum energy metabolism mitochondria neuroscience science

74 notes

Red Light Increases Alertness During “Post-Lunch Dip”
Acute or chronic sleep deprivation resulting in increased feelings of fatigue is one of the leading causes of workplace incidents and related injuries. More incidents and performance failures, such as automobile accidents, occur in the mid-afternoon hours known as the “post-lunch dip.” The post-lunch dip typically occurs from 2-4 p.m., or about 16-18 hours after an individual’s bedtime from the previous night.
A new study from the Lighting Research Center (LRC) at Rensselaer Polytechnic Institute shows that exposure to certain wavelengths and levels of light has the potential to increase alertness during the post-lunch dip. The research was a collaboration between Mariana Figueiro, LRC Light and Health Program director and associate professor at Rensselaer, and LRC doctoral student Levent Sahin. Results of the study titled “Alerting effects of short-wavelength (blue) and long-wavelength (red) lights in the afternoon,” were recently published in Physiology & Behavior journal.
The collaboration between Figueiro and Sahin lays the groundwork for the possible use of tailored light exposures as a non-pharmacological intervention to increase alertness during the daytime. Figueiro has previously conducted studies that show that light has the potential to increase alertness at night. Exposure to more than 2500 lux of white light at night increases performance, elevates core body temperature, and increases heart rate.
In most studies to date, the alerting effects of light have been linked to its ability to suppress melatonin. However, results from another study led by Figueiro demonstrate that acute melatonin suppression is not needed for light to affect alertness during the nighttime. They showed that both short-wavelength (blue) and long-wavelength (red) lights increased measures of alertness but only short-wavelength light suppressed melatonin. Melatonin levels are typically lower during the daytime, and higher at night.
Figueiro and Sahin hypothesized that if light can impact alertness via pathways other than melatonin suppression, then certain wavelengths and levels of light might also increase alertness during the middle of the afternoon, close to the post-lunch dip hours.
During the study conducted at the LRC, participants experienced two experimental lighting conditions in addition to darkness. Long-wavelength “red” light (λmax = 630 nanometers) and short-wavelength “blue” light (λmax = 470 nanometers) were delivered to the corneas of each participant by arrays of light emitting diodes (LEDs) placed in 60 × 60 × 60 cm light boxes. Participant alertness was measured by electroencephalogram (EEG) and subjective sleepiness (KSS scale).
The team found that, compared to remaining in darkness, exposure to red light in the middle of the afternoon significantly reduces power in the alpha, alpha theta, and theta ranges. Because high power in these frequency ranges has been associated with sleepiness, these results suggest that red light positively affects measures of alertness not only at night, but also during the day. Red light also seemed to be a more potent stimulus for modulating brain activities associated with daytime alertness than blue light, although they did not find any significant differences in measures of alertness after exposure to red and blue lights. This suggests that blue light, especially higher levels of blue light, could still increase alertness in the afternoon. It appears that melatonin suppression is not needed for light to have an impact on objective measures of alertness.
“Our study suggests that photoreceptors other than the intrinsically photosensitive retinal ganglion cells respond to light for the arousal system,” said Figueiro. “Future research should look into the spectral sensitivity of alertness and how it changes over the course of 24 hours.”
Sahin, who has more than 10 years of experience in railway engineering, was interested in this study from a transportation safety perspective, and what the results could mean to the transportation industry. “Safety is a prerequisite and one of the most important quality indicators in the transportation industry,” said Sahin. “Our recent findings provided the scientifically valid underpinnings in approaching fatigue related safety problems in 24 hour transportation operations.”
From the present results, it is not possible to determine the underlying mechanisms contributing to light-induced changes in alertness because the optical radiation incident on the retina has multiple effects on brain activity through parallel neural pathways. According to Figueiro, that is an area that she would like to explore in future research.

Red Light Increases Alertness During “Post-Lunch Dip”

Acute or chronic sleep deprivation resulting in increased feelings of fatigue is one of the leading causes of workplace incidents and related injuries. More incidents and performance failures, such as automobile accidents, occur in the mid-afternoon hours known as the “post-lunch dip.” The post-lunch dip typically occurs from 2-4 p.m., or about 16-18 hours after an individual’s bedtime from the previous night.

A new study from the Lighting Research Center (LRC) at Rensselaer Polytechnic Institute shows that exposure to certain wavelengths and levels of light has the potential to increase alertness during the post-lunch dip. The research was a collaboration between Mariana Figueiro, LRC Light and Health Program director and associate professor at Rensselaer, and LRC doctoral student Levent Sahin. Results of the study titled “Alerting effects of short-wavelength (blue) and long-wavelength (red) lights in the afternoon,” were recently published in Physiology & Behavior journal.

The collaboration between Figueiro and Sahin lays the groundwork for the possible use of tailored light exposures as a non-pharmacological intervention to increase alertness during the daytime. Figueiro has previously conducted studies that show that light has the potential to increase alertness at night. Exposure to more than 2500 lux of white light at night increases performance, elevates core body temperature, and increases heart rate.

In most studies to date, the alerting effects of light have been linked to its ability to suppress melatonin. However, results from another study led by Figueiro demonstrate that acute melatonin suppression is not needed for light to affect alertness during the nighttime. They showed that both short-wavelength (blue) and long-wavelength (red) lights increased measures of alertness but only short-wavelength light suppressed melatonin. Melatonin levels are typically lower during the daytime, and higher at night.

Figueiro and Sahin hypothesized that if light can impact alertness via pathways other than melatonin suppression, then certain wavelengths and levels of light might also increase alertness during the middle of the afternoon, close to the post-lunch dip hours.

During the study conducted at the LRC, participants experienced two experimental lighting conditions in addition to darkness. Long-wavelength “red” light (λmax = 630 nanometers) and short-wavelength “blue” light (λmax = 470 nanometers) were delivered to the corneas of each participant by arrays of light emitting diodes (LEDs) placed in 60 × 60 × 60 cm light boxes. Participant alertness was measured by electroencephalogram (EEG) and subjective sleepiness (KSS scale).

The team found that, compared to remaining in darkness, exposure to red light in the middle of the afternoon significantly reduces power in the alpha, alpha theta, and theta ranges. Because high power in these frequency ranges has been associated with sleepiness, these results suggest that red light positively affects measures of alertness not only at night, but also during the day. Red light also seemed to be a more potent stimulus for modulating brain activities associated with daytime alertness than blue light, although they did not find any significant differences in measures of alertness after exposure to red and blue lights. This suggests that blue light, especially higher levels of blue light, could still increase alertness in the afternoon. It appears that melatonin suppression is not needed for light to have an impact on objective measures of alertness.

“Our study suggests that photoreceptors other than the intrinsically photosensitive retinal ganglion cells respond to light for the arousal system,” said Figueiro. “Future research should look into the spectral sensitivity of alertness and how it changes over the course of 24 hours.”

Sahin, who has more than 10 years of experience in railway engineering, was interested in this study from a transportation safety perspective, and what the results could mean to the transportation industry. “Safety is a prerequisite and one of the most important quality indicators in the transportation industry,” said Sahin. “Our recent findings provided the scientifically valid underpinnings in approaching fatigue related safety problems in 24 hour transportation operations.”

From the present results, it is not possible to determine the underlying mechanisms contributing to light-induced changes in alertness because the optical radiation incident on the retina has multiple effects on brain activity through parallel neural pathways. According to Figueiro, that is an area that she would like to explore in future research.

Filed under alertness sleepiness sleep deprivation melatonin post-lunch dip wavelength lights fatigue neuroscience psychology science

555 notes

Scientists Find Antibody that Transforms Bone Marrow Stem Cells Directly into Brain Cells
In a serendipitous discovery, scientists at The Scripps Research Institute (TSRI) have found a way to turn bone marrow stem cells directly into brain cells.
Current techniques for turning patients’ marrow cells into cells of some other desired type are relatively cumbersome, risky and effectively confined to the lab dish. The new finding points to the possibility of simpler and safer techniques. Cell therapies derived from patients’ own cells are widely expected to be useful in treating spinal cord injuries, strokes and other conditions throughout the body, with little or no risk of immune rejection.
“These results highlight the potential of antibodies as versatile manipulators of cellular functions,” said Richard A. Lerner, the Lita Annenberg Hazen Professor of Immunochemistry and institute professor in the Department of Cell and Molecular Biology at TSRI, and principal investigator for the new study. “This is a far cry from the way antibodies used to be thought of—as molecules that were selected simply for binding and not function.”
The researchers discovered the method, reported in the online Early Edition of the Proceedings of the National Academy of Sciences the week of April 22, 2013, while looking for lab-grown antibodies that can activate a growth-stimulating receptor on marrow cells. One antibody turned out to activate the receptor in a way that induces marrow stem cells—which normally develop into white blood cells—to become neural progenitor cells, a type of almost-mature brain cell.
Nature’s Toolkit
Natural antibodies are large, Y-shaped proteins produced by immune cells. Collectively, they are diverse enough to recognize about 100 billion distinct shapes on viruses, bacteria and other targets. Since the 1980s, molecular biologists have known how to produce antibodies in cell cultures in the laboratory. That has allowed them to start using this vast, target-gripping toolkit to make scientific probes, as well as diagnostics and therapies for cancer, arthritis, transplant rejection, viral infections and other diseases.
In the late 1980s, Lerner and his TSRI colleagues helped invent the first techniques for generating large “libraries” of distinct antibodies and swiftly determining which of these could bind to a desired target. The anti-inflammatory antibody Humira®, now one of the world’s top-selling drugs, was discovered with the benefit of this technology.
Last year, in a study spearheaded by TSRI Research Associate Hongkai Zhang, Lerner’s laboratory devised a new antibody-discovery technique—in which antibodies are produced in mammalian cells along with receptors or other target molecules of interest. The technique enables researchers to determine rapidly not just which antibodies in a library bind to a given receptor, for example, but also which ones activate the receptor and thereby alter cell function.
Lab Dish in a Cell
For the new study, Lerner laboratory Research Associate Jia Xie and colleagues modified the new technique so that antibody proteins produced in a given cell are physically anchored to the cell’s outer membrane, near its target receptors. “Confining an antibody’s activity to the cell in which it is produced effectively allows us to use larger antibody libraries and to screen these antibodies more quickly for a specific activity,” said Xie. With the improved technique, scientists can sift through a library of tens of millions of antibodies in a few days.
In an early test, Xie used the new method to screen for antibodies that could activate the GCSF receptor, a growth-factor receptor found on bone marrow cells and other cell types. GCSF-mimicking drugs were among the first biotech bestsellers because of their ability to stimulate white blood cell growth—which counteracts the marrow-suppressing side effect of cancer chemotherapy.
The team soon isolated one antibody type or “clone” that could activate the GCSF receptor and stimulate growth in test cells. The researchers then tested an unanchored, soluble version of this antibody on cultures of bone marrow stem cells from human volunteers. Whereas the GCSF protein, as expected, stimulated such stem cells to proliferate and start maturing towards adult white blood cells, the GCSF-mimicking antibody had a markedly different effect.
“The cells proliferated, but also started becoming long and thin and attaching to the bottom of the dish,” remembered Xie.
To Lerner, the cells were reminiscent of neural progenitor cells—which further tests for neural cell markers confirmed they were.
A New Direction
Changing cells of marrow lineage into cells of neural lineage—a direct identity switch termed “transdifferentiation”—just by activating a single receptor is a noteworthy achievement. Scientists do have methods for turning marrow stem cells into other adult cell types, but these methods typically require a radical and risky deprogramming of marrow cells to an embryonic-like stem-cell state, followed by a complex series of molecular nudges toward a given adult cell fate. Relatively few laboratories have reported direct transdifferentiation techniques.
“As far as I know, no one has ever achieved transdifferentiation by using a single protein—a protein that potentially could be used as a therapeutic,” said Lerner.
Current cell-therapy methods typically assume that a patient’s cells will be harvested, then reprogrammed and multiplied in a lab dish before being re-introduced into the patient. In principle, according to Lerner, an antibody such as the one they have discovered could be injected directly into the bloodstream of a sick patient. From the bloodstream it would find its way to the marrow, and, for example, convert some marrow stem cells into neural progenitor cells. “Those neural progenitors would infiltrate the brain, find areas of damage and help repair them,” he said.
While the researchers still aren’t sure why the new antibody has such an odd effect on the GCSF receptor, they suspect it binds the receptor for longer than the natural GCSF protein can achieve, and this lengthier interaction alters the receptor’s signaling pattern. Drug-development researchers are increasingly recognizing that subtle differences in the way a cell-surface receptor is bound and activated can result in very different biological effects. That adds complexity to their task, but in principle expands the scope of what they can achieve. “If you can use the same receptor in different ways, then the potential of the genome is bigger,” said Lerner.

Scientists Find Antibody that Transforms Bone Marrow Stem Cells Directly into Brain Cells

In a serendipitous discovery, scientists at The Scripps Research Institute (TSRI) have found a way to turn bone marrow stem cells directly into brain cells.

Current techniques for turning patients’ marrow cells into cells of some other desired type are relatively cumbersome, risky and effectively confined to the lab dish. The new finding points to the possibility of simpler and safer techniques. Cell therapies derived from patients’ own cells are widely expected to be useful in treating spinal cord injuries, strokes and other conditions throughout the body, with little or no risk of immune rejection.

“These results highlight the potential of antibodies as versatile manipulators of cellular functions,” said Richard A. Lerner, the Lita Annenberg Hazen Professor of Immunochemistry and institute professor in the Department of Cell and Molecular Biology at TSRI, and principal investigator for the new study. “This is a far cry from the way antibodies used to be thought of—as molecules that were selected simply for binding and not function.”

The researchers discovered the method, reported in the online Early Edition of the Proceedings of the National Academy of Sciences the week of April 22, 2013, while looking for lab-grown antibodies that can activate a growth-stimulating receptor on marrow cells. One antibody turned out to activate the receptor in a way that induces marrow stem cells—which normally develop into white blood cells—to become neural progenitor cells, a type of almost-mature brain cell.

Nature’s Toolkit

Natural antibodies are large, Y-shaped proteins produced by immune cells. Collectively, they are diverse enough to recognize about 100 billion distinct shapes on viruses, bacteria and other targets. Since the 1980s, molecular biologists have known how to produce antibodies in cell cultures in the laboratory. That has allowed them to start using this vast, target-gripping toolkit to make scientific probes, as well as diagnostics and therapies for cancer, arthritis, transplant rejection, viral infections and other diseases.

In the late 1980s, Lerner and his TSRI colleagues helped invent the first techniques for generating large “libraries” of distinct antibodies and swiftly determining which of these could bind to a desired target. The anti-inflammatory antibody Humira®, now one of the world’s top-selling drugs, was discovered with the benefit of this technology.

Last year, in a study spearheaded by TSRI Research Associate Hongkai Zhang, Lerner’s laboratory devised a new antibody-discovery technique—in which antibodies are produced in mammalian cells along with receptors or other target molecules of interest. The technique enables researchers to determine rapidly not just which antibodies in a library bind to a given receptor, for example, but also which ones activate the receptor and thereby alter cell function.

Lab Dish in a Cell

For the new study, Lerner laboratory Research Associate Jia Xie and colleagues modified the new technique so that antibody proteins produced in a given cell are physically anchored to the cell’s outer membrane, near its target receptors. “Confining an antibody’s activity to the cell in which it is produced effectively allows us to use larger antibody libraries and to screen these antibodies more quickly for a specific activity,” said Xie. With the improved technique, scientists can sift through a library of tens of millions of antibodies in a few days.

In an early test, Xie used the new method to screen for antibodies that could activate the GCSF receptor, a growth-factor receptor found on bone marrow cells and other cell types. GCSF-mimicking drugs were among the first biotech bestsellers because of their ability to stimulate white blood cell growth—which counteracts the marrow-suppressing side effect of cancer chemotherapy.

The team soon isolated one antibody type or “clone” that could activate the GCSF receptor and stimulate growth in test cells. The researchers then tested an unanchored, soluble version of this antibody on cultures of bone marrow stem cells from human volunteers. Whereas the GCSF protein, as expected, stimulated such stem cells to proliferate and start maturing towards adult white blood cells, the GCSF-mimicking antibody had a markedly different effect.

“The cells proliferated, but also started becoming long and thin and attaching to the bottom of the dish,” remembered Xie.

To Lerner, the cells were reminiscent of neural progenitor cells—which further tests for neural cell markers confirmed they were.

A New Direction

Changing cells of marrow lineage into cells of neural lineage—a direct identity switch termed “transdifferentiation”—just by activating a single receptor is a noteworthy achievement. Scientists do have methods for turning marrow stem cells into other adult cell types, but these methods typically require a radical and risky deprogramming of marrow cells to an embryonic-like stem-cell state, followed by a complex series of molecular nudges toward a given adult cell fate. Relatively few laboratories have reported direct transdifferentiation techniques.

“As far as I know, no one has ever achieved transdifferentiation by using a single protein—a protein that potentially could be used as a therapeutic,” said Lerner.

Current cell-therapy methods typically assume that a patient’s cells will be harvested, then reprogrammed and multiplied in a lab dish before being re-introduced into the patient. In principle, according to Lerner, an antibody such as the one they have discovered could be injected directly into the bloodstream of a sick patient. From the bloodstream it would find its way to the marrow, and, for example, convert some marrow stem cells into neural progenitor cells. “Those neural progenitors would infiltrate the brain, find areas of damage and help repair them,” he said.

While the researchers still aren’t sure why the new antibody has such an odd effect on the GCSF receptor, they suspect it binds the receptor for longer than the natural GCSF protein can achieve, and this lengthier interaction alters the receptor’s signaling pattern. Drug-development researchers are increasingly recognizing that subtle differences in the way a cell-surface receptor is bound and activated can result in very different biological effects. That adds complexity to their task, but in principle expands the scope of what they can achieve. “If you can use the same receptor in different ways, then the potential of the genome is bigger,” said Lerner.

Filed under stem cells brain cells marrow cells antibodies brain drug development neuroscience science

88 notes

Putting the brakes on Parkinson’s
The earliest signs of Parkinson’s disease can be deceptively mild. The first thing that movie star Michael J. Fox noticed was twitching of the little finger of his left hand. For years, he made light of the apparently harmless tic. But such tremors typically spread, while muscles stiffen up and directed movements take longer to carry out. Research groups led by Armin Giese of LMU Munich and Christian Griesinger at the Max Planck Institute for Biophysical Chemistry in Göttingen have developed a chemical compound that slows down the onset and progression of Parkinson’s disease in mice. The scientists hope that this approach will give them a way to treat the cause of Parkinson’s and so arrest its progress.
The disease usually becomes manifest between the ages of 50 and 60, and results from the loss of dopamine-producing nerve cells in the substantia nigra, which is part of the midbrain. Under the microscope, the affected cells are seen to contain insoluble precipitates made up of a protein called alpha-synuclein. As an early step in the pathological cascade, this protein forms so-called oligomers, tiny aggregates consisting of small numbers of alpha-synuclein molecules, which are apparently highly neurotoxic. By the time the first overt symptoms appear in humans, more than half of the vulnerable cells have already been lost. Many researchers therefore focus on developing methods for early diagnosis of the condition. However, current therapies only alleviate symptoms, so the research teams led by Armin Giese and Christian Griesinger set out to address the underlying cause of nerve-cell death.
Together, the scientists have developed a substance which, in mouse models of the disease, reduces the rate of growth of the protein deposits and delays nerve cell degeneration to a yet unprecedented degree. As a consequence, mice treated with this agent remain disease-free for longer than non-medicated controls. “The most striking feature of the new compound is that it is the first that directly targets oligomers and interferes with their formation,” explains Christian Griesinger, head of the Department of NMR-based Structural Biology and Director at the Max Planck Institute for Biophysical Chemistry. The discovery is the result of years of hard work. “Combining skills from a range of disciplines has been the key to our success. Biologists, chemists, clinicians, physicists, and veterinarians have all contributed to the development of the therapeutic compound,” adds Armin Giese, who leads a research group at LMU’s Center for Neuropathology and Prion Research.
Giese and his colleagues systematically tested 20,000 candidate substances for the ability to block formation of the protein deposits that are typical for the disease. The screen made use of an extremely sensitive laser-based assay developed by Giese years ago when he was working together with Nobel Laureate Manfred Eigen at the Max Planck Institute for Biophysical Chemistry in Göttingen. Some interesting lead compounds identified during the very first phase of the screening program served as starting point for further optimization. Ultimately, one substance proved to be particularly active. Andrei Leonov a chemist in Griesinger’s team, finally succeeded in synthesizing a pharmaceutically promising derivative. This is well tolerated at dosage levels with significant therapeutic effects, can be administered with the food, and penetrates the blood-brain barrier, reaching high levels in the brain. The two teams have already applied for a patent on the compound which they called Anle138b – an abbreviation of Andrei Leonov’s first name and surname.
A complex series of experiments has provided encouraging indications that Anle138b could also be of therapeutic use in humans. These tests involved not only biochemical and structural investigations of Anle138b’s mode of action but also employed several animal models of Parkinson’s which are under study in Munich and in laboratories of the Excellence Cluster “Nanoscale Microscopy and Molecular Physiology of the Brain” in Göttingen. Mice exposed to Anle138b were found to display better motor coordination than their untreated siblings. “We use a kind of fitness test to evaluate muscle coordination,” Giese explains. “The mice are placed on a rotating rod and we measure how long the animals can keep their balance.”
Generally speaking, the earlier the onset of treatment, the longer the animals remained disease free. What’s more, the beneficial effects of Anle138b are not restricted to animals with Parkinson’s disease. “Creutzfeldt-Jakob disease is caused by toxic aggregates of the prion protein,” Griesinger points out. “And here too, Anle138b effectively inhibits clumping and significantly increases survival times.” These findings hint that Anle138b might also prevent the formation of insoluble deposits formed by other proteins, such as the tau protein that is associated with Alzheimer’s disease. Further experiments will address this issue. Anle138b will therefore be a useful research tool in medicine, as it will enable scientists to study the process of oligomer formation in the test-tube and to determine how their assembly is inhibited. The researchers hope ultimately to gain new insights into the mechanisms into how neurodegenerative disorders develop.
The drugs so far available for treatment of Parkinson’s disease only control its symptoms by enhancing the function of the surviving nerve cells in the substantia nigra. “With Anle138b, we may have the first representative of a new class of neuroprotective agents allowing to retard or even halt the progression of conditions such as Parkinson’s or Creutzfeldt-Jakob disease,” Griesinger says. However, he warns that the findings in mice cannot immediately be applied to humans. The next step will be to carry out toxicity tests in non-rodent species. Only if these are successful will clinical trials in patients become a realistic possibility. As clinician Giese emphasizes: “To successfully establish a novel therapeutic agent for treatment of real patients is a laborious task that requires a lot of work as well as serendipity.”
Full article

Putting the brakes on Parkinson’s

The earliest signs of Parkinson’s disease can be deceptively mild. The first thing that movie star Michael J. Fox noticed was twitching of the little finger of his left hand. For years, he made light of the apparently harmless tic. But such tremors typically spread, while muscles stiffen up and directed movements take longer to carry out. Research groups led by Armin Giese of LMU Munich and Christian Griesinger at the Max Planck Institute for Biophysical Chemistry in Göttingen have developed a chemical compound that slows down the onset and progression of Parkinson’s disease in mice. The scientists hope that this approach will give them a way to treat the cause of Parkinson’s and so arrest its progress.

The disease usually becomes manifest between the ages of 50 and 60, and results from the loss of dopamine-producing nerve cells in the substantia nigra, which is part of the midbrain. Under the microscope, the affected cells are seen to contain insoluble precipitates made up of a protein called alpha-synuclein. As an early step in the pathological cascade, this protein forms so-called oligomers, tiny aggregates consisting of small numbers of alpha-synuclein molecules, which are apparently highly neurotoxic. By the time the first overt symptoms appear in humans, more than half of the vulnerable cells have already been lost. Many researchers therefore focus on developing methods for early diagnosis of the condition. However, current therapies only alleviate symptoms, so the research teams led by Armin Giese and Christian Griesinger set out to address the underlying cause of nerve-cell death.

Together, the scientists have developed a substance which, in mouse models of the disease, reduces the rate of growth of the protein deposits and delays nerve cell degeneration to a yet unprecedented degree. As a consequence, mice treated with this agent remain disease-free for longer than non-medicated controls. “The most striking feature of the new compound is that it is the first that directly targets oligomers and interferes with their formation,” explains Christian Griesinger, head of the Department of NMR-based Structural Biology and Director at the Max Planck Institute for Biophysical Chemistry. The discovery is the result of years of hard work. “Combining skills from a range of disciplines has been the key to our success. Biologists, chemists, clinicians, physicists, and veterinarians have all contributed to the development of the therapeutic compound,” adds Armin Giese, who leads a research group at LMU’s Center for Neuropathology and Prion Research.

Giese and his colleagues systematically tested 20,000 candidate substances for the ability to block formation of the protein deposits that are typical for the disease. The screen made use of an extremely sensitive laser-based assay developed by Giese years ago when he was working together with Nobel Laureate Manfred Eigen at the Max Planck Institute for Biophysical Chemistry in Göttingen. Some interesting lead compounds identified during the very first phase of the screening program served as starting point for further optimization. Ultimately, one substance proved to be particularly active. Andrei Leonov a chemist in Griesinger’s team, finally succeeded in synthesizing a pharmaceutically promising derivative. This is well tolerated at dosage levels with significant therapeutic effects, can be administered with the food, and penetrates the blood-brain barrier, reaching high levels in the brain. The two teams have already applied for a patent on the compound which they called Anle138b – an abbreviation of Andrei Leonov’s first name and surname.

A complex series of experiments has provided encouraging indications that Anle138b could also be of therapeutic use in humans. These tests involved not only biochemical and structural investigations of Anle138b’s mode of action but also employed several animal models of Parkinson’s which are under study in Munich and in laboratories of the Excellence Cluster “Nanoscale Microscopy and Molecular Physiology of the Brain” in Göttingen. Mice exposed to Anle138b were found to display better motor coordination than their untreated siblings. “We use a kind of fitness test to evaluate muscle coordination,” Giese explains. “The mice are placed on a rotating rod and we measure how long the animals can keep their balance.”

Generally speaking, the earlier the onset of treatment, the longer the animals remained disease free. What’s more, the beneficial effects of Anle138b are not restricted to animals with Parkinson’s disease. “Creutzfeldt-Jakob disease is caused by toxic aggregates of the prion protein,” Griesinger points out. “And here too, Anle138b effectively inhibits clumping and significantly increases survival times.” These findings hint that Anle138b might also prevent the formation of insoluble deposits formed by other proteins, such as the tau protein that is associated with Alzheimer’s disease. Further experiments will address this issue. Anle138b will therefore be a useful research tool in medicine, as it will enable scientists to study the process of oligomer formation in the test-tube and to determine how their assembly is inhibited. The researchers hope ultimately to gain new insights into the mechanisms into how neurodegenerative disorders develop.

The drugs so far available for treatment of Parkinson’s disease only control its symptoms by enhancing the function of the surviving nerve cells in the substantia nigra. “With Anle138b, we may have the first representative of a new class of neuroprotective agents allowing to retard or even halt the progression of conditions such as Parkinson’s or Creutzfeldt-Jakob disease,” Griesinger says. However, he warns that the findings in mice cannot immediately be applied to humans. The next step will be to carry out toxicity tests in non-rodent species. Only if these are successful will clinical trials in patients become a realistic possibility. As clinician Giese emphasizes: “To successfully establish a novel therapeutic agent for treatment of real patients is a laborious task that requires a lot of work as well as serendipity.”

Full article

Filed under parkinson's disease substantia nigra alpha-synuclein animal model neuroscience science

free counters