Neuroscience

Articles and news from the latest research reports.

Posts tagged gene mutation

169 notes

Genes discovered linking circadian clock with eating schedule
For most people, the urge to eat a meal or snack comes at a few, predictable times during the waking part of the day. But for those with a rare syndrome, hunger comes at unwanted hours, interrupts sleep and causes overeating.
Now, Salk scientists have discovered a pair of genes that normally keeps eating schedules in sync with daily sleep rhythms, and, when mutated, may play a role in so-called night eating syndrome. In mice with mutations in one of the genes, eating patterns are shifted, leading to unusual mealtimes and weight gain. The results were published in Cell Reports today.
"We really never expected that we would be able to decouple the sleep-wake cycle and the eating cycle, especially with a simple mutation," says senior study author Satchidananda Panda, an associate professor in Salk’s Regulatory Biology Laboratory. "It opens up a whole lot of future questions about how these cycles are regulated."
More than a decade ago, researchers discovered that individuals with an inherited sleep disorder often carry a particular mutation in a protein called PER2. The mutation is in an area of the protein that can be phosphorylated—the ability to bond with a phosphate chemical that changes the protein’s function. Humans have three PER, or period, genes, all thought to play a role in the daily circadian clock and all containing the same phosphorylation spot.
The Salk scientists joined forces with a Chinese team led by Ying Xu of Nanjing University to test whether mutations in the equivalent area of PER1 would have the same effect as those in PER2 that caused the sleep disorder. So they bred mice to lack the mouse period genes, and added in a human PER1 or PER2 with a mutation in the phosphorylation site. As expected, mice with a mutated PER2 had sleep defects, dozing off earlier than usual. The same wasn’t true for PER1 mutations though.
"In the mice without PER1, there was no obvious defect in their sleep-wake cycles," says Panda. "Instead, when we looked at their metabolism, we suddenly saw drastic changes."
Mice with the PER1 phosphorylation defects ate earlier than other mice—causing them to wake up and snack before their sleep cycle was over—and ate more food throughout their normal waking period. When the researchers looked at the molecular details of the PER1 protein, they found that the mutated PER1 led to lower protein levels during the sleeping period, higher levels during the waking period, and a faster degradation of protein whenever it was produced by cells.
Panda and his colleagues hypothesize that normally, PER1 and PER2 are kept synchronized since they have identical phosphorylation sites—they are turned on and off at the same times, keeping sleep and eating cycles aligned. But a mutation in one of the genes could break this link, and cause off-cycle eating or sleeping.
"For a long time, people discounted night eating syndrome as not real," says Panda. "These results in mice suggest that it could actually be a genetic basis for the syndrome." The researchers haven’t yet tested, however, whether any humans with night eating syndrome have mutations in PER1.
When Panda and Xu’s team restricted access to food, providing it only at the mice’s normal meal times, they found that even with a genetic mutation in PER1, mice could maintain a normal weight. Over a 10-week follow-up, these mice—with a PER1 mutation but timed access to food—showed no differences to control animals. This tells the researchers that the weight gain caused by PER1 is entirely caused by meal mistiming, not other metabolic defects.
Next, they hope to study exactly how PER1 controls appetite and eating behavior—whether its molecular actions work through the liver, fat cells, brain or other organs.

Genes discovered linking circadian clock with eating schedule

For most people, the urge to eat a meal or snack comes at a few, predictable times during the waking part of the day. But for those with a rare syndrome, hunger comes at unwanted hours, interrupts sleep and causes overeating.

Now, Salk scientists have discovered a pair of genes that normally keeps eating schedules in sync with daily sleep rhythms, and, when mutated, may play a role in so-called night eating syndrome. In mice with mutations in one of the genes, eating patterns are shifted, leading to unusual mealtimes and weight gain. The results were published in Cell Reports today.

"We really never expected that we would be able to decouple the sleep-wake cycle and the eating cycle, especially with a simple mutation," says senior study author Satchidananda Panda, an associate professor in Salk’s Regulatory Biology Laboratory. "It opens up a whole lot of future questions about how these cycles are regulated."

More than a decade ago, researchers discovered that individuals with an inherited sleep disorder often carry a particular mutation in a protein called PER2. The mutation is in an area of the protein that can be phosphorylated—the ability to bond with a phosphate chemical that changes the protein’s function. Humans have three PER, or period, genes, all thought to play a role in the daily circadian clock and all containing the same phosphorylation spot.

The Salk scientists joined forces with a Chinese team led by Ying Xu of Nanjing University to test whether mutations in the equivalent area of PER1 would have the same effect as those in PER2 that caused the sleep disorder. So they bred mice to lack the mouse period genes, and added in a human PER1 or PER2 with a mutation in the phosphorylation site. As expected, mice with a mutated PER2 had sleep defects, dozing off earlier than usual. The same wasn’t true for PER1 mutations though.

"In the mice without PER1, there was no obvious defect in their sleep-wake cycles," says Panda. "Instead, when we looked at their metabolism, we suddenly saw drastic changes."

Mice with the PER1 phosphorylation defects ate earlier than other mice—causing them to wake up and snack before their sleep cycle was over—and ate more food throughout their normal waking period. When the researchers looked at the molecular details of the PER1 protein, they found that the mutated PER1 led to lower protein levels during the sleeping period, higher levels during the waking period, and a faster degradation of protein whenever it was produced by cells.

Panda and his colleagues hypothesize that normally, PER1 and PER2 are kept synchronized since they have identical phosphorylation sites—they are turned on and off at the same times, keeping sleep and eating cycles aligned. But a mutation in one of the genes could break this link, and cause off-cycle eating or sleeping.

"For a long time, people discounted night eating syndrome as not real," says Panda. "These results in mice suggest that it could actually be a genetic basis for the syndrome." The researchers haven’t yet tested, however, whether any humans with night eating syndrome have mutations in PER1.

When Panda and Xu’s team restricted access to food, providing it only at the mice’s normal meal times, they found that even with a genetic mutation in PER1, mice could maintain a normal weight. Over a 10-week follow-up, these mice—with a PER1 mutation but timed access to food—showed no differences to control animals. This tells the researchers that the weight gain caused by PER1 is entirely caused by meal mistiming, not other metabolic defects.

Next, they hope to study exactly how PER1 controls appetite and eating behavior—whether its molecular actions work through the liver, fat cells, brain or other organs.

Filed under night eating syndrome circadian rhythms overeating gene mutation PER sleep neuroscience science

87 notes

(Image caption: The images show an early developmental stage of normal (top row) and BRCA1-deficient brains (bottom row). The imaged embryos show abundant proliferation of cell growth (red, first column) in both normal and BRCA1-deficient brains at this stage. However brains lacking BRCA1 exhibit high levels of cellular suicide (green, second column). The third column shows an overlay of the other columns. Credit: Courtesy of the Salk Institute for Biological Studies) 
Scientists reveal potential link between brain development and breast cancer gene
Scientists at the Salk Institute have uncovered details into a surprising—and crucial—link between brain development and a gene whose mutation is tied to breast and ovarian cancer. Aside from better understanding neurological damage associated in a small percentage of people susceptible to breast cancers, the new work also helps to better understand the evolution of the brain.
The research, published this month in PNAS, shows that the gene known as BRCA1 has a significant role in creating healthy brains in mice and may provide a hint as to why some women genetically prone to breast cancer experience brain seizures.
"Previously, people associated mutations or deletions of BRCA1 with breast and ovarian cancer," says Inder Verma, a professor in Salk’s Laboratory of Genetics and American Cancer Society Professor of Molecular Biology. "Our paper goes beyond this link to explain the protective mechanism of BRCA1 in the brain."
Through a three–lab collaboration at the Salk Institute, which began over a water cooler conversation between adjacent lab researchers 10 years ago, the work has culminated in dramatic findings. The team found that eliminating BRCA1 in neural stems cells had profound effects: large swaths of brain were simply missing; the cortex, which typically has six layers, only developed two very rudimentary layers; the cerebellum, which is normally made up of many folds and creases, was almost completely smooth; and the olfactory bulb, which processes odor information, was severely disorganized and poorly developed. Neurons were dying rapidly shortly after forming, while ones that did last were often defective. In mouse models, this resulted in interference in balance, motor skills, and other core functions.
How exactly was the absence of BRCA1 leading to such a neural catastrophe? In a previous paper, the team showed that without the protein coded by the BRCA1 gene, DNA is not packaged properly, becoming fragile and more likely to break during DNA replication. In this new paper, the researchers reveal more about that mechanism, showing that without the protective ability of BRCA1, breaks in the DNA strands go unfixed, prompting the molecule ATM kinase to activate a cellular “suicide” pathway involving a protein called p53. This pathway helps to halt the replication of damaged cells and is important in cancer research.
"BRCA1 acts by conferring stability to the DNA and preventing it from breaking," says Carlos G. Perez–Garcia, a Salk researcher in the Molecular Neurobiology Lab. "BRCA1 is important for all healthy cells."
When the researchers eliminated both BRCA1 and p53, they found the neurons grew at a normal rate, but still disorderly, with cells pointed in the wrong direction.
"In this scenario, we recover a lot of neurons but there’s still a lot of abnormalities, such as cells that are sideways and pointed the wrong direction," says Gerald Pao, who, along with Quan Zhu and Perez–Garcia, is a primary contributor to the paper and Salk researcher.
This observation led the team to propose that BRCA1 has an additional role in assisting neurons in orienting: the gene acts on the centromere of DNA—essentially an anchor for the chromosome arms essential in cell replication—to tell the new cell in which direction to grow, providing guidance in developing the brain’s organized layers.
"It is remarkable that BRCA1 has such a significant effect on the brain, especially size. This work leads us to a better understanding of how to protect neurons," says Verma, who is also the Irwin and Joan Jacobs Chair in Exemplary Life Science. Because BRCA1 seems to regulate the centromere, studying the gene will help scientists to understand how mammalian brains have evolved over time.
"Now we have an explanation for why some patients with breast cancer also experienced brain seizures," adds Pao. This knowledge could potentially help identify breast cancer–susceptible patients predisposed to seizures and provide appropriate treatments.

(Image caption: The images show an early developmental stage of normal (top row) and BRCA1-deficient brains (bottom row). The imaged embryos show abundant proliferation of cell growth (red, first column) in both normal and BRCA1-deficient brains at this stage. However brains lacking BRCA1 exhibit high levels of cellular suicide (green, second column). The third column shows an overlay of the other columns. Credit: Courtesy of the Salk Institute for Biological Studies)

Scientists reveal potential link between brain development and breast cancer gene

Scientists at the Salk Institute have uncovered details into a surprising—and crucial—link between brain development and a gene whose mutation is tied to breast and ovarian cancer. Aside from better understanding neurological damage associated in a small percentage of people susceptible to breast cancers, the new work also helps to better understand the evolution of the brain.

The research, published this month in PNAS, shows that the gene known as BRCA1 has a significant role in creating healthy brains in mice and may provide a hint as to why some women genetically prone to breast cancer experience brain seizures.

"Previously, people associated mutations or deletions of BRCA1 with breast and ovarian cancer," says Inder Verma, a professor in Salk’s Laboratory of Genetics and American Cancer Society Professor of Molecular Biology. "Our paper goes beyond this link to explain the protective mechanism of BRCA1 in the brain."

Through a three–lab collaboration at the Salk Institute, which began over a water cooler conversation between adjacent lab researchers 10 years ago, the work has culminated in dramatic findings. The team found that eliminating BRCA1 in neural stems cells had profound effects: large swaths of brain were simply missing; the cortex, which typically has six layers, only developed two very rudimentary layers; the cerebellum, which is normally made up of many folds and creases, was almost completely smooth; and the olfactory bulb, which processes odor information, was severely disorganized and poorly developed. Neurons were dying rapidly shortly after forming, while ones that did last were often defective. In mouse models, this resulted in interference in balance, motor skills, and other core functions.

How exactly was the absence of BRCA1 leading to such a neural catastrophe? In a previous paper, the team showed that without the protein coded by the BRCA1 gene, DNA is not packaged properly, becoming fragile and more likely to break during DNA replication. In this new paper, the researchers reveal more about that mechanism, showing that without the protective ability of BRCA1, breaks in the DNA strands go unfixed, prompting the molecule ATM kinase to activate a cellular “suicide” pathway involving a protein called p53. This pathway helps to halt the replication of damaged cells and is important in cancer research.

"BRCA1 acts by conferring stability to the DNA and preventing it from breaking," says Carlos G. Perez–Garcia, a Salk researcher in the Molecular Neurobiology Lab. "BRCA1 is important for all healthy cells."

When the researchers eliminated both BRCA1 and p53, they found the neurons grew at a normal rate, but still disorderly, with cells pointed in the wrong direction.

"In this scenario, we recover a lot of neurons but there’s still a lot of abnormalities, such as cells that are sideways and pointed the wrong direction," says Gerald Pao, who, along with Quan Zhu and Perez–Garcia, is a primary contributor to the paper and Salk researcher.

This observation led the team to propose that BRCA1 has an additional role in assisting neurons in orienting: the gene acts on the centromere of DNA—essentially an anchor for the chromosome arms essential in cell replication—to tell the new cell in which direction to grow, providing guidance in developing the brain’s organized layers.

"It is remarkable that BRCA1 has such a significant effect on the brain, especially size. This work leads us to a better understanding of how to protect neurons," says Verma, who is also the Irwin and Joan Jacobs Chair in Exemplary Life Science. Because BRCA1 seems to regulate the centromere, studying the gene will help scientists to understand how mammalian brains have evolved over time.

"Now we have an explanation for why some patients with breast cancer also experienced brain seizures," adds Pao. This knowledge could potentially help identify breast cancer–susceptible patients predisposed to seizures and provide appropriate treatments.

Filed under brain development breast cancer BRCA1 brain seizures gene mutation neuroscience science

108 notes

Huntington’s disease: Study discovers potassium boost improves walking in mouse model
Tweaking a specific cell type’s ability to absorb potassium in the brain improved walking and prolonged survival in a mouse model of Huntington’s disease, reports a UCLA study published March 30 in the online edition of Nature Neuroscience. The discovery could point to new drug targets for treating the devastating disease, which strikes one in every 20,000 Americans.
Huntington’s disease is passed from parent to child through a mutation in the huntingtin gene. By killing brain cells called neurons, the progressive disorder gradually deprives patients of their ability to walk, speak, swallow, breathe and think clearly. No cure exists, and patients with aggressive cases can die in as little as 10 years.
The laboratories of Baljit Khakh, a professor of physiology and neurobiology, and Michael Sofroniew, a professor of neurobiology, teamed up at the David Geffen School of Medicine at UCLA to unravel the role played in Huntington’s by astrocytes—large, star-shaped cells found in the brain and spinal cord.
Read more

Huntington’s disease: Study discovers potassium boost improves walking in mouse model

Tweaking a specific cell type’s ability to absorb potassium in the brain improved walking and prolonged survival in a mouse model of Huntington’s disease, reports a UCLA study published March 30 in the online edition of Nature Neuroscience. The discovery could point to new drug targets for treating the devastating disease, which strikes one in every 20,000 Americans.

Huntington’s disease is passed from parent to child through a mutation in the huntingtin gene. By killing brain cells called neurons, the progressive disorder gradually deprives patients of their ability to walk, speak, swallow, breathe and think clearly. No cure exists, and patients with aggressive cases can die in as little as 10 years.

The laboratories of Baljit Khakh, a professor of physiology and neurobiology, and Michael Sofroniew, a professor of neurobiology, teamed up at the David Geffen School of Medicine at UCLA to unravel the role played in Huntington’s by astrocytes—large, star-shaped cells found in the brain and spinal cord.

Read more

Filed under huntington's disease astrocytes huntingtin neurons animal model gene mutation neuroscience science

131 notes

Genetic mutation increases risk of Parkinson’s disease from pesticides
A team of researchers has brought new clarity to the picture of how gene-environmental interactions can kill nerve cells that make dopamine. Dopamine is the neurotransmitter that sends messages to the part of the brain that controls movement and coordination. Their discoveries, described in a paper published online in Cell today, include identification of a molecule that protects neurons from pesticide damage.
"For the first time, we have used human stem cells derived from Parkinson’s disease patients to show that a genetic mutation combined with exposure to pesticides creates a ‘double hit’ scenario, producing free radicals in neurons that disable specific molecular pathways that cause nerve-cell death," said Stuart Lipton, M.D., Ph.D., professor and director of Sanford-Burnham Medical Research Institute’s Del E. Webb Center for Neuroscience, Aging, and Stem Cell Research and senior author of the study.
Until now, the link between pesticides and Parkinson’s disease was based mainly on animal studies and epidemiological research that demonstrated an increased risk of disease among farmers, rural populations, and others exposed to agricultural chemicals.
In the new study, Lipton, along with Rajesh Ambasudhan, Ph.D., research assistant professor in the Del E. Webb Center, and Rudolf Jaenisch, M.D., founding member of Whitehead Institute for Biomedical Research and professor of biology at the Massachusetts Institute of Technology, used skin cells from Parkinson’s patients that had a mutation in the gene encoding a protein called alpha-synuclein. Alpha-synuclein is the primary protein found in Lewy bodies—protein clumps that are the pathological hallmark of Parkinson’s disease.
Using patient skin cells, the researchers created human induced pluripotent stem cells (hiPSCs) containing the mutation, and then “corrected” the alpha-synuclein mutation in other cells. Next, they reprogrammed all of these cells to become the specific type of nerve cell that is damaged in Parkinson’s disease, called A9 dopamine-containing neurons—thus creating two sets of neurons—identical in every respect except for the alpha-synuclein mutation.
"Exposing both normal and mutant neurons to pesticides—including paraquat, maneb, and rotenone—created excessive free radicals in cells with the mutation, causing damage to dopamine-containing neurons that led to cell death," said Frank Soldner, M.D., research scientist in Jaenisch’s lab and co-author of the study.
"In fact, we observed the detrimental effects of these pesticides with short exposures to doses well below EPA-accepted levels," said Scott Ryan, Ph.D., researcher in the Del E. Webb Center and lead author of the paper.
Having access to genetically matched neurons with the exception of a single mutation simplified the interpretation of the genetic contribution to pesticide-induced neuronal death. In this case, the researchers were able to pinpoint how cells with the mutation, when exposed to pesticides, disrupt a key mitochondrial pathway—called MEF2C-PGC1alpha—that normally protects neurons that contain dopamine. The free radicals attacked the MEF2C protein, leading to the loss of function of this pathway that would otherwise have protected the nerve cells from the pesticides.
"Once we understood the pathway and the molecules that were altered by the pesticides, we used high-throughput screening to identify molecules that could inhibit the effect of free radicals on the pathway," said Lipton. "One molecule we identified was isoxazole, which protected mutant neurons from cell death induced by the tested pesticides. Since several FDA-approved drugs contain derivatives of isoxazole, our findings may have potential clinical implications for repurposing these drugs to treat Parkinson’s."
While the study clearly shows the relationship between a mutation, the environment, and the damage done to dopamine-containing neurons, it does not exclude other mutations and pathways from being important as well. The team plans to explore additional molecular mechanisms that demonstrate how genes and the environment interact to contribute to Parkinson’s and other neurodegenerative diseases, such as Alzheimer’s and ALS.
"In the future, we anticipate using the knowledge of mutations that predispose an individual to these diseases in order to predict who should avoid a particular environmental exposure. Moreover, we will be able to screen for patients who may benefit from a specific therapy that can prevent, treat, or possibly cure these diseases," Lipton said.

Genetic mutation increases risk of Parkinson’s disease from pesticides

A team of researchers has brought new clarity to the picture of how gene-environmental interactions can kill nerve cells that make dopamine. Dopamine is the neurotransmitter that sends messages to the part of the brain that controls movement and coordination. Their discoveries, described in a paper published online in Cell today, include identification of a molecule that protects neurons from pesticide damage.

"For the first time, we have used human stem cells derived from Parkinson’s disease patients to show that a genetic mutation combined with exposure to pesticides creates a ‘double hit’ scenario, producing free radicals in neurons that disable specific molecular pathways that cause nerve-cell death," said Stuart Lipton, M.D., Ph.D., professor and director of Sanford-Burnham Medical Research Institute’s Del E. Webb Center for Neuroscience, Aging, and Stem Cell Research and senior author of the study.

Until now, the link between pesticides and Parkinson’s disease was based mainly on animal studies and epidemiological research that demonstrated an increased risk of disease among farmers, rural populations, and others exposed to agricultural chemicals.

In the new study, Lipton, along with Rajesh Ambasudhan, Ph.D., research assistant professor in the Del E. Webb Center, and Rudolf Jaenisch, M.D., founding member of Whitehead Institute for Biomedical Research and professor of biology at the Massachusetts Institute of Technology, used skin cells from Parkinson’s patients that had a mutation in the gene encoding a protein called alpha-synuclein. Alpha-synuclein is the primary protein found in Lewy bodies—protein clumps that are the pathological hallmark of Parkinson’s disease.

Using patient skin cells, the researchers created human induced pluripotent stem cells (hiPSCs) containing the mutation, and then “corrected” the alpha-synuclein mutation in other cells. Next, they reprogrammed all of these cells to become the specific type of nerve cell that is damaged in Parkinson’s disease, called A9 dopamine-containing neurons—thus creating two sets of neurons—identical in every respect except for the alpha-synuclein mutation.

"Exposing both normal and mutant neurons to pesticides—including paraquat, maneb, and rotenone—created excessive free radicals in cells with the mutation, causing damage to dopamine-containing neurons that led to cell death," said Frank Soldner, M.D., research scientist in Jaenisch’s lab and co-author of the study.

"In fact, we observed the detrimental effects of these pesticides with short exposures to doses well below EPA-accepted levels," said Scott Ryan, Ph.D., researcher in the Del E. Webb Center and lead author of the paper.

Having access to genetically matched neurons with the exception of a single mutation simplified the interpretation of the genetic contribution to pesticide-induced neuronal death. In this case, the researchers were able to pinpoint how cells with the mutation, when exposed to pesticides, disrupt a key mitochondrial pathway—called MEF2C-PGC1alpha—that normally protects neurons that contain dopamine. The free radicals attacked the MEF2C protein, leading to the loss of function of this pathway that would otherwise have protected the nerve cells from the pesticides.

"Once we understood the pathway and the molecules that were altered by the pesticides, we used high-throughput screening to identify molecules that could inhibit the effect of free radicals on the pathway," said Lipton. "One molecule we identified was isoxazole, which protected mutant neurons from cell death induced by the tested pesticides. Since several FDA-approved drugs contain derivatives of isoxazole, our findings may have potential clinical implications for repurposing these drugs to treat Parkinson’s."

While the study clearly shows the relationship between a mutation, the environment, and the damage done to dopamine-containing neurons, it does not exclude other mutations and pathways from being important as well. The team plans to explore additional molecular mechanisms that demonstrate how genes and the environment interact to contribute to Parkinson’s and other neurodegenerative diseases, such as Alzheimer’s and ALS.

"In the future, we anticipate using the knowledge of mutations that predispose an individual to these diseases in order to predict who should avoid a particular environmental exposure. Moreover, we will be able to screen for patients who may benefit from a specific therapy that can prevent, treat, or possibly cure these diseases," Lipton said.

Filed under parkinson's disease pesticides dopamine neurons gene mutation stem cells alpha-synuclein neuroscience science

95 notes

Rare disease yields clues about broader brain pathology

Alexander disease is a devastating brain disease that almost nobody has heard of — unless someone in the family is afflicted with it. Alexander disease strikes young or old, and in children destroys white matter in the front of the brain. Many patients, especially those with early onset, have significant intellectual disabilities.

image

(Image: A mutant gene that causes the deadly Alexander disease creates an overgrowth of the protein GFAP in mouse brain cells called astrocytes (right) compared to normal brain cells (left))

Regardless of the age when it begins, Alexander disease is always fatal. It typically results from mutations in a gene known as GFAP (glial fibrillary acidic protein), leading to the formation of fibrous clumps of protein inside brain cells called astrocytes.

Classically, astrocytes and other glial cells were considered “helpers” that nourish and protect the neurons that do the actual communication. But in recent years, it’s become clear that glial cells are much more than passive bystanders, and may be active culprits in many neurological diseases.

Now, in a report in the Journal of Neuroscience, researchers at UW-Madison show that Alexander disease also affects neurons, and in a way that impacts several measures of learning and memory.

Mice were engineered to contain the same mutation in GFAP that is found in human patients. Their astrocytes spontaneously increased production of GFAP, the same response found after many types of injury or disease in the brain. In Alexander disease, the result is an increase in mutant GFAP that is “toxic to the cell, and unfortunately astrocytes respond by making more GFAP,” says first author Tracy Hagemann, an associate scientist with the university’s Waisman Center.

While GFAP is usually found in astrocytes, it also occurs in neural stem cells, a population of cells that persist in some areas of the brain to continually spawn new neurons throughout adulthood. In the mouse versions of Alexander disease, neural stem cells are present, but they fail to develop into neurons, Hagemann says. “Think of a garden where your green beans never sprouted. Was it too cold for them to sprout, or was there another problem? Something similar is happening with these neural stem cells. They are present, but inert, and we’re not sure why.”

The shortage of new neurons could explain why the mice with excess GFAP failed a test that required them to remember the location of a submerged platform in a tub of water.

The report is “the first to suggest that the problems in Alexander disease extend beyond just the white matter and astrocytes, and may provide a clue to the problems with learning and memory that are such prominent features in the human disease,” says lab leader Albee Messing, a professor of comparative biosciences in the UW School of Veterinary Medicine.

One immediate question that the team will try to answer is whether the same defect in stem cells can be found in autopsy samples stored over many years to allow just this kind of investigation.

Still to be clarified is whether the mutation affects the neural stem cells directly, or whether it acts through other astrocytes that are nearby. “We do know that the astrocytes become activated with this GFAP mutation,” Hagemann says. “That activation — a kind of inflammation — could be making the environment hostile to young neurons. Or the mutation could be changing the neural stem cells themselves in some other way.

"Medicine advances by teasing things apart," says Hagemann. "A single mutation can work in different ways — through different chains of cause and effect leading to different symptoms of a disease. In this case it’s like the old question of nature versus nurture. Was the stem cell born bad — was it genetically doomed? Or were the reactive astrocytes in the neighborhood a toxic influence? Or both? This is an important question for Alexander disease and other brain deteriorating disorders, especially with the current focus on stem cells as a source for new neurons and therapy."

Already, the Waisman group is screening drugs that might slow GFAP production. Eventually, Hagemann says, the work may illuminate the role of astrocyte dysfunction in other neural diseases featuring aggregates of misformed proteins, including ALS, Parkinson’s, and Alzheimer’s disease.

(Source: news.wisc.edu)

Filed under alexander disease astrocytes gene mutation glial cells GFAP neuroscience science

272 notes

Girl who feels no pain could inspire new painkillers

A girl who does not feel physical pain has helped researchers identify a gene mutation that disrupts pain perception. The discovery may spur the development of new painkillers that will block pain signals in the same way.

image

People with congenital analgesia cannot feel physical pain and often injure themselves as a result – they might badly scald their skin, for example, through being unaware that they are touching something hot.

By comparing the gene sequence of a girl with the disorder against those of her parents, who do not, Ingo Kurth at Jena University Hospital in Germany and his colleagues identified a mutation in a gene called SCN11A.

This gene controls the development of channels on pain-sensing neurons. Sodium ions travel through these channels, creating electrical nerve impulses that are sent to the brain, which registers pain.

Blocked signals

Overactivity in the mutated version of SCN11A prevents the build-up of the charge that the neurons need to transmit an electrical impulse, numbing the body to pain. “The outcome is blocked transmission of pain signals,” says Kurth.

To confirm their findings, the team inserted a mutated version of SCN11A into mice and tested their ability to perceive pain. They found that 11 per cent of the mice with the modified gene developed injuries similar to those seen in people with congenital analgesia, such as bone fractures and skin wounds. They also tested a control group of mice with the normal SCN11A gene, none of which developed such injuries.

The altered mice also took 2.5 times longer on average than the control group to react to the “tail flick” pain test, which measures how long it takes for mice to flick their tails when exposed to a hot light beam. “What became clear from our experiments is that although there are similarities between mice and men with the mutation, the degree of pain insensitivity is more prominent in humans,” says Kurth.

The team has now begun the search for drugs that block the SCN11A channel. “It would require drugs that selectively block this but not other sodium channels, which is far from simple,” says Kurth.

Completely unexpected

"This is a cracking paper, and great science," says Geoffrey Woods of the University of Cambridge, whose team discovered in 2006 that mutations in another, closely related ion channel gene can cause insensitivity to pain. "It’s completely unexpected and not what people had been looking for," he says.

Woods says that there are three ion channels, called SCN9A, 10A and 11A, on pain-sensing neurons. People experience no pain when either of the first two don’t work, and agonising pain when they’re overactive. “With this new gene, it’s the opposite: when it’s overactive, they feel no pain. So maybe it’s some kind of gatekeeper that stops neurons from firing too often, but cancels pain signals completely when it’s overactive,” he says. “If you could get a drug that made SCN11A overactive, it should be a fantastic analgesic.”

"It’s fascinating that SCN11A appears to work the other way, and that could really advance our knowledge of the role of sodium channels in pain perception, which is a very hot topic,” says Jeffrey Mogil at McGill University in Canada, who was not involved in the new study.

(Source: newscientist.com)

Filed under pain pain perception gene mutation congenital analgesia ion channels neuroscience science

52 notes

Inner-Ear Disorders May Cause Hyperactivity
Behavioral abnormalities are traditionally thought to originate in the brain. But a new study by researchers at Albert Einstein College of Medicine of Yeshiva University has found that inner-ear dysfunction can directly cause neurological changes that increase hyperactivity. The study, conducted in mice, also implicated two brain proteins in this process, providing potential targets for intervention. The findings were published today in the online edition of Science.
For years, scientists have observed that many children and adolescents with severe inner-ear disorders – particularly disorders affecting both hearing and balance – also have behavioral problems, such as hyperactivity. Until now, no one has been able to determine whether the ear disorders and behavioral problems are actually linked.
"Our study provides the first evidence that a sensory impairment, such as inner-ear dysfunction, can induce specific molecular changes in the brain that cause maladaptive behaviors traditionally considered to originate exclusively in the brain," said study leader Jean M. Hébert, Ph.D., professor in the Dominick P. Purpura Department of Neuroscience and of genetics at Einstein.
The inner ear consists of two structures, the cochlea (responsible for hearing) and the vestibular system (responsible for balance). Inner-ear disorders are typically caused by genetic defects but can also result from infection or injury.
The idea for the study arose when Michelle W. Antoine, a Ph.D. student at Einstein at the time, noticed that some mice in Dr. Hébert’s laboratory were unusually active – in a state of near-continual movement, chasing their tails in a circular pattern. Further investigation revealed that the mice had severe cochlear and vestibular defects and were profoundly deaf. “We then realized that these mice provided a good opportunity to study the relationship between inner-ear dysfunction and behavior,” said Dr. Hébert.
The researchers established that the animals’ inner-ear problems were due to a mutation in a gene called Slc12a2, which mediates the transport of sodium, potassium, and chloride molecules in various tissues, including the inner ear and central nervous system (CNS). The gene is also found in humans.
To determine whether the gene mutation was linked to the animals’ hyperactivity, the researchers took healthy mice and selectively deleted Slc12a2 from either the inner ear, various parts of the brain that control movement or the entire CNS. “To our surprise, it was only when we deleted the gene from the inner ear that we observed increased locomotor activity,” said Dr. Hébert.
The researchers hypothesized that inner-ear defects cause abnormal functioning of the striatum, a central brain area that controls movement. Tests revealed increased levels of two proteins involved in a signaling pathway that controls the action of neurotransmitters: pERK (phosphorylated extracellular signal-regulated kinase) and pCREB (phospho-cAMP response-element binding protein), which is further down the signaling pathway from pERK. Increases in levels of the two proteins were seen only in the striatum and not in other forebrain regions.
To discover whether increased pERK levels caused the abnormal increase in locomotor activity, Slc12a2-deficient mice were given injections of SL327, a pERK inhibitor. Administering SL327 restored locomotor activity to normal, without affecting activity levels in controls. The SL327 injections did not affect grooming, suggesting that increased pERK in the striatum selectively elevates locomotor activity and not general activity. According to the researchers, the findings suggest that hyperactivity in children with inner-ear disorders might be controllable with medications that directly or indirectly inhibit the pERK pathway in the striatum.
"Our study also raises the intriguing possibility that other sensory impairments not associated with inner-ear defects could cause or contribute to psychiatric or motor disorders that are now considered exclusively of cerebral origin," said Dr. Hébert. "This is an area that has not been well studied."

Inner-Ear Disorders May Cause Hyperactivity

Behavioral abnormalities are traditionally thought to originate in the brain. But a new study by researchers at Albert Einstein College of Medicine of Yeshiva University has found that inner-ear dysfunction can directly cause neurological changes that increase hyperactivity. The study, conducted in mice, also implicated two brain proteins in this process, providing potential targets for intervention. The findings were published today in the online edition of Science.

For years, scientists have observed that many children and adolescents with severe inner-ear disorders – particularly disorders affecting both hearing and balance – also have behavioral problems, such as hyperactivity. Until now, no one has been able to determine whether the ear disorders and behavioral problems are actually linked.

"Our study provides the first evidence that a sensory impairment, such as inner-ear dysfunction, can induce specific molecular changes in the brain that cause maladaptive behaviors traditionally considered to originate exclusively in the brain," said study leader Jean M. Hébert, Ph.D., professor in the Dominick P. Purpura Department of Neuroscience and of genetics at Einstein.

The inner ear consists of two structures, the cochlea (responsible for hearing) and the vestibular system (responsible for balance). Inner-ear disorders are typically caused by genetic defects but can also result from infection or injury.

The idea for the study arose when Michelle W. Antoine, a Ph.D. student at Einstein at the time, noticed that some mice in Dr. Hébert’s laboratory were unusually active – in a state of near-continual movement, chasing their tails in a circular pattern. Further investigation revealed that the mice had severe cochlear and vestibular defects and were profoundly deaf. “We then realized that these mice provided a good opportunity to study the relationship between inner-ear dysfunction and behavior,” said Dr. Hébert.

The researchers established that the animals’ inner-ear problems were due to a mutation in a gene called Slc12a2, which mediates the transport of sodium, potassium, and chloride molecules in various tissues, including the inner ear and central nervous system (CNS). The gene is also found in humans.

To determine whether the gene mutation was linked to the animals’ hyperactivity, the researchers took healthy mice and selectively deleted Slc12a2 from either the inner ear, various parts of the brain that control movement or the entire CNS. “To our surprise, it was only when we deleted the gene from the inner ear that we observed increased locomotor activity,” said Dr. Hébert.

The researchers hypothesized that inner-ear defects cause abnormal functioning of the striatum, a central brain area that controls movement. Tests revealed increased levels of two proteins involved in a signaling pathway that controls the action of neurotransmitters: pERK (phosphorylated extracellular signal-regulated kinase) and pCREB (phospho-cAMP response-element binding protein), which is further down the signaling pathway from pERK. Increases in levels of the two proteins were seen only in the striatum and not in other forebrain regions.

To discover whether increased pERK levels caused the abnormal increase in locomotor activity, Slc12a2-deficient mice were given injections of SL327, a pERK inhibitor. Administering SL327 restored locomotor activity to normal, without affecting activity levels in controls. The SL327 injections did not affect grooming, suggesting that increased pERK in the striatum selectively elevates locomotor activity and not general activity. According to the researchers, the findings suggest that hyperactivity in children with inner-ear disorders might be controllable with medications that directly or indirectly inhibit the pERK pathway in the striatum.

"Our study also raises the intriguing possibility that other sensory impairments not associated with inner-ear defects could cause or contribute to psychiatric or motor disorders that are now considered exclusively of cerebral origin," said Dr. Hébert. "This is an area that has not been well studied."

Filed under hyperactivity inner-ear disorders gene mutation striatum neuroscience science

54 notes

Advance in tuberous sclerosis brain science
By manipulating the timing of disease-causing mutations in the brains of developing mice, Brown University researchers have found that early genetic deletions in the thalamus may play an important role in course and severity of the developmental disease tuberous sclerosis complex. Findings appear in the journal Neuron. 
Doctors often diagnose tuberous sclerosis complex (TSC) based on the abnormal growths the genetic disease causes in organs around the body. Those overt anatomical structures, however, belie the microscopic and mysterious neurological differences behind the disease’s troublesome behavioral symptoms: autism, intellectual disabilities, and seizures. In a new study in mice, Brown University researchers highlight a role for a brain region called the thalamus and show that the timing of gene mutation during thalamus development makes a huge difference in the severity of the disease.
TSC can arise in humans and mice alike when both alleles (the one from mom and the one from dad) of the TSC1 gene are deleted. One bad gene is often inherited and the other accumulates a mutation some time during embryonic development. This happens to one in 6,000 people.
“We don’t know when during development the mutations are occurring in the patients,” said Elizabeth Normand, a Brown neuroscience graduate student and lead author of the paper in the journal Neuron. “That’s why we chose to look at the timing. It can give us some insight into the role of genes during embryonic development.”
Normand and adviser Mark Zervas, assistant professor of biology, not only wanted to assess the timing but also to probe the role the thalamus might have in contributing to the neurological symptoms of the disease. To do both, their team genetically engineered a clever mouse model in which they could, with a dose of the drug tamoxifen, delete both alleles exclusively in thalamus neurons at the developmental stage of their choosing.
Their interest in the thalamus comes from its role in forging strong but intricate links to the cortex, which is where most other TSC researchers have focused. As for timing, they tested the effect of controlling allele deletions on day 12 of gestation in some mice and day 18 (just before birth) in others. Still other mice were left healthy as experimental controls.
Significant symptoms
Overall, the researchers found they could indeed generate TSC-like behavioral symptoms in the mice, such as seizures, by deleting TSC1 alleles in developing cells of the thalamus. They also found that the timing of the deletion mattered tremendously to the extent of the disease in the brain, the degree of abnormality, and the severity of TSC-like symptoms.
The mice whose alleles were deleted on embryonic day 12 fared much worse behaviorally than the mice whose alleles were deleted on embryonic day 18.
At two months of age, the mice with the embryonic day 12 deletion exhibited excessive self-grooming to the point where they experienced lesions. Among those mice, 10 of 11 experienced seizures at an average rate of more than three per hour.
The mice with the embryonic day 18 deletion, on the other hand, fared better without any over-grooming. By eight months of age, however, four of 17 of the mice did exhibit rare seizures.
These behavioral differences traced to differences in the the way the mice’s brains became wired. A comparison of brain tissue from adult mice — some of which had the early TSC1 deletions and some of which didn’t — revealed differences in the connections between the thalamus and the cortex and in the electrical and physical properties of thalamus cells.
“We’re building off the core idea of the thalamus playing an important role in brain function and showing that if you disrupt the way that the thalamic neurons develop that you can get some of these behavioral consequences such as overgrooming or seizures,” said Zervas, who is affiliated with the Brown Institute for Brain Science.
The extent of mutant neurons was much more severe in the mice with the embryonic day 12 versus day 18 mutations. In embryonic day 12 deleted mice, for example, the deletion disrupted the growth-regulating “mTOR” pathway in 70 percent of neurons versus only 29 percent of neurons in the embryonic day 18 deleted mice. The disruptions occurred in more areas of the thalamus in embryonic day 12 than in day 18 mice as well. The overactivity of mTOR in TSC is what produces the unusual growths around the body, though these new findings indicate additional roles for the mTOR pathway in brain development and function, Zervas said.
In future work, the team plans to study the effects of deleting the TSC1 allele at other days during development as well as to understand whether there is a threshold of mutant neurons with mTOR disruption at which TSC-like symptoms begin to emerge.

Advance in tuberous sclerosis brain science

By manipulating the timing of disease-causing mutations in the brains of developing mice, Brown University researchers have found that early genetic deletions in the thalamus may play an important role in course and severity of the developmental disease tuberous sclerosis complex. Findings appear in the journal Neuron.

Doctors often diagnose tuberous sclerosis complex (TSC) based on the abnormal growths the genetic disease causes in organs around the body. Those overt anatomical structures, however, belie the microscopic and mysterious neurological differences behind the disease’s troublesome behavioral symptoms: autism, intellectual disabilities, and seizures. In a new study in mice, Brown University researchers highlight a role for a brain region called the thalamus and show that the timing of gene mutation during thalamus development makes a huge difference in the severity of the disease.

TSC can arise in humans and mice alike when both alleles (the one from mom and the one from dad) of the TSC1 gene are deleted. One bad gene is often inherited and the other accumulates a mutation some time during embryonic development. This happens to one in 6,000 people.

“We don’t know when during development the mutations are occurring in the patients,” said Elizabeth Normand, a Brown neuroscience graduate student and lead author of the paper in the journal Neuron. “That’s why we chose to look at the timing. It can give us some insight into the role of genes during embryonic development.”

Normand and adviser Mark Zervas, assistant professor of biology, not only wanted to assess the timing but also to probe the role the thalamus might have in contributing to the neurological symptoms of the disease. To do both, their team genetically engineered a clever mouse model in which they could, with a dose of the drug tamoxifen, delete both alleles exclusively in thalamus neurons at the developmental stage of their choosing.

Their interest in the thalamus comes from its role in forging strong but intricate links to the cortex, which is where most other TSC researchers have focused. As for timing, they tested the effect of controlling allele deletions on day 12 of gestation in some mice and day 18 (just before birth) in others. Still other mice were left healthy as experimental controls.

Significant symptoms

Overall, the researchers found they could indeed generate TSC-like behavioral symptoms in the mice, such as seizures, by deleting TSC1 alleles in developing cells of the thalamus. They also found that the timing of the deletion mattered tremendously to the extent of the disease in the brain, the degree of abnormality, and the severity of TSC-like symptoms.

The mice whose alleles were deleted on embryonic day 12 fared much worse behaviorally than the mice whose alleles were deleted on embryonic day 18.

At two months of age, the mice with the embryonic day 12 deletion exhibited excessive self-grooming to the point where they experienced lesions. Among those mice, 10 of 11 experienced seizures at an average rate of more than three per hour.

The mice with the embryonic day 18 deletion, on the other hand, fared better without any over-grooming. By eight months of age, however, four of 17 of the mice did exhibit rare seizures.

These behavioral differences traced to differences in the the way the mice’s brains became wired. A comparison of brain tissue from adult mice — some of which had the early TSC1 deletions and some of which didn’t — revealed differences in the connections between the thalamus and the cortex and in the electrical and physical properties of thalamus cells.

“We’re building off the core idea of the thalamus playing an important role in brain function and showing that if you disrupt the way that the thalamic neurons develop that you can get some of these behavioral consequences such as overgrooming or seizures,” said Zervas, who is affiliated with the Brown Institute for Brain Science.

The extent of mutant neurons was much more severe in the mice with the embryonic day 12 versus day 18 mutations. In embryonic day 12 deleted mice, for example, the deletion disrupted the growth-regulating “mTOR” pathway in 70 percent of neurons versus only 29 percent of neurons in the embryonic day 18 deleted mice. The disruptions occurred in more areas of the thalamus in embryonic day 12 than in day 18 mice as well. The overactivity of mTOR in TSC is what produces the unusual growths around the body, though these new findings indicate additional roles for the mTOR pathway in brain development and function, Zervas said.

In future work, the team plans to study the effects of deleting the TSC1 allele at other days during development as well as to understand whether there is a threshold of mutant neurons with mTOR disruption at which TSC-like symptoms begin to emerge.

Filed under embryonic development gene mutation animal model tuberous sclerosis complex neuroscience science

52 notes

Science surprise: Toxic protein made in unusual way may explain brain disorder

A bizarre twist on the usual way proteins are made may explain mysterious symptoms in the grandparents of some children with mental disabilities.

The discovery, made by a team of scientists at the University of Michigan Medical School, may lead to better treatments for older adults with a recently discovered genetic condition.

The condition, called Fragile X-associated Tremor Ataxia Syndrome (FXTAS), causes shakiness and balance problems and is often misdiagnosed as Parkinson’s disease. The grandchildren of people with the disease have a separate disorder called Fragile X syndrome, caused by problems in the same gene. The new discovery may also help shine light on that disease, though indirectly.

In a new paper published in the journal Neuron, the U-M-led team presents evidence that a toxic protein they’ve named FMRpolyG contributes to the death of nerve cells in FXTAS – and that this protein is made in a very unusual way.

Normally, DNA is transcribed into RNA, and then a part of the RNA is translated into a protein that performs its function in cells. Where this translation process starts on the RNA is usually determined by a specific sequence called a start codon.

The gene mutation that causes FXTAS is a repeated DNA sequence that is made into RNA but normally is not made into protein because it lacks a start codon. However, the investigators discovered that when this repeat expands, it can trigger protein production by a new mechanism known as RAN translation.

Corresponding author Peter Todd, M.D., Ph.D., notes that this unusual translation process appears to stem from a long chain of repeated DNA “letters” found in the genes of both grandparents and kids with Fragile X mutations. Todd is the Bucky and Patti Harris Professor in the U-M Department of Neurology

"Essentially, we’ve found that a sequence of DNA which shouldn’t be made into protein is being made into protein – and that this causes a toxicity in nerve cells," he explains. "We believe that the protein forms aggregates, and that this is a major contributor to toxicity and symptoms in FXTAS."

The U-M group went on to show how this RAN translation occurs in FXTAS and demonstrated that blocking it prevents the repeat mutation from being toxic, suggesting a new target for future treatments.

Fragile X tremor/ataxia syndrome or FXTAS was only discovered a decade ago. It may affect as many as one in every 3,000 men and one in 20,000 women, who have a repeat mutation in the gene known as FMR1. However, these patients don’t usually develop symptoms until late middle age, allowing them to pass the mutation on to their daughters, who can then have children where the DNA repeat that has grown much longer. In those children, especially in boys, it can cause severe intellectual disability and autism-like symptoms as the FMR1 gene shuts down and none of the normal protein is produced.

In fact, says Todd, it’s often only after a child is diagnosed with Fragile X syndrome through genetic testing that their grandfather or grandmother finds out that their own symptoms stem from FXTAS. Doctors in U-M’s Neurogenetics clinic for adults, and the Pediatric Genetics Clinic at U-M’s C.S. Mott Children’s Hospital, routinely work together to address the needs of Fragile X families.

"We have some treatments for the symptoms that FXTAS patients have, but we do not yet have a cure," says Todd, who regularly sees patients with FXTAS and related disorders. "Better treatments are needed – and this new discovery might help lead to novel strategies for clearing away or preventing the buildup of this toxic protein."

In addition, he says, the discovery that Fragile X ataxia results in part from RAN translation could have significance both for other diseases like amyotrophic lateral sclerosis (ALS, also called Lou Gehrig’s disease) and certain forms of dementia that are caused by DNA repeats. It can also aid our understanding of basic biology. “This may represent a new way in which translational initiation events occur, and may have importance beyond this one disease,” he notes. Further research on how RAN translation occurs, and why, is needed.

The idea that proteins can be created without a “start site” flies in the face of what most students of biology have learned in the last century. “In biology, we’re finding that the rules we once thought were hard and fast have some wiggle room,” Todd says.

(Source: eurekalert.org)

Filed under fragile x syndrome toxic protein nerve cells gene mutation DNA sequence neuroscience science

70 notes

Researchers Discover New Clues About How Amyotrophic Lateral Sclerosis (ALS) Develops

Johns Hopkins scientists say they have evidence from animal studies that a type of central nervous system cell other than motor neurons plays a fundamental role in the development of amyotrophic lateral sclerosis (ALS), a fatal degenerative disease. The discovery holds promise, they say, for identifying new targets for interrupting the disease’s progress.

In a study described online in Nature Neuroscience, the researchers found that, in mice bred with a gene mutation that causes human ALS, dramatic changes occurred in oligodendrocytes — cells that create insulation for the nerves of the central nervous system — long before the first physical symptoms of the disease appeared. Oligodendrocytes located near motor neurons — cells that govern movement — died off at very high rates, and new ones regenerated in their place were inferior and unhealthy.

The researchers also found, to their surprise, that suppressing an ALS-causing gene in oligodendrocytes of mice bred with the disease — while still allowing the gene to remain in the motor neurons — profoundly delayed the onset of ALS. It also prolonged survival of these mice by more than three months, a long time in the life span of a mouse. These observations suggest that oligodendrocytes play a very significant role in the early stage of the disease.

“The abnormalities in oligodendrocytes appear to be having a negative impact on the survival of motor neurons,” says Dwight E. Bergles, Ph.D., a co-author and a professor of neuroscience at the Johns Hopkins University School of Medicine. “The motor neurons seem to be dependent on healthy oligodendrocytes for survival, something we didn’t appreciate before.”

“These findings teach us that cells we never thought had a role in ALS not only are involved but also clearly contribute to the onset of the disease,” says co-author Jeffrey D. Rothstein, M.D., Ph.D., a professor of neurology at Johns Hopkins and director of the Johns Hopkins Medicine Brain Science Institute.

Scientists have long believed that oligodendrocytes functioned only as structural elements of the central nervous system. They wrap around nerves, making up the myelin sheath that provides the “insulation” that allows nerve signals to be transmitted rapidly and efficiently. However, Rothstein and others recently discovered that oligodendrocytes also deliver essential nutrients to neurons, and that most neurons need this support to survive.

The Johns Hopkins team of Bergles and Rothstein published a paper in 2010 that described in mice with ALS an unexpected massive proliferation of oligodendrocyte progenitor cells in the spinal cord’s motor neurons, and that these progenitors were being mobilized to make new oligodendrocytes. The researchers believed that these cells were multiplying because of an injury to oligodendrocytes, but they weren’t sure what was happening. Using a genetic method of tracking the fate of oligodendrocytes, in the new study, the researchers found that cells present in young mice with ALS were dying off at an increasing rate in concert with advancing disease. Moreover, the development of the newly formed oligodendrocytes was stalled and they were not able to provide motor neurons with a needed source of cell nutrients.

To determine whether the changes to the oligodendrocytes were just a side effect of the death of motor neurons, the scientists used a poison to kill motor neurons in the ALS mice and found no response from the progenitors, suggesting, says Rothstein, that it is the mutant ALS gene that is damaging oligodendrocytes directly.

Meanwhile, in separate experiments, the researchers found similar changes in samples of tissues from the brains of 35 people who died of ALS. Rothstein says it may be possible to see those changes early on in the disease and use MRI technology to follow progression.

“If our research is confirmed, perhaps we can start looking at ALS patients in a different way, looking for damage to oligodendrocytes as a marker for disease progression,” Rothstein says. “This could not only lead to new treatment targets but also help us to monitor whether the treatments we offer are actually working.”

ALS, also known as Lou Gehrig’s disease, named for the Yankee baseball great who died from it, affects nerve cells in the brain and spinal cord that control voluntary muscle movement. The nerve cells waste away or die, and can no longer send messages to muscles, eventually leading to muscle weakening, twitching and an inability to move the arms, legs and body. Onset is typically around age 50 and death often occurs within three to five years of diagnosis. Some 10 percent of cases are hereditary.

There is no cure for ALS and there is only one FDA-approved drug treatment, which has just a small effect in slowing disease progression and increasing survival.

Even though myelin loss has not previously been thought to occur in the gray matter, a region in the brain where neurons process information, the researchers in the new study found in ALS patients a significant loss of myelin in one of every three samples of human tissue taken from the brain’s gray matter, suggesting that the oligodendrocytes were abnormal. It isn’t clear if the oligodendrocytes that form this myelin in the gray matter play a different role than in white matter — the region in the brain where signals are relayed.

The findings further suggest that clues to the treatment of other diseases long believed to be focused in the brain’s gray matter — such as Alzheimer’s disease, Huntington’s disease and Parkinson’s disease — may be informed by studies of diseases of the white matter, such as multiple sclerosis (MS). Bergles says ALS and MS researchers never really thought their diseases had much in common before.

Oligodendrocytes have been under intense scrutiny in MS, Bergles says. In MS, the disease over time can transform from a remitting-relapsing form — in which myelin is attacked but then is regenerated when existing progenitors create new oligodendrocytes to re-form myelin — to a more chronic stage in which oligodendrocytes are no longer regenerated. MS researchers are working to identify new ways to induce the creation of new oligodendrocytes and improve their survival. “It’s possible that we may be able to dovetail with some of the same therapeutics to slow the progression of ALS,” Bergles says.

(Source: newswise.com)

Filed under ALS Lou Gehrig's disease motor neurons oligodendrocytes CNS gene mutation neuroscience science

free counters