Neuroscience

Articles and news from the latest research reports.

201 notes

After death, twin brains show similar patterns of neuropathologic changes
Despite widespread use of a single term, Alzheimer’s disease is actually a diverse collection of diseases, symptoms and pathological changes. What’s happening in the brain often varies widely from patient to patient, and a trigger for one person may be harmless is another.
In a unique study, an international team of researchers led by USC psychologist Margaret Gatz compared the brains of twins where one or both died of Alzheimer’s disease. They found that many of the twin pairs not only had similar progressions of Alzheimer’s disease and dementia prior to death, but they also had similar combinations of pathologies — two-or-more unconnected areas of damage to the brain.
The paper is part of Gatz’s landmark body of work on aging and cognition with the Swedish Twin Registry, a large cohort study of more than 14,000 Swedish twins, now over the age of 65. Across nearly 30 years, Gatz’s work with twins — including genetically identical pairs — has shifted the study of Alzheimer’s disease to include the entire lifespan, including the effects of developmental exposure, periodontal disease, mental health, obesity and diabetes on later-life Alzheimer’s risk.
The current paper provides more evidence that there may not be a single smoking-gun cause of Alzheimer’s, but rather a range of potential causes to which we may be susceptible largely depending on our genetics. It appears in the current issue of the journal Brain Pathology.
“We try to make inferences based on tests and diagnoses, but we have to assume that what we’re seeing is a manifestation of what’s going on in these twins’ brains,” said Gatz, professor of psychology, gerontology and preventive medicine in USC Dornsife College. “For this reason, we wanted to compare the brains of twins to ask whether identical twins’ brains are actually more identical?”
The researchers had the rare opportunity to directly autopsy the brains of seven pairs of twins who both died after being receiving diagnostic evaluations over many years, including a pair of identical twins who were both diagnosed with Alzheimer’s and died within a year of one another at the age of 98.
“There may be risk factors that start to accumulate but don’t lead to a clinical diagnosis,” explained lead author Diego Iacono of the Karolinska Institute in Sweden and the Biomedical Research Institute. “We found that the presence of Alzheimer’s disease doesn’t preclude the presence of other damage. Looking at co-pathologies in twin pairs may present new areas for research aside from the typical factors.”
For example, while there’s wide consensus among experts about the course of Alzheimer’s disease and the presence of amyloid plaques and tangles in the brain, what starts the process going is less clear, including the role of lesions, Lewy bodies and vascular or ventricle damage, more often associated with specific types of dementia such as Parkinson’s disease.
“Identical twins tended to have similar combinations of pathologies. We looked not just at the hallmark indicators of Alzheimer’s, but at all the other damage in the brain. Across the whole array of neuropathological changes, the identical twins appeared to have more similar pathologies,” Gatz said. “This is fascinating: it’s not just a key pathology related to the twins’ diagnoses but the combination of things happening in their brains. We’re going to keep looking for what these combinations are.”
(Image: Getty)

After death, twin brains show similar patterns of neuropathologic changes

Despite widespread use of a single term, Alzheimer’s disease is actually a diverse collection of diseases, symptoms and pathological changes. What’s happening in the brain often varies widely from patient to patient, and a trigger for one person may be harmless is another.

In a unique study, an international team of researchers led by USC psychologist Margaret Gatz compared the brains of twins where one or both died of Alzheimer’s disease. They found that many of the twin pairs not only had similar progressions of Alzheimer’s disease and dementia prior to death, but they also had similar combinations of pathologies — two-or-more unconnected areas of damage to the brain.

The paper is part of Gatz’s landmark body of work on aging and cognition with the Swedish Twin Registry, a large cohort study of more than 14,000 Swedish twins, now over the age of 65. Across nearly 30 years, Gatz’s work with twins — including genetically identical pairs — has shifted the study of Alzheimer’s disease to include the entire lifespan, including the effects of developmental exposure, periodontal disease, mental health, obesity and diabetes on later-life Alzheimer’s risk.

The current paper provides more evidence that there may not be a single smoking-gun cause of Alzheimer’s, but rather a range of potential causes to which we may be susceptible largely depending on our genetics. It appears in the current issue of the journal Brain Pathology.

“We try to make inferences based on tests and diagnoses, but we have to assume that what we’re seeing is a manifestation of what’s going on in these twins’ brains,” said Gatz, professor of psychology, gerontology and preventive medicine in USC Dornsife College. “For this reason, we wanted to compare the brains of twins to ask whether identical twins’ brains are actually more identical?”

The researchers had the rare opportunity to directly autopsy the brains of seven pairs of twins who both died after being receiving diagnostic evaluations over many years, including a pair of identical twins who were both diagnosed with Alzheimer’s and died within a year of one another at the age of 98.

“There may be risk factors that start to accumulate but don’t lead to a clinical diagnosis,” explained lead author Diego Iacono of the Karolinska Institute in Sweden and the Biomedical Research Institute. “We found that the presence of Alzheimer’s disease doesn’t preclude the presence of other damage. Looking at co-pathologies in twin pairs may present new areas for research aside from the typical factors.”

For example, while there’s wide consensus among experts about the course of Alzheimer’s disease and the presence of amyloid plaques and tangles in the brain, what starts the process going is less clear, including the role of lesions, Lewy bodies and vascular or ventricle damage, more often associated with specific types of dementia such as Parkinson’s disease.

“Identical twins tended to have similar combinations of pathologies. We looked not just at the hallmark indicators of Alzheimer’s, but at all the other damage in the brain. Across the whole array of neuropathological changes, the identical twins appeared to have more similar pathologies,” Gatz said. “This is fascinating: it’s not just a key pathology related to the twins’ diagnoses but the combination of things happening in their brains. We’re going to keep looking for what these combinations are.”

(Image: Getty)

Filed under alzheimer's disease dementia monozygotic twins neurodegeneration neuroscience science

176 notes

Researchers Find Inherited Pathway of Risk for Schizophrenia

Schizophrenia is one of the most disabling of all psychiatric illnesses. Sadly, it is not uncommon and it strikes early in life.

Many studies have looked into causes and potential interventions, and it has been long known that genetic factors play a role in determining the risk of developing schizophrenia. However, recent work has shown that there will be no simple answers as to why some people get schizophrenia: No single gene or small number of genes explains much of the risk for illness. Instead, future studies must focus on larger numbers of interacting genes.

In a new paper published in PLOS ONE, researchers led by Bruce Cohen of Harvard Medical School and McLean Hospital report promising evidence on what one of those important groups of genes may be.

Previous studies of schizophrenia have shown abnormalities in the brain’s white matter—its wiring and insulation—but these studies could not definitively separate inherited from environmental causes. For this study, researchers used previously discovered anomalies to select likely assortments of genes that, as a group, might be highly determinative of the risk for schizophrenia. The choice of genes was based on convergent results of past studies conducted locally and around the world, and included genes that control the insulation of the nerve cells in the brain.

The results of this study strongly suggest that the abnormalities of wiring and insulation are substantially determined by genes.

“There is abundant evidence from our center and from other laboratories that this insulation is compromised in schizophrenia,” said Cohen, HMS Robertson-Steele Professor of Psychiatry and director of the Shervert Frazier Research Institute at McLean Hospital. “Based on this lead, we tested whether the genes required for the activities of the cells that make this insulation (oligodendrocytes) were associated with schizophrenia. In a primary analysis, followed by three separate means of confirmatory analysis, we found strong evidence that genes for oligodendrocytes, as a group, were indeed associated with schizophrenia.”

The findings suggest a concrete reason why insulation is disrupted in the brain in schizophrenia. This disruption in turn may explain why thinking is altered in schizophrenia: Nerve cells are unable to pass exact messages if they lack proper insulation.

Further, the findings show that the abnormality in insulation is at least in part genetically determined, rather than solely due to environmental factors such as years of treatment, different life activities or exposure to toxins.

Finally, the results identify a specific cell-level abnormality, in oligodendrocytes, in schizophrenia.

Similar findings, using different techniques, were recently reported by an independent group of investigators, working separately but contemporaneously with the authors of this study.

“Knowing that one of the pathways of risk for schizophrenia is in this set of genes and in these cells may help identify who is at risk and in what way they are at risk,” said Cohen. “The cells themselves will next be studied to define the problem and seek methods to prevent or reverse it. Thus, the findings can point us towards new ways to reduce the risk and burden of schizophrenia.”

Additional researchers from HMS, Harvard School of Public Health, McLean Hospital, Massachusetts General Hospital, The Broad Institute of MIT and Harvard, and the Cardiff University School of Medicine in Wales contributed to the study.

(Source: hms.harvard.edu)

Filed under oligodendrocytes schizophrenia white matter genes neuroscience science

171 notes

Scientists Uncover Trigger for Most Common Form of Intellectual Disability and Autism

A new study led by Weill Cornell Medical College scientists shows that the most common genetic form of mental retardation and autism occurs because of a mechanism that shuts off the gene associated with the disease. The findings, published today in Science, also show that a drug that blocks this silencing mechanism can prevent fragile X syndrome — suggesting similar therapy is possible for 20 other diseases that range from mental retardation to multisystem failure.

image

(Image caption: A key brain signaling protein, seen here in green, that is normally lost in Fragile X syndrome neurons is restored by an experimental drug. Image: Dilek Colak)

Fragile X syndrome occurs mostly in boys, causing intellectual disability as well as telltale physical, behavioral and emotional traits. While researchers have known for more than two decades that the culprit behind the disease is an unusual mutation characterized by the excess repetition of a particular segment of the genetic code, they weren’t sure why the presence of a large number of these repetitions — 200 or more — sets the disease process in motion.

Using stem cells from donated human embryos that tested positive for fragile X syndrome, the scientists discovered that early on in fetal development, messenger RNA — a template for protein production — begins sticking itself onto the fragile X syndrome gene’s DNA. This binding appears to gum up the gene, making it inactive and unable to produce a protein crucial to the transmission of signals between brain cells.

"Until 11 weeks of gestation, the fragile X syndrome gene is active — it produces its messenger RNA and protein normally. Then, all of a sudden it turns off, and stays off for the rest of the patient’s lifetime, causing fragile X syndrome. But scientists have not understood why this gene gets shut off," says senior author Dr. Samie Jaffrey, a professor of pharmacology at Weill Cornell Medical College. "We discovered that the messenger RNA can jam up one strand of the gene’s DNA, shutting down the gene — which was not known before.

"This is new biology — an interaction between the RNA and the DNA of the fragile X syndrome gene causes disease," Dr. Jaffrey says. "We are coming to understand that RNAs are powerful molecules that can regulate gene expression, but this mechanism is completely novel — and very exciting."

The malfunction occurs suddenly — before the end of the first trimester in humans and after 50 days in laboratory embryonic stem cells. At that point, the messenger RNA produced by the fragile X syndrome gene makes what the researchers call an RNA-DNA duplex — a particular arrangement of molecules in which the messenger RNA is stuck onto its DNA complement. (DNA produces two complementary strands of the genetic code responsible for human development and function. The four nucleic acids in the genomic code — A, C, G, T — have specific complements. In the case of fragile X syndrome, the repeat sequence in question is CGG. Therefore, RNA binds to its GCC complement on one strand of DNA.)

The RNA-DNA duplex then shuts down production of the fragile X syndrome gene, causing the loss of a protein needed for communication between brain cells. The gene then remains inactive for life. A normal fragile X gene — one with fewer than 200 CGG repeats — stays active in a person without the disorder, and produces the necessary protein. However, the mutant fragile X gene contains more than 200 CGG repeats, resulting in fragile X syndrome. Fragile X occurs in about 1 in 4,000 males and 1 in 8,000 females.

"Because the fragile X syndrome mutation is a repeat sequence, it is very easy for just a small portion of this sequence in the messenger RNA to find a matching repeat sequence on the DNA," Dr. Jaffrey says. "This is a unique feature of repeat sequences. When there are 200 or more repeats, the RNA-DNA interaction locks into place."

Hope for treatment — and other disorders

Dr. Jaffrey and his team, which includes researchers from The Scripps Research Institute in Florida and Albert Einstein College of Medicine in the Bronx, sought to find out why the disease is switched on when the CGG repeat is present in 200 to as many as 1,000 copies.

"Utilizing traditional ways to solve this puzzle has been impossible," he says. "Human fragile X syndrome genes introduced into mice and cells in the laboratory never turn off, no matter how many CGG repeats the genes have."

So the scientists turned to human embryonic stem cells. Co-authors Dr. Zev Rosenwaks, director and physician-in-chief of the Ronald O. Perelman and Claudia Cohen Center for Reproductive Medicine and director of the Stem Cell Derivation Laboratory of Weill Cornell Medical College, and Dr. Nikica Zaninovic, assistant professor of reproductive medicine, generated stem cell lines from donated embryos that tested positive for fragile X syndrome. “These stem cells were critical to the success of this research, because they alone allowed us to mimic what happens to the fragile X gene during embryonic development,” says Dr. Dilek Colak, a postdoctoral scientist in Dr. Jaffrey’s laboratory and the first author of the study.

The stem cells were coaxed to become brain neurons, and at about 50 days, they differentiated in the same way that an embryo’s brain is developing at 11-plus weeks when the fragile X syndrome gene is switched off.

The researchers then used a drug developed by co-author Dr. Matthew Disney of the Scripps Research Institute that binds to CGG in the fragile X gene’s RNA before and after the 50-day switch. Strikingly, the gene never stopped producing its beneficial protein.

That suggests a potential prevention or treatment strategy for fragile X syndrome, Dr. Jaffrey says. “If a pregnant woman is told that her fetus carries the genetic mutation causing fragile X syndrome, we could potentially intervene and give the drug during gestation. This may delay or prevent the silencing of the fragile X gene, which could potentially significantly improve the outcome of these patients,” he says.

The researchers are now looking for similar RNA-DNA duplexes in other trinucleotide repeat diseases, including Huntington’s disease (a degenerative brain disease), myotonic dystrophy 1 and 2 (a multisystem progressive disease), Friedrich’s ataxia (a progressive nervous system disorder), Jacobsen syndrome (an intellectual disorder), and familial amyotrophic lateral sclerosis (a motor neuron disease), among others.

"This completely new mechanism by which RNAs can direct gene silencing may be involved in a lot of other diseases," Dr. Jaffrey says. "Our hope is that we can find drugs that interfere with this new type of disease process."

(Source: weill.cornell.edu)

Filed under fragile x syndrome autism genetics mental retardation intellectual disability neuroscience science

120 notes

Study first to offer detailed map of mouse’s cerebral cortex
The mammalian cerebral cortex, long thought to be a dense single interrelated tangle of neural networks, actually has a “logical” underlying organizational principle, according to a study appearing in the journal Cell.
Researchers have identified eight distinct neural subnetworks that together form the connectivity infrastructure of the mammalian cortex — the part of the brain involved in higher-order functions such as cognition, emotion and consciousness.
“This study is the first comprehensive mapping of the most developed region of the mammalian brain: the cerebral cortex. The cortex is highly complex and made up of many densely interconnected structures, but when you strip it down, is organized into a small number of subnetworks,” said senior author Hongwei Dong of the USC Institute for Neuroimaging and Informatics (INI).
The cerebral cortex is the outermost layer of neural tissue in the brain and is one of the most extensively studied brain structures in the field of neuroscience. However, before this study, its underlying organizational principle was still largely unclear.
“Think about it: The brain is built for logic, so it’s organization must be logical. The brain’s architectural organization is arranged such that all of its substructures most efficiently work in conjunction to produce appropriate behaviors,” said Dong, associate professor of neurology at the Keck School of Medicine of USC. “We want to find the code to how the brain is structurally organized.”
The study is also a reminder that while there is more data than ever, the quality and reliability of information still matters. In contrast to past patchwork attempts, Dong and his team undertook an effort to directly develop a whole-brain mouse atlas of brain pathways. Across the cortex, they injected fluorescent molecules. These molecules were then transported along the brain’s “cellular highways” — the neuronal pathways — and meticulously tracked using a high-resolution microscope.
The uniformity and completeness of the scientists’ effort across the entire cortex provided for a searchable image database of cortical connections, which the researchers are making open-access and publicly available.
It also allowed them to reliably see patterns: the seemingly inscrutable mass of connections in the cerebral cortex is highly organized, consisting of eight distinct subnetworks that are relatively segregated.
“The systematic and comprehensive manner in which the data were collected lent itself to a detailed analysis through which these subnetworks emerged,” explained co-lead author Houri Hintiryan of the USC Laboratory of Neuro Imaging.
So that scientists around the world may continue to look for fundamental structural insights, the full, interactive imaging dataset is viewable at Mouse Connectome Project, providing a resource for researchers interested in studying the anatomy and function of cortical networks throughout the brain.
“It really is quite tedious,” Dong said of collecting the data, “and labor-intensive, and it requires highly specialized skills and technology. But think of the Human Genome Project and how much it accelerated the process of discovery and the whole field when infrastructures existed for people to share and compare. That was our motivation.”
How these subnetworks interact will provide a crucial baseline from which to better understand diseases of “disconnection” such as autism and Alzheimer’s disease, in which the manifestations of symptoms are potentially a result of disordered or damaged connections.
The researchers’ map of the mouse cerebral cortex can be compared to data on disease-affected brains, brains in development and genetic information. It will also offer necessary context for humans, who behaved just like other mammals only a few thousand years ago and who still share most underlying basic behavioral characteristics such as hunger and pain.
“The fundamental logic of mammalian brains is the same, particularly when it comes to basic behaviors such as eating, sleeping and social behaviors” said Dong, who noted that similar studies in humans have thus far not gotten to the cellular level. “There are lots of organizing principles to brain structures that we are just beginning to understand.”
The researchers identified the brain subnetworks based on their high degree of interconnectivity — though relatively independent, several structures provide communication routes through which the subnetworks interact. Combined with behavioral data from past research and information about subcortical targets, these interconnections imply remarkable functional significance for the subnetworks.
Four of the eight identified subnetworks in the mouse cortex relate to sensation and movement of the body — what the researchers dub somatic sensorimotor. In particular, the researchers identified separate subnetworks for movements in the face, upper limbs, lower limbs and trunk, and whiskers. Together, these networks facilitate motor behaviors such as eating and drinking, reaching and grabbing, locomotion and exploration of the environment.
Two other subnetworks are comprised of structures located along the midline of the cerebral cortex. These medial subnetworks seem devoted to the integration of visual, auditory and somatic sensory information, according to the study. Several other structures located along the side of the brain form two lateral subnetworks, one of which potentially serves to regulate the internal status of the body (i.e., taste, hunger, visceral information) and the other as a “mega-integration” subnetwork that allows the interaction of information from nearly the entire cortex.

Study first to offer detailed map of mouse’s cerebral cortex

The mammalian cerebral cortex, long thought to be a dense single interrelated tangle of neural networks, actually has a “logical” underlying organizational principle, according to a study appearing in the journal Cell.

Researchers have identified eight distinct neural subnetworks that together form the connectivity infrastructure of the mammalian cortex — the part of the brain involved in higher-order functions such as cognition, emotion and consciousness.

“This study is the first comprehensive mapping of the most developed region of the mammalian brain: the cerebral cortex. The cortex is highly complex and made up of many densely interconnected structures, but when you strip it down, is organized into a small number of subnetworks,” said senior author Hongwei Dong of the USC Institute for Neuroimaging and Informatics (INI).

The cerebral cortex is the outermost layer of neural tissue in the brain and is one of the most extensively studied brain structures in the field of neuroscience. However, before this study, its underlying organizational principle was still largely unclear.

“Think about it: The brain is built for logic, so it’s organization must be logical. The brain’s architectural organization is arranged such that all of its substructures most efficiently work in conjunction to produce appropriate behaviors,” said Dong, associate professor of neurology at the Keck School of Medicine of USC. “We want to find the code to how the brain is structurally organized.”

The study is also a reminder that while there is more data than ever, the quality and reliability of information still matters. In contrast to past patchwork attempts, Dong and his team undertook an effort to directly develop a whole-brain mouse atlas of brain pathways. Across the cortex, they injected fluorescent molecules. These molecules were then transported along the brain’s “cellular highways” — the neuronal pathways — and meticulously tracked using a high-resolution microscope.

The uniformity and completeness of the scientists’ effort across the entire cortex provided for a searchable image database of cortical connections, which the researchers are making open-access and publicly available.

It also allowed them to reliably see patterns: the seemingly inscrutable mass of connections in the cerebral cortex is highly organized, consisting of eight distinct subnetworks that are relatively segregated.

“The systematic and comprehensive manner in which the data were collected lent itself to a detailed analysis through which these subnetworks emerged,” explained co-lead author Houri Hintiryan of the USC Laboratory of Neuro Imaging.

So that scientists around the world may continue to look for fundamental structural insights, the full, interactive imaging dataset is viewable at Mouse Connectome Project, providing a resource for researchers interested in studying the anatomy and function of cortical networks throughout the brain.

“It really is quite tedious,” Dong said of collecting the data, “and labor-intensive, and it requires highly specialized skills and technology. But think of the Human Genome Project and how much it accelerated the process of discovery and the whole field when infrastructures existed for people to share and compare. That was our motivation.”

How these subnetworks interact will provide a crucial baseline from which to better understand diseases of “disconnection” such as autism and Alzheimer’s disease, in which the manifestations of symptoms are potentially a result of disordered or damaged connections.

The researchers’ map of the mouse cerebral cortex can be compared to data on disease-affected brains, brains in development and genetic information. It will also offer necessary context for humans, who behaved just like other mammals only a few thousand years ago and who still share most underlying basic behavioral characteristics such as hunger and pain.

“The fundamental logic of mammalian brains is the same, particularly when it comes to basic behaviors such as eating, sleeping and social behaviors” said Dong, who noted that similar studies in humans have thus far not gotten to the cellular level. “There are lots of organizing principles to brain structures that we are just beginning to understand.”

The researchers identified the brain subnetworks based on their high degree of interconnectivity — though relatively independent, several structures provide communication routes through which the subnetworks interact. Combined with behavioral data from past research and information about subcortical targets, these interconnections imply remarkable functional significance for the subnetworks.

Four of the eight identified subnetworks in the mouse cortex relate to sensation and movement of the body — what the researchers dub somatic sensorimotor. In particular, the researchers identified separate subnetworks for movements in the face, upper limbs, lower limbs and trunk, and whiskers. Together, these networks facilitate motor behaviors such as eating and drinking, reaching and grabbing, locomotion and exploration of the environment.

Two other subnetworks are comprised of structures located along the midline of the cerebral cortex. These medial subnetworks seem devoted to the integration of visual, auditory and somatic sensory information, according to the study. Several other structures located along the side of the brain form two lateral subnetworks, one of which potentially serves to regulate the internal status of the body (i.e., taste, hunger, visceral information) and the other as a “mega-integration” subnetwork that allows the interaction of information from nearly the entire cortex.

Filed under cerebral cortex brain mapping neural networks neuroimaging neurons neuroscience science

516 notes

Study ties father’s age at childbearing to higher rates of psychiatric, academic problems in kids
An Indiana University study in collaboration with medical researchers from Karolinska Institute in Stockholm has found that advancing paternal age at childbearing can lead to higher rates of psychiatric and academic problems in offspring than previously estimated.
Examining an immense data set — everyone born in Sweden from 1973 until 2001 — the researchers documented a compelling association between advancing paternal age at childbearing and numerous psychiatric disorders and educational problems in their children, including autism, ADHD, bipolar disorder, schizophrenia, suicide attempts and substance abuse problems. Academic problems included failing grades, low educational attainment and low IQ scores.
Among the findings: When compared to a child born to a 24-year-old father, a child born to a 45-year-old father is 3.5 times more likely to have autism, 13 times more likely to have ADHD, two times more likely to have a psychotic disorder, 25 times more likely to have bipolar disorder and 2.5 times more likely to have suicidal behavior or a substance abuse problem. For most of these problems, the likelihood of the disorder increased steadily with advancing paternal age, suggesting there is no particular paternal age at childbearing that suddenly becomes problematic. 
"We were shocked by the findings," said Brian D’Onofrio, lead author and associate professor in the Department of Psychological and Brain Sciences in the College of Arts and Sciences at IU Bloomington. "The specific associations with paternal age were much, much larger than in previous studies. In fact, we found that advancing paternal age was associated with greater risk for several problems, such as ADHD, suicide attempts and substance use problems, whereas traditional research designs suggested advancing paternal age may have diminished the rate at which these problems occur."
The study, “Parental Age at Childbearing and Offspring Psychiatric and Academic Morbidity,” was published today in JAMA Psychiatry.
Notably, the researchers found converging evidence for the associations with advancing paternal age at childbearing from multiple research designs for a broad range of problems in offspring. By comparing siblings, which accounts for all factors that make children living in the same house similar, researchers discovered that the associations with advancing paternal age were much greater than estimates in the general population. By comparing cousins, including first-born cousins, the researchers could examine whether birth order or the influences of one sibling on another could account for the findings.
The authors also statistically controlled for parents’ highest level of education and income, factors often thought to counteract the negative effects of advancing paternal age because older parents are more likely to be more mature and financially stable. The findings were remarkably consistent, however, as the specific associations with advancing paternal age remained.
"The findings in this study are more informative than many previous studies," D’Onofrio said. "First, we had the largest sample size for a study on paternal age. Second, we predicted numerous psychiatric and academic problems that are associated with significant impairment. Finally, we were able to estimate the association between paternal age at childbearing and these problems while comparing differentially exposed siblings, as well as cousins. These approaches allowed us to control for many factors that other studies could not."
In the past 40 years, the average age for childbearing has been increasing steadily for both men and women. Since 1970 for instance, the average age of first-time mothers in the U.S. has gone up four years from 21.5 to 25.4. For men the average is three years older. In the northeast, the ages are higher. Yet the implications of this fact — both socially and in terms of the long-term effects on the health and well-being of the population as a whole — are not yet fully understood.
Moreover, while maternal age has been under scrutiny for a number of years, a more recent body of research has begun to explore the possible effects of advancing paternal age on a variety of physical and mental health issues in offspring. Existing studies have pointed to increasing risks for some psychological disorders with advancing paternal age. Yet the results are often inconsistent with one another, statistically inconclusive or unable to take certain confounding factors into account.
The working hypothesis for D’Onofrio and his colleagues who study this phenomenon is that unlike women, who are born with all their eggs, men continue to produce new sperm throughout their lives. Each time sperm replicate, there is a chance for a mutation in the DNA to occur. As men age, they are also exposed to numerous environmental toxins, which have been shown to cause mutations in the DNA found in sperm. Molecular genetic studies have, in fact, shown that sperm of older men have more genetic mutations.
This study and others like it, however, perhaps signal some of the unforeseen, negative consequences of a relatively new trend in human history. As such, D’Onofrio said, it may have important social and public policy implications. Given the increased risk associated with advancing paternal age at childbearing, policy-makers may want to make it possible for men and women to accommodate children earlier in their lives without having to set aside other goals.
"While the findings do not indicate that every child born to an older father will have these problems," D’Onofrio said, "they add to a growing body of research indicating that advancing paternal age is associated with increased risk for serious problems. As such, the entire body of research can help to inform individuals in their personal and medical decision-making."

Study ties father’s age at childbearing to higher rates of psychiatric, academic problems in kids

An Indiana University study in collaboration with medical researchers from Karolinska Institute in Stockholm has found that advancing paternal age at childbearing can lead to higher rates of psychiatric and academic problems in offspring than previously estimated.

Examining an immense data set — everyone born in Sweden from 1973 until 2001 — the researchers documented a compelling association between advancing paternal age at childbearing and numerous psychiatric disorders and educational problems in their children, including autism, ADHD, bipolar disorder, schizophrenia, suicide attempts and substance abuse problems. Academic problems included failing grades, low educational attainment and low IQ scores.

Among the findings: When compared to a child born to a 24-year-old father, a child born to a 45-year-old father is 3.5 times more likely to have autism, 13 times more likely to have ADHD, two times more likely to have a psychotic disorder, 25 times more likely to have bipolar disorder and 2.5 times more likely to have suicidal behavior or a substance abuse problem. For most of these problems, the likelihood of the disorder increased steadily with advancing paternal age, suggesting there is no particular paternal age at childbearing that suddenly becomes problematic. 

"We were shocked by the findings," said Brian D’Onofrio, lead author and associate professor in the Department of Psychological and Brain Sciences in the College of Arts and Sciences at IU Bloomington. "The specific associations with paternal age were much, much larger than in previous studies. In fact, we found that advancing paternal age was associated with greater risk for several problems, such as ADHD, suicide attempts and substance use problems, whereas traditional research designs suggested advancing paternal age may have diminished the rate at which these problems occur."

The study, “Parental Age at Childbearing and Offspring Psychiatric and Academic Morbidity,” was published today in JAMA Psychiatry.

Notably, the researchers found converging evidence for the associations with advancing paternal age at childbearing from multiple research designs for a broad range of problems in offspring. By comparing siblings, which accounts for all factors that make children living in the same house similar, researchers discovered that the associations with advancing paternal age were much greater than estimates in the general population. By comparing cousins, including first-born cousins, the researchers could examine whether birth order or the influences of one sibling on another could account for the findings.

The authors also statistically controlled for parents’ highest level of education and income, factors often thought to counteract the negative effects of advancing paternal age because older parents are more likely to be more mature and financially stable. The findings were remarkably consistent, however, as the specific associations with advancing paternal age remained.

"The findings in this study are more informative than many previous studies," D’Onofrio said. "First, we had the largest sample size for a study on paternal age. Second, we predicted numerous psychiatric and academic problems that are associated with significant impairment. Finally, we were able to estimate the association between paternal age at childbearing and these problems while comparing differentially exposed siblings, as well as cousins. These approaches allowed us to control for many factors that other studies could not."

In the past 40 years, the average age for childbearing has been increasing steadily for both men and women. Since 1970 for instance, the average age of first-time mothers in the U.S. has gone up four years from 21.5 to 25.4. For men the average is three years older. In the northeast, the ages are higher. Yet the implications of this fact — both socially and in terms of the long-term effects on the health and well-being of the population as a whole — are not yet fully understood.

Moreover, while maternal age has been under scrutiny for a number of years, a more recent body of research has begun to explore the possible effects of advancing paternal age on a variety of physical and mental health issues in offspring. Existing studies have pointed to increasing risks for some psychological disorders with advancing paternal age. Yet the results are often inconsistent with one another, statistically inconclusive or unable to take certain confounding factors into account.

The working hypothesis for D’Onofrio and his colleagues who study this phenomenon is that unlike women, who are born with all their eggs, men continue to produce new sperm throughout their lives. Each time sperm replicate, there is a chance for a mutation in the DNA to occur. As men age, they are also exposed to numerous environmental toxins, which have been shown to cause mutations in the DNA found in sperm. Molecular genetic studies have, in fact, shown that sperm of older men have more genetic mutations.

This study and others like it, however, perhaps signal some of the unforeseen, negative consequences of a relatively new trend in human history. As such, D’Onofrio said, it may have important social and public policy implications. Given the increased risk associated with advancing paternal age at childbearing, policy-makers may want to make it possible for men and women to accommodate children earlier in their lives without having to set aside other goals.

"While the findings do not indicate that every child born to an older father will have these problems," D’Onofrio said, "they add to a growing body of research indicating that advancing paternal age is associated with increased risk for serious problems. As such, the entire body of research can help to inform individuals in their personal and medical decision-making."

Filed under autism ADHD parenting schizophrenia psychology neuroscience science

122 notes

One gene influences recovery from traumatic brain injury

Researchers report that one tiny variation in the sequence of a gene may cause some people to be more impaired by traumatic brain injury (TBI) than others with comparable wounds.

The study, described in the journal PLOS ONE, measured general intelligence in a group of 156 Vietnam War veterans who suffered penetrating head injuries during the war. All of the study subjects had damage to the prefrontal cortex, a brain region behind the forehead that is important to cognitive tasks such as planning, problem-solving, self-restraint and complex thought.

The researchers controlled for the size and location of subjects’ brain injuries and other factors, such as intelligence prior to injury, which might have contributed to differences in cognitive function. (Prior to combat, the veterans had completed the Armed Forces Qualifications Test, which included measures of intelligence that provided a baseline for the new analysis.)

“We administered a large, cognitive battery of tests to investigate how they performed after their injury,” said study leader Aron Barbey, a professor of speech and hearing science, of psychology and of neuroscience at the University of Illinois. “And we had a team of neurologists who helped characterize the nature and scope of the patients’ brain injuries.”

The researchers also collected blood for a genetic analysis, focusing on a gene known as BDNF (brain-derived neurotrophic factor).

The team found that a single polymorphism (a difference in one “letter” of the sequence) in the BDNF gene accounted for significant differences in intelligence among those with similar injuries and comparable intelligence before being injured.

“BDNF is a basic growth factor and it’s related to neurogenesis, the production of new neurons,” Barbey said. “What we found is that if people have a specific polymorphism in the BDNF gene, they recovered to a greater extent than those with a different variant of the gene.”

The change in the gene alters the BDNF protein: The amino acid methionine (Met) is incorporated at a specific site in the protein instead of valine (Val). Since people inherit two versions of each gene, one from each parent, they have either Val/Val, Val/Met or Met/Met variants of the gene.

“The effects of this difference were large – very large,” Barbey said. “If an individual had the Val/Val combination, then their performance on a battery of cognitive tests (conducted long after the injury occurred) was remarkably lower than that of individuals who had the Val/Met or Met/Met combination.”

On average, those with the Val/Val polymorphism scored about eight IQ points lower on tests of general intelligence than those with the Val/Met or Met/Met variants, Barbey said. Those with the Val/Val variant also were significantly more impaired in “specific competencies for intelligence like verbal comprehension, perceptual organization, working memory and processing speed,” he said.

To test these results, the researchers did the analysis over again “in a subset of individuals who had very similar (brain injuries) to the other group,” Barbey said. “We found the same kind of effects, suggesting that lesion location isn’t a factor influencing the difference between the groups.”

The finding opens a new avenue of exploration for treatments to aid the process of recovery from TBI, Barbey said.

(Source: news.illinois.edu)

Filed under prefrontal cortex brain-derived neurotrophic factor TBI memory brain injury neuroscience science

166 notes

In one ear and out the other
Remember that sound bite you heard on the radio this morning? The grocery items your spouse asked you to pick up? Chances are, you won’t.
Researchers at the University of Iowa have found that when it comes to memory, we don’t remember things we hear nearly as well as things we see or touch.
“As it turns out, there is merit to the Chinese proverb ‘I hear, and I forget; I see, and I remember,” says lead author of the study and UI graduate student, James Bigelow.
“We tend to think that the parts of our brain wired for memory are integrated. But our findings indicate our brain may use separate pathways to process information. Even more, our study suggests the brain may process auditory information differently than visual and tactile information, and alternative strategies—such as increased mental repetition—may be needed when trying to improve memory,” says Amy Poremba, associate professor in the UI Department of Psychology and corresponding author on the paper, published this week in the journal PLoS One.
Bigelow and Poremba discovered that when more than 100 UI undergraduate students were exposed to a variety of sounds, visuals, and things that could be felt, the students were least apt to remember the sounds they had heard.
In an experiment testing short-term memory, participants were asked to listen to pure tones they heard through headphones, look at various shades of red squares, and feel low-intensity vibrations by gripping an aluminum bar. Each set of tones, squares and vibrations was separated by time delays ranging from one to 32 seconds.
Although students’ memory declined across the board when time delays grew longer, the decline was much greater for sounds, and began as early as four to eight seconds after being exposed to them.
While this seems like a short time span, it’s akin to forgetting a phone number that wasn’t written down, notes Poremba. “If someone gives you a number, and you dial it right away, you are usually fine. But do anything in between, and the odds are you will have forgotten it,” she says.
In a second experiment, Bigelow and Poremba tested participants’ memory using things they might encounter on an everyday basis. Students listened to audio recordings of dogs barking, watched silent videos of a basketball game, and touched and held common objects blocked from view, such as a coffee mug. The researchers found that between an hour and a week later, students were worse at remembering the sounds they had heard, but their memory for visual scenes and tactile objects was about the same.
Both experiments suggest that the way your mind processes and stores sound may be different from the way it process and stores other types of memories. And that could have big implications for educators, design engineers, and advertisers alike.
“As teachers, we want to assume students will remember everything we say. But if you really want something to be memorable you may need to include a visual or hands-on experience, in addition to auditory information,” says Poremba.
Previous research has suggested that humans may have superior visual memory, and that hearing words associated with sounds—rather than hearing the sounds alone—may aid memory. Bigelow and Poremba’s study builds upon those findings by confirming that, indeed, we remember less of what we hear, regardless of whether sounds are linked to words.
The study also is the first to show that our ability to remember what we touch is roughly equal to our ability to remember what we see. The finding is important, because experiments with nonhuman primates such as monkeys and chimpanzees have shown that they similarly excel at visual and tactile memory tasks, but struggle with auditory tasks. Based on these observations, the authors believe humans’ weakness for remembering sounds likely has its roots in the evolution of the primate brain.

In one ear and out the other

Remember that sound bite you heard on the radio this morning? The grocery items your spouse asked you to pick up? Chances are, you won’t.

Researchers at the University of Iowa have found that when it comes to memory, we don’t remember things we hear nearly as well as things we see or touch.

“As it turns out, there is merit to the Chinese proverb ‘I hear, and I forget; I see, and I remember,” says lead author of the study and UI graduate student, James Bigelow.

“We tend to think that the parts of our brain wired for memory are integrated. But our findings indicate our brain may use separate pathways to process information. Even more, our study suggests the brain may process auditory information differently than visual and tactile information, and alternative strategies—such as increased mental repetition—may be needed when trying to improve memory,” says Amy Poremba, associate professor in the UI Department of Psychology and corresponding author on the paper, published this week in the journal PLoS One.

Bigelow and Poremba discovered that when more than 100 UI undergraduate students were exposed to a variety of sounds, visuals, and things that could be felt, the students were least apt to remember the sounds they had heard.

In an experiment testing short-term memory, participants were asked to listen to pure tones they heard through headphones, look at various shades of red squares, and feel low-intensity vibrations by gripping an aluminum bar. Each set of tones, squares and vibrations was separated by time delays ranging from one to 32 seconds.

Although students’ memory declined across the board when time delays grew longer, the decline was much greater for sounds, and began as early as four to eight seconds after being exposed to them.

While this seems like a short time span, it’s akin to forgetting a phone number that wasn’t written down, notes Poremba. “If someone gives you a number, and you dial it right away, you are usually fine. But do anything in between, and the odds are you will have forgotten it,” she says.

In a second experiment, Bigelow and Poremba tested participants’ memory using things they might encounter on an everyday basis. Students listened to audio recordings of dogs barking, watched silent videos of a basketball game, and touched and held common objects blocked from view, such as a coffee mug. The researchers found that between an hour and a week later, students were worse at remembering the sounds they had heard, but their memory for visual scenes and tactile objects was about the same.

Both experiments suggest that the way your mind processes and stores sound may be different from the way it process and stores other types of memories. And that could have big implications for educators, design engineers, and advertisers alike.

“As teachers, we want to assume students will remember everything we say. But if you really want something to be memorable you may need to include a visual or hands-on experience, in addition to auditory information,” says Poremba.

Previous research has suggested that humans may have superior visual memory, and that hearing words associated with sounds—rather than hearing the sounds alone—may aid memory. Bigelow and Poremba’s study builds upon those findings by confirming that, indeed, we remember less of what we hear, regardless of whether sounds are linked to words.

The study also is the first to show that our ability to remember what we touch is roughly equal to our ability to remember what we see. The finding is important, because experiments with nonhuman primates such as monkeys and chimpanzees have shown that they similarly excel at visual and tactile memory tasks, but struggle with auditory tasks. Based on these observations, the authors believe humans’ weakness for remembering sounds likely has its roots in the evolution of the primate brain.

Filed under sound sound processing memory visual memory neuroscience science

247 notes

Phantom limb pain relieved when amputated arm is put back to work
Max Ortiz Catalan has developed a new method for the treatment of phantom limb pain (PLP) after an amputation. The method is based on a unique combination of several technologies, and has been initially tested on a patient who has suffered from severe phantom limb pain for 48 years. A case study shows a drastic reduction of pain.
People who lose an arm or a leg often experience phantom sensations, as if the missing limb were still there. Seventy per cent of amputees experience pain in the amputated limb despite that it no longer exists. Phantom limb pain can be a serious chronic and deteriorating condition that reduces the quality of the person´s life considerably. The exact cause of phantom limb pain and other phantom sensations is yet unknown.
Phantom limb pain is currently treated with several different methods. Examples include mirror therapy, different types of medication, acupuncture and hypnosis. In many cases, however, nothing helps. This was the case for the patient that Chalmers researcher Max Ortiz Catalan selected for a case study of the new treatment method he has envisaged as a potential solution.
The patient lost his arm 48 years ago, and had since that time suffered from phantom pain varying from moderate to unbearable. He was never entirely free of pain.
The patient´s pain was drastically reduced after a period of treatment with the new method. He now has periods where he is entirely free of pain, and he is no longer awakened by intense periods of pain at night like he was previously. The new method uses muscle signals from the patient´s arm stump to drive a system known as augmented reality. The electrical signals in the muscles are sensed by electrodes on the skin. The signals are then translated into arm movements by complex algorithms. The patient can see himself on a screen with a superimposed virtual arm, which is controlled using his own neural command in real time.
”There are several features of this system which combined might be the cause of pain relief” says Max Ortiz Catalan. “The motor areas in the brain needed for movement of the amputated arm are reactivated, and the patient obtains visual feedback that tricks the brain into believing there is an arm executing such motor commands. He experiences himself as a whole, with the amputated arm back in place.”
Modern therapies that use conventional mirrors or virtual reality are based on visual feedback via the opposite arm or leg. For this reason, people who have lost both arms or both legs cannot be helped using these methods.
”Our method differs from previous treatment because the control signals are retrieved from the arm stump, and thus the affected arm is in charge” says Max Ortiz Catalan. ”The promotion of motor execution and the vivid sensation of completion provided by augmented reality may be the reason for the patient improvement, while mirror therapy and medicaments did not help previously.”
A clinical study will now be conducted of the new treatment, which has been developed in a collaboration between Chalmers University of Technology, Sahlgrenska University Hospital, the University of Gothenburg and Integrum. Three Swedish hospitals and other European clinics will cooperate during the study which will target patients with conditions resembling the one in the case study – that is, people who suffer from phantom pain and who have not responded to other currently available treatments.
The research group has also developed a system that can be used at home. Patients will be able to apply this therapy on their own, once it has been approved. An extension of the treatment is that it can be used by other patient groups that need to rehabilitate their mobility, such as stroke victims or some patients with spinal cord injuries.

Phantom limb pain relieved when amputated arm is put back to work

Max Ortiz Catalan has developed a new method for the treatment of phantom limb pain (PLP) after an amputation. The method is based on a unique combination of several technologies, and has been initially tested on a patient who has suffered from severe phantom limb pain for 48 years. A case study shows a drastic reduction of pain.

People who lose an arm or a leg often experience phantom sensations, as if the missing limb were still there. Seventy per cent of amputees experience pain in the amputated limb despite that it no longer exists. Phantom limb pain can be a serious chronic and deteriorating condition that reduces the quality of the person´s life considerably. The exact cause of phantom limb pain and other phantom sensations is yet unknown.

Phantom limb pain is currently treated with several different methods. Examples include mirror therapy, different types of medication, acupuncture and hypnosis. In many cases, however, nothing helps. This was the case for the patient that Chalmers researcher Max Ortiz Catalan selected for a case study of the new treatment method he has envisaged as a potential solution.

The patient lost his arm 48 years ago, and had since that time suffered from phantom pain varying from moderate to unbearable. He was never entirely free of pain.

The patient´s pain was drastically reduced after a period of treatment with the new method. He now has periods where he is entirely free of pain, and he is no longer awakened by intense periods of pain at night like he was previously.
The new method uses muscle signals from the patient´s arm stump to drive a system known as augmented reality. The electrical signals in the muscles are sensed by electrodes on the skin. The signals are then translated into arm movements by complex algorithms. The patient can see himself on a screen with a superimposed virtual arm, which is controlled using his own neural command in real time.

”There are several features of this system which combined might be the cause of pain relief” says Max Ortiz Catalan. “The motor areas in the brain needed for movement of the amputated arm are reactivated, and the patient obtains visual feedback that tricks the brain into believing there is an arm executing such motor commands. He experiences himself as a whole, with the amputated arm back in place.”

Modern therapies that use conventional mirrors or virtual reality are based on visual feedback via the opposite arm or leg. For this reason, people who have lost both arms or both legs cannot be helped using these methods.

”Our method differs from previous treatment because the control signals are retrieved from the arm stump, and thus the affected arm is in charge” says Max Ortiz Catalan. ”The promotion of motor execution and the vivid sensation of completion provided by augmented reality may be the reason for the patient improvement, while mirror therapy and medicaments did not help previously.”

A clinical study will now be conducted of the new treatment, which has been developed in a collaboration between Chalmers University of Technology, Sahlgrenska University Hospital, the University of Gothenburg and Integrum. Three Swedish hospitals and other European clinics will cooperate during the study which will target patients with conditions resembling the one in the case study – that is, people who suffer from phantom pain and who have not responded to other currently available treatments.

The research group has also developed a system that can be used at home. Patients will be able to apply this therapy on their own, once it has been approved. An extension of the treatment is that it can be used by other patient groups that need to rehabilitate their mobility, such as stroke victims or some patients with spinal cord injuries.

Filed under amputation phantom limb phantom limb pain prosthetics virtual reality technology neuroscience science

351 notes

Researchers generate new neurons in brains, spinal cords of living adult mammals
UT Southwestern Medical Center researchers created new nerve cells in the brains and spinal cords of living mammals without the need for stem cell transplants to replenish lost cells.
Although the research indicates it may someday be possible to regenerate neurons from the body’s own cells to repair traumatic brain injury or spinal cord damage or to treat conditions such as Alzheimer’s disease, the researchers stressed that it is too soon to know whether the neurons created in these initial studies resulted in any functional improvements, a goal for future research.
Spinal cord injuries can lead to an irreversible loss of neurons, and along with scarring, can ultimately lead to impaired motor and sensory functions. Scientists are hopeful that regenerating cells can be an avenue to repair damage, but adult spinal cords have limited ability to produce new neurons. Biomedical scientists have transplanted stem cells to replace neurons, but have faced other hurdles, underscoring the need for new methods of replenishing lost cells.
Scientists in UT Southwestern’s Department of Molecular Biology first successfully turned astrocytes – the most common non-neuronal brain cells – into neurons that formed networks in mice. They now successfully turned scar-forming astrocytes in the spinal cords of adult mice into neurons. The latest findings are published today in Nature Communications and follow previous findings published in Nature Cell Biology.
“Our earlier work was the first to clearly show in vivo (in a living animal) that mature astrocytes can be reprogrammed to become functional neurons without the need of cell transplantation. The current study did something similar in the spine, turning scar-forming astrocytes into progenitor cells called neuroblasts that regenerated into neurons,” said Dr. Chun-Li Zhang, assistant professor of molecular biology at UT Southwestern and senior author of both studies.
“Astrocytes are abundant and widely distributed both in the brain and in the spinal cord. In response to injury, these cells proliferate and contribute to scar formation. Once a scar has formed, it seals the injured area and creates a mechanical and biochemical barrier to neural regeneration,” Dr. Zhang explained. “Our results indicate that the astrocytes may be ideal targets for in vivo reprogramming.”
The scientists’ two-step approach first introduces a biological substance that regulates the expression of genes, called a transcription factor, into areas of the brain or spinal cord where that factor is not highly expressed in adult mice. Of 12 transcription factors tested, only SOX2 switched fully differentiated, adult astrocytes to an earlier neuronal precursor, or neuroblast, stage of development, Dr. Zhang said.
In the second step, the researchers gave the mice a drug called valproic acid (VPA) that encouraged the survival of the neuroblasts and their maturation (differentiation) into neurons. VPA has been used to treat epilepsy for more than half a century and also is prescribed to treat bipolar disorder and to prevent migraine headaches, he said.
The current study reports neurogenesis (neuron creation) occurred in the spinal cords of both adult and aged (over one-year old) mice of both sexes, although the response was much weaker in the aged mice, Dr. Zhang said. Researchers now are searching for ways to boost the number and speed of neuron creation. Neuroblasts took four weeks to form and eight weeks to mature into neurons, slower than neurogenesis reported in lab dish experiments, so researchers plan to conduct experiments to determine if the slower pace helps the newly generated neurons properly integrate into their environment.
In the spinal cord study, SOX2-induced mature neurons created from reprogramming of astrocytes persisted for 210 days after the start of the experiment, the longest time the researchers examined, he added.
Because tumor growth is a concern when cells are reprogrammed to an earlier stage of development, the researchers followed the mice in the Nature Cell Biology study for nearly a year to look for signs of tumor formation and reported finding none.
(Image: Shutterstock)

Researchers generate new neurons in brains, spinal cords of living adult mammals

UT Southwestern Medical Center researchers created new nerve cells in the brains and spinal cords of living mammals without the need for stem cell transplants to replenish lost cells.

Although the research indicates it may someday be possible to regenerate neurons from the body’s own cells to repair traumatic brain injury or spinal cord damage or to treat conditions such as Alzheimer’s disease, the researchers stressed that it is too soon to know whether the neurons created in these initial studies resulted in any functional improvements, a goal for future research.

Spinal cord injuries can lead to an irreversible loss of neurons, and along with scarring, can ultimately lead to impaired motor and sensory functions. Scientists are hopeful that regenerating cells can be an avenue to repair damage, but adult spinal cords have limited ability to produce new neurons. Biomedical scientists have transplanted stem cells to replace neurons, but have faced other hurdles, underscoring the need for new methods of replenishing lost cells.

Scientists in UT Southwestern’s Department of Molecular Biology first successfully turned astrocytes – the most common non-neuronal brain cells – into neurons that formed networks in mice. They now successfully turned scar-forming astrocytes in the spinal cords of adult mice into neurons. The latest findings are published today in Nature Communications and follow previous findings published in Nature Cell Biology.

“Our earlier work was the first to clearly show in vivo (in a living animal) that mature astrocytes can be reprogrammed to become functional neurons without the need of cell transplantation. The current study did something similar in the spine, turning scar-forming astrocytes into progenitor cells called neuroblasts that regenerated into neurons,” said Dr. Chun-Li Zhang, assistant professor of molecular biology at UT Southwestern and senior author of both studies.

“Astrocytes are abundant and widely distributed both in the brain and in the spinal cord. In response to injury, these cells proliferate and contribute to scar formation. Once a scar has formed, it seals the injured area and creates a mechanical and biochemical barrier to neural regeneration,” Dr. Zhang explained. “Our results indicate that the astrocytes may be ideal targets for in vivo reprogramming.”

The scientists’ two-step approach first introduces a biological substance that regulates the expression of genes, called a transcription factor, into areas of the brain or spinal cord where that factor is not highly expressed in adult mice. Of 12 transcription factors tested, only SOX2 switched fully differentiated, adult astrocytes to an earlier neuronal precursor, or neuroblast, stage of development, Dr. Zhang said.

In the second step, the researchers gave the mice a drug called valproic acid (VPA) that encouraged the survival of the neuroblasts and their maturation (differentiation) into neurons. VPA has been used to treat epilepsy for more than half a century and also is prescribed to treat bipolar disorder and to prevent migraine headaches, he said.

The current study reports neurogenesis (neuron creation) occurred in the spinal cords of both adult and aged (over one-year old) mice of both sexes, although the response was much weaker in the aged mice, Dr. Zhang said. Researchers now are searching for ways to boost the number and speed of neuron creation. Neuroblasts took four weeks to form and eight weeks to mature into neurons, slower than neurogenesis reported in lab dish experiments, so researchers plan to conduct experiments to determine if the slower pace helps the newly generated neurons properly integrate into their environment.

In the spinal cord study, SOX2-induced mature neurons created from reprogramming of astrocytes persisted for 210 days after the start of the experiment, the longest time the researchers examined, he added.

Because tumor growth is a concern when cells are reprogrammed to an earlier stage of development, the researchers followed the mice in the Nature Cell Biology study for nearly a year to look for signs of tumor formation and reported finding none.

(Image: Shutterstock)

Filed under valproic acid spinal cord astrocytes neurons neurodegeneration genetics neuroscience science

105 notes

New risk gene illuminates Alzheimer’s disease

A team of international scientists, including a researcher from Simon Fraser University, has isolated a gene thought to play a causal role in the development of Alzheimer’s disease. The Proceedings of the National Academy of Sciences recently published the team’s study.

The newly identified gene affects accumulation of amyloid-beta, a protein believed to be one of the main causes of the damage that underpins this brain disease in humans.

The gene encodes a protein that is important for intracellular transportation. Each brain cell relies on an internal highway system that transports molecular signals needed for the development, communication, and survival of the cell. 

This system’s impairment can disrupt amyloid-beta processing, causing its eventual accumulation. This contributes to the development of amyloid plaques, which are a key hallmark of Alzheimer’s disease.

Teasing out contributing disease factors, whether genetic or environmental, has long posed a challenge for Alzheimer’s researchers.

“Alzheimer’s is a multifactorial disease where a build-up of subtle problems develop in the nervous system over a span of decades,” says Michael Silverman, an SFU biology associate professor. He worked on the study with a team of Japanese scientists led by Dr. Takashi Morihara at Osaka University.   

Identifying these subtle, yet perhaps critical genetic contributions is challenging. “Alzheimer’s, like many human disorders, has a genetic component, yet many environmental and lifestyle factors contribute to the disease as well,” says Silverman. “In a sense, it is like looking for a needle in a complex genetic haystack.”

Only a small fraction of cases have a strong hereditary component, for example early-onset Alzheimer’s.

This breakthrough in Alzheimer’s research could open new avenues for the design of therapeutics and pave the way for early detection by helping healthcare professionals identify those who are predisposed to the disease.

“One possibility is that a genetic test for a particular variant of this newly discovered gene, along with other variants of genes that contribute to Alzheimer’s, will help to give a person their overall risk for the disease.  

“Lifestyle changes, such as improved diet, exercise, and an increase in cognitive stimulation may then help to slow the progression of Alzheimer’s,” says Silverman.

(Source: sfu.ca)

Filed under alzheimer's disease neurodegenerative diseases genetics neurons neuroscience science

free counters