Neuroscience

Articles and news from the latest research reports.

36 notes

Statin Use Following Hemorrhagic Stroke Associated with Improved Survival

Patients who were treated with a statin in the hospital after suffering from a hemorrhagic stroke were significantly more likely to survive than those who were not, according to a study published today in JAMA Neurology. This study was conducted by the same researchers who recently discovered that the use of cholesterol-lowering statins can improve survival in victims of ischemic stroke.

Ischemic stroke is caused by a constriction or obstruction of a blood vessel that blocks blood from reaching areas of the brain, while hemorrhagic stroke, also known as intracerebral hemorrhage, is bleeding in the brain.

“Some previous research has suggested that treating patients with statins after they suffer hemorrhagic stroke may increase their long-term risk of continued bleeding,” said lead author Alexander Flint, MD, PhD, of the Kaiser Permanente Department of Neuroscience in Redwood City, Calif. “Yet the findings of our study suggest that stopping statin treatments for these patients may carry substantial risks.”

The study included 3,481 individuals who were admitted to any of 20 Kaiser Permanente hospitals in Northern California with a hemorrhagic stroke over a 10-year period. Researchers looked at patient survival and discharge 30 days after the stroke.

Patients treated with a statin while in the hospital were more likely to be alive 30 days after suffering a hemorrhagic stroke than those who were not treated with a statin — 81.6 percent versus 61.3 percent. Patients treated with a statin while in the hospital were also more likely to be discharged to home or an acute rehabilitation facility than those who were not — 51.1 percent compared to 35.0 percent.

Patients whose statin therapy was discontinued — that is, patients taking a statin as an outpatient prior to experiencing a hemorrhagic stroke who did not receive a statin as an inpatient — had a mortality rate of 57.8 percent compared with a mortality rate of 18.9 percent for patients using a statin before and during hospitalization.

The researchers concluded that statin use is strongly associated with improved outcomes after hemorrhagic stroke, and that discontinuing statin use is strongly associated with worsened outcomes after hemorrhagic stroke.

(Source: share.kaiserpermanente.org)

Filed under stroke statin intracerebral hemorrhage neuroscience science

94 notes

Brainwave Test Could Improve Autism Diagnosis and Classification
A new study by researchers at Albert Einstein College of Medicine of Yeshiva University suggests that measuring how fast the brain responds to sights and sounds could help in objectively classifying people on the autism spectrum and may help diagnose the condition earlier. The paper was published today in the online edition of the Journal of Autism and Developmental Disabilities.
The U.S. Centers for Disease Control and Prevention estimates that 1 in 68 children has been identified with an autism spectrum disorder (ASD). The signs and symptoms of ASD vary significantly from person to person, ranging from mild social and communication difficulties to profound cognitive impairments.
“One of the challenges in autism is that we don’t know how to classify patients into subgroups or even what those subgroups might be,” said study leader Sophie Molholm, Ph.D., associate professor in the Dominick P. Purpura Department of Neuroscience and the Muriel and Harold Block Faculty Scholar in Mental Illness in the department of pediatrics at Einstein. “This has greatly limited our understanding of the disorder and how to treat it.”
Autism is diagnosed based on a patient’s behavioral characteristics and symptoms. “These assessments can be highly subjective and require a tremendous amount of clinical expertise,” said Dr. Molholm. “We clearly need a more objective way to diagnose and classify this disorder.”
An earlier study by Dr. Molholm and colleagues suggested that brainwave electroencephalogram (EEG) recordings could potentially reveal how severely ASD individuals are affected. That study found that children with ASD process sensory information—such as sound, touch and vision—less rapidly than typically developing children do.
The current study was intended to see whether sensory processing varies along the autism spectrum. Forty-three ASD children aged 6 to 17 were presented with either a simple auditory tone, a visual image (red circle), or a tone combined with an image, and instructed to press a button as soon as possible after hearing the tone, seeing the image or seeing and hearing the two stimuli together. Continuous EEG recordings were made via 70 scalp electrodes to determine how fast the children’s brains were processing the stimuli.
The speed with which the subjects processed auditory signals strongly correlated with the severity of their symptoms: the more time required for an ASD individual to process the auditory signals, the more severe that person’s autistic symptoms. “This finding is in line with studies showing that, in people with ASD, the microarchitecture in the brain’s auditory center differs from that of typically developing children,” Dr. Molholm said.
The study also found a significant though weaker correlation between the speed of processing combined audio-visual signals and ASD severity. No link was observed between visual processing and ASD severity.
“This is a first step toward developing a biomarker of autism severity—an objective way to assess someone’s place on the ASD spectrum,” said Dr. Molholm. “Using EEG recordings in this way might also prove useful for objectively evaluating the effectiveness of ASD therapies.”
In addition, EEG recordings might help diagnose ASD earlier. “Early diagnosis allows for earlier treatment—which we know increases the likelihood of a better outcome,” said Dr. Molholm. “But currently, fewer than 15 percent of children with ASD are diagnosed before age 4. We might be able to adapt this technology to allow for early ASD detection and therapy for a much larger percentage of children.”

Brainwave Test Could Improve Autism Diagnosis and Classification

A new study by researchers at Albert Einstein College of Medicine of Yeshiva University suggests that measuring how fast the brain responds to sights and sounds could help in objectively classifying people on the autism spectrum and may help diagnose the condition earlier. The paper was published today in the online edition of the Journal of Autism and Developmental Disabilities.

The U.S. Centers for Disease Control and Prevention estimates that 1 in 68 children has been identified with an autism spectrum disorder (ASD). The signs and symptoms of ASD vary significantly from person to person, ranging from mild social and communication difficulties to profound cognitive impairments.

“One of the challenges in autism is that we don’t know how to classify patients into subgroups or even what those subgroups might be,” said study leader Sophie Molholm, Ph.D., associate professor in the Dominick P. Purpura Department of Neuroscience and the Muriel and Harold Block Faculty Scholar in Mental Illness in the department of pediatrics at Einstein. “This has greatly limited our understanding of the disorder and how to treat it.”

Autism is diagnosed based on a patient’s behavioral characteristics and symptoms. “These assessments can be highly subjective and require a tremendous amount of clinical expertise,” said Dr. Molholm. “We clearly need a more objective way to diagnose and classify this disorder.”

An earlier study by Dr. Molholm and colleagues suggested that brainwave electroencephalogram (EEG) recordings could potentially reveal how severely ASD individuals are affected. That study found that children with ASD process sensory information—such as sound, touch and vision—less rapidly than typically developing children do.

The current study was intended to see whether sensory processing varies along the autism spectrum. Forty-three ASD children aged 6 to 17 were presented with either a simple auditory tone, a visual image (red circle), or a tone combined with an image, and instructed to press a button as soon as possible after hearing the tone, seeing the image or seeing and hearing the two stimuli together. Continuous EEG recordings were made via 70 scalp electrodes to determine how fast the children’s brains were processing the stimuli.

The speed with which the subjects processed auditory signals strongly correlated with the severity of their symptoms: the more time required for an ASD individual to process the auditory signals, the more severe that person’s autistic symptoms. “This finding is in line with studies showing that, in people with ASD, the microarchitecture in the brain’s auditory center differs from that of typically developing children,” Dr. Molholm said.

The study also found a significant though weaker correlation between the speed of processing combined audio-visual signals and ASD severity. No link was observed between visual processing and ASD severity.

“This is a first step toward developing a biomarker of autism severity—an objective way to assess someone’s place on the ASD spectrum,” said Dr. Molholm. “Using EEG recordings in this way might also prove useful for objectively evaluating the effectiveness of ASD therapies.”

In addition, EEG recordings might help diagnose ASD earlier. “Early diagnosis allows for earlier treatment—which we know increases the likelihood of a better outcome,” said Dr. Molholm. “But currently, fewer than 15 percent of children with ASD are diagnosed before age 4. We might be able to adapt this technology to allow for early ASD detection and therapy for a much larger percentage of children.”

Filed under autism ASD EEG brainwaves neuroscience science

68 notes

(Image caption: A neuron in which the axon originates at a dendrite. Signals arriving at this dendrites become more efficiently forwarded than signals input elsewhere. Credit: Alexei V. Egorov, 2014)
Communication without detours
Certain nerve cells take a shortcut for the transmission of information: signals are not conducted via the cell`s center, but around it like on a bypass road. The previously unknown nerve cell shape is now presented in the journal “Neuron" by a research team from Heidelberg, Mannheim and Bonn.
Nerve cells communicate by using electrical signals. Via widely ramified cell structures—the  dendrites—, they receive signals from other neurons and then transmit them over a thin cell extension—the axon—to other nerve cells. Axon and dendrites are usually interconnected by the neuron’s cell body. A team of scientists at the Bernstein Center Heidelberg-Mannheim, Heidelberg University, and the University of Bonn has now discovered neurons in which the axon arises directly from one of the dendrites. Similar to taking a bypass road, the signal transmission is thus facilitated within the cell.
“Input signals at this dendrite do not need not be propagated across the cell body,” explains Christian Thome of the Bernstein Center Heidelberg-Mannheim and Heidelberg University, one of the two first authors of the study. For their analyses, the scientists specifically colored the places of origin of axons of so-called pyramidal cells in the hippocampus. This brain region is involved in memory processes. The surprising result: “We found that in more than half of the cells, the axon does not emerge from the cell body, but arises from a lower dendrite,” Thome says.
The researchers then studied the effect of signals received at this special dendrite. For this purpose, they injected a certain form of the neural transmitter substance glutamate into the brain tissue of mice that can be activated by light pulses. A high-resolution microscope allowed the neuroscientists to direct the light beam directly to a specific dendrite. By the subsequent activation of the messenger substance, they simulated an exciting input signal.
“Our measurements indicate that dendrites that are directly connected to the axon, actively propagate even small input stimuli and activate the neuron,” says second first author Tony Kelly, a member of the Sonderforschungsbereich (SFB) 1089 at the University of Bonn. A computer simulation of the scientists predicts that this effect is particularly pronounced when the information flow from other dendrites to the axon is suppressed by inhibitory input signals at the cell body.
“That way, information transmitted by this special dendrite influences the behavior of the nerve cell more than input from any other dendrite,” Kelly says. In a future step, the researchers attempt to figure out which biological function is actually strengthened through the specific dendrite—and what therefore might be the reason for the unusual shape of these neurons.

(Image caption: A neuron in which the axon originates at a dendrite. Signals arriving at this dendrites become more efficiently forwarded than signals input elsewhere. Credit: Alexei V. Egorov, 2014)

Communication without detours

Certain nerve cells take a shortcut for the transmission of information: signals are not conducted via the cell`s center, but around it like on a bypass road. The previously unknown nerve cell shape is now presented in the journal “Neuron" by a research team from Heidelberg, Mannheim and Bonn.

Nerve cells communicate by using electrical signals. Via widely ramified cell structures—the  dendrites—, they receive signals from other neurons and then transmit them over a thin cell extension—the axon—to other nerve cells. Axon and dendrites are usually interconnected by the neuron’s cell body. A team of scientists at the Bernstein Center Heidelberg-Mannheim, Heidelberg University, and the University of Bonn has now discovered neurons in which the axon arises directly from one of the dendrites. Similar to taking a bypass road, the signal transmission is thus facilitated within the cell.

“Input signals at this dendrite do not need not be propagated across the cell body,” explains Christian Thome of the Bernstein Center Heidelberg-Mannheim and Heidelberg University, one of the two first authors of the study. For their analyses, the scientists specifically colored the places of origin of axons of so-called pyramidal cells in the hippocampus. This brain region is involved in memory processes. The surprising result: “We found that in more than half of the cells, the axon does not emerge from the cell body, but arises from a lower dendrite,” Thome says.

The researchers then studied the effect of signals received at this special dendrite. For this purpose, they injected a certain form of the neural transmitter substance glutamate into the brain tissue of mice that can be activated by light pulses. A high-resolution microscope allowed the neuroscientists to direct the light beam directly to a specific dendrite. By the subsequent activation of the messenger substance, they simulated an exciting input signal.

“Our measurements indicate that dendrites that are directly connected to the axon, actively propagate even small input stimuli and activate the neuron,” says second first author Tony Kelly, a member of the Sonderforschungsbereich (SFB) 1089 at the University of Bonn. A computer simulation of the scientists predicts that this effect is particularly pronounced when the information flow from other dendrites to the axon is suppressed by inhibitory input signals at the cell body.

“That way, information transmitted by this special dendrite influences the behavior of the nerve cell more than input from any other dendrite,” Kelly says. In a future step, the researchers attempt to figure out which biological function is actually strengthened through the specific dendrite—and what therefore might be the reason for the unusual shape of these neurons.

Filed under hippocampus nerve cells pyramidal cells dendrites axons neuroscience science

102 notes

Evidence Supports Deep Brain Stimulation for Obsessive-Compulsive Disorder

Available research evidence supports the use of deep brain stimulation (DBS) for patients with obsessive-compulsive disorder (OCD) who don’t respond to other treatments, concludes a review in the October issue of Neurosurgery, official journal of the Congress of Neurological Surgeons (CNS). The journal is published by Lippincott Williams & Wilkins, a part of Wolters Kluwer Health.

image

Based on evidence, two specific bilateral DBS techniques are recommended for treatment of carefully selected patients with OCD, according to a new clinical practice guideline endorsed by the CNS and the American Association of Neurological Surgeons. While calling for further research in key areas, Dr. Clement Hamani of Toronto Western Hospital and coauthors emphasize that patients with OCD symptoms that don’t respond to other treatments should continue to have access to DBS.

Deep Brain Stimulation for OCD—What’s the Evidence?

Dr. Hamani led a multispecialty expert group in performing a systematic review of research on the effectiveness of DBS for OCD. Deep brain stimulation—placement of electrodes in specific areas of the brain, followed by electrical stimulation of those areas—has become an important treatment for patients with Parkinson’s disease and other movement disorders.

Although many patients with OCD respond well to medications and/or psychotherapy, 40 to 60 percent continue to experience symptoms despite treatment. Over the past decade, a growing number of reports have suggested that DBS may be an effective alternative in these “medically refractory” cases.

Dr. Hamani and colleagues were tasked with analyzing the supporting evidence and developing an initial clinical practice guideline for the use of DBS for patients with OCD. The review and guideline development process was sponsored by the American Society of Stereotactic and Functional Neurosurgery and the CNS. Out of more than 350 papers, the reviewers identified seven high-quality studies evaluating DBS for OCD.

Based on that evidence, they conclude that bilateral stimulation (on both sides of the brain) of two brain “targets”—areas called the subthalamic nucleus and the nucleus accumbens—can be regarded as effective treatments for OCD. In controlled clinical trials, both techniques improved OCD symptoms by around 30 percent on a standard rating scale.

While Research Proceeds, well-selected treatment-resistant severe OCD Patients Should Have Access to DBS

That evidence forms the basis for a clinical guideline stating that bilateral DBS is a “reasonable therapeutic option” for patients with severe OCD that does not respond to other treatments. The guideline also notes that there is “insufficient evidence” supporting the use of any type of unilateral DBS target (one side of the brain) for OCD.

The review highlights the difficulties of studying the effectiveness of DBS for OCD—because most patients respond to medical treatment, studies of this highly specialized treatment typically include only small numbers of patients. Dr. Hamani and coauthors identify some priorities for future research: particularly to identify the most effective brain targets and the subgroups of patients most likely to benefit.

Despite the limited evidence base, DBS therapy for OCD has been approved by the Food and Drug Administration under a humanitarian device exemption. Dr. Hamani and coauthors note that various safeguards are in place to ensure appropriate use, and prevent overuse, of DBS for OCD.

While research continues, they believe that functional neurosurgeons should continue to work with other specialists to ensure that patients with severe, medically refractory OCD continue to have access to potentially beneficial DBS therapy.

(Source: wolterskluwerhealth.com)

Filed under OCD deep brain stimulation nucleus accumbens DBS neuroscience science

87 notes

Mechanism of Parkinson’s spread demonstrated

An international, interdisciplinary group of researchers led by Gabor G. Kovacs from the Clinical Institute of Neurology at the MedUni Vienna has demonstrated, through the use of a new antibody, how Parkinson’s disease spreads from cell to cell in the human brain. Until now, this mechanism has only been observed in experimental models, but has now been demonstrated for the first time in humans too.

image

At the focus of the study, recently published in the highly respected journal “Neurobiology of Disease”, is the protein α-synuclein. This protein is present in the human brain but develops into a pathologically modified form in the presence of Parkinson’s disease and a common type of age-related dementia (known as Lewy body dementia, responsible for up to a quarter of all dementia-related diseases).

This study, which was carried out by a team from the MedUni Vienna in collaboration with researchers from the USA, Germany and Hungary, demonstrates for the first time that human nerve cells take up the pathological α-synuclein and thereby transfer the disease from one cell to the next. “This explains why patients with Parkinson’s disease deteriorate more and more from a clinical perspective and develop new symptoms, because the disease is able to spread to other parts of the brain through this infection process,” says Gabor G Kovacs, commenting on the central finding of the study.

New antibody achieved major breakthrough
The researchers demonstrated this mechanism using an antibody that scientists from the MedUni Vienna played a key role in helping to develop in collaboration with the German biotech firm Roboscreen. As the study shows, this antibody is the first to distinguish between the physiologically present and disease-associated form of α-synuclein and reacts exclusively with the pathological form.

Mechanism of spread demonstrated for the first time could provide a basis for new treatments for Parkinson’s
"For patients with Parkinson’s disease, this means that α-synuclein’s mechanism of spread from cell to cell could serve as a point of therapeutic attack if we are able to block this cell-to-cell transfer mechanism", continues Kovacs. In diagnostic terms, this antibody also represents a major breakthrough, since the antibodies used previously were unable to distinguish between the physiological and disease-associated form, which meant that they could not be used as easily for diagnostic purposes, e.g. in body fluids.

New antibody improves diagnosis
The fact that this is now possible for the first time has been demonstrated by a further study, also recently published in the specialist publication “Clinical Neuropathology”. According to this study, the new antibody can be used to detect disease-associated α-synuclein in the cerebrospinal fluid of patients with brain disease associated with α-synuclein. This is of major importance for clinical practice, because it means it will be possible to clinically determine whether the dementia is caused by Lewy bodies or not. This study arose through close collaboration between the Clinical Institute of Neurology (Gabor G. Kovacs) and the University Department of Neurology (Walter Pirker) at the MedUni Vienna.

(Source: meduniwien.ac.at)

Filed under parkinson's disease alpha synuclein mitochondria cerebrospinal fluid dementia neuroscience science

130 notes

Cooling of Dialysis Fluids Protects Against Brain Damage

While dialysis can cause blood pressure changes that damage the brain, cooling dialysis fluids can protect against such effects. The findings come from a study appearing in an upcoming issue of the Journal of the American Society of Nephrology (JASN). The cooling intervention can be delivered without additional cost and is simple to perform.

image

While dialysis is an essential treatment for many patients with kidney disease, it can cause damage to multiple organs, including the brain and heart, due to the sudden removal of bodily fluids.

To characterize dialysis-induced brain injury and to see whether cooled dialysis fluids (called dialysate) might help reduce such injury, Christopher McIntyre, DM, and his colleagues randomized 73 new dialysis patients to dialyze with body temperature dialysate or dialysate cooled to 0.5◦C below body temperature for 1 year.

The study demonstrated that dialysis drives progressive white matter brain injury due to blood pressure instability; however, patients who dialyzed at 0.5◦C below body temperature were completely protected against such white matter changes.

“This study demonstrates that paying attention to improving the tolerability of dialysis treatment—in this case by the simple and safe intervention of reducing the temperature of dialysate—does not just make patients feel better, but also can completely protect the brain from progressive damage,” said Dr. McIntyre.

(Source: newswise.com)

Filed under hemodialysis white matter brain damage brain injury medicine science

180 notes

Scientists track the rise and fall of brain volume throughout life
We can witness our bodies mature, then gradually grow wrinkled and weaker with age, but it is only recently that scientists have been able to track a similar progression in the nerve bundles of our brains. That tissue increases in volume until around age 40, then slowly shrinks. By the end of our lives the tissue is about the volume of a 7-year-old.
So finds a team of Stanford scientists who used a new magnetic resonance imaging technique to show, for the first time, how human brain tissue changes throughout life. Knowing what’s normal at different ages, doctors can now image a patient’s brain, compare it to this standard curve and be able to tell if a person is out of the normal range, much like the way a growth chart can help identify kids who have fallen below their growth curve. The researchers have already used the technique to identify previously overlooked changes in the brain of people with multiple sclerosis.
"This allows us to look at people who have come into the clinic, compare them to the norm and potentially diagnose or monitor abnormalities due to different diseases or changes due to medications," said Jason Yeatman, a graduate student in psychology and first author on a paper published today in Nature Communications. Aviv Mezer, a research associate, was senior author on the paper. Both collaborated with Brian Wandell, a professor of psychology, and his team.
For decades scientists have been able to image the brain using magnetic resonance imaging (MRI) and detect tumors, brain activity or abnormalities in people with some diseases, but those measurements were all subjective. A scientist measuring some aspect of the brain in one lab couldn’t directly compare findings with someone in another lab. And because no two scans could be compared, there was no way to look at a patient’s image and know whether it fell outside the normal range.
Limitation overcome
"A big problem in MRI is variation between instruments," Mezer said. Last year Mezer and Wandell led an interdisciplinary team to develop a technique that can be used to compare MRI scans quantitatively between labs, described in Nature Medicine. “Now with that method we found a way to measure the underlying tissue and not the instrumental bias. So that means that we can measure 100 subjects here and Jason can measure another 100 in Seattle (where he is now a postdoctoral fellow) and we can put them all in a database for the community.”
The technique the team had developed measures the amount of white matter tissue in the brain. That amount of white matter comes primarily from an insulating covering called myelin that allows nerves to fire most efficiently and is a hallmark of brain maturation, though the white matter can also be composed of other types of cells in the brain.
White matter plays a critical role in brain development and decline, and several diseases including schizophrenia and autism are associated with white matter abnormalities. Despite its importance in normal development and disease, no metric existed for determining whether any person’s white matter fell within a normal range, particularly if the people were imaged on different machines.
Mezer and Yeatman decided to use the newly developed quantitative technique to develop a normal curve for white matter levels throughout life. They imaged 24 regions within the brains of 102 people ages 7 to 85, and from that established a set of curves showing the increase and then eventual decrease in white matter in each of the 24 regions throughout life.
What they found is that the normal curve for brain composition is rainbow-shaped. It starts and ends with roughly the same amount of white matter and peaks between ages 30 and 50. But each of the 24 regions changes a different amount. Some parts of the brain, like those that control movement, are long, flat arcs, staying relatively stable throughout life.
Others, like the areas involved in thinking and learning, are steep arches, maturing dramatically and then falling off quickly. (The group did point out that their samples started at age 7 and a lot of brain development had already occurred.)
Continued collaboration
"Regions of the brain supporting high-level cognitive functions develop longer and have more degradation," Yeatman said. "Understanding how that relates to cognition will be really important and interesting." Yeatman is now a postdoctoral scholar at the University of Washington, and Mezer is now an assistant professor at the Hebrew University of Jerusalem. They plan to continue collaborating with each other and with other members of the Wandell lab, looking at how brain composition correlates with learning and how it could be used to diagnose diseases, learning disabilities or mental health issues.
The group has already shown that they can identify people with multiple sclerosis (MS) as falling outside the normal curve. People with MS develop what are known as lesions – regions in the brain or spinal cord where myelin is missing. In this paper, the team showed that they could identify people with MS as being off the normal curve throughout regions of the brain, including places where there are no visible lesions. This could provide an alternate method of monitoring and diagnosing MS, they say.
Wandell has had a particular interest in studying the changes that happen in the brain as a child learns to read. Until now, if a family brought a child into the clinic with learning disabilities, Wandell and other scientists had no way to diagnose whether the child’s brain was developing normally, or to determine the relationship between learning delays and white matter abnormalities.
"Now that we know what the normal distribution is, when a single person comes in you can ask how their child compares to the normal distribution. That’s where this is headed," said Wandell, who is also the Isaac and Madeline Stein Family professor and a Stanford Bio-X affiliate. Wandell runs the Center for Cognitive and Neurobiological Imaging (CNI), where Mezer and the team developed the MRI technique to quantify white matter, and where the scans for this study were conducted.
The ability to share data among scientists is an issue Wandell has championed at the CNI and has been promoting in his work helping the Stanford Neurosciences Institute plan the computing strategy for their new facility. “Sharing of data and computational methods is critical for scientific progress,” Wandell said. In line with that goal, the new standard curve for white matter is something scientists around the world can use and contribute data to.

Scientists track the rise and fall of brain volume throughout life

We can witness our bodies mature, then gradually grow wrinkled and weaker with age, but it is only recently that scientists have been able to track a similar progression in the nerve bundles of our brains. That tissue increases in volume until around age 40, then slowly shrinks. By the end of our lives the tissue is about the volume of a 7-year-old.

So finds a team of Stanford scientists who used a new magnetic resonance imaging technique to show, for the first time, how human brain tissue changes throughout life. Knowing what’s normal at different ages, doctors can now image a patient’s brain, compare it to this standard curve and be able to tell if a person is out of the normal range, much like the way a growth chart can help identify kids who have fallen below their growth curve. The researchers have already used the technique to identify previously overlooked changes in the brain of people with multiple sclerosis.

"This allows us to look at people who have come into the clinic, compare them to the norm and potentially diagnose or monitor abnormalities due to different diseases or changes due to medications," said Jason Yeatman, a graduate student in psychology and first author on a paper published today in Nature Communications. Aviv Mezer, a research associate, was senior author on the paper. Both collaborated with Brian Wandell, a professor of psychology, and his team.

For decades scientists have been able to image the brain using magnetic resonance imaging (MRI) and detect tumors, brain activity or abnormalities in people with some diseases, but those measurements were all subjective. A scientist measuring some aspect of the brain in one lab couldn’t directly compare findings with someone in another lab. And because no two scans could be compared, there was no way to look at a patient’s image and know whether it fell outside the normal range.

Limitation overcome

"A big problem in MRI is variation between instruments," Mezer said. Last year Mezer and Wandell led an interdisciplinary team to develop a technique that can be used to compare MRI scans quantitatively between labs, described in Nature Medicine. “Now with that method we found a way to measure the underlying tissue and not the instrumental bias. So that means that we can measure 100 subjects here and Jason can measure another 100 in Seattle (where he is now a postdoctoral fellow) and we can put them all in a database for the community.”

The technique the team had developed measures the amount of white matter tissue in the brain. That amount of white matter comes primarily from an insulating covering called myelin that allows nerves to fire most efficiently and is a hallmark of brain maturation, though the white matter can also be composed of other types of cells in the brain.

White matter plays a critical role in brain development and decline, and several diseases including schizophrenia and autism are associated with white matter abnormalities. Despite its importance in normal development and disease, no metric existed for determining whether any person’s white matter fell within a normal range, particularly if the people were imaged on different machines.

Mezer and Yeatman decided to use the newly developed quantitative technique to develop a normal curve for white matter levels throughout life. They imaged 24 regions within the brains of 102 people ages 7 to 85, and from that established a set of curves showing the increase and then eventual decrease in white matter in each of the 24 regions throughout life.

What they found is that the normal curve for brain composition is rainbow-shaped. It starts and ends with roughly the same amount of white matter and peaks between ages 30 and 50. But each of the 24 regions changes a different amount. Some parts of the brain, like those that control movement, are long, flat arcs, staying relatively stable throughout life.

Others, like the areas involved in thinking and learning, are steep arches, maturing dramatically and then falling off quickly. (The group did point out that their samples started at age 7 and a lot of brain development had already occurred.)

Continued collaboration

"Regions of the brain supporting high-level cognitive functions develop longer and have more degradation," Yeatman said. "Understanding how that relates to cognition will be really important and interesting." Yeatman is now a postdoctoral scholar at the University of Washington, and Mezer is now an assistant professor at the Hebrew University of Jerusalem. They plan to continue collaborating with each other and with other members of the Wandell lab, looking at how brain composition correlates with learning and how it could be used to diagnose diseases, learning disabilities or mental health issues.

The group has already shown that they can identify people with multiple sclerosis (MS) as falling outside the normal curve. People with MS develop what are known as lesions – regions in the brain or spinal cord where myelin is missing. In this paper, the team showed that they could identify people with MS as being off the normal curve throughout regions of the brain, including places where there are no visible lesions. This could provide an alternate method of monitoring and diagnosing MS, they say.

Wandell has had a particular interest in studying the changes that happen in the brain as a child learns to read. Until now, if a family brought a child into the clinic with learning disabilities, Wandell and other scientists had no way to diagnose whether the child’s brain was developing normally, or to determine the relationship between learning delays and white matter abnormalities.

"Now that we know what the normal distribution is, when a single person comes in you can ask how their child compares to the normal distribution. That’s where this is headed," said Wandell, who is also the Isaac and Madeline Stein Family professor and a Stanford Bio-X affiliate. Wandell runs the Center for Cognitive and Neurobiological Imaging (CNI), where Mezer and the team developed the MRI technique to quantify white matter, and where the scans for this study were conducted.

The ability to share data among scientists is an issue Wandell has championed at the CNI and has been promoting in his work helping the Stanford Neurosciences Institute plan the computing strategy for their new facility. “Sharing of data and computational methods is critical for scientific progress,” Wandell said. In line with that goal, the new standard curve for white matter is something scientists around the world can use and contribute data to.

Filed under brain tissue brain volume MS white matter neuroimaging neuroscience science

102 notes

(Image caption: Archer1 fluorescence in a cultured rat hippocampal neuron. By monitoring changes in this fluorescence at up to a thousand frames per second, researchers can track the electrical activity of the cell. Credit: Nicholas Flytzanis, Claire Bedbrook and Viviana Gradinaru/Caltech)
Sensing Neuronal Activity With Light
For years, neuroscientists have been trying to develop tools that would allow them to clearly view the brain’s circuitry in action—from the first moment a neuron fires to the resulting behavior in a whole organism. To get this complete picture, neuroscientists are working to develop a range of new tools to study the brain. Researchers at Caltech have developed one such tool that provides a new way of mapping neural networks in a living organism.
The work—a collaboration between Viviana Gradinaru (BS ‘05), assistant professor of biology and biological engineering, and Frances Arnold, the Dick and Barbara Dickinson Professor of Chemical Engineering, Bioengineering and Biochemistry—was described in two separate papers published this month.
When a neuron is at rest, channels and pumps in the cell membrane maintain a cell-specific balance of positively and negatively charged ions within and outside of the cell resulting in a steady membrane voltage called the cell’s resting potential. However, if a stimulus is detected—for example, a scent or a sound—ions flood through newly open channels causing a change in membrane voltage. This voltage change is often manifested as an action potential—the neuronal impulse that sets circuit activity into motion.
The tool developed by Gradinaru and Arnold detects and serves as a marker of these voltage changes.
"Our overarching goal for this tool was to achieve sensing of neuronal activity with light rather than traditional electrophysiology, but this goal had a few prerequisites," Gradinaru says. "The sensor had to be fast, since action potentials happen in just milliseconds. Also, the sensor had to be very bright so that the signal could be detected with existing microscopy setups. And you need to be able to simultaneously study the multiple neurons that make up a neural network."
The researchers began by optimizing Archaerhodopsin (Arch), a light-sensitive protein from bacteria. In nature, opsins like Arch detect sunlight and initiate the microbes’ movement toward the light so that they can begin photosynthesis. However, researchers can also exploit the light-responsive qualities of opsins for a neuroscience method called optogenetics—in which an organism’s neurons are genetically modified to express these microbial opsins. Then, by simply shining a light on the modified neurons, the researchers can control the activity of the cells as well as their associated behaviors in the organism.
Gradinaru had previously engineered Arch for better tolerance and performance in mammalian cells as a traditional optogenetic tool used to control an organism’s behavior with light. When the modified neurons are exposed to green light, Arch acts as an inhibitor, controlling neuronal activity—and thus the associated behaviors—by preventing the neurons from firing.
However, Gradinaru and Arnold were most interested in another property of Arch: when exposed to red light, the protein acts as a voltage sensor, responding to changes in membrane voltages by producing a flash of light in the presence of an action potential. Although this property could in principle allow Arch to detect the activity of networks of neurons, the light signal marking this neuronal activity was often too dim to see.
To fix this problem, Arnold and her colleagues made the Arch protein brighter using a method called directed evolution—a technique Arnold originally pioneered in the early 1990s. The researchers introduced mutations into the Arch gene, thus encoding millions of variants of the protein. They transferred the mutated genes into E. coli cells, which produced the mutant proteins encoded by the genes. They then screened thousands of the resulting E. coli colonies for the intensities of their fluorescence. The genes for the brightest versions were isolated and subjected to further rounds of mutagenesis and screening until the bacteria produced proteins that were 20 times brighter than the original Arch protein.
A paper describing the process and the bright new protein variants that were created was published in the September 9 issue of the Proceedings of the National Academy of Science.
"This experiment demonstrates how rapidly these remarkable bacterial proteins can evolve in response to new demands. But even more exciting is what they can do in neurons, as Viviana discovered," says Arnold.
In a separate study led by Gradinaru’s graduate students Nicholas Flytzanis and Claire Bedbrook, who is also advised by Arnold, the researchers genetically incorporated the new, brighter Arch variants into rodent neurons in culture to see which of these versions was most sensitive to voltage changes—and therefore would be the best at detecting action potentials. One variant, Archer1, was not only bright and sensitive enough to mark action potentials in mammalian neurons in real time, it could also be used to identify which neurons were synaptically connected—and communicating with one another—in a circuit.
The work is described in a study published on September 15 in the journal Nature Communications.
"What was interesting is that we would see two cells over here light up, but not this one over there—because the first two are synaptically connected," Gradinaru says. "This tool gave us a way to observe a network where the perturbation of one cell affects another."
However, sensing activity in a living organism and correlating this activity with behavior remained the biggest challenge. To accomplish this goal Gradinaru’s team worked with Paul Sternberg, the Thomas Hunt Morgan Professor of Biology, to test Archer1 as a sensor in a living organism—the tiny nematode worm C. elegans. “There are a few reasons why we used the worms here: they are powerful organisms for quick genetic engineering and their tissues are nearly transparent, making it easy to see the fluorescent protein in a living animal,” she says.
After incorporating Archer1 into neurons that were a part of the worm’s olfactory system—a primary source of sensory information for C. elegans—the researchers exposed the worm to an odorant. When the odorant was present, a baseline fluorescent signal was seen, and when the odorant was removed, the researchers could see the circuit of neurons light up, meaning that these particular neurons are repressed in the presence of the stimulus and active in the absence of the stimulus. The experiment was the first time that an Arch variant had been used to observe an active circuit in a living organism.
Gradinaru next hopes to use tools like Archer1 to better understand the complex neuronal networks of mammals, using microbial opsins as sensing and actuating tools in optogenetically modified rodents.
"For the future work it’s useful that this tool is bifunctional. Although Archer1 acts as a voltage sensor under red light, with green light, it’s an inhibitor," she says. "And so now a long-term goal for our optogenetics experiments is to combine the tools with behavior-controlling properties and the tools with voltage-sensing properties. This would allow us to obtain all-optical access to neuronal circuits. But I think there is still a lot of work ahead."
One goal for the future, Gradinaru says, is to make Archer1 even brighter. Although the protein’s fluorescence can be seen through the nearly transparent tissues of the nematode worm, opaque organs such as the mammalian brain are still a challenge. More work, she says, will need to be done before Archer1 could be used to detect voltage changes in the neurons of living, behaving mammals.
And that will require further collaborations with protein engineers and biochemists like Arnold.
"As neuroscientists we often encounter experimental barriers, which open the potential for new methods. We then collaborate to generate tools through chemistry or instrumentation, then we validate them and suggest optimizations, and it just keeps going," she says. "There are a few things that we’d like to be better, and through these many iterations and hard work it can happen."

(Image caption: Archer1 fluorescence in a cultured rat hippocampal neuron. By monitoring changes in this fluorescence at up to a thousand frames per second, researchers can track the electrical activity of the cell. Credit: Nicholas Flytzanis, Claire Bedbrook and Viviana Gradinaru/Caltech)

Sensing Neuronal Activity With Light

For years, neuroscientists have been trying to develop tools that would allow them to clearly view the brain’s circuitry in action—from the first moment a neuron fires to the resulting behavior in a whole organism. To get this complete picture, neuroscientists are working to develop a range of new tools to study the brain. Researchers at Caltech have developed one such tool that provides a new way of mapping neural networks in a living organism.

The work—a collaboration between Viviana Gradinaru (BS ‘05), assistant professor of biology and biological engineering, and Frances Arnold, the Dick and Barbara Dickinson Professor of Chemical Engineering, Bioengineering and Biochemistry—was described in two separate papers published this month.

When a neuron is at rest, channels and pumps in the cell membrane maintain a cell-specific balance of positively and negatively charged ions within and outside of the cell resulting in a steady membrane voltage called the cell’s resting potential. However, if a stimulus is detected—for example, a scent or a sound—ions flood through newly open channels causing a change in membrane voltage. This voltage change is often manifested as an action potential—the neuronal impulse that sets circuit activity into motion.

The tool developed by Gradinaru and Arnold detects and serves as a marker of these voltage changes.

"Our overarching goal for this tool was to achieve sensing of neuronal activity with light rather than traditional electrophysiology, but this goal had a few prerequisites," Gradinaru says. "The sensor had to be fast, since action potentials happen in just milliseconds. Also, the sensor had to be very bright so that the signal could be detected with existing microscopy setups. And you need to be able to simultaneously study the multiple neurons that make up a neural network."

The researchers began by optimizing Archaerhodopsin (Arch), a light-sensitive protein from bacteria. In nature, opsins like Arch detect sunlight and initiate the microbes’ movement toward the light so that they can begin photosynthesis. However, researchers can also exploit the light-responsive qualities of opsins for a neuroscience method called optogenetics—in which an organism’s neurons are genetically modified to express these microbial opsins. Then, by simply shining a light on the modified neurons, the researchers can control the activity of the cells as well as their associated behaviors in the organism.

Gradinaru had previously engineered Arch for better tolerance and performance in mammalian cells as a traditional optogenetic tool used to control an organism’s behavior with light. When the modified neurons are exposed to green light, Arch acts as an inhibitor, controlling neuronal activity—and thus the associated behaviors—by preventing the neurons from firing.

However, Gradinaru and Arnold were most interested in another property of Arch: when exposed to red light, the protein acts as a voltage sensor, responding to changes in membrane voltages by producing a flash of light in the presence of an action potential. Although this property could in principle allow Arch to detect the activity of networks of neurons, the light signal marking this neuronal activity was often too dim to see.

To fix this problem, Arnold and her colleagues made the Arch protein brighter using a method called directed evolution—a technique Arnold originally pioneered in the early 1990s. The researchers introduced mutations into the Arch gene, thus encoding millions of variants of the protein. They transferred the mutated genes into E. coli cells, which produced the mutant proteins encoded by the genes. They then screened thousands of the resulting E. coli colonies for the intensities of their fluorescence. The genes for the brightest versions were isolated and subjected to further rounds of mutagenesis and screening until the bacteria produced proteins that were 20 times brighter than the original Arch protein.

A paper describing the process and the bright new protein variants that were created was published in the September 9 issue of the Proceedings of the National Academy of Science.

"This experiment demonstrates how rapidly these remarkable bacterial proteins can evolve in response to new demands. But even more exciting is what they can do in neurons, as Viviana discovered," says Arnold.

In a separate study led by Gradinaru’s graduate students Nicholas Flytzanis and Claire Bedbrook, who is also advised by Arnold, the researchers genetically incorporated the new, brighter Arch variants into rodent neurons in culture to see which of these versions was most sensitive to voltage changes—and therefore would be the best at detecting action potentials. One variant, Archer1, was not only bright and sensitive enough to mark action potentials in mammalian neurons in real time, it could also be used to identify which neurons were synaptically connected—and communicating with one another—in a circuit.

The work is described in a study published on September 15 in the journal Nature Communications.

"What was interesting is that we would see two cells over here light up, but not this one over there—because the first two are synaptically connected," Gradinaru says. "This tool gave us a way to observe a network where the perturbation of one cell affects another."

However, sensing activity in a living organism and correlating this activity with behavior remained the biggest challenge. To accomplish this goal Gradinaru’s team worked with Paul Sternberg, the Thomas Hunt Morgan Professor of Biology, to test Archer1 as a sensor in a living organism—the tiny nematode worm C. elegans. “There are a few reasons why we used the worms here: they are powerful organisms for quick genetic engineering and their tissues are nearly transparent, making it easy to see the fluorescent protein in a living animal,” she says.

After incorporating Archer1 into neurons that were a part of the worm’s olfactory system—a primary source of sensory information for C. elegans—the researchers exposed the worm to an odorant. When the odorant was present, a baseline fluorescent signal was seen, and when the odorant was removed, the researchers could see the circuit of neurons light up, meaning that these particular neurons are repressed in the presence of the stimulus and active in the absence of the stimulus. The experiment was the first time that an Arch variant had been used to observe an active circuit in a living organism.

Gradinaru next hopes to use tools like Archer1 to better understand the complex neuronal networks of mammals, using microbial opsins as sensing and actuating tools in optogenetically modified rodents.

"For the future work it’s useful that this tool is bifunctional. Although Archer1 acts as a voltage sensor under red light, with green light, it’s an inhibitor," she says. "And so now a long-term goal for our optogenetics experiments is to combine the tools with behavior-controlling properties and the tools with voltage-sensing properties. This would allow us to obtain all-optical access to neuronal circuits. But I think there is still a lot of work ahead."

One goal for the future, Gradinaru says, is to make Archer1 even brighter. Although the protein’s fluorescence can be seen through the nearly transparent tissues of the nematode worm, opaque organs such as the mammalian brain are still a challenge. More work, she says, will need to be done before Archer1 could be used to detect voltage changes in the neurons of living, behaving mammals.

And that will require further collaborations with protein engineers and biochemists like Arnold.

"As neuroscientists we often encounter experimental barriers, which open the potential for new methods. We then collaborate to generate tools through chemistry or instrumentation, then we validate them and suggest optimizations, and it just keeps going," she says. "There are a few things that we’d like to be better, and through these many iterations and hard work it can happen."

Filed under optogenetics archaerhodopsin opsins neural activity neurons neuroscience science

101 notes

Down syndrome helps researchers understand Alzheimer’s disease

The link between a protein typically associated with Alzheimer’s disease and its impact on memory and cognition may not be as clear as once thought, according to a new study from the University of Wisconsin-Madison’s Waisman Center. The findings are revealing more information about the earliest stages of the neurodegenerative disease.

The researchers — including lead study author Sigan Hartley, UW-Madison assistant professor of human development and family studies, and Brad Christian, UW-Madison associate professor of medical physics and psychiatry and director of PET Physics in the Waisman Laboratory for Brain Imaging and Behavior — looked at the role of the brain protein amyloid-β in adults living with Down syndrome, a genetic condition that leaves people more susceptible to developing Alzheimer’s. They published their findings in the September issue of the journal Brain.

"Our hope is to better understand the role of this protein in memory and cognitive function," says Hartley. "With this information we hope to better understand the earliest stages in the development of this disease and gain information to guide prevention and treatment efforts."

However, the findings of their study not only may help scientists better understand the condition as it impacts those living with Down syndrome, but they are also relevant to adults without the genetic syndrome.

"There are many unanswered questions about at what point amyloid-β, together with other brain changes, begins to take a toll on memory and cognition and why certain individuals may be more resistant than others," says Hartley.

The UW-Madison scientists, along with collaborators at the University of Pittsburgh, studied 63 healthy adults with Down syndrome, aged 30 to 53, who did not exhibit clinical signs of Alzheimer’s or other forms of dementia. They found that many adults with Down syndrome had high levels of amyloid-β protein but did not suffer the expected negative consequences of the elevated protein.

Alzheimer’s disease is the sixth leading cause of death in the U.S. People with Down syndrome are born with an extra copy of the 21st chromosome, where the gene that codes for the amyloid-β protein resides.

For the study, which was conducted over the course of two days, researchers used magnetic resonance imaging (MRI) and positron emission tomography (PET) scans to capture images of the participants’ brains. Twenty-two of the 63 participants had elevated levels of amyloid-β but showed no evidence of diminished memory or cognitive function when compared to those without elevated levels of the protein. The researchers controlled for differences in age and intellectual level.

Similarly, when assessed as a continuous measure, amyloid-β levels were not tied to differences in memory or cognitive ability, such as changes in visual and verbal memory, attention and language.

(Source: news.wisc.edu)

Filed under alzheimer's disease beta amyloid down syndrome cognitive function neuroimaging neuroscience science

120 notes

Researchers Reveal Pathway that Contributes to Alzheimer’s Disease
Researchers at Jacksonville’s campus of Mayo Clinic have discovered a defect in a key cell-signaling pathway they say contributes to both overproduction of toxic protein in the brains of Alzheimer’s disease patients as well as loss of communication between neurons — both significant contributors to this type of dementia.
Their study, in the online issue of Neuron, offers the potential that targeting this specific defect with drugs “may rejuvenate or rescue this pathway,” says the study’s lead investigator, Guojun Bu, Ph.D., a neuroscientist at Mayo Clinic, Jacksonville, Fla.
“This defect is likely not the sole contributor to development of Alzheimer’s disease, but our findings suggest it is very important, and could be therapeutically targeted to possibly prevent Alzheimer’s or treat early disease,” he says.
The pathway, Wnt signaling, is known to play a critical role in cell survival, embryonic development and synaptic activity — the electrical and chemical signals necessary for learning and memory. Any imbalance in this pathway (too much or too little activity) leads to disease — the overgrowth of cells in cancer is one example of overactivation of this pathway.
While much research on Wnt has focused on diseases involved in overactive Wnt signaling, Dr. Bu’s team is one of the first to demonstrate the link between suppressed Wnt signaling and Alzheimer’s disease.
“Our finding makes sense, because researchers have long known that patients with cancer are at reduced risk of developing Alzheimer’s disease, and vice versa,” Dr. Bu says. “What wasn’t known is that Wnt signaling was involved in that dichotomy.”
Using a new mouse model, the investigators discovered the key defect that leads to suppressed Wnt signaling in Alzheimer’s. They found that the low-density lipoprotein receptor-related protein 6 (LRP6) is deficient, and that LRP6 regulates both production of amyloid beta, the protein that builds up in the brains of AD patients, and communication between neurons. That means lower than normal levels of LRP6 leads to a toxic buildup of amyloid and impairs the ability of neurons to talk to each other.
Mice without LRP6 had impaired Wnt signaling, cognitive impairment, neuroinflammation and excess amyloid.
The researchers validated their findings by examining postmortem brain tissue from Alzheimer’s patients — they found that LRP6 levels were deficient and Wnt signaling was severely compromised in the human brain they examined.
The good news is that specific inhibitors of this pathway are already being tested for cancer treatment. “Of course, we don’t want to inhibit Wnt in people with Alzheimer’s or at risk for the disease, but it may be possible to use the science invested in inhibiting Wnt to figure out how to boost activity in the pathway,” Dr. Bu says.
“Identifying small molecule compounds to restore LRP6 and the Wnt pathway, without inducing side effects, may help prevent or treat Alzheimer’s disease,” he says. “This is a really exciting new strategy — a new and fresh approach.”

Researchers Reveal Pathway that Contributes to Alzheimer’s Disease

Researchers at Jacksonville’s campus of Mayo Clinic have discovered a defect in a key cell-signaling pathway they say contributes to both overproduction of toxic protein in the brains of Alzheimer’s disease patients as well as loss of communication between neurons — both significant contributors to this type of dementia.

Their study, in the online issue of Neuron, offers the potential that targeting this specific defect with drugs “may rejuvenate or rescue this pathway,” says the study’s lead investigator, Guojun Bu, Ph.D., a neuroscientist at Mayo Clinic, Jacksonville, Fla.

“This defect is likely not the sole contributor to development of Alzheimer’s disease, but our findings suggest it is very important, and could be therapeutically targeted to possibly prevent Alzheimer’s or treat early disease,” he says.

The pathway, Wnt signaling, is known to play a critical role in cell survival, embryonic development and synaptic activity — the electrical and chemical signals necessary for learning and memory. Any imbalance in this pathway (too much or too little activity) leads to disease — the overgrowth of cells in cancer is one example of overactivation of this pathway.

While much research on Wnt has focused on diseases involved in overactive Wnt signaling, Dr. Bu’s team is one of the first to demonstrate the link between suppressed Wnt signaling and Alzheimer’s disease.

“Our finding makes sense, because researchers have long known that patients with cancer are at reduced risk of developing Alzheimer’s disease, and vice versa,” Dr. Bu says. “What wasn’t known is that Wnt signaling was involved in that dichotomy.”

Using a new mouse model, the investigators discovered the key defect that leads to suppressed Wnt signaling in Alzheimer’s. They found that the low-density lipoprotein receptor-related protein 6 (LRP6) is deficient, and that LRP6 regulates both production of amyloid beta, the protein that builds up in the brains of AD patients, and communication between neurons. That means lower than normal levels of LRP6 leads to a toxic buildup of amyloid and impairs the ability of neurons to talk to each other.

Mice without LRP6 had impaired Wnt signaling, cognitive impairment, neuroinflammation and excess amyloid.

The researchers validated their findings by examining postmortem brain tissue from Alzheimer’s patients — they found that LRP6 levels were deficient and Wnt signaling was severely compromised in the human brain they examined.

The good news is that specific inhibitors of this pathway are already being tested for cancer treatment. “Of course, we don’t want to inhibit Wnt in people with Alzheimer’s or at risk for the disease, but it may be possible to use the science invested in inhibiting Wnt to figure out how to boost activity in the pathway,” Dr. Bu says.

“Identifying small molecule compounds to restore LRP6 and the Wnt pathway, without inducing side effects, may help prevent or treat Alzheimer’s disease,” he says. “This is a really exciting new strategy — a new and fresh approach.”

Filed under alzheimer's disease LRP6 beta amyloid neurons dementia wnt neuroscience science

free counters