Neuroscience

Articles and news from the latest research reports.

Posts tagged cognitive decline

72 notes

Mayo Clinic Study: Blood Biomarker Could Mark Severe Cognitive Decline, Quicker Progression Among Parkinson’s Patients
A genetic mutation, known as GBA, that leads to early onset of Parkinson’s disease and severe cognitive impairment (in about 4 to 7 percent of all patients with the disease) also alters how specific lipids, ceramides and glucosylceramides are metabolized. Mayo Clinic researchers have found that Parkinson’s patients who do not carry the genetic mutation also have higher levels of these lipids in the blood. Further, those who had Parkinson’s and high blood levels were also more likely to have cognitive impairment and dementia. The research was recently published online in the journal PLOS ONE.
The discovery could be an important warning for those with Parkinson’s disease. Parkinson’s is the second most common neurodegenerative disease after Alzheimer’s disease. There is no biomarker to tell who is going to develop the disease — and who is going to develop cognitive impairment after developing Parkinson’s, says Michelle Mielke, Ph.D., a Mayo Clinic researcher and first author of the study.
Cognitive impairment is a frequent symptom in Parkinson’s disease and can be even more debilitating for patients and their caregivers than the characteristic motor symptoms. The early identification of Parkinson’s patients at greatest risk of developing dementia is important for preventing or delaying the onset and progression of cognitive symptoms. Changing these blood lipids could be a way to stop the progression of the disease, says Dr. Mielke.
There is a suggestion this blood lipid marker also could help to predict who will develop Parkinson’s disease and this research is ongoing.
"There is currently no cure for Parkinson’s, but the earlier we catch it — the better chance we have to fight it," says Dr. Mielke. "It’s particularly important we find a biomarker and identify it in the preclinical phase of the disease, before the onset even begins."
Dr. Mielke’s lab is researching blood-based biomarkers for Parkinson’s disease because blood tests are less invasive and cheaper than a brain scan or spinal tap — other tools used to research the disease.

Mayo Clinic Study: Blood Biomarker Could Mark Severe Cognitive Decline, Quicker Progression Among Parkinson’s Patients

A genetic mutation, known as GBA, that leads to early onset of Parkinson’s disease and severe cognitive impairment (in about 4 to 7 percent of all patients with the disease) also alters how specific lipids, ceramides and glucosylceramides are metabolized. Mayo Clinic researchers have found that Parkinson’s patients who do not carry the genetic mutation also have higher levels of these lipids in the blood. Further, those who had Parkinson’s and high blood levels were also more likely to have cognitive impairment and dementia. The research was recently published online in the journal PLOS ONE.

The discovery could be an important warning for those with Parkinson’s disease. Parkinson’s is the second most common neurodegenerative disease after Alzheimer’s disease. There is no biomarker to tell who is going to develop the disease — and who is going to develop cognitive impairment after developing Parkinson’s, says Michelle Mielke, Ph.D., a Mayo Clinic researcher and first author of the study.

Cognitive impairment is a frequent symptom in Parkinson’s disease and can be even more debilitating for patients and their caregivers than the characteristic motor symptoms. The early identification of Parkinson’s patients at greatest risk of developing dementia is important for preventing or delaying the onset and progression of cognitive symptoms. Changing these blood lipids could be a way to stop the progression of the disease, says Dr. Mielke.

There is a suggestion this blood lipid marker also could help to predict who will develop Parkinson’s disease and this research is ongoing.

"There is currently no cure for Parkinson’s, but the earlier we catch it — the better chance we have to fight it," says Dr. Mielke. "It’s particularly important we find a biomarker and identify it in the preclinical phase of the disease, before the onset even begins."

Dr. Mielke’s lab is researching blood-based biomarkers for Parkinson’s disease because blood tests are less invasive and cheaper than a brain scan or spinal tap — other tools used to research the disease.

Filed under neurodegenerative diseases dementia cognitive decline parkinson's disease neuroscience science

41 notes

Fat Marker Predicts Cognitive Decline in People With HIV

Similarities found between HIV-associated brain damage and impairment from genetic fat-storage disease

Johns Hopkins scientists have found that levels of certain fats found in cerebral spinal fluid can predict which patients with HIV are more likely to become intellectually impaired.

The researchers believe that these fat markers reflect disease-associated changes in how the brain metabolizes these fat molecules. These changes disrupt the brain cells’ ability to regulate the activity of cells’ “garbage disposals” meant to degrade and flush the brain of molecular debris. In this case, too much cholesterol and a fat known as sphingomyelin build up in the lysosomes — the garbage disposals — backing up waste and leading to often debilitating cognitive declines. 

As many as half of patients infected with HIV will develop some form of cognitive impairment, ranging from mild (trouble counting change or driving a car) to frank dementia (an inability to manage activities of every day life), but no tests have been available to predict which people were more likely to suffer cognitive losses.

 “Every researcher of neurodegenerative disease is chasing biomarkers for the same reason: It’s better to identify problems before they strike,” says Norman J. Haughey, Ph.D., an associate professor in the departments of neurology and psychiatry and behavioral sciences at the Johns Hopkins University School of Medicine. He led the current study described online in the journal Neurology.

“It’s very hard to reverse brain damage after it starts,” he says. “Instead, we want to figure out who is likely to lose cognitive function and stop the damage before it happens.”

Haughey and his team analyzed 321 cerebral spinal fluid samples collected from seven test sites across the continental United States, Hawaii and Puerto Rico. The samples came from 291 HIV-positive participants and 30 HIV-negative subjects. The investigators found that early accumulations of a small number of these fat molecules could predict the probability of cognitive decline. As cognitive function declined in these patients, the number of different types of fat molecules that accumulated increased. The types of accumulating fat molecules in HIV were very similar to those that accumulate in inherited forms of a class of diseases called lysosomal storage disorders. This suggests that in some HIV-infected patients, the brain is retaining more of these fats, and this may disrupt the function of lysosomes.

Haughey says he believes some of these impairments in the metabolism of these fats found in people with HIV stems from the infection itself, while others may be linked to the lifesaving antiretroviral therapy taken by most people with HIV. The medications have been associated with elevated blood cholesterol and triglycerides, along with a host of other side effects. Those with HIV are now taking these drugs for decades and the complications from long-term use have not been well studied, he says.

The similarities between the metabolic disturbances in HIV-infected individuals and those apparent in lysosomal storage disorders are enabling Haughey and his team to collaborate with researchers who study genetic lysosomal storage diseases, and who are developing experimental treatments to clear the fat buildup. They are currently exploring dietary and pharmacological interventions designed to restore balance that could potentially restore brain metabolism in HIV-infected individuals, and in doing so could promote good brain health by ensuring the lysosomes function properly.

(Source: hopkinsmedicine.org)

Filed under cognitive decline HIV lysosomes lysosomal storage disorders sphingomyelin neurology neuroscience science

41 notes

Brain circuitry loss may be a very early sign of cognitive decline in healthy elderly people

The degeneration of a small, wishbone-shaped structure deep inside the brain may provide the earliest clues to future cognitive decline, long before healthy older people exhibit clinical symptoms of memory loss or dementia, a study by researchers with the UC Davis Alzheimer’s Disease Center has found.

image

The longitudinal study found that the only discernible brain differences between normal people who later developed cognitive impairment and those who did not were changes in their fornix, an organ that carries messages to and from the hippocampus, and that has long been known to play a role in memory.

“This could be a very early and useful marker for future incipient decline,” said Evan Fletcher, the study’s lead author and a project scientist with the UC Davis Alzheimer’s Disease Center.

“Our results suggest that fornix variables are measurable brain factors that precede the earliest clinically relevant deterioration of cognitive function among cognitively normal elderly individuals,” Fletcher said.

The research is published online today in JAMA Neurology.

Hippocampal atrophy occurs in the later stages of cognitive decline and is one of the most studied changes associated with the Alzheimer’s disease process. However, changes to the fornix and other regions of the brain structurally connected to the hippocampus have not been as closely examined. The study found that degeneration of the fornix in relation to cognition was detectable even earlier than changes in the hippocampus.

“Although hippocampal measures have been studied much more deeply in relation to cognitive decline, our direct comparison between fornix and hippocampus measures suggests that fornix properties have a superior ability to identify incipient cognitive decline among healthy individuals,” Fletcher said.

The study was conducted over five years in a group of 102 diverse, cognitively normal people with an average age of 73 who were recruited through community outreach at the Alzheimer’s Disease Center. The researchers conducted magnetic resonance imaging (MRI) studies of the participants’ brains that described their volumes and integrity. A different type of MRI was used to determine the integrity of the myelin, the fatty coating that sheaths and protects the axons. The axons are analogous to the copper wiring of the brain’s circuitry and the myelin is like the wiring’s plastic insulation.

Either one of those things being lost will “degrade the signal transmission” in the brain, Fletcher said.

The researchers also conducted psychological tests and cognitive evaluations of the study participants to gauge their level of cognitive functioning. The participants returned for updated MRIs and cognitive testing at approximately one-year intervals. At the outset, none of the study participants exhibited symptoms of cognitive decline. Over time about 20 percent began to show symptoms that led to diagnoses with either mild cognitive impairment (MCI) and, in a minority of cases, Alzheimer’s disease.

“We found that if you looked at various brain factors there was one — and only one — that seemed to be predictive of whether a person would have cognitive decline, and that was the degradation of the fornix,” Fletcher said.

The study measured two relevant fornix characteristics predicting future cognitive impairment — low fornix white matter volume and reduced axonal integrity. Each of these was stronger than any other brain factor in models predicting cognitive loss, Fletcher said. 

He said that routine MRI examination of the fornix could conceivably be used clinically in the future as a predictor of abnormal cognitive decline.

“Our findings suggest that if your fornix volume or integrity is within a certain range you’re at an increased risk of cognitive impairment down the road. But developing the use of the fornix as a predictor in a clinical setting will take some time, in the same way that it took time for evaluation of cholesterol levels to be used to predict future heart disease,” he said.

Fletcher also said that the finding may mark a paradigm shift toward evaluation of the brain’s white matter, rather than its gray matter, as among the very earliest indicators of developing cognitive loss. There is currently a strong research focus on understanding brain processes that lead eventually to Alzheimer’s disease. He said the current finding could fill in one piece of the picture and motivate new directions in research to understand why and how fornix and other white matter change is such an important harbinger of cognitive impairment. 

“The key importance of this finding is that it suggests that white matter tract measures may prove to be promising candidate biomarkers for predicting incipient cognitive decline among cognitively normal individuals in a clinical setting, possibly more so than gray matter measures,” he said.

(Source: ucdmc.ucdavis.edu)

Filed under alzheimer's disease dementia cognitive decline fornix hippocampus neuroscience science

121 notes

Mild B-12 Deficiency May Speed Dementia
Study finds that the vitamin shortage might affect more people than previously thought 

Being even mildly deficient in vitamin B-12 may put older adults at a greater risk for accelerated cognitive decline, an observational study from the Jean Mayer USDA Human Nutrition Research Center on Aging at Tufts suggests.
Martha Savaria Morris, an epidemiologist in the Nutrition Epidemiology Program at the HNRCA, and colleagues examined data from 549 men and women enrolled in a cohort of the Framingham Heart Study. The subjects, who had an average age of 75 at the start, were divided into five groups based on their vitamin B-12 blood levels.
Being in the two lowest groups was associated with significantly accelerated cognitive decline, based on scores from dementia screening tests given over eight years.
“Men and women in the second-lowest group did not fare any better in terms of cognitive decline than those with the worst vitamin B-12 blood levels,” Morris says. It is well known that severe B-12 deficiency speeds up dementia, but the finding suggests that even more seniors may be affected.
The study appeared in the Journal of the American Geriatrics Society.
“While we emphasize our study does not show causation, our associations raise the concern that some cognitive decline may be the result of inadequate vitamin B-12 in older adults, for whom maintaining normal blood levels can be a challenge,” says Professor Paul Jacques, the study’s senior author and director of the HNRCA Nutrition Epidemiology Program.
Animal proteins, such as lean meats, poultry and eggs, are good sources of vitamin B-12. Because older adults may have a hard time absorbing vitamin B-12 from food, the USDAʼs 2010 Dietary Guidelines for Americans recommend that people over age 50 incorporate foods fortified with B-12 or supplements in their diets.
The subjects in this study were mostly Caucasian women who had earned at least a high school diploma. The authors said future research might include more diverse populations and explore whether vitamin B-12 status affects particular cognitive skills.
This article first appeared in the Summer 2013 issue of Tufts Nutrition magazine. 

Mild B-12 Deficiency May Speed Dementia

Study finds that the vitamin shortage might affect more people than previously thought

Being even mildly deficient in vitamin B-12 may put older adults at a greater risk for accelerated cognitive decline, an observational study from the Jean Mayer USDA Human Nutrition Research Center on Aging at Tufts suggests.

Martha Savaria Morris, an epidemiologist in the Nutrition Epidemiology Program at the HNRCA, and colleagues examined data from 549 men and women enrolled in a cohort of the Framingham Heart Study. The subjects, who had an average age of 75 at the start, were divided into five groups based on their vitamin B-12 blood levels.

Being in the two lowest groups was associated with significantly accelerated cognitive decline, based on scores from dementia screening tests given over eight years.

“Men and women in the second-lowest group did not fare any better in terms of cognitive decline than those with the worst vitamin B-12 blood levels,” Morris says. It is well known that severe B-12 deficiency speeds up dementia, but the finding suggests that even more seniors may be affected.

The study appeared in the Journal of the American Geriatrics Society.

“While we emphasize our study does not show causation, our associations raise the concern that some cognitive decline may be the result of inadequate vitamin B-12 in older adults, for whom maintaining normal blood levels can be a challenge,” says Professor Paul Jacques, the study’s senior author and director of the HNRCA Nutrition Epidemiology Program.

Animal proteins, such as lean meats, poultry and eggs, are good sources of vitamin B-12. Because older adults may have a hard time absorbing vitamin B-12 from food, the USDAʼs 2010 Dietary Guidelines for Americans recommend that people over age 50 incorporate foods fortified with B-12 or supplements in their diets.

The subjects in this study were mostly Caucasian women who had earned at least a high school diploma. The authors said future research might include more diverse populations and explore whether vitamin B-12 status affects particular cognitive skills.

This article first appeared in the Summer 2013 issue of Tufts Nutrition magazine. 

Filed under vitamin B-12 B-12 deficiency cognitive decline dementia neuroscience science

101 notes

Cognitive decline with age is normal, routine – but not inevitable

If you forget where you put your car keys and you can’t seem to remember things as well as you used to, the problem may well be with the GluN2B subunits in your NMDA receptors.

And don’t be surprised if by tomorrow you can’t remember the name of those darned subunits.

They help you remember things, but you’ve been losing them almost since the day you were born, and it’s only going to get worse. An old adult may have only half as many of them as a younger person.

Research on these biochemical processes in the Linus Pauling Institute at Oregon State University is making it clear that cognitive decline with age is a natural part of life, and scientists are tracking the problem down to highly specific components of the brain. Separate from some more serious problems like dementia and Alzheimer’s disease, virtually everyone loses memory-making and cognitive abilities as they age. The process is well under way by the age of 40 and picks up speed after that.

But of considerable interest: It may not have to be that way.

“These are biological processes, and once we fully understand what is going on, we may be able to slow or prevent it,” said Kathy Magnusson, a neuroscientist in the OSU Department of Biomedical Sciences, College of Veterinary Medicine, and professor in the Linus Pauling Institute. “There may be ways to influence it with diet, health habits, continued mental activity or even drugs.”

The processes are complex. In a study just published in the Journal of Neuroscience, researchers found that one protein that stabilizes receptors in a young animal – a good thing conducive to learning and memory – can have just the opposite effect if there’s too much of it in an older animal.

But complexity aside, progress is being made. In recent research, supported by the National Institutes of Health, OSU scientists used a genetic therapy in laboratory mice, in which a virus helped carry complementary DNA into appropriate cells and restored some GluN2B subunits. Tests showed that it helped mice improve their memory and cognitive ability.

The NMDA receptor has been known of for decades, Magnusson said. It plays a role in memory and learning but isn’t active all the time – it takes a fairly strong stimulus of some type to turn it on and allow you to remember something. The routine of getting dressed in the morning is ignored and quickly lost to the fog of time, but the day you had an auto accident earns a permanent etching in your memory.

Within the NMDA receptor are various subunits, and Magnusson said that research keeps pointing back to the GluN2B subunit as one of the most important. Infants and children have lots of them, and as a result are like a sponge in soaking up memories and learning new things. But they gradually dwindle in number with age, and it also appears the ones that are left work less efficiently.

“You can still learn new things and make new memories when you are older, but it’s not as easy,” Magnusson said. “Fewer messages get through, fewer connections get made, and your brain has to work harder.”

Until more specific help is available, she said, some of the best advice for maintaining cognitive function is to keep using your brain. Break old habits, do things different ways. Get physical exercise, maintain a good diet and ensure social interaction. Such activities help keep these “subunits” active and functioning.

Gene therapy such as that already used in mice would probably be a last choice for humans, rather than a first option, Magnusson said. Dietary or drug options would be explored first.

“The one thing that does seem fairly clear is that cognitive decline is not inevitable,” she said. “It’s biological, we’re finding out why it happens, and it appears there are ways we might be able to slow or stop it, perhaps repair the NMDA receptors. If we can determine how to do that without harm, we will.”

(Source: oregonstate.edu)

Filed under aging cognitive decline NMDA receptors GluN2B subunit memory neuroscience science

152 notes

Injuries From Teen Fighting Deal a Blow to IQ 
New study explores connection between physical fights, cognitive decline
A new Florida State University study has found that adolescent boys who are hurt in just two physical fights suffer a loss in IQ that is roughly equivalent to missing an entire year of school. Girls experience a similar loss of IQ after only a single fighting-related injury.
The findings are significant because decreases in IQ are associated with lower educational achievement and occupational performance, mental disorders, behavioral problems and even longevity, the researchers said.
“It’s no surprise that being severely physically injured results in negative repercussions, but the extent to which such injuries affect intelligence was quite surprising,” said Joseph A. Schwartz, a doctoral student who conducted the study with Professor Kevin Beaver in FSU’s College of Criminology and Criminal Justice.
Their findings are outlined in the paper, “Serious Fighting-Related Injuries Produce a Significant Reduction in Intelligence,” which was published in the Journal of Adolescent Health. The study is among the first to look at the long-term effects of fighting during adolescence, a critical period of neurological development.
About 4 percent of high school students are injured as a result of a physical fight each year, the researchers said.
Schwartz and Beaver used data from the National Longitudinal Study of Adolescent Health collected between 1994 and 2002 to examine whether serious fighting-related injuries resulted in significant decreases in IQ over a 5- to 6-year time span. The longitudinal study began with a nationally representative sample of 20,000 middle and high school students who were tracked into adulthood through subsequent waves of data collection. At each wave of data collection, respondents were asked about a wide variety of topics, including personality traits, social relationships and the frequency of specific behaviors.
Perhaps not surprisingly, boys experienced a higher number of injuries from fighting than girls; however, the consequences for girls were more severe, a fact the researchers attributed to physiological differences that give males an increased ability to withstand physical trauma.
The researchers found that each fighting-related injury resulted in a loss of 1.62 IQ points for boys, while girls lost an average of 3.02 IQ points, even after controlling for changes in socio-economic status, age and race for both genders. Previous studies have indicated that missing a single year of school is associated with a loss of 2 to 4 IQ points.
The impact on IQ may be even greater when considering only head injuries, the researchers said. The data they studied took into account all fighting-related physical injuries.
The findings highlight the importance of schools and communities developing policies aimed at limiting injuries suffered during adolescence whether through fighting, bullying or contact sports, Schwartz said.
“We tend to focus on factors that may result in increases in intelligence over time, but examining the factors that result in decreases may be just as important,” he said. “The first step in correcting a problem is understanding its underlying causes. By knowing that fighting-related injuries result in a significant decrease in intelligence, we can begin to develop programs and protocols aimed at effective intervention.”

Injuries From Teen Fighting Deal a Blow to IQ

New study explores connection between physical fights, cognitive decline

A new Florida State University study has found that adolescent boys who are hurt in just two physical fights suffer a loss in IQ that is roughly equivalent to missing an entire year of school. Girls experience a similar loss of IQ after only a single fighting-related injury.

The findings are significant because decreases in IQ are associated with lower educational achievement and occupational performance, mental disorders, behavioral problems and even longevity, the researchers said.

“It’s no surprise that being severely physically injured results in negative repercussions, but the extent to which such injuries affect intelligence was quite surprising,” said Joseph A. Schwartz, a doctoral student who conducted the study with Professor Kevin Beaver in FSU’s College of Criminology and Criminal Justice.

Their findings are outlined in the paper, “Serious Fighting-Related Injuries Produce a Significant Reduction in Intelligence,” which was published in the Journal of Adolescent Health. The study is among the first to look at the long-term effects of fighting during adolescence, a critical period of neurological development.

About 4 percent of high school students are injured as a result of a physical fight each year, the researchers said.

Schwartz and Beaver used data from the National Longitudinal Study of Adolescent Health collected between 1994 and 2002 to examine whether serious fighting-related injuries resulted in significant decreases in IQ over a 5- to 6-year time span. The longitudinal study began with a nationally representative sample of 20,000 middle and high school students who were tracked into adulthood through subsequent waves of data collection. At each wave of data collection, respondents were asked about a wide variety of topics, including personality traits, social relationships and the frequency of specific behaviors.

Perhaps not surprisingly, boys experienced a higher number of injuries from fighting than girls; however, the consequences for girls were more severe, a fact the researchers attributed to physiological differences that give males an increased ability to withstand physical trauma.

The researchers found that each fighting-related injury resulted in a loss of 1.62 IQ points for boys, while girls lost an average of 3.02 IQ points, even after controlling for changes in socio-economic status, age and race for both genders. Previous studies have indicated that missing a single year of school is associated with a loss of 2 to 4 IQ points.

The impact on IQ may be even greater when considering only head injuries, the researchers said. The data they studied took into account all fighting-related physical injuries.

The findings highlight the importance of schools and communities developing policies aimed at limiting injuries suffered during adolescence whether through fighting, bullying or contact sports, Schwartz said.

“We tend to focus on factors that may result in increases in intelligence over time, but examining the factors that result in decreases may be just as important,” he said. “The first step in correcting a problem is understanding its underlying causes. By knowing that fighting-related injuries result in a significant decrease in intelligence, we can begin to develop programs and protocols aimed at effective intervention.”

Filed under cognitive decline brain injury fighting IQ adolescence neuroscience psychology science

66 notes

Certain blood pressure drugs slow dementia deterioration

A class of drug, called ACE inhibitors, which are used to lower blood pressure, slow the rate of cognitive decline typical of dementia, suggests research published in the online journal BMJ Open.

Furthermore, these drugs may even boost brain power, the research indicates.

The researchers compared the rates of cognitive decline in 361 patients who had either been diagnosed with Alzheimer’s disease, vascular dementia, or a mix of both. 

Eighty five of the patients were already taking ACE inhibitors; the rest were not.

The researchers also assessed the impact of ACE inhibitors on the brain power of 30 patients newly prescribed these drugs, during their first six months of treatment. The average age of all the participants was 77.

Between 1999 and 2010, the cognitive decline of each patient was assessed using either the Standardised Mini Mental State Examination (SMMSE) or the Quick Mild Cognitive Impairment (Qmci) screen on two separate occasions, six months apart.

Compared with those not taking ACE inhibitors, those on these drugs experienced marginally slower rates of cognitive decline. 

In those whose brain power had been assessed by Qmci, which is a more sensitive screen than the SMMSE, the difference was small, but significant.

And the brain power of those patients newly prescribed ACE inhibitors actually improved over the six month period, compared with those already taking them, and those not taking them at all.

This might be because these patients stuck to their medication regimen better, or it might be a by-product of better blood pressure control, or improved blood flow to the brain, suggest the authors.

But it is the first time that there has been any evidence to suggest that blood pressure lowering drugs may not only halt cognitive decline, but may actually improve brain power.

“This [study] supports the growing body of evidence for the use of ACE inhibitors and other [blood pressure lowering] agents in the management of dementia,” write the authors. 

“Although the differences were small and of uncertain clinical significance, if sustained over years, the compounding effects may well have significant clinical benefits,” they add.

They caution, however, that recent evidence indicates that ACE inhibitors may be harmful in some cases, so if larger studies confirm that they work well in dementia, it may be only certain groups of patients with the condition who stand to benefit.

(Source: group.bmj.com)

Filed under ACE inhibitors dementia cognitive decline neuroscience science

78 notes

New clues illuminate Alzheimer’s roots
Scientists at Rice University and the University of Miami have figured out how synthetic molecules designed at Rice latch onto the amyloid peptide fibrils thought to be responsible for Alzheimer’s disease. Their discovery could point the way toward therapies to halt or even reverse the insidious disease.
The metallic dipyridophenazine ruthenium molecules strongly bind to pockets created when fibrils form from misfolded proteins that cells fail to destroy. When excited under a spectroscope, the molecules luminesce, which indicates the presence of the fibrils. That much was known by Rice researchers, but until now the process was a mystery.
By combining their talents in biophysics (at Rice) and computer simulation (at Miami), researchers pinpointed four such pockets along the fibril where the hydrophobic (water-averse) molecules can bind. They believe their work will help chemists design molecules to keep the fibrils from forming the plaques found in Alzheimer’s patients.
The teams led by Rice chemist Angel Martí and Miami chemist Rajeev Prabhakar reported their results in the Journal of the American Chemical Society this month.
Two years ago, Martí and Nathan Cook, a graduate student in his lab and lead author of the new paper, combined ruthenium complexes with solutions containing the spaghetti-like amyloid fibrils. The complexes don’t luminesce by themselves, but when they link to an amyloid fibril, they can be triggered by light at one wavelength to glow at another; this helps the researchers “see” the fibrils.
This ability to track amyloids was a great step forward, but left open the question of why the complexes latched onto the fibrils at all, Cook said.
“We had no way to figure it out because our experimental techniques can’t identify binding sites,” he said. “The standard (used to analyze proteins) is to crystallize your material and use X-rays to determine where everything is positioned. The problem with amyloid beta is the fibrils are not uniform, and you can’t crystallize them. All you would get is an amorphous lump.”
But a door opened when Prabhakar, a theoretical and computational chemist who specializes in amyloids, contacted Martí and suggested a collaboration. “We both knew the other was working with amyloid betas,” Martí said. “We were able to figure out how many amyloid beta monomers (molecules that can bind with each other) had to come together to form fibrils, while he modeled the interactions. When we brought all the data together, we had a perfect match.”
“Basically, we learned from the model that we need two monomers to form a binding site,” Marti said. “The cleft where the ruthenium complex binds is completely hydrophobic, the same as the complex. Neither wants to be exposed to water, so when they find each other, they don’t have a choice but to come together. It turns out that’s exactly what needs to happen to turn on the photoluminescent response of the compound.”
Martí said testing various concentrations of monomers with ruthenium complexes helped them determine that a little more than two monomers, on average, was sufficient to get the “light switch” effect. Prabhakar’s analysis found four specific locations along the aggregating monomers where the ruthenium complexes could bind: two at the ends where the monomers tend to bind to each other, and two in the middle.
“It was a complicated system to model and we tried hard, using a variety of computational techniques,” Prabhakar said. “In the end, we were amazed to find our results in perfect agreement with the experiments performed in the Martí lab.”
The researchers called the end locations “A and B,” and the middle clefts “C and D.” The hydrophobic A and B sites exist only at the edges of the fibrils, which limits their exposure to the complexes, Martí said. “But there are lots of C and D sites,” he said. “That explains why the ruthenium complexes don’t inhibit the aggregation of fibrils. It seems the system prefers to bind another monomer, rather than a ruthenium complex, at the ends.
“But now that we understand the mechanism, we can design more hydrophobic complexes that could bind strongly to the ends and prevent further elongation of the fibril,” he said.
“There’s a whole variety of ways to tweak this that could potentially disrupt a binding pocket,” Cook said.
More challenges lie beyond the new discovery, he said. New research indicates toxic oligomers may be catalyzed by the formation of amyloid fibrils. “We might be able to prevent the formation of these oligomeric species by binding ruthenium complexes to the surface, which would completely change the surface chemistry of the fibrils,” Martí said. “These are the things we are really interested in doing right now.”

New clues illuminate Alzheimer’s roots

Scientists at Rice University and the University of Miami have figured out how synthetic molecules designed at Rice latch onto the amyloid peptide fibrils thought to be responsible for Alzheimer’s disease. Their discovery could point the way toward therapies to halt or even reverse the insidious disease.

The metallic dipyridophenazine ruthenium molecules strongly bind to pockets created when fibrils form from misfolded proteins that cells fail to destroy. When excited under a spectroscope, the molecules luminesce, which indicates the presence of the fibrils. That much was known by Rice researchers, but until now the process was a mystery.

By combining their talents in biophysics (at Rice) and computer simulation (at Miami), researchers pinpointed four such pockets along the fibril where the hydrophobic (water-averse) molecules can bind. They believe their work will help chemists design molecules to keep the fibrils from forming the plaques found in Alzheimer’s patients.

The teams led by Rice chemist Angel Martí and Miami chemist Rajeev Prabhakar reported their results in the Journal of the American Chemical Society this month.

Two years ago, Martí and Nathan Cook, a graduate student in his lab and lead author of the new paper, combined ruthenium complexes with solutions containing the spaghetti-like amyloid fibrils. The complexes don’t luminesce by themselves, but when they link to an amyloid fibril, they can be triggered by light at one wavelength to glow at another; this helps the researchers “see” the fibrils.

This ability to track amyloids was a great step forward, but left open the question of why the complexes latched onto the fibrils at all, Cook said.

“We had no way to figure it out because our experimental techniques can’t identify binding sites,” he said. “The standard (used to analyze proteins) is to crystallize your material and use X-rays to determine where everything is positioned. The problem with amyloid beta is the fibrils are not uniform, and you can’t crystallize them. All you would get is an amorphous lump.”

But a door opened when Prabhakar, a theoretical and computational chemist who specializes in amyloids, contacted Martí and suggested a collaboration. “We both knew the other was working with amyloid betas,” Martí said. “We were able to figure out how many amyloid beta monomers (molecules that can bind with each other) had to come together to form fibrils, while he modeled the interactions. When we brought all the data together, we had a perfect match.”

“Basically, we learned from the model that we need two monomers to form a binding site,” Marti said. “The cleft where the ruthenium complex binds is completely hydrophobic, the same as the complex. Neither wants to be exposed to water, so when they find each other, they don’t have a choice but to come together. It turns out that’s exactly what needs to happen to turn on the photoluminescent response of the compound.”

Martí said testing various concentrations of monomers with ruthenium complexes helped them determine that a little more than two monomers, on average, was sufficient to get the “light switch” effect. Prabhakar’s analysis found four specific locations along the aggregating monomers where the ruthenium complexes could bind: two at the ends where the monomers tend to bind to each other, and two in the middle.

“It was a complicated system to model and we tried hard, using a variety of computational techniques,” Prabhakar said. “In the end, we were amazed to find our results in perfect agreement with the experiments performed in the Martí lab.”

The researchers called the end locations “A and B,” and the middle clefts “C and D.” The hydrophobic A and B sites exist only at the edges of the fibrils, which limits their exposure to the complexes, Martí said. “But there are lots of C and D sites,” he said. “That explains why the ruthenium complexes don’t inhibit the aggregation of fibrils. It seems the system prefers to bind another monomer, rather than a ruthenium complex, at the ends.

“But now that we understand the mechanism, we can design more hydrophobic complexes that could bind strongly to the ends and prevent further elongation of the fibril,” he said.

“There’s a whole variety of ways to tweak this that could potentially disrupt a binding pocket,” Cook said.

More challenges lie beyond the new discovery, he said. New research indicates toxic oligomers may be catalyzed by the formation of amyloid fibrils. “We might be able to prevent the formation of these oligomeric species by binding ruthenium complexes to the surface, which would completely change the surface chemistry of the fibrils,” Martí said. “These are the things we are really interested in doing right now.”

Filed under alzheimer's disease beta amyloid dementia cognitive decline oligomers amyloid fibrils neuroscience science

39 notes

Path of Plaque Buildup in Brain Shows Promise as Early Biomarker for Alzheimer’s Disease

The trajectory of amyloid plaque buildup—clumps of abnormal proteins in the brain linked to Alzheimer’s disease—may serve as a more powerful biomarker for early detection of cognitive decline rather than using the total amount to gauge risk, researchers from Penn Medicine’s Department of Radiology suggest in a new study published online July 15 in the Journal of Neurobiology of Aging.

Amyloid plaque that starts to accumulate relatively early in the temporal lobe, compared to other areas and in particular to the frontal lobe, was associated with cognitively declining participants, the study found. “Knowing that certain brain abnormality patterns are associated with cognitive performance could have pivotal importance for the early detection and management of Alzheimer’s,” said senior author Christos Davatzikos, PhD, professor in the Department of Radiology, the Center for Biomedical Image Computing and Analytics, at the Perelman School of Medicine at the University of Pennsylvania.

Today, memory decline and Alzheimer’s—which 5.4 million Americans live with today—is often assessed with a variety of tools, including physical and bio fluid tests and neuroimaging of total amyloid plaque in the brain. Past studies have linked higher amounts of the plaque in dementia-free people with greater risk for developing the disorder. However, it’s more recently been shown that nearly a third of people with plaque on their brains never showed signs of cognitive decline, raising questions about its specific role in the disease.

Now, Dr. Davatzikos and his Penn colleagues, in collaboration with a team led by Susan M. Resnick, PhD, Chief, Laboratory of Behavioral Neuroscience at the National Institute on Aging (NIA), used Pittsburgh compound B (PiB) brain scans from the Baltimore Longitudinal Study of Aging’s Imaging Study and discovered a stronger association between memory decline and spatial patterns of amyloid plaque progression than the total amyloid burden.

“It appears to be more about the spatial pattern of this plaque progression, and not so much about the total amount found in brains. We saw a difference in the spatial distribution of plaques among cognitive declining and stable patients whose cognitive function had been measured over a 12-year period. They had similar amounts of amyloid plaque, just in different spots,” Dr. Davatzikos said. “This is important because it potentially answers questions about the variability seen in clinical research among patients presenting plaque. It accumulates in different spatial patterns for different patients, and it’s that pattern growth that may determine whether your memory declines.”

The team, including first author Rachel A. Yotter, PhD, a postdoctoral researcher in the Section for Biomedical Image Analysis, retrospectively analyzed the PET PiB scans of 64 patients from the NIA’s Baltimore Longitudinal Study of Aging whose average age was 76 years old. For the study, researchers created a unique picture of patients’ brains by combining and analyzing PET images measuring the density and volume of amyloid plaque and their spatial distribution within the brain. The radiotracer PiB allowed investigators to see amyloid temporal changes in deposition.

Those images were then compared to California Verbal Learning Test (CLVT) scores, among other tests, from the participants to determine the longitudinal cognitive decline. The group was then broken up into two subgroups: the most stable and the most declining individuals (26 participants).

Despite lack of significant difference in the total amount of amyloid in the brain, the spatial patterns between the two groups (stable and declining) were different, with the former showing relatively early accumulation in the frontal lobes and the latter in the temporal lobes.   

A particular area of the brain may be affected early or later depending on the amyloid trajectory, according to the authors, which in turn would affect cognitive impairment. Areas affected early with the plaque include the lateral temporal and parietal regions, with sparing of the occipital lobe and motor cortices until later in disease progression.

“This finding has broad implications for our understanding of the relationship between cognitive decline and resistance and amyloid plaque location, as well as the use of amyloid imaging as a biomarker in research and the clinic,” said Dr Davatzikos. “The next step is to investigate more individuals with mild cognitive impairment, and to further investigate the follow-up scans of these individuals via the BLSA study, which might shed further light on its relevance for early detection of Alzheimer’s.”

(Source: uphs.upenn.edu)

Filed under alzheimer's disease dementia cognitive decline amyloid plaques temporal lobe neuroscience science

136 notes

Protein Linked to Cognitive Decline in Alzheimer’s Identified
Researchers at Columbia University Medical Center (CUMC) have demonstrated that a protein called caspase-2 is a key regulator of a signaling pathway that leads to cognitive decline in Alzheimer’s disease. The findings, made in a mouse model of Alzheimer’s, suggest that inhibiting this protein could prevent the neuronal damage and subsequent cognitive decline associated with the disease. The study was published this month in the online journal Nature Communications.
One of the earliest events in Alzheimer’s is disruption of the brain’s synapses (the small gaps across which nerve impulses are passed), which can lead to neuronal death. Although what drives this process has not been clear, studies have indicated that caspace-2 might be involved, according to senior author Michael Shelanski, MD, PhD, the Delafield Professor of Pathology & Cell Biology, chair of the Department of Pathology and Cell Biology, and co-director of the Taub Institute for Research on Alzheimer’s Disease and the Aging Brain at CUMC.
Several years ago, in tissue culture studies of mouse neurons, Dr. Shelanski found that caspace-2 plays a critical role in the death of neurons in the presence of amyloid beta, the protein that accumulates in the neurons of people with Alzheimer’s. Other researchers have shown that caspase-2 also contributes to the maintenance of normal synaptic functions.
Dr. Shelanski and his team hypothesized that aberrant activation of caspase-2 may cause synaptic changes in Alzheimer’s disease. To test this hypothesis, the researchers crossed J20 transgenic mice (a common mouse model of Alzheimer’s) with caspase-2 null mice (mice that lack caspase-2). They compared the animals’ ability to negotiate a radial-arm water maze, a standard test of cognitive ability, with that of regular J20 mice and of normal mice at 4, 9, and 14 months of age.
The results for the three groups of mice were similar at the first two intervals. At 14 months, however, the J20/caspase-2 null mice did significantly better in the water maze test than the J20 mice and similarly to the normal mice. “We showed that removing caspase-2 from J20 mice prevented memory impairment — without significant changes in the level of soluble amyloid beta,” said co-lead author Roger Lefort, PhD, associate research scientist at CUMC.
Analysis of the neurons showed that the J20/caspase-2 null mice had a higher density of dendritic spines than the J20 mice. The more spines a neuron has, the more impulses it can transmit.
“The J20/caspase-2 null mice showed the same dendritic spine density and morphology as the normal mice—as opposed to the deficits in the J20 mice,” said co-lead author Julio Pozueta, PhD. “This strongly suggests that caspase-2 is a critical regulator in the memory decline associated with beta-amyloid in Alzheimer’s disease.”
The researchers further validated the results in studies of rat neurons in tissue culture.
Finally, the researchers found that caspase-2 interacts with RhoA, a critical regulator of the morphology (form and structure) of dendritic spines. “It appears that in normal neurons, caspase-2 and RhoA form an inactive complex outside the dendritic spines,” said Dr. Lefort. “When the complex is exposed to amyloid beta, it breaks apart, activating the two components.” Once activated, caspase-2 and RhoA enter the dendritic spines and contribute to their demise, possibly by interacting with a third molecule, the enzyme ROCK-II.
“This raises the possibility that if you can inhibit one or all of these molecules, especially early in the course of Alzheimer’s, you might be able to protect neurons and slow down the cognitive effects of the disease,” said Dr. Lefort.

Protein Linked to Cognitive Decline in Alzheimer’s Identified

Researchers at Columbia University Medical Center (CUMC) have demonstrated that a protein called caspase-2 is a key regulator of a signaling pathway that leads to cognitive decline in Alzheimer’s disease. The findings, made in a mouse model of Alzheimer’s, suggest that inhibiting this protein could prevent the neuronal damage and subsequent cognitive decline associated with the disease. The study was published this month in the online journal Nature Communications.

One of the earliest events in Alzheimer’s is disruption of the brain’s synapses (the small gaps across which nerve impulses are passed), which can lead to neuronal death. Although what drives this process has not been clear, studies have indicated that caspace-2 might be involved, according to senior author Michael Shelanski, MD, PhD, the Delafield Professor of Pathology & Cell Biology, chair of the Department of Pathology and Cell Biology, and co-director of the Taub Institute for Research on Alzheimer’s Disease and the Aging Brain at CUMC.

Several years ago, in tissue culture studies of mouse neurons, Dr. Shelanski found that caspace-2 plays a critical role in the death of neurons in the presence of amyloid beta, the protein that accumulates in the neurons of people with Alzheimer’s. Other researchers have shown that caspase-2 also contributes to the maintenance of normal synaptic functions.

Dr. Shelanski and his team hypothesized that aberrant activation of caspase-2 may cause synaptic changes in Alzheimer’s disease. To test this hypothesis, the researchers crossed J20 transgenic mice (a common mouse model of Alzheimer’s) with caspase-2 null mice (mice that lack caspase-2). They compared the animals’ ability to negotiate a radial-arm water maze, a standard test of cognitive ability, with that of regular J20 mice and of normal mice at 4, 9, and 14 months of age.

The results for the three groups of mice were similar at the first two intervals. At 14 months, however, the J20/caspase-2 null mice did significantly better in the water maze test than the J20 mice and similarly to the normal mice. “We showed that removing caspase-2 from J20 mice prevented memory impairment — without significant changes in the level of soluble amyloid beta,” said co-lead author Roger Lefort, PhD, associate research scientist at CUMC.

Analysis of the neurons showed that the J20/caspase-2 null mice had a higher density of dendritic spines than the J20 mice. The more spines a neuron has, the more impulses it can transmit.

“The J20/caspase-2 null mice showed the same dendritic spine density and morphology as the normal mice—as opposed to the deficits in the J20 mice,” said co-lead author Julio Pozueta, PhD. “This strongly suggests that caspase-2 is a critical regulator in the memory decline associated with beta-amyloid in Alzheimer’s disease.”

The researchers further validated the results in studies of rat neurons in tissue culture.

Finally, the researchers found that caspase-2 interacts with RhoA, a critical regulator of the morphology (form and structure) of dendritic spines. “It appears that in normal neurons, caspase-2 and RhoA form an inactive complex outside the dendritic spines,” said Dr. Lefort. “When the complex is exposed to amyloid beta, it breaks apart, activating the two components.” Once activated, caspase-2 and RhoA enter the dendritic spines and contribute to their demise, possibly by interacting with a third molecule, the enzyme ROCK-II.

“This raises the possibility that if you can inhibit one or all of these molecules, especially early in the course of Alzheimer’s, you might be able to protect neurons and slow down the cognitive effects of the disease,” said Dr. Lefort.

Filed under alzheimer's disease beta amyloid dementia cognitive decline neurotransmission neuroscience science

free counters