Neuroscience

Articles and news from the latest research reports.

Posts tagged cognitive decline

62 notes

New Alzheimer’s research suggests possible cause: the interaction of proteins in the brain
Research shows interaction of tau and amyloid-beta in the brain may cause cognitive decline
For years, Alzheimer’s researchers have focused on two proteins that accumulate in the brains of people with Alzheimer’s and may contribute to the disease: plaques made up of the protein amyloid-beta, and tangles of another protein, called tau.
But for the first time, an Alzheimer’s researcher has looked closely at not the two proteins independently, but at the interaction of the two proteins with each other — in the brain tissue of post-mortem Alzheimer’s patients and in mouse brains with Alzheimer’s disease. The research found that the interaction between the two proteins might be the key: as these interactions increased, the progression of Alzheimer’s disease worsened.
The research, by Hemachandra Reddy, Ph.D., an associate scientist at the Oregon National Primate Research Center at Oregon Health & Science University, is detailed in the June 2013 edition of the Journal of Alzheimer’s Disease.
Reddy’s paper suggests that when the interaction between the phosphorylated tau and the amyloid-beta — particularly in its toxic form — happens at brain synapses, it can damage those synapses. And that can lead to cognitive decline in Alzheimer’s patients.
"This complex formation between amyloid-beta and tau — it is actually blocking the neural communication," Reddy said. "If we could somehow find a molecule that could inhibit the binding of these two proteins at the synapses, that very well might be the cure to Alzheimer’s disease."
To conduct the research, Reddy and his team studied three different kinds of mice, who had been bred to have some of the brain characteristics of Alzheimer’s disease, including having amyloid-beta and phosphorylated tau in their brains. Reddy also analyzed postmortem brain tissue from people who had Alzheimer’s disease.
Using multiple antibodies that recognize amyloid-beta and phosphorylated tau, Reddy and Maria Manczak, Ph.D., a research associate in Reddy’s laboratory, specifically looked for the evidence of the amyloid-beta and phosphorylated tau interactions. They found amyloid-beta/tau complexes in the human Alzheimer’s brain tissue and in the Alzheimer’s disease mouse brains. The Reddy team also found much more of those amyloid-beta/tau complexes in brains where Alzheimer’s disease had progressed the most.
Reddy found very little or no evidence of the same interaction in the “control” subjects — mice that did not have the Alzheimer’s traits and human brain tissue of people who did not have Alzheimer’s.
"So much Alzheimer’s research has been done to look at amyloid-beta and tau," Reddy said. "But ours is the first paper to strongly demonstrate that yes, there is an amyloid-beta/phosphorylated tau interaction. And that interaction might be causing the synaptic damage and cognitive decline in persons with Alzheimer’s disease."
Reddy and his lab are already working on the next crucial questions. One is to define the binding site or sites and exactly where within the neuron the interaction of amyloid-beta and tau first occurs. The second is to find a way to inhibit that interaction — and thus maybe prevent or slow the progression of Alzheimer’s.
Manczak was a co-author on the Journal of Alzheimer’s Disease article.
(Image: Shutterstock)

New Alzheimer’s research suggests possible cause: the interaction of proteins in the brain

Research shows interaction of tau and amyloid-beta in the brain may cause cognitive decline

For years, Alzheimer’s researchers have focused on two proteins that accumulate in the brains of people with Alzheimer’s and may contribute to the disease: plaques made up of the protein amyloid-beta, and tangles of another protein, called tau.

But for the first time, an Alzheimer’s researcher has looked closely at not the two proteins independently, but at the interaction of the two proteins with each other — in the brain tissue of post-mortem Alzheimer’s patients and in mouse brains with Alzheimer’s disease. The research found that the interaction between the two proteins might be the key: as these interactions increased, the progression of Alzheimer’s disease worsened.

The research, by Hemachandra Reddy, Ph.D., an associate scientist at the Oregon National Primate Research Center at Oregon Health & Science University, is detailed in the June 2013 edition of the Journal of Alzheimer’s Disease.

Reddy’s paper suggests that when the interaction between the phosphorylated tau and the amyloid-beta — particularly in its toxic form — happens at brain synapses, it can damage those synapses. And that can lead to cognitive decline in Alzheimer’s patients.

"This complex formation between amyloid-beta and tau — it is actually blocking the neural communication," Reddy said. "If we could somehow find a molecule that could inhibit the binding of these two proteins at the synapses, that very well might be the cure to Alzheimer’s disease."

To conduct the research, Reddy and his team studied three different kinds of mice, who had been bred to have some of the brain characteristics of Alzheimer’s disease, including having amyloid-beta and phosphorylated tau in their brains. Reddy also analyzed postmortem brain tissue from people who had Alzheimer’s disease.

Using multiple antibodies that recognize amyloid-beta and phosphorylated tau, Reddy and Maria Manczak, Ph.D., a research associate in Reddy’s laboratory, specifically looked for the evidence of the amyloid-beta and phosphorylated tau interactions. They found amyloid-beta/tau complexes in the human Alzheimer’s brain tissue and in the Alzheimer’s disease mouse brains. The Reddy team also found much more of those amyloid-beta/tau complexes in brains where Alzheimer’s disease had progressed the most.

Reddy found very little or no evidence of the same interaction in the “control” subjects — mice that did not have the Alzheimer’s traits and human brain tissue of people who did not have Alzheimer’s.

"So much Alzheimer’s research has been done to look at amyloid-beta and tau," Reddy said. "But ours is the first paper to strongly demonstrate that yes, there is an amyloid-beta/phosphorylated tau interaction. And that interaction might be causing the synaptic damage and cognitive decline in persons with Alzheimer’s disease."

Reddy and his lab are already working on the next crucial questions. One is to define the binding site or sites and exactly where within the neuron the interaction of amyloid-beta and tau first occurs. The second is to find a way to inhibit that interaction — and thus maybe prevent or slow the progression of Alzheimer’s.

Manczak was a co-author on the Journal of Alzheimer’s Disease article.

(Image: Shutterstock)

Filed under alzheimer's disease dementia tau protein cognitive decline phosphorylated tau neuroscience science

129 notes

Rapid, Irregular Heartbeat May Be Linked to Problems with Memory and Thinking 
People who develop a type of irregular heartbeat common in old age called atrial fibrillation may also be more likely to develop problems with memory and thinking, according to new research published in the June 5, 2013, online issue of Neurology®, the medical journal of the American Academy of Neurology.
“Problems with memory and thinking are common for people as they get older. Our study shows that on average, problems with memory and thinking may start earlier or get worse more quickly in people who have atrial fibrillation,” said study author Evan L. Thacker, PhD, of the University of Alabama at Birmingham. “This means that heart health is an important factor related to brain health.”
The study involved people age 65 and older from four communities in the United States who were enrolled in the Cardiovascular Health Study. Participants did not have a history of atrial fibrillation or stroke at the start of the study. They were followed for an average of seven years, and received a 100-point memory and thinking test every year. People who had a stroke were not included in this analysis after the stroke. Of the 5,150 participants, 552, or about 11 percent, developed atrial fibrillation during the study.
The study found that people with atrial fibrillation were more likely to experience lower memory and thinking scores at earlier ages than people with no history of atrial fibrillation. For example, from age 80 to age 85 the average score on the 100-point test went down by about 6 points for people without atrial fibrillation, but it went down by about 10 points for people with atrial fibrillation.
For participants ages 75 and older, the average rate of decline was about three to four points faster per five years of aging with atrial fibrillation compared to those without the condition.
“This suggests that on average, people with atrial fibrillation may be more likely to develop cognitive impairment or dementia at earlier ages than people with no history of atrial fibrillation,” Thacker said.
Thacker noted that scores below 78 points on the 100-point test are suggestive of dementia. People without atrial fibrillation in the study were predicted on average to score below 78 points at age 87, while people with atrial fibrillation were predicted to score below 78 points at age 85, two years earlier.
“If there is indeed a link between atrial fibrillation and memory and thinking decline, the next steps are to learn why that decline happens and how we can prevent that decline,” said Thacker.

Rapid, Irregular Heartbeat May Be Linked to Problems with Memory and Thinking

People who develop a type of irregular heartbeat common in old age called atrial fibrillation may also be more likely to develop problems with memory and thinking, according to new research published in the June 5, 2013, online issue of Neurology®, the medical journal of the American Academy of Neurology.

“Problems with memory and thinking are common for people as they get older. Our study shows that on average, problems with memory and thinking may start earlier or get worse more quickly in people who have atrial fibrillation,” said study author Evan L. Thacker, PhD, of the University of Alabama at Birmingham. “This means that heart health is an important factor related to brain health.”

The study involved people age 65 and older from four communities in the United States who were enrolled in the Cardiovascular Health Study. Participants did not have a history of atrial fibrillation or stroke at the start of the study. They were followed for an average of seven years, and received a 100-point memory and thinking test every year. People who had a stroke were not included in this analysis after the stroke. Of the 5,150 participants, 552, or about 11 percent, developed atrial fibrillation during the study.

The study found that people with atrial fibrillation were more likely to experience lower memory and thinking scores at earlier ages than people with no history of atrial fibrillation. For example, from age 80 to age 85 the average score on the 100-point test went down by about 6 points for people without atrial fibrillation, but it went down by about 10 points for people with atrial fibrillation.

For participants ages 75 and older, the average rate of decline was about three to four points faster per five years of aging with atrial fibrillation compared to those without the condition.

“This suggests that on average, people with atrial fibrillation may be more likely to develop cognitive impairment or dementia at earlier ages than people with no history of atrial fibrillation,” Thacker said.

Thacker noted that scores below 78 points on the 100-point test are suggestive of dementia. People without atrial fibrillation in the study were predicted on average to score below 78 points at age 87, while people with atrial fibrillation were predicted to score below 78 points at age 85, two years earlier.

“If there is indeed a link between atrial fibrillation and memory and thinking decline, the next steps are to learn why that decline happens and how we can prevent that decline,” said Thacker.

Filed under atrial fibrillation cognitive decline cognition irregular heartbeat medicine neuroscience science

100 notes

Heart Health Matters to Your Brain
People suffering from type 2 diabetes and cardiovascular disease (CVD) are at an increased risk of cognitive decline, according to a new study from Wake Forest Baptist Medical Center.
Lead author Christina E. Hugenschmidt, Ph.D., an instructor of gerontology and geriatric medicine at Wake Forest Baptist, said the results from the Diabetes Heart Study-Mind (DHS-Mind) suggest that CVD is playing a role in cognition problems before it is clinically apparent in patients. The research appears online ahead of print in the Journal of Diabetes and Its Complications.
 ”There has been a lot of research looking at the links between type 2 diabetes and increased risk for dementia, but this is the first study to look specifically at subclinical CVD and the role it plays,” Hugenschmidt said. “Our research shows that CVD risk caused by diabetes even before it’s at a clinically treatable level might be bad for your brain.
"The results imply that additional CVD factors, especially calcified plaque and vascular status, and not diabetes status alone, are major contributors to type 2 diabetes related cognitive decline."
Hugenschmidt said DHS-Mind is a follow-up study to the Diabetes Heart Study (DHS), which examined relationships between cognitive function, vascular calcified plaque and other major diabetes risk factors associated with cognition. The DHS investigated CVD in siblings with a high incidence and prevalence of type 2 diabetes, where extensive measurements of CVD risk factors were obtained during exams that occurred from 1998 to 2006.
The study was supported by the National Institutes of Health through NINDS R01NS058700-02S109 and NIDDK 1F32DK083214-01.
The DHS-Mind study added cognitive testing to existing measures with the express purpose of exploring the relationships between measures of atherosclerosis and cognition in a population heavily affected by diabetes, a novel approach given that previous studies have focused on diabetes and cognition in the context of clinically evident CVD, Hugenschmidt said. The researchers followed up with as many of the original 1,443 DHS study participants as possible who had cardiovascular measures. Of that 516 total, 422 were affected with type 2 diabetes and 94 were unaffected.
Hugenschmidt said the researchers ran a battery of cognitive testing that looked at different kinds of thinking like memory and processing speed, as well as executive function, which is a set of mental skills coordinated in the brain’s frontal lobe that includes stop and think processes like managing time and attention, planning and organizing. She said that being able to look at data where the comparison group was  siblings, some of whom had a high level of CVD themselves, made the results more clinically relevant because the participants shared the same environmental and genetic background.
"We still saw a difference between these two groups. Even compared to their own siblings who were not disease free, those with diabetes and subclinical cardiovascular disease had a higher risk of cognitive dysfunction," Hugenschmidt said.
CVD explains a lot of the cognitive problems that people with diabetes experience, Hugenschmidt said. “One possibility is that your brain requires a really steady blood flow and it’s possible that the cardiovascular disease that accompanies diabetes might be the main driver behind the cognitive deficits that we see.”
Hugenschmidt said the takeaway for clinicians is to take CVD risk factors into consideration when they’re treating patients with type 2 diabetes patients because even at borderline clinical levels, it might have long-term implications for peoples’ mental, cognitive health.

Heart Health Matters to Your Brain

People suffering from type 2 diabetes and cardiovascular disease (CVD) are at an increased risk of cognitive decline, according to a new study from Wake Forest Baptist Medical Center.

Lead author Christina E. Hugenschmidt, Ph.D., an instructor of gerontology and geriatric medicine at Wake Forest Baptist, said the results from the Diabetes Heart Study-Mind (DHS-Mind) suggest that CVD is playing a role in cognition problems before it is clinically apparent in patients. The research appears online ahead of print in the Journal of Diabetes and Its Complications.

 ”There has been a lot of research looking at the links between type 2 diabetes and increased risk for dementia, but this is the first study to look specifically at subclinical CVD and the role it plays,” Hugenschmidt said. “Our research shows that CVD risk caused by diabetes even before it’s at a clinically treatable level might be bad for your brain.

"The results imply that additional CVD factors, especially calcified plaque and vascular status, and not diabetes status alone, are major contributors to type 2 diabetes related cognitive decline."

Hugenschmidt said DHS-Mind is a follow-up study to the Diabetes Heart Study (DHS), which examined relationships between cognitive function, vascular calcified plaque and other major diabetes risk factors associated with cognition. The DHS investigated CVD in siblings with a high incidence and prevalence of type 2 diabetes, where extensive measurements of CVD risk factors were obtained during exams that occurred from 1998 to 2006.

The study was supported by the National Institutes of Health through NINDS R01NS058700-02S109 and NIDDK 1F32DK083214-01.

The DHS-Mind study added cognitive testing to existing measures with the express purpose of exploring the relationships between measures of atherosclerosis and cognition in a population heavily affected by diabetes, a novel approach given that previous studies have focused on diabetes and cognition in the context of clinically evident CVD, Hugenschmidt said. The researchers followed up with as many of the original 1,443 DHS study participants as possible who had cardiovascular measures. Of that 516 total, 422 were affected with type 2 diabetes and 94 were unaffected.

Hugenschmidt said the researchers ran a battery of cognitive testing that looked at different kinds of thinking like memory and processing speed, as well as executive function, which is a set of mental skills coordinated in the brain’s frontal lobe that includes stop and think processes like managing time and attention, planning and organizing. She said that being able to look at data where the comparison group was  siblings, some of whom had a high level of CVD themselves, made the results more clinically relevant because the participants shared the same environmental and genetic background.

"We still saw a difference between these two groups. Even compared to their own siblings who were not disease free, those with diabetes and subclinical cardiovascular disease had a higher risk of cognitive dysfunction," Hugenschmidt said.

CVD explains a lot of the cognitive problems that people with diabetes experience, Hugenschmidt said. “One possibility is that your brain requires a really steady blood flow and it’s possible that the cardiovascular disease that accompanies diabetes might be the main driver behind the cognitive deficits that we see.”

Hugenschmidt said the takeaway for clinicians is to take CVD risk factors into consideration when they’re treating patients with type 2 diabetes patients because even at borderline clinical levels, it might have long-term implications for peoples’ mental, cognitive health.

Filed under cardiovascular disease diabetes cognitive decline neurodegeneration neuroscience science

94 notes

Microbleeding in Brain May Be Behind Senior Moments
People may grow wiser with age, but they don’t grow smarter. Many of our mental abilities decline after midlife, and now researchers say that they’ve fingered a culprit. A study presented here last week at the annual  meeting of the Association for Psychological Science points to microbleeding in the brain caused by stiffening arteries. The finding may lead to new therapies to combat senior moments.
This isn’t the first time that microbleeds have been suspected as a cause of cognitive decline. “We have known [about them] for some time thanks to neuroimaging studies,” says Matthew Pase, a psychology Ph.D. student at Swinburne University of Technology in Melbourne, Australia. The brains of older people are sometimes peppered with dark splotches where blood vessels have burst and created tiny dead zones of tissue. How important these microbleeds are to cognitive decline, and what causes them, have remained open questions, however.
Pase wondered if high blood pressure might be behind the microbleeds. The brain is a very blood-hungry organ, he notes. “It accounts for only 2% of the body weight yet receives 15% of the cardiac output and consumes 20% of the body’s oxygen expenditure.” Rather than getting the oxygen in pulses, the brain needs a smooth, continuous supply. So the aorta, the largest blood vessel branching off the heart, smooths out blood pressure before it reaches the brain by absorbing the pressure with its flexible walls. But as people age, the aorta stiffens. That translates to higher pressure on the brain, especially during stress. The pulse of blood can be strong enough to burst vessels in the brain, resulting in microbleeds.
A stumbling block has been accurately measuring the blood pressure that the brain experiences. The hand-pumped armband devices commonly used in doctor’s offices measure only the local pressure of blood in the arm, known as the brachial pressure. To calculate aorta stiffness, the “central blood pressure” in the aorta is needed. A technique for measuring central blood pressure was developed in the late 1990s, called applanation tonometry (AT). It works by comparing the pressure wave of blood from the heart with the reflected pressure wave from the vessels farthest from the heart—the aorta stiffness is calculated from the difference in pressure from the two. Devices for measuring AT have appeared on the market that are fast and painless.
To see if central blood pressure and aorta stiffening are related to cognitive abilities, Pase and colleagues recruited 493 people in Melbourne, 20 to 82 years old. They made traditional blood pressure measurements and also used AT to measure central blood pressure and estimate aorta stiffness. They also measured their subjects’ cognitive abilities with a standard battery of computer tests.
Central blood pressure and aorta stiffness alone were sensitive predictors of cognitive abilities, Pase reported at the meeting. The higher the central pressure and aorta stiffness, the worse people tended to perform on tests of visual processing and memory. The traditional measures of blood pressure in the arm were correlated with only scores on one test of visual processing.
To prove that aorta stiffening causes microbleeds, the researchers will need to repeat the experiment on the same people over the course of several years, using neuroimaging as well to establish that aorta stiffening leads to both microbleeding and cognitive decline. Pase notes that other causes of microbleeding have been proposed, such as weakening of blood vessels in the brain.
"This work is so important because the problem is so pervasive," says Earl Hunt, a veteran intelligence researcher at the University of Washington, Seattle, who was not involved in the work. The individual effects of these microbleeds are probably too small to measure. "But even a trifling difference multiplied a million times is big," he says. Pase’s collaborator at Swinburne, Con Stough, is now leading a study of how to prevent microbleeding through dietary supplements. He proposes that the elasticity of the aorta could be preserved by providing fatty acids or antioxidants that help maintain its structure. The results are expected in 2015.

Microbleeding in Brain May Be Behind Senior Moments

People may grow wiser with age, but they don’t grow smarter. Many of our mental abilities decline after midlife, and now researchers say that they’ve fingered a culprit. A study presented here last week at the annual meeting of the Association for Psychological Science points to microbleeding in the brain caused by stiffening arteries. The finding may lead to new therapies to combat senior moments.

This isn’t the first time that microbleeds have been suspected as a cause of cognitive decline. “We have known [about them] for some time thanks to neuroimaging studies,” says Matthew Pase, a psychology Ph.D. student at Swinburne University of Technology in Melbourne, Australia. The brains of older people are sometimes peppered with dark splotches where blood vessels have burst and created tiny dead zones of tissue. How important these microbleeds are to cognitive decline, and what causes them, have remained open questions, however.

Pase wondered if high blood pressure might be behind the microbleeds. The brain is a very blood-hungry organ, he notes. “It accounts for only 2% of the body weight yet receives 15% of the cardiac output and consumes 20% of the body’s oxygen expenditure.” Rather than getting the oxygen in pulses, the brain needs a smooth, continuous supply. So the aorta, the largest blood vessel branching off the heart, smooths out blood pressure before it reaches the brain by absorbing the pressure with its flexible walls. But as people age, the aorta stiffens. That translates to higher pressure on the brain, especially during stress. The pulse of blood can be strong enough to burst vessels in the brain, resulting in microbleeds.

A stumbling block has been accurately measuring the blood pressure that the brain experiences. The hand-pumped armband devices commonly used in doctor’s offices measure only the local pressure of blood in the arm, known as the brachial pressure. To calculate aorta stiffness, the “central blood pressure” in the aorta is needed. A technique for measuring central blood pressure was developed in the late 1990s, called applanation tonometry (AT). It works by comparing the pressure wave of blood from the heart with the reflected pressure wave from the vessels farthest from the heart—the aorta stiffness is calculated from the difference in pressure from the two. Devices for measuring AT have appeared on the market that are fast and painless.

To see if central blood pressure and aorta stiffening are related to cognitive abilities, Pase and colleagues recruited 493 people in Melbourne, 20 to 82 years old. They made traditional blood pressure measurements and also used AT to measure central blood pressure and estimate aorta stiffness. They also measured their subjects’ cognitive abilities with a standard battery of computer tests.

Central blood pressure and aorta stiffness alone were sensitive predictors of cognitive abilities, Pase reported at the meeting. The higher the central pressure and aorta stiffness, the worse people tended to perform on tests of visual processing and memory. The traditional measures of blood pressure in the arm were correlated with only scores on one test of visual processing.

To prove that aorta stiffening causes microbleeds, the researchers will need to repeat the experiment on the same people over the course of several years, using neuroimaging as well to establish that aorta stiffening leads to both microbleeding and cognitive decline. Pase notes that other causes of microbleeding have been proposed, such as weakening of blood vessels in the brain.

"This work is so important because the problem is so pervasive," says Earl Hunt, a veteran intelligence researcher at the University of Washington, Seattle, who was not involved in the work. The individual effects of these microbleeds are probably too small to measure. "But even a trifling difference multiplied a million times is big," he says. Pase’s collaborator at Swinburne, Con Stough, is now leading a study of how to prevent microbleeding through dietary supplements. He proposes that the elasticity of the aorta could be preserved by providing fatty acids or antioxidants that help maintain its structure. The results are expected in 2015.

Filed under brain microbleeding cognitive decline blood vessels blood pressure psychology neuroscience science

64 notes

Exposure to general anaesthesia could increase the risk of dementia in elderly by 35 percent

Exposure to general anaesthesia increases the risk of dementia in the elderly by 35%, says new research presented at Euroanaesthesia, the annual congress of the European Society of Anaesthesiology (ESA). The research is by Dr Francois Sztark, INSERM and University of Bordeaux, France, and colleagues.

Postoperative cognitive dysfunction, or POCD, could be associated with dementia several years later. POCD is a common complication in elderly patients after major surgery. It has been proposed that there is an association between POCD and the development of dementia due to a common pathological mechanism through the amyloid β peptide. Several experimental studies suggest that some anaesthetics could promote inflammation of neural tissues leading to POCD and/or Alzheimer’s disease (AD) precursors including β-amyloid plaques and neurofibrillary tangles. But it remains uncertain whether POCD can be a precursor of dementia.

In this new study, the researchers analysed the risk of dementia associated with anaesthesia within a prospective population-based cohort of elderly patients (aged 65 years and over). The team used data from the Three-City study, designed to assess the risk of dementia and cognitive decline due to vascular risk factors. Between 1999 and 2001, the 3C study included 9294 community-dwelling French people aged 65 years and over in three French cities (Bordeaux, Dijon and Montpellier).

Participants aged 65 years and over were interviewed at baseline and subsequently 2, 4, 7 and 10 years after. Each examination included a complete cognitive evaluation with systematic screening of dementia. From the 2-year follow-up, 7008 non-demented participants were asked at each follow-up whether they have had a history of anaesthesia (general anaesthesia (GA) or local/locoregional anaesthesia (LRA)) since the last follow-up. The data were adjusted to take account of potential confounders such as socioeconomic status and comorbidities.

The mean age of participants was 75 years and 62% were women. At the 2-year follow-up, 33% of the participants (n=2309) reported an anaesthesia over the 2 previous years, with 19% (n=1333) reporting a GA and 14% (n=948) a LRA. A total of 632 (9%) participants developed dementia over the 8 subsequent years of follow-up, among them 284 probable AD and 228 possible AD, and the remaining 120 non-Alzheimer’s dementia. The researchers found that demented patients were more likely to have received anaesthesia (37%) than non-demented patients (32%). This difference in anaesthesia was due to difference in numbers receiving general anaesthetics, with 22% of demented patients reporting a GA compared with 19% of non-demented patients. After adjustment, participants with at least one GA over the follow-up had a 35% increased risk of developing a dementia compared with participants without anaesthesia.

Dr Sztark concludes: “These results are in favour of an increased risk for dementia several years after general anaesthesia. Recognition of POCD is essential in the perioperative management of elderly patients. A long-term follow-up of these patients should be planned.”

(Source: eurekalert.org)

Filed under anaesthesia dementia amyloid plaques cognitive decline socioeconomic status neuroscience science

34 notes

Anti-cancer drug viewed as possible Alzheimer’s treatment doesn’t work in UF study

An anti-cancer drug about to be tested in a clinical trial by a biomedical company in Ohio as a possible treatment for Alzheimer’s disease has failed to work with the same type of brain plaques that plague Alzheimer’s patients, according to results of a study by University of Florida researchers.

David Borchelt, Ph.D., a professor of neuroscience affiliated with the Evelyn F. and William L. McKnight Brain Institute of the University of Florida, emphasized the importance of verifying promising research results before investing in clinical studies or testing potential therapies in people. Bexarotene has known side effects that include effects on the liver, blood and other metabolic systems.

“We wanted to repeat the study to see if we could build on it, and we couldn’t,” he said. “We thought it was important that something like this, which got a lot of publicity and patients were immediately looking to try to get access to this drug, that it was important to publish the fact that we couldn’t reproduce the most exciting part of the study. Maybe there should be some caution going forward in regard to patients.”

Borchelt and Kevin Felsenstein, Ph.D., an associate professor of neuroscience, said a drug called bexarotene that their team orally administered to mice did not reduce amyloid plaques, waxy buildups on the brain that are a key culprit in Alzheimer’s disease. Their findings will be published in the May 24, 2013 issue of the journal Science magazine, with two additional articles (1, 2) detailing similar results from other researchers.

The research follows up on a 2012 Science article that claimed bexarotene had reversed Alzheimer’s-like symptoms in mice afflicted with the plaques. Authors of that study also administered the drug orally.

The paper “indicated that with as little as three days of treatment, they basically cleared the amyloid deposits from these animals, as well as restored cognitive abilities,” Felsenstein said of the 2012 paper.He said the results of the original study were surprising, given decades of research that had failed to find a therapy successful in dismantling amyloid plaques.

“We can shut down the production of amyloid in these animal models and the deposits in these animal models don’t disappear,” Felsenstein said. “These deposits have been described by some as cement, and it will take a lot to get rid of them. The fact that something could actually make them disappear in literally a couple of days is — again — very remarkable.”

Interested to see how bexarotene might work to break down amyloid plaques, Felsenstein and Borchelt selected mice approximately the same age as those used in the 2012 study and orally administered the drug to the mice. Tests confirmed the drug had reached its target genes in the mice, and that it elevated levels of a protein called apolipoprotein E. Some scientists believe one of the forms of this protein may prevent the buildup of amyloid brain plaques in people who don’t have Alzheimer’s disease.

But elevated levels of the protein in the mice studied by UF researchers seemed to have no effect on the animals’ amyloid plaques. Samples taken after seven days of treatment with bexarotene showed no significant difference in the number or size of plaques in the animals’ brains. Two teams of researchers from other institutions also were unable to replicate the breakdown of amyloid plaques.

Felsenstein emphasized that his team does not claim the previous study indicating bexarotene’s effectiveness is “totally wrong.”

“We’re just saying right now it’s extremely difficult to replicate and there may be little nuances, that there’s something that we don’t quite understand,” he added. Felsenstein and Borchelt both work at UF’s Center for Translational Research in Neurodegenerative Disease.

(Source: ufhealth.org)

Filed under alzheimer's disease cognitive decline amyloid plaques anti-cancer drug bexarotene neuroscience science

65 notes

White matter imaging provides insight into human and chimpanzee aging

The instability of “white matter” in humans may contribute to greater cognitive decline during the aging of humans compared with chimpanzees, scientists from Yerkes National Primate Research Center, Emory University have found.

image

Yerkes scientists have discovered that white matter — the wires connecting the computing centers of the brain — begins to deteriorate earlier in the human lifespan than in the lives of aging chimpanzees.

This was the first examination of white matter integrity in aging chimpanzees. The results were published April 24 and are available online before print in the journal Neurobiology of Aging.

"Our study demonstrates that the price we pay for greater longevity than other primates may be the unique vulnerability of humans to neurodegenerative disease," says research associate Xu (Jerry) Chen, first author of the paper. “The breakdown of white matter in later life could be part of that vulnerability.” 

Both humans’ longer life spans and distinctive metabolism could lie behind the differences in the patterns of brain aging, says co-author Todd Preuss, PhD, associate research professor in Yerkes’ Division of Neuropharmacology and Neurologic Diseases.

White matter integrity actually peaks around the same absolute age in both chimpanzees and humans, but humans may experience more degradation because they live longer. Perhaps the need to retain brain capacity late in life is one reason increased brain size was selected for in human evolution,” Preuss says.  

The senior author is James Rilling, PhD, Yerkes researcher, associate professor of anthropology at Emory and director of the Laboratory for Darwinian Neuroscience. Collaborators at the University of Oslo also contributed to the paper.

In the brain, gray matter represents information processing centers, while white matter represents wires connecting these centers. White matter looks white because it is made up of myelin, a fatty electrical insulator that coats the axons of neurons.

If myelin deteriorates, neurons’ electrical signals are not transmitted as effectively, which contributes to cognitive decline. Myelin breakdown has been linked with cognitive decline both in healthy aging and in the context of Alzheimer’s disease.

The team’s data show that white matter integrity, as measured through a form of magnetic resonance imaging (MRI), peaks at age 31 in chimpanzees and at age 30 in humans. The average lifespan of chimpanzees is between 40 to 45 years, although in zoos or research facilities some have lived until 60. For comparison, human life expectancy in some developed countries is more than 80 years.

"The human equivalent of a 31 year old chimpanzee is about 47 years," Rilling says. "Extrapolating from chimpanzees, we could expect that human white matter integrity would peak at age 47, but instead it peaks and begins to decline at age 30."

The researchers collected MRI scans from 32 female chimpanzees and 20 female rhesus macaques and compared them with a pre-existing set of scans from human females. They used diffusion-weighted imaging (a form of MRI) to examine age-related changes in white matter integrity.

Diffusion-weighted imaging picks up microscopic changes in white matter by detecting directional differences in the ability of water molecules to diffuse. When the myelin coating of axons breaks down, water molecules in the brain can diffuse more freely, especially in directions perpendicular to axon bundles, Chen says.

(Source: news.emory.edu)

Filed under brain primates aging cognitive decline white matter evolution neuroscience science

74 notes

Alzheimer’s markers predict start of mental decline

Scientists at Washington University School of Medicine in St. Louis have helped identify many of the biomarkers for Alzheimer’s disease that could potentially predict which patients will develop the disorder later in life. Now, studying spinal fluid samples and health data from 201 research participants at the Charles F. and Joanne Knight Alzheimer’s Disease Research Center, the researchers have shown the markers are accurate predictors of Alzheimer’s years before symptoms develop.

image

“We wanted to see if one marker was better than the other in predicting which of our participants would get cognitive impairment and when they would get it,” said Catherine Roe, PhD, research assistant professor of neurology. “We found no differences in the accuracy of the biomarkers.”

The study, supported in part by the National Institute on Aging, appears in Neurology.

The researchers evaluated markers such as the buildup of amyloid plaques in the brain, newly visible thanks to an imaging agent developed in the last decade; levels of various proteins in the cerebrospinal fluid, such as the amyloid fragments that are the principal ingredient of brain plaques; and the ratios of one protein to another in the cerebrospinal fluid, such as different forms of the brain cell structural protein tau.

The markers were studied in volunteers whose ages ranged from 45 to 88. On average, the data available on study participants spanned four years, with the longest recorded over 7.5 years.

The researchers found that all of the markers were equally good at identifying subjects who were likely to develop cognitive problems and at predicting how soon they would become noticeably impaired.

Next, the scientists paired the biomarkers data with demographic information, testing to see if sex, age, race, education and other factors could improve their predictions.

“Sex, age and race all helped to predict who would develop cognitive impairment,” Roe said. “Older participants, men and African Americans were more likely to become cognitively impaired than those who were younger, female and Caucasian.”

Roe described the findings as providing more evidence that scientists can detect Alzheimer’s disease years before memory loss and cognitive decline become apparent.

“We can better predict future cognitive impairment when we combine biomarkers with patient characteristics,” she said. “Knowing how accurate biomarkers are is important if we are going to some day be able to treat Alzheimer’s before symptoms and slow or prevent the disease.”

Clinical trials are already underway at Washington University and elsewhere to determine if treatments prior to symptoms can prevent or delay inherited forms of Alzheimer’s disease. Reliable biomarkers for Alzheimer’s should one day make it possible to test the most successful treatments in the much more common sporadic forms of Alzheimer’s.

(Source: news.wustl.edu)

Filed under biomarkers alzheimer's disease cognitive decline amyloid plaques neuroimaging neuroscience science

71 notes

A little brain training goes a long way
People who use a ‘brain-workout’ program for just 10 hours have a mental edge over their peers even a year later, researchers report today in PLoS ONE.
The search for a regimen of mental callisthenics to stave off age-related cognitive decline is a booming area of research — and a multimillion-dollar business. But critics argue that even though such computer programs can improve performance on specific mental tasks, there is scant proof that they have broader cognitive benefits.
For the study, adults aged 50 and older played a computer game designed to boost the speed at which players process visual stimuli. Processing speed is thought to be “the first domino that falls in cognitive decline”, says Fredric Wolinsky, a public-health researcher at the University of Iowa in Iowa City, who led the research.
The game was developed by academic researchers but is now sold under the name Double Decision by Posit Science, based in San Francisco, California. (Posit did not fund the study.) Players are timed on how fast they click on an image in the centre of the screen and on others that appear around the periphery. The program ratchets up the difficulty as a player’s performance improves.
Participants played the training game for 10 hours on site, some with an extra 4-hour ‘booster’ session later, or for 10 hours at home. A control group worked on computerized crossword puzzles for 10 hours on site. Researchers measured the mental agility of all 621 subjects before the brain training began, and again one year later, using eight well-established tests of cognitive performance.
The control group’s scores did not increase over the course of that year, but all the brain-training groups significantly upped their scores in the Useful Field of View test — which requires a subject to identify items in a scene with just a quick glance — and four others. When they compared the study participants’ scores to those expected for people their ages, the researchers found improvements that translated to 3-4.1 years of protection in age-related decline for the field-of-view test and from 1.5-6.6 years for the other tasks.
“It was interesting that it didn’t matter whether you were on site at the clinic or just did this at home — you got basically the same bang for your buck,” says Frederick Unverzagt, a neuropsychologist at the Indiana University School of Medicine in Indianapolis, who was not involved with the study.
But Peter Snyder, a neuropsychologist at Brown University in Providence, Rhode Island, points out that players’ performance could have improved simply because they were familiar with the game — not because their cognitive skills improved. “To me, that makes it hard to interpret the results with the same degree of certainty” that the authors have, he says.
Snyder also doubts that 10 hours of training could affect brain wiring enough to provide long-lasting general benefits, but Henry Mahncke, chief executive of Posit Science, disagrees. “If you’ve never played piano before and spend 10 hours practising, a year later you will be better than when you started,” he says. “The new study shows that there’s science to be done here. Some things you can do with your brain are highly productive and others are not.”

A little brain training goes a long way

People who use a ‘brain-workout’ program for just 10 hours have a mental edge over their peers even a year later, researchers report today in PLoS ONE.

The search for a regimen of mental callisthenics to stave off age-related cognitive decline is a booming area of research — and a multimillion-dollar business. But critics argue that even though such computer programs can improve performance on specific mental tasks, there is scant proof that they have broader cognitive benefits.

For the study, adults aged 50 and older played a computer game designed to boost the speed at which players process visual stimuli. Processing speed is thought to be “the first domino that falls in cognitive decline”, says Fredric Wolinsky, a public-health researcher at the University of Iowa in Iowa City, who led the research.

The game was developed by academic researchers but is now sold under the name Double Decision by Posit Science, based in San Francisco, California. (Posit did not fund the study.) Players are timed on how fast they click on an image in the centre of the screen and on others that appear around the periphery. The program ratchets up the difficulty as a player’s performance improves.

Participants played the training game for 10 hours on site, some with an extra 4-hour ‘booster’ session later, or for 10 hours at home. A control group worked on computerized crossword puzzles for 10 hours on site. Researchers measured the mental agility of all 621 subjects before the brain training began, and again one year later, using eight well-established tests of cognitive performance.

The control group’s scores did not increase over the course of that year, but all the brain-training groups significantly upped their scores in the Useful Field of View test — which requires a subject to identify items in a scene with just a quick glance — and four others. When they compared the study participants’ scores to those expected for people their ages, the researchers found improvements that translated to 3-4.1 years of protection in age-related decline for the field-of-view test and from 1.5-6.6 years for the other tasks.

“It was interesting that it didn’t matter whether you were on site at the clinic or just did this at home — you got basically the same bang for your buck,” says Frederick Unverzagt, a neuropsychologist at the Indiana University School of Medicine in Indianapolis, who was not involved with the study.

But Peter Snyder, a neuropsychologist at Brown University in Providence, Rhode Island, points out that players’ performance could have improved simply because they were familiar with the game — not because their cognitive skills improved. “To me, that makes it hard to interpret the results with the same degree of certainty” that the authors have, he says.

Snyder also doubts that 10 hours of training could affect brain wiring enough to provide long-lasting general benefits, but Henry Mahncke, chief executive of Posit Science, disagrees. “If you’ve never played piano before and spend 10 hours practising, a year later you will be better than when you started,” he says. “The new study shows that there’s science to be done here. Some things you can do with your brain are highly productive and others are not.”

Filed under cognitive training aging cognitive decline visual processing performance psychology neuroscience science

48 notes

Going Places: Rat Brain ‘GPS’ Maps Routes to Rewards

Research has implications for understanding memory and imagination

While studying rats’ ability to navigate familiar territory, Johns Hopkins scientists found that one particular brain structure uses remembered spatial information to imagine routes the rats then follow. Their discovery has implications for understanding why damage to that structure, called the hippocampus, disrupts specific types of memory and learning in people with Alzheimer’s disease and age-related cognitive decline. And because these mental trajectories guide the rats’ behavior, the research model the scientists developed may be useful in future studies on higher-level tasks, such as decision-making.

The details of their work were published online in the journal Nature on April 17.

image

“For the first time, we believe we have evidence that before a rat returns to an important place, it actually plans out its path,” says David Foster, Ph.D., assistant professor of neuroscience at the Johns Hopkins University School of Medicine. “The rat finds that location in its mind’s eye and knows how to get there.”

Foster and his team found that, at least for the purposes of navigation, the “mind’s eye” is located in the hippocampus, which is composed of two banana-shaped segments under the cerebral cortex on both sides of the brain. It is best known for creating memories. In people with Alzheimer’s, it is one of the first parts of the brain to sustain damage.

The Foster lab experiments focused on a group of neurons in the hippocampus called place cells because they are known to fire when animals are at a given location within a given environment. What was not known, Foster says, was how and when the brain uses that information.

By miniaturizing an existing technology, Foster and a postdoc in his lab, Brad Pfeiffer, Ph.D., were able to implant 20 microwires into each side of the hippocampus of four rats. The tiny wires let them record electrical activity from as many as 250 individual place cells at the same time, more than ever achieved before.

Over a two-week training period, the rats became familiar with the testing area which was surrounded by a variety of objects, so that the rats could tell where they were in relation to the objects outside. The space was 2 meters square with 36 tiny “dishes” placed at regular intervals in a grid. A single dish at a time would be filled with the rats’ reward: liquid chocolate.

The rats’ navigation tests involved as many as 40 sets of alternating “odd” and “even” trials per day. The odd trials required the rats to “forage” through the arena to find a chocolate-filled dish in a random location; the even trials required the rats to return each time to a “home” dish to receive their reward. While the rats fulfilled their tasks, the researchers recorded the firing of their place cells.

They found that as a rat travels randomly through the box without knowing where it needs to go, different combinations of place cells fire at each location along its path. The same set of cells fires every time the rat travels the same spot. These unique combinations of firings “mark” each spot in the rat’s brain and can be reconstructed into what seems like a virtual map, when needed.

When a rat is about to go to a specific location, e.g., “home,” place cells in its hippocampus fire in a sequence that creates a predictive path, which the rat then follows, somewhat like Hansel and Gretel following an imagined bread crumb trail.

Foster says that “unlike a Hansel and Gretel bread crumb trail, which only allows you to leave by the same route by which you entered, the rats’ memories of their surroundings are flexible and can be reconstructed in a way that allows them to ‘picture’ how to quickly get from point A to point B.” In order to do this, he says, the rats must already be familiar with the terrain between point A and point B, but, like a GPS, they don’t have to have previously started at point A with the goal of reaching point B.

Foster says the elderly can get lost easily, and research on aged mice shows that their place cells can fail to distinguish between different environments. His team’s research suggests that defective place cells would also affect a person’s ability to “look ahead” in their imaginations to predict a way home. Similarly, he says, higher-order brain functions, like problem solving, also require people to “look ahead” and imagine themselves in a different scenario.

“The hippocampus seems to be directing the movement of the rats, making decisions for them in real time,” says Foster. “Our model allows us to see this happening in a way that’s not been possible before. Our next question is, what will these place cells do when we put obstacles in the rats’ paths?”

Filed under cerebral cortex hippocampus cognitive decline spatial information rats neuroscience science

free counters