Neuroscience

Articles and news from the latest research reports.

Posts tagged cognitive decline

130 notes

Regular aerobic exercise boosts memory area of brain in older women
Regular aerobic exercise seems to boost the size of the area of the brain (hippocampus) involved in verbal memory and learning among women whose intellectual capacity has been affected by age, indicates a small study published online in the British Journal of Sports Medicine.
The hippocampus has become a focus of interest in dementia research because it is the area of the brain involved in verbal memory and learning, but it is very sensitive to the effects of ageing and neurological damage.
The researchers tested the impact of different types of exercise on the hippocampal volume of 86 women who said they had mild memory problems, known as mild cognitive impairment - and a common risk factor for dementia.
All the women were aged between 70 and 80 years old and were living independently at home.
Roughly equal numbers of them were assigned to either twice weekly hour long sessions of aerobic training (brisk walking); or resistance training, such as lunges, squats, and weights; or balance and muscle toning exercises, for a period of six months.
The size of their hippocampus was assessed at the start and the end of the six month period by means of an MRI scan, and their verbal memory and learning capacity was assessed before and afterward using a validated test (RAVLT).
Only 29 of the women had before and after MRI scans, but the results showed that the total volume of the hippocampus in the group who had completed the full six months of aerobic training was significantly larger than that of those who had lasted the course doing balance and muscle toning exercises.
No such difference in hippocampal volume was seen in those doing resistance training compared with the balance and muscle toning group.
However, despite an earlier finding in the same sample of women that aerobic exercise improved verbal memory, there was some evidence to suggest that an increase in hippocampal volume was associated with poorer verbal memory.
This suggests that the relationship between brain volume and cognitive performance is complex, and requires further research, say the authors.
But at the very least, aerobic exercise seems to be able to slow the shrinkage of the hippocampus and maintain the volume in a group of women who are at risk of developing dementia, they say.
And they recommend regular aerobic exercise to stave off mild cognitive decline, which is especially important, given the mounting evidence showing that regular exercise is good for cognitive function and overall brain health, and the rising toll of dementia.
Worldwide, one new case of dementia is diagnosed every four seconds, with the number of those afflicted set to rise to more than 115 million by 2050, they point out.

Regular aerobic exercise boosts memory area of brain in older women

Regular aerobic exercise seems to boost the size of the area of the brain (hippocampus) involved in verbal memory and learning among women whose intellectual capacity has been affected by age, indicates a small study published online in the British Journal of Sports Medicine.

The hippocampus has become a focus of interest in dementia research because it is the area of the brain involved in verbal memory and learning, but it is very sensitive to the effects of ageing and neurological damage.

The researchers tested the impact of different types of exercise on the hippocampal volume of 86 women who said they had mild memory problems, known as mild cognitive impairment - and a common risk factor for dementia.

All the women were aged between 70 and 80 years old and were living independently at home.

Roughly equal numbers of them were assigned to either twice weekly hour long sessions of aerobic training (brisk walking); or resistance training, such as lunges, squats, and weights; or balance and muscle toning exercises, for a period of six months.

The size of their hippocampus was assessed at the start and the end of the six month period by means of an MRI scan, and their verbal memory and learning capacity was assessed before and afterward using a validated test (RAVLT).

Only 29 of the women had before and after MRI scans, but the results showed that the total volume of the hippocampus in the group who had completed the full six months of aerobic training was significantly larger than that of those who had lasted the course doing balance and muscle toning exercises.

No such difference in hippocampal volume was seen in those doing resistance training compared with the balance and muscle toning group.

However, despite an earlier finding in the same sample of women that aerobic exercise improved verbal memory, there was some evidence to suggest that an increase in hippocampal volume was associated with poorer verbal memory.

This suggests that the relationship between brain volume and cognitive performance is complex, and requires further research, say the authors.

But at the very least, aerobic exercise seems to be able to slow the shrinkage of the hippocampus and maintain the volume in a group of women who are at risk of developing dementia, they say.

And they recommend regular aerobic exercise to stave off mild cognitive decline, which is especially important, given the mounting evidence showing that regular exercise is good for cognitive function and overall brain health, and the rising toll of dementia.

Worldwide, one new case of dementia is diagnosed every four seconds, with the number of those afflicted set to rise to more than 115 million by 2050, they point out.

Filed under aerobic exercise memory hippocampus dementia cognitive decline psychology neuroscience science

66 notes

Older People with Faster Decline In Memory and Thinking Skills May Have Lower Risk of Cancer Death
Older people who are starting to have memory and thinking problems, but do not yet have dementia may have a lower risk of dying from cancer than people who have no memory and thinking problems, according to a study published in the April 9, 2014, online issue of Neurology®, the medical journal of the American Academy of Neurology.
“Studies have shown that people with Alzheimer’s disease are less likely to develop cancer, but we don’t know the reason for that link,” said study author Julián Benito-León, MD, PhD, of University Hospital 12 of October in Madrid, Spain. “One possibility is that cancer is underdiagnosed in people with dementia, possibly because they are less likely to mention their symptoms or caregivers and doctors are focused on the problems caused by dementia. The current study helps us discount that theory.”
The study involved 2,627 people age 65 and older in Spain who did not have dementia at the start of the study. They took tests of memory and thinking skills at the start of the study and again three years later, and were followed for an average of almost 13 years. The participants were divided into three groups: those whose scores on the thinking tests were declining the fastest, those whose scores improved on the tests, and those in the middle.
During the study, 1,003 of the participants died, including 339 deaths, or 34 percent, among those with the fastest decline in thinking skills and 664 deaths, or 66 percent, among those in the other two groups. A total of 21 percent of those in the group with the fastest decline died of cancer, according to their death certificates, compared to 29 percent of those in the other two groups.
People in the fastest declining group were still 30 percent less likely to die of cancer when the results were adjusted to control for factors such as smoking, diabetes and heart disease, among others.
“We need to understand better the relationship between a disease that causes abnormal cell death and one that causes abnormal cell growth,” Benito-León said. “With the increasing number of people with both dementia and cancer, understanding this association could help us better understand and treat both diseases.”

Older People with Faster Decline In Memory and Thinking Skills May Have Lower Risk of Cancer Death

Older people who are starting to have memory and thinking problems, but do not yet have dementia may have a lower risk of dying from cancer than people who have no memory and thinking problems, according to a study published in the April 9, 2014, online issue of Neurology®, the medical journal of the American Academy of Neurology.

“Studies have shown that people with Alzheimer’s disease are less likely to develop cancer, but we don’t know the reason for that link,” said study author Julián Benito-León, MD, PhD, of University Hospital 12 of October in Madrid, Spain. “One possibility is that cancer is underdiagnosed in people with dementia, possibly because they are less likely to mention their symptoms or caregivers and doctors are focused on the problems caused by dementia. The current study helps us discount that theory.”

The study involved 2,627 people age 65 and older in Spain who did not have dementia at the start of the study. They took tests of memory and thinking skills at the start of the study and again three years later, and were followed for an average of almost 13 years. The participants were divided into three groups: those whose scores on the thinking tests were declining the fastest, those whose scores improved on the tests, and those in the middle.

During the study, 1,003 of the participants died, including 339 deaths, or 34 percent, among those with the fastest decline in thinking skills and 664 deaths, or 66 percent, among those in the other two groups. A total of 21 percent of those in the group with the fastest decline died of cancer, according to their death certificates, compared to 29 percent of those in the other two groups.

People in the fastest declining group were still 30 percent less likely to die of cancer when the results were adjusted to control for factors such as smoking, diabetes and heart disease, among others.

“We need to understand better the relationship between a disease that causes abnormal cell death and one that causes abnormal cell growth,” Benito-León said. “With the increasing number of people with both dementia and cancer, understanding this association could help us better understand and treat both diseases.”

Filed under memory dementia cancer cognitive decline aging neurology neuroscience science

204 notes

Blood Test Identifies Those At-Risk for Cognitive Decline, Alzheimer’s Within 3 Years
Researchers have discovered and validated a blood test that can predict with greater than 90 percent accuracy if a healthy person will develop mild cognitive impairment or Alzheimer’s disease within three years.
Described in the April issue of Nature Medicine, the study heralds the potential for developing treatment strategies for Alzheimer’s at an earlier stage, when therapy would be more effective at slowing or preventing onset of symptoms. It is the first known published report of blood-based biomarkers for preclinical Alzheimer’s.
The test identifies 10 lipids, or fats, in the blood that predict disease onset. It could be ready for use in clinical studies in as few as two years and, researchers say, other diagnostic uses are possible.
“Our novel blood test offers the potential to identify people at risk for progressive cognitive decline and can change how patients, their families and treating physicians plan for and manage the disorder,” says the study’s corresponding author Howard J. Federoff, MD, PhD, professor of neurology and executive vice president for health sciences at Georgetown University Medical Center.
There is no cure or effective treatment for Alzheimer’s. Worldwide, about 35.6 million individuals have the disease and, according to the World Health Organization, the number will double every 20 years to 115.4 million people with Alzheimer’s by 2050.
Federoff explains there have been many efforts to develop drugs to slow or reverse the progression of Alzheimer’s disease, but all of them have failed. He says one reason may be the drugs were evaluated too late in the disease process.
“The preclinical state of the disease offers a window of opportunity for timely disease-modifying intervention,” Federoff says. “Biomarkers such as ours that define this asymptomatic period are critical for successful development and application of these therapeutics.”
The study included 525 healthy participants aged 70 and older who gave blood samples upon enrolling and at various points in the study. Over the course of the five-year study, 74 participants met the criteria for either mild Alzheimer’s disease (AD) or a condition known as amnestic mild cognitive impairment (aMCI), in which memory loss is prominent. Of these, 46 were diagnosed upon enrollment and 28 developed aMCI or mild AD during the study (the latter group called converters).
In the study’s third year, the researchers selected 53 participants who developed aMCI/AD (including 18 converters) and 53 cognitively normal matched controls for the lipid biomarker discovery phase of the study. The lipids were not targeted before the start of the study, but rather, were an outcome of the study.
A panel of 10 lipids was discovered, which researchers say appears to reveal the breakdown of neural cell membranes in participants who develop symptoms of cognitive impairment or AD. The panel was subsequently validated using the remaining 21 aMCI/AD participants (including 10 converters), and 20 controls. Blinded data were analyzed to determine if the subjects could be characterized into the correct diagnostic categories based solely on the 10 lipids identified in the discovery phase.
“The lipid panel was able to distinguish with 90 percent accuracy these two distinct groups: cognitively normal participants who would progress to MCI or AD within two to three years, and those who would remain normal in the near future,” Federoff says.
The researchers examined if the presence of the APOE4 gene, a known risk factor for developing AD, would contribute to accurate classification of the groups, but found it was not a significant predictive factor in this study.
“We consider our results a major step toward the commercialization of a preclinical disease biomarker test that could be useful for large-scale screening to identify at-risk individuals,” Federoff says. “We’re designing a clinical trial where we’ll use this panel to identify people at high risk for Alzheimer’s to test a therapeutic agent that might delay or prevent the emergence of the disease.”

Blood Test Identifies Those At-Risk for Cognitive Decline, Alzheimer’s Within 3 Years

Researchers have discovered and validated a blood test that can predict with greater than 90 percent accuracy if a healthy person will develop mild cognitive impairment or Alzheimer’s disease within three years.

Described in the April issue of Nature Medicine, the study heralds the potential for developing treatment strategies for Alzheimer’s at an earlier stage, when therapy would be more effective at slowing or preventing onset of symptoms. It is the first known published report of blood-based biomarkers for preclinical Alzheimer’s.

The test identifies 10 lipids, or fats, in the blood that predict disease onset. It could be ready for use in clinical studies in as few as two years and, researchers say, other diagnostic uses are possible.

“Our novel blood test offers the potential to identify people at risk for progressive cognitive decline and can change how patients, their families and treating physicians plan for and manage the disorder,” says the study’s corresponding author Howard J. Federoff, MD, PhD, professor of neurology and executive vice president for health sciences at Georgetown University Medical Center.

There is no cure or effective treatment for Alzheimer’s. Worldwide, about 35.6 million individuals have the disease and, according to the World Health Organization, the number will double every 20 years to 115.4 million people with Alzheimer’s by 2050.

Federoff explains there have been many efforts to develop drugs to slow or reverse the progression of Alzheimer’s disease, but all of them have failed. He says one reason may be the drugs were evaluated too late in the disease process.

“The preclinical state of the disease offers a window of opportunity for timely disease-modifying intervention,” Federoff says. “Biomarkers such as ours that define this asymptomatic period are critical for successful development and application of these therapeutics.”

The study included 525 healthy participants aged 70 and older who gave blood samples upon enrolling and at various points in the study. Over the course of the five-year study, 74 participants met the criteria for either mild Alzheimer’s disease (AD) or a condition known as amnestic mild cognitive impairment (aMCI), in which memory loss is prominent. Of these, 46 were diagnosed upon enrollment and 28 developed aMCI or mild AD during the study (the latter group called converters).

In the study’s third year, the researchers selected 53 participants who developed aMCI/AD (including 18 converters) and 53 cognitively normal matched controls for the lipid biomarker discovery phase of the study. The lipids were not targeted before the start of the study, but rather, were an outcome of the study.

A panel of 10 lipids was discovered, which researchers say appears to reveal the breakdown of neural cell membranes in participants who develop symptoms of cognitive impairment or AD. The panel was subsequently validated using the remaining 21 aMCI/AD participants (including 10 converters), and 20 controls. Blinded data were analyzed to determine if the subjects could be characterized into the correct diagnostic categories based solely on the 10 lipids identified in the discovery phase.

“The lipid panel was able to distinguish with 90 percent accuracy these two distinct groups: cognitively normal participants who would progress to MCI or AD within two to three years, and those who would remain normal in the near future,” Federoff says.

The researchers examined if the presence of the APOE4 gene, a known risk factor for developing AD, would contribute to accurate classification of the groups, but found it was not a significant predictive factor in this study.

“We consider our results a major step toward the commercialization of a preclinical disease biomarker test that could be useful for large-scale screening to identify at-risk individuals,” Federoff says. “We’re designing a clinical trial where we’ll use this panel to identify people at high risk for Alzheimer’s to test a therapeutic agent that might delay or prevent the emergence of the disease.”

Filed under alzheimer's disease neurodegeneration memory cognitive decline blood test neuroscience medicine science

254 notes

Forget about forgetting – The elderly know more and use it better
What happens to our cognitive abilities as we age? If your think our brains go into a steady decline, research reported this week in the Journal Topics in Cognitive Science may make you think again. The work, headed by Dr. Michael Ramscar of Tübingen University, takes a critical look at the measures usually thought to show that our cognitive abilities decline across adulthood. Instead of finding evidence of decline, the team discovered that most standard cognitive measures, which date back to the early twentieth century, are flawed. “The human brain works slower in old age,” says Ramscar, “but only because we have stored more information over time.”
Computers were trained, like humans, to read a certain amount each day, and to learn new things. When the researchers let a computer “read” only so much, its performance on cognitive tests resembled that of a young adult. But if the same computer was exposed to the experiences we might encounter over a lifetime – with reading simulated over decades – its performance now looked like that of an older adult. Often it was slower, but not because its processing capacity had declined. Rather, increased “experience” had caused the computer’s database to grow, giving it more data to process – which takes time.
Technology now allows researchers to make quantitative estimates of the number of words an adult can be expected to learn across a lifetime, enabling the Tübingen team to separate the challenge that increasing knowledge poses to memory from the actual performance of memory itself. “Imagine someone who knows two people’s birthdays and can recall them almost perfectly. Would you really want to say that person has a better memory than a person who knows the birthdays of 2000 people, but can ‘only’ match the right person to the right birthday nine times out of ten?” asks Ramscar.
The answer appears to be “no.” When Ramscar’s team trained their computer models on huge linguistic datasets, they found that standardized vocabulary tests, which are used to take account of the growth of knowledge in studies of ageing, massively underestimate the size of adult vocabularies. It takes computers longer to search databases of words as their sizes grow, which is hardly surprising but may have important implications for our understanding of age-related slowdowns. The researchers found that to get their computers to replicate human performance in word recognition tests across adulthood, they had to keep their capacities the same. “Forget about forgetting,” explained Tübingen researcher Peter Hendrix, “if I wanted to get the computer to look like an older adult, I had to keep all the words it learned in memory and let them compete for attention.”
The research shows that studies of the problems older people have with recalling names suffer from a similar blind spot: there is a far greater variety of given names today than there were two generations ago. This cultural shift toward greater name diversity means the number of different names anyone learns over their lifetime has increased dramatically. The work shows how this makes locating a name in memory far harder than it used to be. Even for computers.
Ramscar and his colleagues’ work provides more than an explanation of why, in the light of all the extra information they have to process, we might expect older brains to seem slower and more forgetful than younger brains. Their work also shows how changes in test performance that have been taken as evidence for declining cognitive abilities in fact demonstrates older adults’ greater mastery of the knowledge they have acquired.
Take “paired-associate learning,” a commonly used cognitive test that involves learning to connect words like “up” to “down” or “necktie” to “cracker” in memory. Using Big Data sets to quantify how often different words appear together in English, the Tuebingen team show that younger adults do better when asked to learn to pair “up” with “down” than “necktie” and “cracker” because “up” and “down” appear in close proximity to one another more frequently. However, whereas older adults also understand which words don’t usually go together, young adults notice this less. When the researchers examined performance on this test across a range of word pairs that go together more and less in English, they found older adult’s scores to be far more closely attuned to the actual information in hundreds of millions of words of English than their younger counterparts.
As Prof. Harald Baayen, who heads the Alexander von Humboldt Quantitative Linguistics research group where the work was carried out puts it, “If you think linguistic skill involves something like being able to choose one word given another, younger adults seem to do better in this task. But, of course, proper understanding of language involves more than this. You have also to not put plausible but wrong pairs of words together. The fact that older adults find nonsense pairs – but not connected pairs – harder to learn than young adults simply demonstrates older adults’ much better understanding of language. They have to make more of an effort to learn unrelated word pairs because, unlike the youngsters, they know a lot about which words don’t belong together.”
The Tübingen research conclude that we need different tests for the cognitive abilities of older people – taking into account the nature and amount of information our brains process. “The brains of older people do not get weak,” says Michael Ramscar. “On the contrary, they simply know more.”

Forget about forgetting – The elderly know more and use it better

What happens to our cognitive abilities as we age? If your think our brains go into a steady decline, research reported this week in the Journal Topics in Cognitive Science may make you think again. The work, headed by Dr. Michael Ramscar of Tübingen University, takes a critical look at the measures usually thought to show that our cognitive abilities decline across adulthood. Instead of finding evidence of decline, the team discovered that most standard cognitive measures, which date back to the early twentieth century, are flawed. “The human brain works slower in old age,” says Ramscar, “but only because we have stored more information over time.”

Computers were trained, like humans, to read a certain amount each day, and to learn new things. When the researchers let a computer “read” only so much, its performance on cognitive tests resembled that of a young adult. But if the same computer was exposed to the experiences we might encounter over a lifetime – with reading simulated over decades – its performance now looked like that of an older adult. Often it was slower, but not because its processing capacity had declined. Rather, increased “experience” had caused the computer’s database to grow, giving it more data to process – which takes time.

Technology now allows researchers to make quantitative estimates of the number of words an adult can be expected to learn across a lifetime, enabling the Tübingen team to separate the challenge that increasing knowledge poses to memory from the actual performance of memory itself. “Imagine someone who knows two people’s birthdays and can recall them almost perfectly. Would you really want to say that person has a better memory than a person who knows the birthdays of 2000 people, but can ‘only’ match the right person to the right birthday nine times out of ten?” asks Ramscar.

The answer appears to be “no.” When Ramscar’s team trained their computer models on huge linguistic datasets, they found that standardized vocabulary tests, which are used to take account of the growth of knowledge in studies of ageing, massively underestimate the size of adult vocabularies. It takes computers longer to search databases of words as their sizes grow, which is hardly surprising but may have important implications for our understanding of age-related slowdowns. The researchers found that to get their computers to replicate human performance in word recognition tests across adulthood, they had to keep their capacities the same. “Forget about forgetting,” explained Tübingen researcher Peter Hendrix, “if I wanted to get the computer to look like an older adult, I had to keep all the words it learned in memory and let them compete for attention.”

The research shows that studies of the problems older people have with recalling names suffer from a similar blind spot: there is a far greater variety of given names today than there were two generations ago. This cultural shift toward greater name diversity means the number of different names anyone learns over their lifetime has increased dramatically. The work shows how this makes locating a name in memory far harder than it used to be. Even for computers.

Ramscar and his colleagues’ work provides more than an explanation of why, in the light of all the extra information they have to process, we might expect older brains to seem slower and more forgetful than younger brains. Their work also shows how changes in test performance that have been taken as evidence for declining cognitive abilities in fact demonstrates older adults’ greater mastery of the knowledge they have acquired.

Take “paired-associate learning,” a commonly used cognitive test that involves learning to connect words like “up” to “down” or “necktie” to “cracker” in memory. Using Big Data sets to quantify how often different words appear together in English, the Tuebingen team show that younger adults do better when asked to learn to pair “up” with “down” than “necktie” and “cracker” because “up” and “down” appear in close proximity to one another more frequently. However, whereas older adults also understand which words don’t usually go together, young adults notice this less. When the researchers examined performance on this test across a range of word pairs that go together more and less in English, they found older adult’s scores to be far more closely attuned to the actual information in hundreds of millions of words of English than their younger counterparts.

As Prof. Harald Baayen, who heads the Alexander von Humboldt Quantitative Linguistics research group where the work was carried out puts it, “If you think linguistic skill involves something like being able to choose one word given another, younger adults seem to do better in this task. But, of course, proper understanding of language involves more than this. You have also to not put plausible but wrong pairs of words together. The fact that older adults find nonsense pairs – but not connected pairs – harder to learn than young adults simply demonstrates older adults’ much better understanding of language. They have to make more of an effort to learn unrelated word pairs because, unlike the youngsters, they know a lot about which words don’t belong together.”

The Tübingen research conclude that we need different tests for the cognitive abilities of older people – taking into account the nature and amount of information our brains process. “The brains of older people do not get weak,” says Michael Ramscar. “On the contrary, they simply know more.”

Filed under cognitive decline aging forgetting memory learning psychology neuroscience science

111 notes

Heavy Drinking in Middle Age May Speed Memory Loss by up to Six Years in Men

Middle-aged men who drink more than 36 grams of alcohol, or two and a half US drinks per day, may speed their memory loss by up to six years later on, according to a study published in the January 15, 2014, online issue of Neurology®, the medical journal of the American Academy of Neurology. On the other hand, the study found no differences in memory and executive function in men who do not drink, former drinkers and light or moderate drinkers. Executive function deals with attention and reasoning skills in achieving a goal.

image

“Much of the research evidence about drinking and a relationship to memory and executive function is based on older populations,” said study author Séverine Sabia, PhD, of the University College London in the United Kingdom. “Our study focused on middle-aged participants and suggests that heavy drinking is associated with faster decline in all areas of cognitive function in men.”

The study involved 5,054 men and 2,099 women whose drinking habits were assessed three times over 10 years. A drink was considered wine, beer or liquor. Then, when the participants were an average age of 56, they took their first memory and executive function test. The tests were repeated twice over the next 10 years.

The study found that there were no differences in memory and executive function decline between men who did not drink and those who were light or moderate drinkers—those who drank less than 20 grams, or less than two US drinks per day. Heavy drinkers showed memory and executive function declines between one-and-a-half to six years faster than those who had fewer drinks per day.

Filed under aging cognitive decline alcohol memory psychology neuroscience science

153 notes

Study Shows Where Alzheimer’s Starts and How It Spreads
Using high-resolution functional MRI (fMRI) imaging in patients with Alzheimer’s disease and in mouse models of the disease, Columbia University Medical Center (CUMC) researchers have clarified three fundamental issues about Alzheimer’s: where it starts, why it starts there, and how it spreads. In addition to advancing understanding of Alzheimer’s, the findings could improve early detection of the disease, when drugs may be most effective. The study was published today in the online edition of the journal Nature Neuroscience.
“It has been known for years that Alzheimer’s starts in a brain region known as the entorhinal cortex,” said co-senior author Scott A. Small, MD, Boris and Rose Katz Professor of Neurology, professor of radiology, and director of the Alzheimer’s Disease Research Center. “But this study is the first to show in living patients that it begins specifically in the lateral entorhinal cortex, or LEC. The LEC is considered to be a gateway to the hippocampus, which plays a key role in the consolidation of long-term memory, among other functions. If the LEC is affected, other aspects of the hippocampus will also be affected.”
The study also shows that, over time, Alzheimer’s spreads from the LEC directly to other areas of the cerebral cortex, in particular, the parietal cortex, a brain region involved in various functions, including spatial orientation and navigation. The researchers suspect that Alzheimer’s spreads “functionally,” that is, by compromising the function of neurons in the LEC, which then compromises the integrity of neurons in adjoining areas.
A third major finding of the study is that LEC dysfunction occurs when changes in tau and amyloid precursor protein (APP) co-exist. “The LEC is especially vulnerable to Alzheimer’s because it normally accumulates tau, which sensitizes the LEC to the accumulation of APP. Together, these two proteins damage neurons in the LEC, setting the stage for Alzheimer’s,” said co-senior author Karen E. Duff, PhD, professor of pathology and cell biology (in psychiatry and in the Taub Institute for Research on Alzheimer’s Disease and the Aging Brain) at CUMC and at the New York State Psychiatric Institute.
In the study, the researchers used a high-resolution variant of fMRI to map metabolic defects in the brains of 96 adults enrolled in the Washington Heights-Inwood Columbia Aging Project (WHICAP). All of the adults were free of dementia at the time of enrollment.
“Dr. Richard Mayeux’s WHICAP study enables us to follow a large group of healthy elderly individuals, some of whom have gone on to develop Alzheimer’s disease,” said Dr. Small. “This study has given us a unique opportunity to image and characterize patients with Alzheimer’s in its earliest, preclinical stage.”
The 96 adults were followed for an average of 3.5 years, at which time 12 individuals were found to have progressed to mild Alzheimer’s disease. An analysis of the baseline fMRI images of those 12 individuals found significant decreases in cerebral blood volume (CBV) — a measure of metabolic activity — in the LEC compared with that of the 84 adults who were free of dementia.
A second part of the study addressed the role of tau and APP in LEC dysfunction. While previous studies have suggested that entorhinal cortex dysfunction is associated with both tau and APP abnormalities, it was not known how these proteins interact to drive this dysfunction, particularly in preclinical Alzheimer’s.
To answer this question, explained first author Usman Khan, an MD-PhD student based in Dr. Small’s lab, the team created three mouse models, one with elevated levels of tau in the LEC, one with elevated levels of APP, and one with elevated levels of both proteins. The researchers found that the LEC dysfunction occurred only in the mice with both tau and APP.
The study has implications for both research and treatment. “Now that we’ve pinpointed where Alzheimer’s starts, and shown that those changes are observable using fMRI, we may be able to detect Alzheimer’s at its earliest preclinical stage, when the disease might be more treatable and before it spreads to other brain regions,” said Dr. Small. In addition, say the researchers, the new imaging method could be used to assess the efficacy of promising Alzheimer’s drugs during the disease’s early stages.

Study Shows Where Alzheimer’s Starts and How It Spreads

Using high-resolution functional MRI (fMRI) imaging in patients with Alzheimer’s disease and in mouse models of the disease, Columbia University Medical Center (CUMC) researchers have clarified three fundamental issues about Alzheimer’s: where it starts, why it starts there, and how it spreads. In addition to advancing understanding of Alzheimer’s, the findings could improve early detection of the disease, when drugs may be most effective. The study was published today in the online edition of the journal Nature Neuroscience.

“It has been known for years that Alzheimer’s starts in a brain region known as the entorhinal cortex,” said co-senior author Scott A. Small, MD, Boris and Rose Katz Professor of Neurology, professor of radiology, and director of the Alzheimer’s Disease Research Center. “But this study is the first to show in living patients that it begins specifically in the lateral entorhinal cortex, or LEC. The LEC is considered to be a gateway to the hippocampus, which plays a key role in the consolidation of long-term memory, among other functions. If the LEC is affected, other aspects of the hippocampus will also be affected.”

The study also shows that, over time, Alzheimer’s spreads from the LEC directly to other areas of the cerebral cortex, in particular, the parietal cortex, a brain region involved in various functions, including spatial orientation and navigation. The researchers suspect that Alzheimer’s spreads “functionally,” that is, by compromising the function of neurons in the LEC, which then compromises the integrity of neurons in adjoining areas.

A third major finding of the study is that LEC dysfunction occurs when changes in tau and amyloid precursor protein (APP) co-exist. “The LEC is especially vulnerable to Alzheimer’s because it normally accumulates tau, which sensitizes the LEC to the accumulation of APP. Together, these two proteins damage neurons in the LEC, setting the stage for Alzheimer’s,” said co-senior author Karen E. Duff, PhD, professor of pathology and cell biology (in psychiatry and in the Taub Institute for Research on Alzheimer’s Disease and the Aging Brain) at CUMC and at the New York State Psychiatric Institute.

In the study, the researchers used a high-resolution variant of fMRI to map metabolic defects in the brains of 96 adults enrolled in the Washington Heights-Inwood Columbia Aging Project (WHICAP). All of the adults were free of dementia at the time of enrollment.

“Dr. Richard Mayeux’s WHICAP study enables us to follow a large group of healthy elderly individuals, some of whom have gone on to develop Alzheimer’s disease,” said Dr. Small. “This study has given us a unique opportunity to image and characterize patients with Alzheimer’s in its earliest, preclinical stage.”

The 96 adults were followed for an average of 3.5 years, at which time 12 individuals were found to have progressed to mild Alzheimer’s disease. An analysis of the baseline fMRI images of those 12 individuals found significant decreases in cerebral blood volume (CBV) — a measure of metabolic activity — in the LEC compared with that of the 84 adults who were free of dementia.

A second part of the study addressed the role of tau and APP in LEC dysfunction. While previous studies have suggested that entorhinal cortex dysfunction is associated with both tau and APP abnormalities, it was not known how these proteins interact to drive this dysfunction, particularly in preclinical Alzheimer’s.

To answer this question, explained first author Usman Khan, an MD-PhD student based in Dr. Small’s lab, the team created three mouse models, one with elevated levels of tau in the LEC, one with elevated levels of APP, and one with elevated levels of both proteins. The researchers found that the LEC dysfunction occurred only in the mice with both tau and APP.

The study has implications for both research and treatment. “Now that we’ve pinpointed where Alzheimer’s starts, and shown that those changes are observable using fMRI, we may be able to detect Alzheimer’s at its earliest preclinical stage, when the disease might be more treatable and before it spreads to other brain regions,” said Dr. Small. In addition, say the researchers, the new imaging method could be used to assess the efficacy of promising Alzheimer’s drugs during the disease’s early stages.

Filed under alzheimer's disease entorhinal cortex aging memory dementia cognitive decline neuroscience science

57 notes

Statin Use Not Linked to a Decline in Cognitive Function
Based on the largest comprehensive systematic review to date, researchers at the Perelman School of Medicine at the University of Pennsylvania concluded that available evidence does not support an association between statins and memory loss or dementia. The new study, a collaborative effort between faculty in Penn Medicine’s Preventive Cardiovascular Program, the Penn Memory Center, and the Penn Center for Evidence-Based Practice, will be published in Annals of Internal Medicine.
“Statins are prescribed to approximately 30 million people in the United States, and these numbers may increase as a result of the national cholesterol guidelines recently released,” said senior study author Emil deGoma, MD, assistant professor of Medicine and medical director of the Preventive Cardiovascular Program at Penn. “A wealth of data supports a benefit of these cholesterol-lowering medications among individuals at risk for cardiovascular disease in terms of a reduction in the risk of heart attack and stroke; however, potential side effects of statins are less well understood. In February 2012, largely based on anecdotal reports, the U.S. Food and Drug Administration (FDA) issued a safety statement warning patients of possible adverse cognitive effects associated with statin use. Many concerned patients have asked if there is a relationship between statins and memory problems. Their concerns, along with the FDA statement, prompted us to pursue a rigorous analysis of all available evidence to better answer the question – are statins associated with changes in cognition?”
The research team conducted a systematic review of the published literature and identified 57 statin studies reporting measures of cognitive function. Dr. deGoma and colleagues found no evidence of an increased risk of dementia with statin therapy. In fact, in cohort studies, statin users had a 13 percent lower risk of dementia, a 21 percent lower risk of Alzheimer’s disease, and a 34 percent lower risk of mild cognitive impairment compared to people who did not take statins.
Most importantly, cognitive test scores were not adversely affected by statin treatment in randomized controlled trials. In these trials, roughly half of the study participants received statins and the other half received placebo. All study participants underwent formal testing of memory and other cognitive domains through tests such as the ability to recall a set of numbers. The analysis of 155 cognitive tests spanning eight categories of cognitive function, including 26 tests of memory, revealed no differences between study participants treated with statins and those provided placebo.
The research team additionally performed an analysis of the FDA post-marketing surveillance databases and found no difference in the frequency of cognitive adverse event reports between statins and two commonly prescribed cardiovascular medications that have not been associated with cognitive impairment, namely, clopidogrel and losartan.
“Overall, these findings are quite reassuring. I wouldn’t let concerns about adverse effects on cognition influence the decision to start a statin in patients suffering from atherosclerotic disease or at risk for cardiovascular disease. I also wouldn’t jump to the conclusion that statins are the culprit when an individual who is taking a statin describes forgetfulness. We may be doing more harm than good if we withhold or stop statins – medications proven to reduce the risk of heart attack and stroke – due to fears that statins might possibly cause memory loss,” said Dr. deGoma.
The team acknowledges that while their analysis is reassuring, large, high-quality randomized controlled trials are needed to confirm their findings. 
“For many of the cognitive outcomes that we examined, the identified studies were small, were at risk for bias, used varying diagnostic tests to assess cognitive domains, and did not include patients on high-dose statins, which is important given the increasing use of high-dose statins for secondary prevention,” noted study co-author Craig Umscheid, MD, MSCE, assistant professor of Medicine and Epidemiology and director of the Penn Center for Evidence-based Practice. “Thus, additional trials addressing these limitations would strengthen our conclusions. Despite this, the totality of the evidence does reassure us that there’s unlikely to be a significant link between statins and cognitive impairment.”

Statin Use Not Linked to a Decline in Cognitive Function

Based on the largest comprehensive systematic review to date, researchers at the Perelman School of Medicine at the University of Pennsylvania concluded that available evidence does not support an association between statins and memory loss or dementia. The new study, a collaborative effort between faculty in Penn Medicine’s Preventive Cardiovascular Program, the Penn Memory Center, and the Penn Center for Evidence-Based Practice, will be published in Annals of Internal Medicine.

“Statins are prescribed to approximately 30 million people in the United States, and these numbers may increase as a result of the national cholesterol guidelines recently released,” said senior study author Emil deGoma, MD, assistant professor of Medicine and medical director of the Preventive Cardiovascular Program at Penn. “A wealth of data supports a benefit of these cholesterol-lowering medications among individuals at risk for cardiovascular disease in terms of a reduction in the risk of heart attack and stroke; however, potential side effects of statins are less well understood. In February 2012, largely based on anecdotal reports, the U.S. Food and Drug Administration (FDA) issued a safety statement warning patients of possible adverse cognitive effects associated with statin use. Many concerned patients have asked if there is a relationship between statins and memory problems. Their concerns, along with the FDA statement, prompted us to pursue a rigorous analysis of all available evidence to better answer the question – are statins associated with changes in cognition?”

The research team conducted a systematic review of the published literature and identified 57 statin studies reporting measures of cognitive function. Dr. deGoma and colleagues found no evidence of an increased risk of dementia with statin therapy. In fact, in cohort studies, statin users had a 13 percent lower risk of dementia, a 21 percent lower risk of Alzheimer’s disease, and a 34 percent lower risk of mild cognitive impairment compared to people who did not take statins.

Most importantly, cognitive test scores were not adversely affected by statin treatment in randomized controlled trials. In these trials, roughly half of the study participants received statins and the other half received placebo. All study participants underwent formal testing of memory and other cognitive domains through tests such as the ability to recall a set of numbers. The analysis of 155 cognitive tests spanning eight categories of cognitive function, including 26 tests of memory, revealed no differences between study participants treated with statins and those provided placebo.

The research team additionally performed an analysis of the FDA post-marketing surveillance databases and found no difference in the frequency of cognitive adverse event reports between statins and two commonly prescribed cardiovascular medications that have not been associated with cognitive impairment, namely, clopidogrel and losartan.

“Overall, these findings are quite reassuring. I wouldn’t let concerns about adverse effects on cognition influence the decision to start a statin in patients suffering from atherosclerotic disease or at risk for cardiovascular disease. I also wouldn’t jump to the conclusion that statins are the culprit when an individual who is taking a statin describes forgetfulness. We may be doing more harm than good if we withhold or stop statins – medications proven to reduce the risk of heart attack and stroke – due to fears that statins might possibly cause memory loss,” said Dr. deGoma.

The team acknowledges that while their analysis is reassuring, large, high-quality randomized controlled trials are needed to confirm their findings. 

“For many of the cognitive outcomes that we examined, the identified studies were small, were at risk for bias, used varying diagnostic tests to assess cognitive domains, and did not include patients on high-dose statins, which is important given the increasing use of high-dose statins for secondary prevention,” noted study co-author Craig Umscheid, MD, MSCE, assistant professor of Medicine and Epidemiology and director of the Penn Center for Evidence-based Practice. “Thus, additional trials addressing these limitations would strengthen our conclusions. Despite this, the totality of the evidence does reassure us that there’s unlikely to be a significant link between statins and cognitive impairment.”

Filed under cognitive decline statins cardiovascular disease memory neurodegenerative diseases medicine neuroscience science

148 notes

Can Certain Herbs Stave Off Alzheimer’s Disease?
Enhanced extracts made from special antioxidants in spearmint and rosemary improve learning and memory, a study in an animal model at Saint Louis University found.
"We found that these proprietary compounds reduce deficits caused by mild cognitive impairment, which can be a precursor to Alzheimer’s disease," said Susan Farr, Ph.D., research professor geriatrics at Saint Louis University School of Medicine.
Farr added, “This probably means eating spearmint and rosemary is good for you. However, our experiments were in an animal model and I don’t know how much — or if any amount — of these herbs people would have to consume for learning and memory to improve. In other words, I’m not suggesting that people chew more gum at this point.”
Farr presented the early findings at Neuroscience 2013, a meeting of 32,000 on Monday, Nov. 11. She tested a novel antioxidant-based ingredient made from spearmint extract and two different doses of a similar antioxidant made from rosemary extract on mice that have age-related cognitive decline.
She found that the higher dose rosemary extract compound was the most powerful in improving memory and learning in three tested behaviors. The lower dose rosemary extract improved memory in two of the behavioral tests, as did the compound made from spearmint extract.
Further, there were signs of reduced oxidative stress, which is considered a hallmark of age-related decline, in the part of the brain that controls learning and memory.
"Our research suggests these extracts made from herbs might have beneficial effects on altering the course of age-associated cognitive decline," Farr said. "It’s worth additional study."
(Image credit)

Can Certain Herbs Stave Off Alzheimer’s Disease?

Enhanced extracts made from special antioxidants in spearmint and rosemary improve learning and memory, a study in an animal model at Saint Louis University found.

"We found that these proprietary compounds reduce deficits caused by mild cognitive impairment, which can be a precursor to Alzheimer’s disease," said Susan Farr, Ph.D., research professor geriatrics at Saint Louis University School of Medicine.

Farr added, “This probably means eating spearmint and rosemary is good for you. However, our experiments were in an animal model and I don’t know how much — or if any amount — of these herbs people would have to consume for learning and memory to improve. In other words, I’m not suggesting that people chew more gum at this point.”

Farr presented the early findings at Neuroscience 2013, a meeting of 32,000 on Monday, Nov. 11. She tested a novel antioxidant-based ingredient made from spearmint extract and two different doses of a similar antioxidant made from rosemary extract on mice that have age-related cognitive decline.

She found that the higher dose rosemary extract compound was the most powerful in improving memory and learning in three tested behaviors. The lower dose rosemary extract improved memory in two of the behavioral tests, as did the compound made from spearmint extract.

Further, there were signs of reduced oxidative stress, which is considered a hallmark of age-related decline, in the part of the brain that controls learning and memory.

"Our research suggests these extracts made from herbs might have beneficial effects on altering the course of age-associated cognitive decline," Farr said. "It’s worth additional study."

(Image credit)

Filed under alzheimer's disease cognitive decline rosemary spearmint Neuroscience 2013 neuroscience science

348 notes

Menstrual Cycle Influences Concussion Outcomes
Researchers found that women injured during the two weeks leading up to their period (the premenstrual phase) had a slower recovery and poorer health one month after injury compared to women injured during the two weeks directly after their period or women taking birth control pills.
The University of Rochester study was published today in the Journal of Head Trauma Rehabilitation. If confirmed in subsequent research, the findings could alter the treatment and prognosis of women who suffer head injuries from sports, falls, car accidents or combat.
Several recent studies have confirmed what women and their physicians anecdotally have known for years: Women experience greater cognitive decline, poorer reaction times, more headaches, extended periods of depression, longer hospital stays and delayed return-to-work compared to men following head injury. Such results are particularly pronounced in women of childbearing age; girls who have not started their period and post-menopausal women have outcomes similar to men.
Few studies have explored why such differences occur, but senior author Jeffrey J. Bazarian, M.D., M.P.H. says it stands to reason that sex hormones such as estrogen and progesterone, which are highest in women of childbearing age, may play a role.
“I don’t think doctors consider menstrual history when evaluating a patient after a concussion, but maybe we should,” noted Bazarian, associate professor of Emergency Medicine at the University of Rochester School of Medicine and Dentistry who treats patients and conducts research on traumatic brain injury and long-term outcomes among athletes. “By taking into account the stage of their cycle at the time of injury we could better identify patients who might need more aggressive monitoring or treatment. It would also allow us to counsel women that they’re more – or less – likely to feel poorly because of their menstrual phase.”
Although media coverage tends to focus on concussions in male professional athletes, studies suggest that women have a higher incidence of head injuries than men playing sports with similar rules, such as ice hockey, soccer and basketball. Bazarian estimates that 70 percent of the patients he treats in the URMC Sport Concussion Clinic are young women. He believes the number is so high because they often need more follow-up care. In his experience, soccer is the most common sport leading to head injuries in women, but lacrosse, field hockey, cheerleading, volleyball and basketball can lead to injuries as well.
Sex hormone levels often change after a head injury, as women who have suffered a concussion and subsequently missed one or more periods can attest. According to Kathleen M. Hoeger, M.D., M.P.H., study co-author and professor of Obstetrics and Gynecology at the University of Rochester School of Medicine and Dentistry, any stressful event, like a hit to the head, can shut down the pituitary gland in the brain, which is the body’s hormone generator. If the pituitary doesn’t work, the level of estrogen and progesterone would drop quickly.  
According to Bazarian, progesterone is known to have a calming effect on the brain and on mood. Knowing this, his team came up with the “withdrawal hypothesis”: If a woman suffers a concussion in the premenstrual phase when progesterone levels are naturally high, an abrupt drop in progesterone after injury produces a kind of withdrawal which either contributes to or worsens post concussive symptoms like headache, nausea, dizziness and trouble concentrating. This may be why women recover differently than men, who have low pre-injury levels of the hormone.     
Hoeger and Bazarian tested their theory by recruiting144 women ages 18 to 60 who arrived within four hours of a head hit at five emergency departments in upstate New York and one in Pennsylvania. Participants gave blood within six hours of injury and progesterone level determined the menstrual cycle phase at the time of injury. Based on the results, participants fell into three groups: 37 in the premenstrual/high progesterone group; 72 in the low progesterone group (progesterone is low in the two weeks directly after a period); and 35 in the birth control group based on self-reported use.
One month later, women in the premenstrual/high progesterone group were twice as likely to score in a worse percentile on standardized tests that measure concussion recovery and quality of life – as defined by mobility, self-care, usual activity, pain and emotional health – compared to women in the low progesterone group. Women in the premenstrual/high progesterone group also scored the lowest (average 65) on a health rating scale that went from 0, being the worst health imaginable, to 100, being the best. Women in the birth control group had the highest scores (average 77).
“If you get hit when progesterone is high and you experience a steep drop in the hormone, this is what makes you feel lousy and causes symptoms to linger,” said Bazarian. “But, if you are injured when progesterone is already low, a hit to the head can’t lower it any further, so there is less change in the way you feel.”
The team suspected that women taking birth control pills, which contain synthetic hormones that mimic the action of progesterone, would have similar outcomes to women injured in the low progesterone phase of their cycle. As expected, there was no clear difference between these groups, as women taking birth control pills have a constant stream of sex hormones and don’t experience a drop following a head hit, so long as they continue to take the pill.    
“Women who are very athletic get several benefits from the pill; it protects their bones and keeps their periods predictable,” noted Hoeger. “If larger studies confirm our data, this could be one more way in which the pill is helpful in athletic women, especially women who participate in sports like soccer that present lots of opportunities for head injuries.”
In addition to determining menstrual cycle phase at the time of injury, Bazarian plans to scrutinize a woman’s cycles after injury to make sure they are not disrupted. If they are, the woman should make an appointment with her gynecologist to discuss the change.

Menstrual Cycle Influences Concussion Outcomes

Researchers found that women injured during the two weeks leading up to their period (the premenstrual phase) had a slower recovery and poorer health one month after injury compared to women injured during the two weeks directly after their period or women taking birth control pills.

The University of Rochester study was published today in the Journal of Head Trauma Rehabilitation. If confirmed in subsequent research, the findings could alter the treatment and prognosis of women who suffer head injuries from sports, falls, car accidents or combat.

Several recent studies have confirmed what women and their physicians anecdotally have known for years: Women experience greater cognitive decline, poorer reaction times, more headaches, extended periods of depression, longer hospital stays and delayed return-to-work compared to men following head injury. Such results are particularly pronounced in women of childbearing age; girls who have not started their period and post-menopausal women have outcomes similar to men.

Few studies have explored why such differences occur, but senior author Jeffrey J. Bazarian, M.D., M.P.H. says it stands to reason that sex hormones such as estrogen and progesterone, which are highest in women of childbearing age, may play a role.

“I don’t think doctors consider menstrual history when evaluating a patient after a concussion, but maybe we should,” noted Bazarian, associate professor of Emergency Medicine at the University of Rochester School of Medicine and Dentistry who treats patients and conducts research on traumatic brain injury and long-term outcomes among athletes. “By taking into account the stage of their cycle at the time of injury we could better identify patients who might need more aggressive monitoring or treatment. It would also allow us to counsel women that they’re more – or less – likely to feel poorly because of their menstrual phase.”

Although media coverage tends to focus on concussions in male professional athletes, studies suggest that women have a higher incidence of head injuries than men playing sports with similar rules, such as ice hockey, soccer and basketball. Bazarian estimates that 70 percent of the patients he treats in the URMC Sport Concussion Clinic are young women. He believes the number is so high because they often need more follow-up care. In his experience, soccer is the most common sport leading to head injuries in women, but lacrosse, field hockey, cheerleading, volleyball and basketball can lead to injuries as well.

Sex hormone levels often change after a head injury, as women who have suffered a concussion and subsequently missed one or more periods can attest. According to Kathleen M. Hoeger, M.D., M.P.H., study co-author and professor of Obstetrics and Gynecology at the University of Rochester School of Medicine and Dentistry, any stressful event, like a hit to the head, can shut down the pituitary gland in the brain, which is the body’s hormone generator. If the pituitary doesn’t work, the level of estrogen and progesterone would drop quickly.  

According to Bazarian, progesterone is known to have a calming effect on the brain and on mood. Knowing this, his team came up with the “withdrawal hypothesis”: If a woman suffers a concussion in the premenstrual phase when progesterone levels are naturally high, an abrupt drop in progesterone after injury produces a kind of withdrawal which either contributes to or worsens post concussive symptoms like headache, nausea, dizziness and trouble concentrating. This may be why women recover differently than men, who have low pre-injury levels of the hormone.     

Hoeger and Bazarian tested their theory by recruiting144 women ages 18 to 60 who arrived within four hours of a head hit at five emergency departments in upstate New York and one in Pennsylvania. Participants gave blood within six hours of injury and progesterone level determined the menstrual cycle phase at the time of injury. Based on the results, participants fell into three groups: 37 in the premenstrual/high progesterone group; 72 in the low progesterone group (progesterone is low in the two weeks directly after a period); and 35 in the birth control group based on self-reported use.

One month later, women in the premenstrual/high progesterone group were twice as likely to score in a worse percentile on standardized tests that measure concussion recovery and quality of life – as defined by mobility, self-care, usual activity, pain and emotional health – compared to women in the low progesterone group. Women in the premenstrual/high progesterone group also scored the lowest (average 65) on a health rating scale that went from 0, being the worst health imaginable, to 100, being the best. Women in the birth control group had the highest scores (average 77).

“If you get hit when progesterone is high and you experience a steep drop in the hormone, this is what makes you feel lousy and causes symptoms to linger,” said Bazarian. “But, if you are injured when progesterone is already low, a hit to the head can’t lower it any further, so there is less change in the way you feel.”

The team suspected that women taking birth control pills, which contain synthetic hormones that mimic the action of progesterone, would have similar outcomes to women injured in the low progesterone phase of their cycle. As expected, there was no clear difference between these groups, as women taking birth control pills have a constant stream of sex hormones and don’t experience a drop following a head hit, so long as they continue to take the pill.    

“Women who are very athletic get several benefits from the pill; it protects their bones and keeps their periods predictable,” noted Hoeger. “If larger studies confirm our data, this could be one more way in which the pill is helpful in athletic women, especially women who participate in sports like soccer that present lots of opportunities for head injuries.”

In addition to determining menstrual cycle phase at the time of injury, Bazarian plans to scrutinize a woman’s cycles after injury to make sure they are not disrupted. If they are, the woman should make an appointment with her gynecologist to discuss the change.

Filed under concussion brain injury estrogen progesterone cognitive decline neuroscience science

65 notes

Alzheimer’s progression tracked prior to dementia

For years, scientists have attempted to understand how Alzheimer’s disease harms the brain before memory loss and dementia are clinically detectable. Most researchers think this preclinical stage, which can last a decade or more before symptoms appear, is the critical phase when the disease might be controlled or stopped, possibly preventing the failure of memory and thinking abilities in the first place.

image

Important progress in this effort is reported in October in Lancet Neurology. Scientists at the Charles F. and Joanne Knight Alzheimer Disease Research Center at Washington University School of Medicine in St. Louis, working in collaboration with investigators at the University of Maastricht in the Netherlands, helped to validate a proposed new system for identifying and classifying individuals with preclinical Alzheimer’s disease.

Their findings indicate that preclinical Alzheimer’s disease can be detected during a person’s life, is common in cognitively normal elderly people and is associated with future mental decline and mortality. According to the scientists, this suggests that preclinical Alzheimer’s disease could be an important target for therapeutic intervention.

A panel of Alzheimer’s experts, convened by the National Institute on Aging in association with the Alzheimer’s Association, proposed the classification system two years ago. It is based on earlier efforts to define and track biomarker changes during preclinical disease.

According to the Washington University researchers, the new findings offer reason for encouragement, showing, for example, that the system can help predict which cognitively normal individuals will develop symptoms of Alzheimer’s and how rapidly their brain function will decline. But they also highlight additional questions that must be answered before the classification system can be adapted for use in clinical care.

“For new treatments, knowing where individuals are on the path to Alzheimer’s dementia will help us improve the design and assessment of clinical trials,” said senior author Anne Fagan, PhD, research professor of neurology. “There are many steps left before we can apply this system in the clinic, including standardizing how we gather and assess data in individuals, and determining which of our indicators of preclinical disease are the most accurate. But the research data are compelling and very encouraging.”

The classification system divides preclinical Alzheimer’s into three stages:

  • Stage 1: Levels of amyloid beta, a protein fragment produced by the brain, begin to fall in the spinal fluid. This indicates that the substance is beginning to form plaques in the brain.
  • Stage 2: Levels of tau protein start to rise in the spinal fluid, indicating that brain cells are beginning to die. Amyloid beta levels are still abnormal and may continue to fall.
  • Stage 3: In the presence of abnormal amyloid and tau biomarker levels, subtle cognitive changes can be detected by neuropsychological testing. By themselves, these changes cannot establish a clinical diagnosis of dementia.

The researchers applied these criteria to research participants studied from 1998 through 2011 at the Knight Alzheimer Disease Research Center. The center annually collects extensive cognitive, biomarker and other health data on normal and cognitively impaired volunteers for use in Alzheimer’s studies.

The scientists analyzed information on 311 individuals age 65 or older who were cognitively normal when first evaluated. Each participant was evaluated annually at the center at least twice; the participant in this study with the most data had been followed for 15 years.

At the initial testing, 41 percent of the participants had no indicators of Alzheimer’s disease (stage 0); 15 percent were in stage 1 of preclinical disease; 12 percent were in stage 2; and 4 percent were in stage 3. The remaining participants were classified as having cognitive impairments caused by conditions other than Alzheimer’s (23 percent) or did not meet any of the proposed criteria (5 percent).

“A total of 31 percent of our participants had preclinical disease,” said Fagan. “This percentage matches findings from autopsy studies of the brains of older individuals, which have shown that about 30 percent of people who were cognitively normal had preclinical Alzheimer’s pathology in their brain.”

Scientists believe the rate of cognitive decline increases as people move through the stages of preclinical Alzheimer’s. The new data support this idea. Five years after their initial evaluation, 11 percent of the stage 1 group, 26 percent of the stage 2 group, and 52 percent of the stage 3 group had been diagnosed with symptomatic Alzheimer’s.

Individuals with preclinical Alzheimer’s disease were six times more likely to die over the next decade than older adults without preclinical Alzheimer’s disease, but researchers don’t know why.

“Risk factors for Alzheimer’s disease might also be associated with other life-threatening illnesses,” Fagan said. “It’s also possible that the presence of Alzheimer’s hampers the diagnosis and treatment of other conditions or contributes to health problems elsewhere in the body. We don’t have enough data yet to say, but it’s an issue we’re continuing to investigate.”

(Source: news.wustl.edu)

Filed under alzheimer's disease beta amyloid cognitive decline tau proteins neuroscience science

free counters