Neuroscience

Articles and news from the latest research reports.

Posts tagged cognitive decline

238 notes

Number of People with Alzheimer’s Disease May Triple by 2050
The number of people with Alzheimer’s disease is expected to triple in the next 40 years, according to a new study by researchers from Rush University Medical Center published in the February 6, 2013, online issue of Neurology, the medical journal of the American Academy of Neurology.
“This increase is due to an aging baby boom generation. It will place a huge burden on society, disabling more people who develop the disease, challenging their caregivers, and straining medical and social safety nets,” said co-author, Jennifer Weuve, MPH, ScD, assistant professor of medicine, Rush Institute for Healthy Aging at Rush University Medical Center in Chicago. “Our study draws attention to an urgent need for more research, treatments and preventive strategies to reduce the impact of this epidemic.”
For the study, researchers analyzed information from 10,802 African-American and Caucasian people living in Chicago, ages 65 and older between 1993 and 2011. Participants were interviewed and assessed for dementia every three years. Age, race and level of education were factored into the research.
The data was combined with U.S. death rates, education and current and future population estimates from the U.S. Census Bureau.
The study found that the total number of people with Alzheimer’s dementia in 2050 is projected to be 13.8 million, up from 4.7 million in 2010. About 7 million of those with the disease would be age 85 or older in 2050.
“Our projections use sophisticated methods and the most up-to-date data, but they echo projections made years and decades ago. All of these projections anticipate a future with a dramatic increase in the number of people with Alzheimer’s and should compel us to prepare for it,” said Weuve.

Number of People with Alzheimer’s Disease May Triple by 2050

The number of people with Alzheimer’s disease is expected to triple in the next 40 years, according to a new study by researchers from Rush University Medical Center published in the February 6, 2013, online issue of Neurology, the medical journal of the American Academy of Neurology.

“This increase is due to an aging baby boom generation. It will place a huge burden on society, disabling more people who develop the disease, challenging their caregivers, and straining medical and social safety nets,” said co-author, Jennifer Weuve, MPH, ScD, assistant professor of medicine, Rush Institute for Healthy Aging at Rush University Medical Center in Chicago. “Our study draws attention to an urgent need for more research, treatments and preventive strategies to reduce the impact of this epidemic.”

For the study, researchers analyzed information from 10,802 African-American and Caucasian people living in Chicago, ages 65 and older between 1993 and 2011. Participants were interviewed and assessed for dementia every three years. Age, race and level of education were factored into the research.

The data was combined with U.S. death rates, education and current and future population estimates from the U.S. Census Bureau.

The study found that the total number of people with Alzheimer’s dementia in 2050 is projected to be 13.8 million, up from 4.7 million in 2010. About 7 million of those with the disease would be age 85 or older in 2050.

“Our projections use sophisticated methods and the most up-to-date data, but they echo projections made years and decades ago. All of these projections anticipate a future with a dramatic increase in the number of people with Alzheimer’s and should compel us to prepare for it,” said Weuve.

Filed under alzheimer's disease dementia memory cognitive decline medicine science

49 notes

Damaged Blood Vessels Loaded with Amyloid Worsen Cognitive Impairment in Alzheimer’s Disease
A team of researchers at Weill Cornell Medical College has discovered that amyloid peptides are harmful to the blood vessels that supply the brain with blood in Alzheimer’s disease — thus accelerating cognitive decline by limiting oxygen-rich blood and nutrients. In their animal studies, the investigators reveal how amyloid-ß accumulates in blood vessels and how such accumulation and damage might be ultimately prevented.
Their study, published in the Feb. 4 online edition of the Proceedings of the National Academy of Sciences (PNAS), is the first to identify the role that the innate immunity receptor CD36 plays in damaging cerebral blood vessels and promoting the accumulation of amyloid deposits in these vessels, a condition known as cerebral amyloid angiopathy (CAA).
Importantly, the study provides the rational bases for targeting CD36 to slow or reverse some of the cognitive deficits in Alzheimer’s disease by preventing CAA.
"Our findings strongly suggest that amyloid, in addition to damaging neurons, also threatens the cerebral blood supply and increases the brain’s susceptibility to damage through oxygen deprivation," says the study’s senior investigator, Dr. Costantino Iadecola, the Anne Parrish Titzell Professor of Neurology at Weill Cornell Medical College and director of the Brain and Mind Research Institute at Weill Cornell Medical College and NewYork-Presbyterian Hospital. "If we can stop accumulation of amyloid in these blood vessels, we might be able to significantly improve cognitive function in Alzheimer’s disease patients. Furthermore, we might be able to improve the effectiveness of amyloid immunotherapy, which is in clinical trials but has been hampered by the accumulation of amyloid in cerebral blood vessels."
Mounting scientific evidence shows that changes in the structure and function of cerebral blood vessels contribute to brain dysfunction underlying Alzheimer’s disease, but no one has truly understood how this happens until now.

Damaged Blood Vessels Loaded with Amyloid Worsen Cognitive Impairment in Alzheimer’s Disease

A team of researchers at Weill Cornell Medical College has discovered that amyloid peptides are harmful to the blood vessels that supply the brain with blood in Alzheimer’s disease — thus accelerating cognitive decline by limiting oxygen-rich blood and nutrients. In their animal studies, the investigators reveal how amyloid-ß accumulates in blood vessels and how such accumulation and damage might be ultimately prevented.

Their study, published in the Feb. 4 online edition of the Proceedings of the National Academy of Sciences (PNAS), is the first to identify the role that the innate immunity receptor CD36 plays in damaging cerebral blood vessels and promoting the accumulation of amyloid deposits in these vessels, a condition known as cerebral amyloid angiopathy (CAA).

Importantly, the study provides the rational bases for targeting CD36 to slow or reverse some of the cognitive deficits in Alzheimer’s disease by preventing CAA.

"Our findings strongly suggest that amyloid, in addition to damaging neurons, also threatens the cerebral blood supply and increases the brain’s susceptibility to damage through oxygen deprivation," says the study’s senior investigator, Dr. Costantino Iadecola, the Anne Parrish Titzell Professor of Neurology at Weill Cornell Medical College and director of the Brain and Mind Research Institute at Weill Cornell Medical College and NewYork-Presbyterian Hospital. "If we can stop accumulation of amyloid in these blood vessels, we might be able to significantly improve cognitive function in Alzheimer’s disease patients. Furthermore, we might be able to improve the effectiveness of amyloid immunotherapy, which is in clinical trials but has been hampered by the accumulation of amyloid in cerebral blood vessels."

Mounting scientific evidence shows that changes in the structure and function of cerebral blood vessels contribute to brain dysfunction underlying Alzheimer’s disease, but no one has truly understood how this happens until now.

Filed under alzheimer's disease cognitive decline oxygen deprivation blood vessels brain neuroscience science

63 notes

How the brain copes with multi-tasking alters with age
The pattern of blood flow in the prefrontal cortex in the brains alters with age during multi-tasking, finds a new study in BioMed Central’s open access journal BMC Neuroscience. Increased blood volume, measured using oxygenated haemoglobin (Oxy-Hb) increased at the start of multitasking in all age groups. But to perform the same tasks, healthy older people had a higher and more sustained increase in Oxy-Hb than younger people.
Age related changes to the brain occur earliest in the prefrontal cortex, the area of the brain associated with memory, emotion, and higher decision making functions. It is changes to this area of the brain that are also associated with dementia, depression and other neuropsychiatric disorders. Some studies have shown that regular physical activity and cognitive training can prevent cognitive decline (use it or lose it!) but to establish what occurs in a healthy aging brain researchers from Japan and USA have compared brain activity during single and dual tasks for young (aged 21 to 25) and older (over 65) people.
Near infrared spectroscopy (NIRS) measurements of Oxy-Hb showed that blood flow to the prefrontal cortex was not affected by the physical task for either age group but was affected by the mental task. For both the young and the over 65s the start of the calculation task  coincided with an increase in blood volume which reduced to baseline once the task was completed.
The main difference between the groups was only seen when performing the physical and mental tasks at the same time - older people had a higher prefrontal cortex response which lasted longer than the younger group.
Hironori Ohsugi, from Seirei Christopher University, and one of the team who performed this research explained “From our observations during the dual task it seems that the older people turn their attention to the calculation at the expense of the physical task, while younger people are able to maintain concentration on both. Since our subjects were all healthy it seems that this requirement for increased activation of the prefrontal cortex is part of normal decrease in brain function associated with aging. Further study will show whether or not dual task training can be used to maintain a more youthful brain.”
(Image: Photos.com)

How the brain copes with multi-tasking alters with age

The pattern of blood flow in the prefrontal cortex in the brains alters with age during multi-tasking, finds a new study in BioMed Central’s open access journal BMC Neuroscience. Increased blood volume, measured using oxygenated haemoglobin (Oxy-Hb) increased at the start of multitasking in all age groups. But to perform the same tasks, healthy older people had a higher and more sustained increase in Oxy-Hb than younger people.

Age related changes to the brain occur earliest in the prefrontal cortex, the area of the brain associated with memory, emotion, and higher decision making functions. It is changes to this area of the brain that are also associated with dementia, depression and other neuropsychiatric disorders. Some studies have shown that regular physical activity and cognitive training can prevent cognitive decline (use it or lose it!) but to establish what occurs in a healthy aging brain researchers from Japan and USA have compared brain activity during single and dual tasks for young (aged 21 to 25) and older (over 65) people.

Near infrared spectroscopy (NIRS) measurements of Oxy-Hb showed that blood flow to the prefrontal cortex was not affected by the physical task for either age group but was affected by the mental task. For both the young and the over 65s the start of the calculation task  coincided with an increase in blood volume which reduced to baseline once the task was completed.

The main difference between the groups was only seen when performing the physical and mental tasks at the same time - older people had a higher prefrontal cortex response which lasted longer than the younger group.

Hironori Ohsugi, from Seirei Christopher University, and one of the team who performed this research explained “From our observations during the dual task it seems that the older people turn their attention to the calculation at the expense of the physical task, while younger people are able to maintain concentration on both. Since our subjects were all healthy it seems that this requirement for increased activation of the prefrontal cortex is part of normal decrease in brain function associated with aging. Further study will show whether or not dual task training can be used to maintain a more youthful brain.”

(Image: Photos.com)

Filed under brain brain activity prefrontal cortex cognitive decline aging multi-tasking neuroscience science

46 notes

Cell Loss in the Brain Relates to Variations in Individual Symptoms in Huntington’s Disease

Scientists have wrestled to understand why Huntington’s disease, which is caused by a single gene mutation, can produce such variable symptoms. An authoritative review by a group of leading experts summarizes the progress relating cell loss in the striatum and cerebral cortex to symptom profile in Huntington’s disease, suggesting a possible direction for developing targeted therapies. The article is published in the latest issue of the Journal of Huntington’s Disease.

Huntington’s disease (HD) is an inherited progressive neurological disorder for which there is presently no cure. It is caused by a dominant mutation in the HD gene leading to expression of mutant huntingtin (HTT) protein. Expression of mutant HTT causes subtle changes in cellular functions, which ultimately results in jerking, uncontrollable movements, progressive psychiatric difficulties, and loss of mental abilities.

Although it is caused by a single gene, there are major variations in the symptoms of HD. The pattern of symptoms shown by each individual during the course of the disease can differ considerably and present as varying degrees of movement disturbances, cognitive decline, and mood and behavioral changes. Disease duration is typically between ten and twenty years.

Recent investigations have focused on what the presence of the defective gene does to various structures in the brain and understanding the relationship between changes in the brain and the variability in symptom profiles in Huntington’s disease.

Analyses of post-mortem human HD tissue suggest that the variation in clinical symptoms in HD is strongly associated with the variable pattern of neurodegeneration in two major regions of the brain, the striatum and the cerebral cortex. The neurodegeneration of the striatum generally follows an ordered and topographical distribution, but comparison of post-mortem human HD tissue and in vivo neuroimaging techniques reveal that the disease produces a striking bilateral atrophy of the striatum, which in these recent studies has been found to be highly variable.

“What is especially interesting is that recent findings suggest that the pattern of striatal cell death shows regional differences between cases in the functionally and neurochemically distinct striosomal and matrix compartments of the striatum which correspond with symptom variation,” says author Richard L.M. Faull, MB, ChB, PhD, DSc, Director of the Centre for Brain Research, University of Auckland, New Zealand.

“Our own recent detailed quantitative study using stereological cell counting in the post-mortem human HD cortex has complemented and expanded the neuroimaging studies by providing a cortical cellular basis of symptom heterogeneity in HD,” continues Dr Faull. “In particular, HD cases which were dominated by motor dysfunction showed a major total cell loss (28% loss) in the primary motor cortex but no cell loss in the limbic cingulate cortex, whereas cases where mood symptoms predominated showed a total of 54% neuronal loss in the limbic cingulate cortex but no cell loss in the motor cortex. This suggests that the variable neuronal loss and alterations in the circuitry of the primary motor cortex and anterior cingulate cortex associated with the variable compartmental pattern of cell degeneration in the striatum contribute to the differential impairments of motor and mood functions in HD.”

The authors note that there are still questions to be answered in the field of HD pathology, such as, how and when pathological neuronal loss occurs; whether the progressive loss of neurons in the striatum is the primary process or is consequential to cortical cell dysfunction; and how these changes relate to symptom profiles.

“What is clear however is that the diverse symptoms of HD patients appear to relate to the heterogeneity of cell loss in both the striatum and cerebral cortex,” the authors conclude. “While there is currently no cure, this contemporary evidence suggests that possible genetic therapies aimed at HD gene silencing should be directed towards intervention at both the cerebral cortex and the striatum in the human brain. This poses challenging problems requiring the application of gene silencing therapies to quite widespread regions of the forebrain which may be assisted via CSF delivery systems using gene suppression agents that cross the CSF/brain barrier.”

(Source: iospress.nl)

Filed under huntington’s disease neurodegeneration cell loss neuroimaging cognitive decline neuroscience science

152 notes

Detrimental effect of obesity on lesions associated with Alzheimer’s disease
Researchers from Inserm and the Université Lille/Université Lille Nord de France have recently used a neurodegeneration model of Alzheimer’s disease to provide experimental evidence of the relationship between obesity and disorders linked to the tau protein. This research was conducted on mice and is published in the Diabetes review: it corroborates the theory that metabolic anomalies contribute massively to the development of dementia.
In France, more than 860,000 people suffer from Alzheimer’s disease and related disorders, making them the largest cause of age-related loss of intellectual function. Cognitive impairments observed in Alzheimer’s disease result from the accumulation of abnormal tau proteins in nerve cells undergoing degeneration (see the picture below). We know that obesity, a major risk factor in the development of insulin resistance and type 2 diabetes, increases the risk of dementia during the aging process. However, the effects of obesity on ‘Taupathies’ (i.e. tau protein-related disorders), including Alzheimer’s disease, were not clearly understood. In particular, researchers assumed that insulin resistance played a major role in terms of the effects of obesity.
The “Alzheimer & Tauopathies” team from mixed research unit 837 (Inserm/Université Lille 2/Université Lille Nord de France) directed by Dr. Luc Buée, in collaboration with mixed research unit 1011 “Nuclear receptors, cardiovascular diseases and diabetes”, have just demonstrated, in mice, that obese subjects develop aggravated disorders. To achieve this result, young transgenic mice, who develop tau-related neurodegeneration progressively with age, were put on a high-fat diet for five months, leading to progressive obesity.
“At the end of this diet, the obese mice had developed an aggravated disorder both from the point of view of memory and modifications to the Tau protein”, explains David Blum, in charge of research at Inserm.
This study uses a neurodenegeneration model of Alzheimer’s disease to provide experimental evidence of the relationship between obesity and disorders linked to the tau protein. Furthermore, it indicates that insulin resistance is not the aggravating factor, as was suggested in previous studies.
“Our research supports the theory that environmental factors contribute massively to the development of this neurodegenerative disorder” underlines the researcher. “Our work is now focussing on identifying the factors responsible for this aggravation” he adds.

Detrimental effect of obesity on lesions associated with Alzheimer’s disease

Researchers from Inserm and the Université Lille/Université Lille Nord de France have recently used a neurodegeneration model of Alzheimer’s disease to provide experimental evidence of the relationship between obesity and disorders linked to the tau protein. This research was conducted on mice and is published in the Diabetes review: it corroborates the theory that metabolic anomalies contribute massively to the development of dementia.

In France, more than 860,000 people suffer from Alzheimer’s disease and related disorders, making them the largest cause of age-related loss of intellectual function. Cognitive impairments observed in Alzheimer’s disease result from the accumulation of abnormal tau proteins in nerve cells undergoing degeneration (see the picture below). We know that obesity, a major risk factor in the development of insulin resistance and type 2 diabetes, increases the risk of dementia during the aging process. However, the effects of obesity on ‘Taupathies’ (i.e. tau protein-related disorders), including Alzheimer’s disease, were not clearly understood. In particular, researchers assumed that insulin resistance played a major role in terms of the effects of obesity.

The “Alzheimer & Tauopathies” team from mixed research unit 837 (Inserm/Université Lille 2/Université Lille Nord de France) directed by Dr. Luc Buée, in collaboration with mixed research unit 1011 “Nuclear receptors, cardiovascular diseases and diabetes”, have just demonstrated, in mice, that obese subjects develop aggravated disorders. To achieve this result, young transgenic mice, who develop tau-related neurodegeneration progressively with age, were put on a high-fat diet for five months, leading to progressive obesity.

“At the end of this diet, the obese mice had developed an aggravated disorder both from the point of view of memory and modifications to the Tau protein”, explains David Blum, in charge of research at Inserm.

This study uses a neurodenegeneration model of Alzheimer’s disease to provide experimental evidence of the relationship between obesity and disorders linked to the tau protein. Furthermore, it indicates that insulin resistance is not the aggravating factor, as was suggested in previous studies.

“Our research supports the theory that environmental factors contribute massively to the development of this neurodegenerative disorder” underlines the researcher. “Our work is now focussing on identifying the factors responsible for this aggravation” he adds.

Filed under tau protein neurodegenerative disorders obesity alzheimer's disease cognitive decline neuroscience science

53 notes







Can Going Hungry As a Child Slow Down Cognitive Decline in Later Years?
People who sometimes went hungry as children had slower cognitive decline once they were elderly than people who always had enough food to eat, according to a new study published in the December 11, 2012, print issue of Neurology®, the medical journal of the American Academy of Neurology.
“These results were unexpected because other studies have shown that people who experience adversity as children are more likely to have problems such as heart disease, mental illness and even lower cognitive functioning than people whose childhoods are free of adversity,” said study author Lisa L. Barnes, PhD, of Rush University Medical Center in Chicago.
For the African American participants, the 5.8 percent who reported that they went without enough food to eat sometimes, often or always were more likely to have a slower rate of cognitive decline, or decline that was reduced by about one-third, than those who rarely or never went without enough food to eat. The 8.4 percent of African American participants who reported that they were much thinner at age 12 than other kids their age also were more likely to have a slower rate of cognitive decline, also by one-third, than those who said they were about the same size or heavier than other kids their age. For Caucasians, there was no relationship between any of the childhood adversity factors and cognitive decline. Barnes said researchers aren’t sure why childhood hunger could have a possible protective effect on cognitive decline. One potential explanation for the finding could be found in research that has shown that calorie restriction can delay the onset of age-related changes in the body and increase the life span. Another explanation could be a selective survival effect. The older people in the study who experienced childhood adversity may be the hardiest and most resilient of their era; those with the most extreme adversity may have died before they reached old age.
Barnes noted that the results stayed the same after researchers adjusted for factors such as amount of education and health problems. The results also did not change after researchers repeated the analysis after excluding people with the lowest cognitive function at the beginning of the study to help rule out the possibility that people with mild, undiagnosed Alzheimer’s disease were included in the study.
Because relatively few Caucasians in the study reported childhood adversity, the study may not have been able to detect an effect of adversity on cognitive decline in Caucasians, Barnes said.








(Image Credit)

Can Going Hungry As a Child Slow Down Cognitive Decline in Later Years?

People who sometimes went hungry as children had slower cognitive decline once they were elderly than people who always had enough food to eat, according to a new study published in the December 11, 2012, print issue of Neurology®, the medical journal of the American Academy of Neurology.

“These results were unexpected because other studies have shown that people who experience adversity as children are more likely to have problems such as heart disease, mental illness and even lower cognitive functioning than people whose childhoods are free of adversity,” said study author Lisa L. Barnes, PhD, of Rush University Medical Center in Chicago.

For the African American participants, the 5.8 percent who reported that they went without enough food to eat sometimes, often or always were more likely to have a slower rate of cognitive decline, or decline that was reduced by about one-third, than those who rarely or never went without enough food to eat. The 8.4 percent of African American participants who reported that they were much thinner at age 12 than other kids their age also were more likely to have a slower rate of cognitive decline, also by one-third, than those who said they were about the same size or heavier than other kids their age. For Caucasians, there was no relationship between any of the childhood adversity factors and cognitive decline. Barnes said researchers aren’t sure why childhood hunger could have a possible protective effect on cognitive decline. One potential explanation for the finding could be found in research that has shown that calorie restriction can delay the onset of age-related changes in the body and increase the life span. Another explanation could be a selective survival effect. The older people in the study who experienced childhood adversity may be the hardiest and most resilient of their era; those with the most extreme adversity may have died before they reached old age.

Barnes noted that the results stayed the same after researchers adjusted for factors such as amount of education and health problems. The results also did not change after researchers repeated the analysis after excluding people with the lowest cognitive function at the beginning of the study to help rule out the possibility that people with mild, undiagnosed Alzheimer’s disease were included in the study.

Because relatively few Caucasians in the study reported childhood adversity, the study may not have been able to detect an effect of adversity on cognitive decline in Caucasians, Barnes said.

Filed under cognitive decline children hunger cognitive functioning childhood adversity neuroscience science

105 notes



Smoking ‘rots’ brain, says King’s College study
Smoking “rots” the brain by damaging memory, learning and reasoning, according to researchers at King’s College London. A study of 8,800 people over 50 showed high blood pressure and being overweight also seemed to affect the brain, but to a lesser extent.
Scientists involved said people needed to be aware that lifestyles could damage the mind as well as the body. Their study was published in the journal Age and Ageing.
Researchers at King’s were investigating links between the likelihood of a heart attack or stroke and the state of the brain. Data about the health and lifestyle of a group of over-50s was collected and brain tests, such as making participants learn new words or name as many animals as they could in a minute, were also performed.
They were all tested again after four and then eight years. The results showed that the overall risk of a heart attack or stroke was “significantly associated with cognitive decline” with those at the highest risk showing the greatest decline.
It also said there was a “consistent association” between smoking and lower scores in the tests. One of the researchers, Dr Alex Dregan, said: “Cognitive decline becomes more common with ageing and for an increasing number of people interferes with daily functioning and well-being.
"We have identified a number of risk factors which could be associated with accelerated cognitive decline, all of which, could be modifiable." He added: "We need to make people aware of the need to do some lifestyle changes because of the risk of cognitive decline."
The researchers do not know how such a decline could affect people going about their daily life. They are also unsure whether the early drop in brain function could lead to conditions such as dementia.


(Image: Alamy)

Smoking ‘rots’ brain, says King’s College study

Smoking “rots” the brain by damaging memory, learning and reasoning, according to researchers at King’s College London. A study of 8,800 people over 50 showed high blood pressure and being overweight also seemed to affect the brain, but to a lesser extent.

Scientists involved said people needed to be aware that lifestyles could damage the mind as well as the body. Their study was published in the journal Age and Ageing.

Researchers at King’s were investigating links between the likelihood of a heart attack or stroke and the state of the brain. Data about the health and lifestyle of a group of over-50s was collected and brain tests, such as making participants learn new words or name as many animals as they could in a minute, were also performed.

They were all tested again after four and then eight years. The results showed that the overall risk of a heart attack or stroke was “significantly associated with cognitive decline” with those at the highest risk showing the greatest decline.

It also said there was a “consistent association” between smoking and lower scores in the tests. One of the researchers, Dr Alex Dregan, said: “Cognitive decline becomes more common with ageing and for an increasing number of people interferes with daily functioning and well-being.

"We have identified a number of risk factors which could be associated with accelerated cognitive decline, all of which, could be modifiable." He added: "We need to make people aware of the need to do some lifestyle changes because of the risk of cognitive decline."

The researchers do not know how such a decline could affect people going about their daily life. They are also unsure whether the early drop in brain function could lead to conditions such as dementia.

(Image: Alamy)

Filed under brain smoking cognitive decline memory dementia neuroscience psychology science

54 notes

Multivitamin lifts brain activity

A daily multivitamin supplement may improve brain efficiency in older women, according to new research from Swinburne University of Technology.

Centre for Human Psychopharmacology researcher at Swinburne, Dr Helen Macpherson’s four month study of the commercial product Swisse Women’s Ultivite 50+ found some evidence that multivitamin supplements may influence cognitive function by altering electrical activity in the brain.

"The main finding of the study was that 16 weeks supplementation with the Swisse Women’s 50+ multivitamin modulated brain activity," Dr Macpherson said.

"This is an important result as it shows there are direct effects of multivitamins on the brain.

"Previous research has used measures of behaviour to determine whether multivitamins can affect brain function, but this is the first trial to directly measure brain activity."

The study was conducted over 16 weeks with 56 women aged between 64 and 79 who were concerned about their memory or experiencing memory difficulties. They were randomly assigned to take the multivitamin supplement or a placebo daily.

Volunteers underwent a recording of their brain electrical activity whilst performing a spatial working memory task.

The research was published in Physiology and Behavior.

A previous paper published in Psychopharmacology reported that multivitamin supplementation improved behavioural performance on a similar task, in the same group of participants.

The study concluded that 16 weeks of supplementation with a combined multivitamin, mineral and herbal formula may benefit memory, by enabling the brain to work in a more efficient way.

"When considered with our other findings of benefits to memory performance, there is increasing evidence that multivitamins may be useful to combat cognitive decline in the elderly," Dr Macpherson said.

(Source: swinburne.edu.au)

Filed under brain cognitive decline memory brain function multivitamin neuroscience psychology science

59 notes


Berkeley Lab Scientists Help Develop Promising Therapy for Huntington’s Disease
There’s new hope in the fight against Huntington’s disease. A group of researchers that includes scientists from the U.S. Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab) have designed a compound that suppresses symptoms of the devastating disease in mice.
The compound is a synthetic antioxidant that targets mitochondria, an organelle within cells that serves as a cell’s power plant. Oxidative damage to mitochondria is implicated in many neurodegenerative diseases including Alzheimer’s, Parkinson’s, and Huntington’s.
The scientists administered the synthetic antioxidant, called XJB-5-131, to mice that have a genetic mutation that triggers Huntington’s disease. The compound improved mitochondrial function and enhanced the survival of neurons. It also inhibited weight loss and stopped the decline of motor skills, among other benefits. In short, the Huntington’s mice looked and behaved like normal mice.
Based on their findings, the scientists believe that XJB-5-131 is a promising therapeutic compound that deserves further investigation as a way to fight neurodegenerative diseases.
They report their research in a paper that appears online Nov. 1 in the journal Cell Reports.

Berkeley Lab Scientists Help Develop Promising Therapy for Huntington’s Disease

There’s new hope in the fight against Huntington’s disease. A group of researchers that includes scientists from the U.S. Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab) have designed a compound that suppresses symptoms of the devastating disease in mice.

The compound is a synthetic antioxidant that targets mitochondria, an organelle within cells that serves as a cell’s power plant. Oxidative damage to mitochondria is implicated in many neurodegenerative diseases including Alzheimer’s, Parkinson’s, and Huntington’s.

The scientists administered the synthetic antioxidant, called XJB-5-131, to mice that have a genetic mutation that triggers Huntington’s disease. The compound improved mitochondrial function and enhanced the survival of neurons. It also inhibited weight loss and stopped the decline of motor skills, among other benefits. In short, the Huntington’s mice looked and behaved like normal mice.

Based on their findings, the scientists believe that XJB-5-131 is a promising therapeutic compound that deserves further investigation as a way to fight neurodegenerative diseases.

They report their research in a paper that appears online Nov. 1 in the journal Cell Reports.

Filed under neurodegenerative disorders Huntington’s disease genetic mutation cognitive decline neuroscience science

20 notes

Drug shows promise in animal model of Alzheimer’s and Parkinson’s with dementia

New research presented in October at the 6th Neurodegenerative Conditions Research and Development Conference in San Francisco demonstrates the role of the investigational compound IRX4204 in alleviating cognitive decline in animal models of Alzheimer’s disease (AD). The presentation entitled “Investigation of the RXR-specific agonist IRX4204 as a Disease Modifying Agent of Alzheimer’s Disease Neuropathology and Cognitive Impairment” was made by lead researcher Giulio Maria Pasinetti, MD, PhD, of the Mount Sinai School of Medicine in New York City.

IRX4204 is a retinoid X receptor (RXR) agonist, meaning it stimulates the retinoid receptor in the brain.The data demonstrates attenuation of AD including prevention of plaque deposits associated with cognitive deterioration in an IRX4204-treated mouse model genetically determined to develop AD. IRX4204 also prevents neuropathological features associated with abnormal tau processing, another form of abnormal protein also found in a form of Parkinson’s disease associated with dementia.

"The treatment of AD remains a serious unmet medical need which IRX4204 may be able to address," Dr. Pasinetti said "Our research show that IRX4204 and other RXR agonists have potential for slowing, and possibly reversing pathology and cognitive deficits in Alzheimer’s disease patients."

Ongoing translational studies in subjects with Alzheimer’s disease and Parkinson’s disease with dementia are currently being developed.

Alzheimer’s disease currently afflicts more than 5 million Americans and may triple in prevalence to more than 16 million Americans by 2050, according to data from The Alzheimer’s Association.

(Source: eurekalert.org)

Filed under animal model alzheimer alzheimer's disease cognitive decline retinoid receptor neuroscience science

free counters