Neuroscience

Articles and news from the latest research reports.

Posts tagged aging

118 notes

Molecular Switches for Age-Related Memory Decline? A Genetic Variant Protects Against Brain Aging
Even among the healthiest individuals, memory and cognitive abilities decline with age. This aspect of normal aging can affect an individual’s quality of life and capability to live independently but the rate of decline is variable across individuals. There are many factors that can influence this trajectory, but perhaps none more importantly than genetics.
Scientists are seeking to identify key molecular switches that control age-related memory impairment. When new molecules are identified as critical to the process of memory consolidation, they are then tested to determine whether they contribute to the memory problems of the elderly.
One of these proteins is called KIBRA and the gene responsible for its production is WWC1. KIBRA is known to play a role in human memory and so researchers at the Lieber Institute for Brain Development and the National Institute of Mental Health, led by senior author Dr. Venkata Mattay, conducted a study to determine the effects of genetic variants in WWC1 on memory. Their findings are published in the current issue of Biological Psychiatry.
“Identifying these genetic factors, while helping us better understand the neurobiology of cognitive aging, will also aid in identifying mechanisms that confer individuals with resilience to withstand the inevitable age-related changes in neural architecture and function,” explained Mattay.
Using imaging genetics, a method that combines genetics with brain imaging technology, the team explored the effect of a variant in the WWC1 gene on age-related changes in memory function. The particular WWC1 variant under investigation has three potential forms – CC, TT, or CT.
They recruited 233 healthy volunteers, who ranged in age from 18-89 years. The volunteers completed a battery of cognitive tests, underwent genotyping, and completed a memory task during a brain imaging scan.
They found that individuals who carry the T allele, as either CT or TT, performed better on the memory task and showed more active engagement in the hippocampus, a vital brain region for memory, with increasing age.
“Our results show a dynamic relationship between this gene and increasing age on hippocampal function and episodic memory with the non-T allele group showing a significant decline across the adult life span,” said Mattay. “A similar relationship was not observed in the T-allele carrying group suggesting that this variant of the gene may confer a protective effect.”
Dr. John Krystal, Editor of Biological Psychiatry, commented, “The risk mechanisms for age-related memory impairment that we identify today may become the targets for the prevention and treatment of this problem in the future.”

Molecular Switches for Age-Related Memory Decline? A Genetic Variant Protects Against Brain Aging

Even among the healthiest individuals, memory and cognitive abilities decline with age. This aspect of normal aging can affect an individual’s quality of life and capability to live independently but the rate of decline is variable across individuals. There are many factors that can influence this trajectory, but perhaps none more importantly than genetics.

Scientists are seeking to identify key molecular switches that control age-related memory impairment. When new molecules are identified as critical to the process of memory consolidation, they are then tested to determine whether they contribute to the memory problems of the elderly.

One of these proteins is called KIBRA and the gene responsible for its production is WWC1. KIBRA is known to play a role in human memory and so researchers at the Lieber Institute for Brain Development and the National Institute of Mental Health, led by senior author Dr. Venkata Mattay, conducted a study to determine the effects of genetic variants in WWC1 on memory. Their findings are published in the current issue of Biological Psychiatry.

“Identifying these genetic factors, while helping us better understand the neurobiology of cognitive aging, will also aid in identifying mechanisms that confer individuals with resilience to withstand the inevitable age-related changes in neural architecture and function,” explained Mattay.

Using imaging genetics, a method that combines genetics with brain imaging technology, the team explored the effect of a variant in the WWC1 gene on age-related changes in memory function. The particular WWC1 variant under investigation has three potential forms – CC, TT, or CT.

They recruited 233 healthy volunteers, who ranged in age from 18-89 years. The volunteers completed a battery of cognitive tests, underwent genotyping, and completed a memory task during a brain imaging scan.

They found that individuals who carry the T allele, as either CT or TT, performed better on the memory task and showed more active engagement in the hippocampus, a vital brain region for memory, with increasing age.

“Our results show a dynamic relationship between this gene and increasing age on hippocampal function and episodic memory with the non-T allele group showing a significant decline across the adult life span,” said Mattay. “A similar relationship was not observed in the T-allele carrying group suggesting that this variant of the gene may confer a protective effect.”

Dr. John Krystal, Editor of Biological Psychiatry, commented, “The risk mechanisms for age-related memory impairment that we identify today may become the targets for the prevention and treatment of this problem in the future.”

Filed under aging hippocampus memory episodic memory WWC1 KIBRA neuroscience science

166 notes

Simple technique may help older adults better remember written information

University of Florida researchers have advice for older adults who need to remember detailed written information: Don’t just read it, tell someone about it.

That recommendation comes from a new UF study that showed that older adults who read a text and then described what they had read to someone else remembered more details of the text than older adults who simply re-read the passage multiple times.

The findings appear in the April issue of the journal Aphasiology.

Older adults are better able than younger adults to recall the gist of information they learn, but they have more difficulty remembering details, said lead investigator Yvonne Rogalski, who conducted the research as part of her doctoral dissertation work at the UF College of Public Health and Health Professions.

“Older adults can rely on things they’ve learned in the past and they can build on that vast wealth of semantic information that they’ve collected over the years. That works as long as the information is familiar, but where it breaks down is when they have to read something that is unfamiliar and has a lot of details,” said Rogalski, now an assistant professor in the department of speech-language pathology and audiology at Ithaca College.

As a doctoral student Rogalski developed a training technique called Read Attentively, Summarise and Review, or RASR, which requires participants to read a passage aloud and then summarize from memory what they’ve read after each paragraph. The training is designed to help people “encode” information and commit it to memory.

“In the reading aloud portion, attention is heightened because you know you’re going to have to recall something,” she said. “Then retrieving that information through the summaries has the ability to act as a secondary encoding. Reading and recalling the text paragraph by paragraph instead of the whole text is designed to reduce the information processing demands.”  

For the UF study, 44 healthy adults ages 60 to 75 used one of two methods to recall details from texts on real — but unusual — animals. Participants who used a technique called Read and Reread Attentively read the entire passage aloud once, and then re-read each paragraph three times aloud in succession. Those in the RASR group read the whole text aloud once, then for each paragraph they read it aloud, summarized it from memory and then re-read it aloud again. Participants in both groups were tested immediately after studying and 24 hours later.

The researchers found that participants who summarized the information aloud remembered more details about the texts than those who just re-read the material. In addition, combining the summarization method with an immediate post-test showed the most benefit for remembering text details after a 24-hour delay.

“We think it is effective because by reading the information and then putting it into your own words you have to do quite a bit of processing of not only the information, but also the relationships among bits of information,” said Lori Altmann, an associate professor in the UF department of speech, language, and hearing sciences, and a study co-author along with John Rosenbek, also a professor in the department. “Picking out the relationships that are important to you as you see them can help to order the information in your own memory.”

Older adults can put the principles of the summarization technique to work for themselves whenever they want or need to learn detailed information, such as a magazine article or medication plan, the researchers say. They suggest that people read the information and then describe it from memory to a partner who can check for accuracy.

“The RASR method is a very functional treatment and it’s something that healthy older adults or even people with mild dementias could use on their own to try and improve their memory,” Altmann said. “It doesn’t involve anything high-tech, and that’s the beauty of it.”

(Source: news.ufl.edu)

Filed under aging reading memory memory technique RASR psychology neuroscience science

177 notes

What Our Ancestors Can Teach Us About Exercise, Alzheimer’s and Human Longevity
Our ancient ancestors’ exercise routines could provide important clues about how best to prevent and treat Alzheimer’s disease and other modern age-related diseases, according to a new paper by two University of Arizona researchers.
The article, featured on the cover of the May issue of the journal Trends in Neurosciences, explores the evolutionary links between physical activity, brain aging and the lifespan of humans, who outlive all other primates.
"This is an effort to try to understand the relationship between exercise and an important genetic risk factor for Alzheimer’s disease and vascular disease, and how the human lifespan evolved, which is a fundamental question that’s been considered in the scientific literature for many years," said UA psychology professor Gene Alexander, who co-authored the paper with David Raichlen, a UA associate professor of anthropology.
While many studies today tout the health benefits of exercise, Alexander and Raichlen consider the link between physical activity and health from an evolutionary perspective, beginning about 2 million years ago. It was around that time that humans made the shift from a more apelike, sedentary lifestyle to a highly active hunter-gatherer lifestyle and began living longer.
During that period, humans likely carried two copies of a genotype known as ApoE4, which is directly linked to higher risk for Alzheimer’s disease and cardiovascular disease. Yet, despite the presence of the problematic gene variation, longer lifespans began to evolve.
"Having this risk allele (ApoE4) is our ancestral condition," Raichlen said. "The lower risk alleles evolved relatively recently, so our question was: How do you evolve a long lifespan when you have this ApoE4 risk allele?"
The answer, Raichlen and Alexander believe, lies in humans’ high level of physical activity 2 million years ago.
"To engage in this hunter-gatherer lifestyle you have to be an aerobically active organism. There’s no way around it. You have to go long distances to find your food," Raichlen said.
"We developed a hypothesis that suggests that exercise may be an important modulating factor that helps to compensate for the negative impact of the (genetic) risk factor for Alzheimer’s and vascular disease, and ultimately might help us to understand why humans are able to live much longer than other primate species," said Alexander, who also teaches in the UA Graduate Interdisciplinary Programs in Neuroscience and Physiological Sciences.
As the human lifestyle today has become increasingly sedentary, this evolutionary link may be important in the development of new prevention therapies and treatments for Alzheimer’s and other age-related diseases, Alexander said.
"We are fundamentally endurance athletes, based on our ancestry. Our recent change, to a more sedentary lifestyle, may have led to a situation where this (ApoE4) genotype has become a problem for us, where it might not have been before," he said.
"With our current tendencies towards less active lifestyles, we need to be thinking about exercise as a potentially important intervention. Considering the evolutionary significance of ApoE4 also gives us some clues about why exercise might be especially important for us."
Today, it has been estimated that about 25 percent of the general U.S. population carries the ApoE4 genotype, and only about 2 percent have two copies of it, putting them at even greater risk for Alzheimer’s or vascular disease. However, the prevalence of the genotype in subgroups of the U.S. population and in some other parts of the world is much higher. 
"There are parts of equatorial Africa where the frequency of the ApoE4 allele is something like 40 percent of the population," Raichlen said, "so thinking about how to use exercise to alter risk around the world is important."
Raichlen has studied in-depth the evolution and effects of physical activity in humans. His research covers a range of topics, including the effects of exercise on happiness, the link between aerobic activity and brain size, the walking patterns of human hunter-gatherers and the role of the runners’ high in human evolution.
Alexander, a member of the UA’s Evelyn F. McKnight Brain Institute and the Arizona Alzheimer’s Consortium, has done extensive research on aging and age-related diseases.
The two came together to explore the connection between their two areas of study by considering research literature in anthropology, brain imaging and neuroscience.
"We’ve generated a new hypothesis from these different scientific literatures that typically don’t cross over," Alexander said. "We are drawing on these different disciplines to look at this question in a new way, and I think it really has important implications for how we understand health issues today. Using what we know about ancestral genotypes, their risks, and how our behaviors evolved over time may help us to gain a better understanding of the underlying mechanisms of Alzheimer’s and age-related cognitive decline."

What Our Ancestors Can Teach Us About Exercise, Alzheimer’s and Human Longevity

Our ancient ancestors’ exercise routines could provide important clues about how best to prevent and treat Alzheimer’s disease and other modern age-related diseases, according to a new paper by two University of Arizona researchers.

The article, featured on the cover of the May issue of the journal Trends in Neurosciences, explores the evolutionary links between physical activity, brain aging and the lifespan of humans, who outlive all other primates.

"This is an effort to try to understand the relationship between exercise and an important genetic risk factor for Alzheimer’s disease and vascular disease, and how the human lifespan evolved, which is a fundamental question that’s been considered in the scientific literature for many years," said UA psychology professor Gene Alexander, who co-authored the paper with David Raichlen, a UA associate professor of anthropology.

While many studies today tout the health benefits of exercise, Alexander and Raichlen consider the link between physical activity and health from an evolutionary perspective, beginning about 2 million years ago. It was around that time that humans made the shift from a more apelike, sedentary lifestyle to a highly active hunter-gatherer lifestyle and began living longer.

During that period, humans likely carried two copies of a genotype known as ApoE4, which is directly linked to higher risk for Alzheimer’s disease and cardiovascular disease. Yet, despite the presence of the problematic gene variation, longer lifespans began to evolve.

"Having this risk allele (ApoE4) is our ancestral condition," Raichlen said. "The lower risk alleles evolved relatively recently, so our question was: How do you evolve a long lifespan when you have this ApoE4 risk allele?"

The answer, Raichlen and Alexander believe, lies in humans’ high level of physical activity 2 million years ago.

"To engage in this hunter-gatherer lifestyle you have to be an aerobically active organism. There’s no way around it. You have to go long distances to find your food," Raichlen said.

"We developed a hypothesis that suggests that exercise may be an important modulating factor that helps to compensate for the negative impact of the (genetic) risk factor for Alzheimer’s and vascular disease, and ultimately might help us to understand why humans are able to live much longer than other primate species," said Alexander, who also teaches in the UA Graduate Interdisciplinary Programs in Neuroscience and Physiological Sciences.

As the human lifestyle today has become increasingly sedentary, this evolutionary link may be important in the development of new prevention therapies and treatments for Alzheimer’s and other age-related diseases, Alexander said.

"We are fundamentally endurance athletes, based on our ancestry. Our recent change, to a more sedentary lifestyle, may have led to a situation where this (ApoE4) genotype has become a problem for us, where it might not have been before," he said.

"With our current tendencies towards less active lifestyles, we need to be thinking about exercise as a potentially important intervention. Considering the evolutionary significance of ApoE4 also gives us some clues about why exercise might be especially important for us."

Today, it has been estimated that about 25 percent of the general U.S. population carries the ApoE4 genotype, and only about 2 percent have two copies of it, putting them at even greater risk for Alzheimer’s or vascular disease. However, the prevalence of the genotype in subgroups of the U.S. population and in some other parts of the world is much higher. 

"There are parts of equatorial Africa where the frequency of the ApoE4 allele is something like 40 percent of the population," Raichlen said, "so thinking about how to use exercise to alter risk around the world is important."

Raichlen has studied in-depth the evolution and effects of physical activity in humans. His research covers a range of topics, including the effects of exercise on happiness, the link between aerobic activity and brain size, the walking patterns of human hunter-gatherers and the role of the runners’ high in human evolution.

Alexander, a member of the UA’s Evelyn F. McKnight Brain Institute and the Arizona Alzheimer’s Consortium, has done extensive research on aging and age-related diseases.

The two came together to explore the connection between their two areas of study by considering research literature in anthropology, brain imaging and neuroscience.

"We’ve generated a new hypothesis from these different scientific literatures that typically don’t cross over," Alexander said. "We are drawing on these different disciplines to look at this question in a new way, and I think it really has important implications for how we understand health issues today. Using what we know about ancestral genotypes, their risks, and how our behaviors evolved over time may help us to gain a better understanding of the underlying mechanisms of Alzheimer’s and age-related cognitive decline."

Filed under alzheimer's disease ApoE4 physical activity exercise dementia aging longevity psychology neuroscience science

174 notes

Functioning of aged brains and muscles in mice made younger
Harvard Stem Cell Institute (HSCI) researchers have shown that a protein they previously demonstrated can make the failing hearts in aging mice appear more like those of young health mice, similarly improves brain and skeletal muscle function in aging mice.
In two separate papers given early online release today by the journal Science—which is publishing the papers this coming Friday, Professors Amy Wagers, PhD, and Lee Rubin, PhD, of Harvard’s Department of Stem Cell and Regenerative Biology (HSCRB), report that injections of a protein known as GDF11, which is found in humans as well as mice, improved the exercise capability of mice equivalent in age to that of about a 70-year-old human, and also improved the function of the olfactory region of the brains of the older mice—they could detect smell as younger mice do.
Rubin, and Wagers, who also has a laboratory at the Joslin Diabetes Center, each said that, baring unexpected developments, they expect to have GDF11 in initial human clinical trials within three to five years.
Postdoctoral fellow Lida Katsimpardi, PhD, is the lead author on the Rubin group’s paper, and postdocs Manisha Sinha, PhD, and Young Jang, PhD, are the lead authors on the paper from the Wagers group.
Both studies examined the effect of GDF11 in two ways. First, by using what is called a parabiotic system, in which two mice are surgically joined and the blood of the younger mouse circulates through the older mouse. And second, by injecting the older mice with GDF11, which in an earlier study by Wagers and Richard Lee, MD, of Brigham and Women’s Hospital who is also an author on the two papers released today, was shown to be sufficient to reverse characteristics of aging in the heart.
Doug Melton, PhD, co-chair of HSCRB and co-director of HSCI, reacted to the two papers by saying that he couldn’t “recall a more exciting finding to come from stem cell science and clever experiments. This should give us all hope for a healthier future. We all wonder why we were stronger and mentally more agile when young, and these two unusually exciting papers actually point to a possible answer: the higher levels of the protein GDF11 we have when young. There seems to be little question that, at least in animals, GDF11 has an amazing capacity to restore aging muscle and brain function,” he said.
Melton, Harvard’s Xander University Professor, continued, saying that the ongoing collaboration between Wagers, a stem cell biologist whose focus has been on muscle, Rubin, whose focus is on neurodegenerative diseases and using patient generated stem cells as targets for drug discovery, and Lee, a practicing cardiologist and researcher, “is a perfect example of the power of the Harvard Stem Cell Institute as an engine of truly collaborative efforts and discovery, bringing together people with big, unique ideas and expertise in different biological areas.”
As Melton noted, GDF11 is naturally found in much higher concentrations in young mice than in older mice, and raising its levels in the older mice has improved the function of every organ system thus far studied.
Wagers first began using the parabiotic system in mice 14 years ago as a postdoctoral fellow at Stanford University, when she and colleagues Thomas Rando, MD, PhD, of Stanford, Irina Conboy, PhD, of the University of California, Berkley, and Irving Weissman, MD, of Stanford, observed that the blood of young mice circulating in old mice seemed to have some rejuvenating effects on muscle repair after injury.
Last year, she and Richard Lee published a paper in which they reported that when exposed to the blood of young mice, the enlarged, weakened hearts of older mice returned to a more youthful size, and their function improved. And then working with a Colorado firm, the pair reported that GDF11 was the factor in the blood apparently responsible for the rejuvenating effect. That finding has raised hopes that GDF11 may prove, in some form, to be a possible treatment for diastolic heart failure, a fatal condition in the elderly that now is irreversible, and fatal.
“From the previous work it could have seemed that GD11 was heart specific,” said Wagers, “but this shows that it is active in multiple organs and cell types. Prior studies of skeletal muscle and the parabiotic effect really focused on regenerative biology. Muscle was damaged and assayed on how well it could recover,” Wagers explained.
She continued: “The additional piece is that while prior studies of young blood factors have shown that we achieve restoration of muscle stem cell function and they repair the muscle better, in this study, we also saw repair of DNA damage associated with aging, and we got it in association with recovery of function, and we saw improvements in unmanipulated muscle. Based on other studies, we think that the accumulation of DNA damage in muscle stem cells might reflect an inability of the cells to properly differentiate to make mature muscle cells, which is needed for adequate muscle repair.”
Wagers noted that there is still a great deal to be learned about the mechanics of aging in muscle, and its repair. “I don’t think we fully understand how this happening or why. We might say that the damage is modification to the genetic material; the genome does have breaks in it. But whether it’s damaging, or a necessary part of repair, we don’t know yet.”
Rubin, whose primary research focus is on developing treatment for neurodegenerative diseases, particularly in children, said that when his group began its GDF11 experiments, “we knew that in the old mouse things were bad in the brain, there is a reduced amount of neurogenesis (the development of neurons), and it’s well known that cognition goes down. It wasn’t obvious to me that those things that can be repaired in peripheral tissue could be fixed in the brain.”
Rubin said that postdoctoral fellow Lida Katsimpardi, the lead author on his group’s paper, was taught the parabiotic experimental technique by Wagers, but conducted the Rubin group’s experiments independently of the Wagers group, and “she saw an increase in neural stem cells, and increased development of blood vessels in the brain.” Rubin said that 3D reconstruction of the brain, and magnetic resonance imaging (MRI) of the mouse brain showed “more new blood vessels and more blood flow,” both of which are normally associated with younger, healthier brain tissue.”
Younger mice, Rubin said, “have a keen sense of olfactory discrimination,” they can sense fine differences in odor. “When we tested the young mice, they avoided the smell of mint; the old mice didn’t. But the old mice exposed to the blood of the young mice, and those treated with GDF11 did.”
“We think an effect of GDF11 is the improved vascularity and blood flow, which is associated with increased neurogenesis,” Rubin said. “However, the increased blood flow should have more widespread effects on brain function. We do think that, at least in principle, there will be a way to reverse some of the cognitive decline that takes place during aging, perhaps even with a single protein. It could be that a molecule like GDF11, or GDF11 itself, could” reverse the damage of aging.
“It isn’t out of question that GDF11,” or a drug developed from it, “might be capable of slowing some of the cognitive defects associated with Alzheimer’s disease, a disorder whose main risk factor is aging itself,” Rubin said. It is even possible that this could occur without directly changing the “plaque and tangle burden” that are the pathological hallmarks of Alzheimer’s. Thus, a future treatment for this disease might be a combination of a therapeutic that reduces plaques and tangles, such as an antibody directed against the β-amyloid peptide, with a potential cognition enhancer like GDF11.
Wagers said that the two research groups are in discussions with a venture capital group to obtain funding to “be able to do the additional preclinical work” necessary before moving GDF11 into human trials.
“I would wager that the results of this work, together with the other work, will translate into a clinical trial and a treatment,” said the stem cell biologist. “But of course that’s just a wager.”

Functioning of aged brains and muscles in mice made younger

Harvard Stem Cell Institute (HSCI) researchers have shown that a protein they previously demonstrated can make the failing hearts in aging mice appear more like those of young health mice, similarly improves brain and skeletal muscle function in aging mice.

In two separate papers given early online release today by the journal Science—which is publishing the papers this coming Friday, Professors Amy Wagers, PhD, and Lee Rubin, PhD, of Harvard’s Department of Stem Cell and Regenerative Biology (HSCRB), report that injections of a protein known as GDF11, which is found in humans as well as mice, improved the exercise capability of mice equivalent in age to that of about a 70-year-old human, and also improved the function of the olfactory region of the brains of the older mice—they could detect smell as younger mice do.

Rubin, and Wagers, who also has a laboratory at the Joslin Diabetes Center, each said that, baring unexpected developments, they expect to have GDF11 in initial human clinical trials within three to five years.

Postdoctoral fellow Lida Katsimpardi, PhD, is the lead author on the Rubin group’s paper, and postdocs Manisha Sinha, PhD, and Young Jang, PhD, are the lead authors on the paper from the Wagers group.

Both studies examined the effect of GDF11 in two ways. First, by using what is called a parabiotic system, in which two mice are surgically joined and the blood of the younger mouse circulates through the older mouse. And second, by injecting the older mice with GDF11, which in an earlier study by Wagers and Richard Lee, MD, of Brigham and Women’s Hospital who is also an author on the two papers released today, was shown to be sufficient to reverse characteristics of aging in the heart.

Doug Melton, PhD, co-chair of HSCRB and co-director of HSCI, reacted to the two papers by saying that he couldn’t “recall a more exciting finding to come from stem cell science and clever experiments. This should give us all hope for a healthier future. We all wonder why we were stronger and mentally more agile when young, and these two unusually exciting papers actually point to a possible answer: the higher levels of the protein GDF11 we have when young. There seems to be little question that, at least in animals, GDF11 has an amazing capacity to restore aging muscle and brain function,” he said.

Melton, Harvard’s Xander University Professor, continued, saying that the ongoing collaboration between Wagers, a stem cell biologist whose focus has been on muscle, Rubin, whose focus is on neurodegenerative diseases and using patient generated stem cells as targets for drug discovery, and Lee, a practicing cardiologist and researcher, “is a perfect example of the power of the Harvard Stem Cell Institute as an engine of truly collaborative efforts and discovery, bringing together people with big, unique ideas and expertise in different biological areas.”

As Melton noted, GDF11 is naturally found in much higher concentrations in young mice than in older mice, and raising its levels in the older mice has improved the function of every organ system thus far studied.

Wagers first began using the parabiotic system in mice 14 years ago as a postdoctoral fellow at Stanford University, when she and colleagues Thomas Rando, MD, PhD, of Stanford, Irina Conboy, PhD, of the University of California, Berkley, and Irving Weissman, MD, of Stanford, observed that the blood of young mice circulating in old mice seemed to have some rejuvenating effects on muscle repair after injury.

Last year, she and Richard Lee published a paper in which they reported that when exposed to the blood of young mice, the enlarged, weakened hearts of older mice returned to a more youthful size, and their function improved. And then working with a Colorado firm, the pair reported that GDF11 was the factor in the blood apparently responsible for the rejuvenating effect. That finding has raised hopes that GDF11 may prove, in some form, to be a possible treatment for diastolic heart failure, a fatal condition in the elderly that now is irreversible, and fatal.

“From the previous work it could have seemed that GD11 was heart specific,” said Wagers, “but this shows that it is active in multiple organs and cell types. Prior studies of skeletal muscle and the parabiotic effect really focused on regenerative biology. Muscle was damaged and assayed on how well it could recover,” Wagers explained.

She continued: “The additional piece is that while prior studies of young blood factors have shown that we achieve restoration of muscle stem cell function and they repair the muscle better, in this study, we also saw repair of DNA damage associated with aging, and we got it in association with recovery of function, and we saw improvements in unmanipulated muscle. Based on other studies, we think that the accumulation of DNA damage in muscle stem cells might reflect an inability of the cells to properly differentiate to make mature muscle cells, which is needed for adequate muscle repair.”

Wagers noted that there is still a great deal to be learned about the mechanics of aging in muscle, and its repair. “I don’t think we fully understand how this happening or why. We might say that the damage is modification to the genetic material; the genome does have breaks in it. But whether it’s damaging, or a necessary part of repair, we don’t know yet.”

Rubin, whose primary research focus is on developing treatment for neurodegenerative diseases, particularly in children, said that when his group began its GDF11 experiments, “we knew that in the old mouse things were bad in the brain, there is a reduced amount of neurogenesis (the development of neurons), and it’s well known that cognition goes down. It wasn’t obvious to me that those things that can be repaired in peripheral tissue could be fixed in the brain.”

Rubin said that postdoctoral fellow Lida Katsimpardi, the lead author on his group’s paper, was taught the parabiotic experimental technique by Wagers, but conducted the Rubin group’s experiments independently of the Wagers group, and “she saw an increase in neural stem cells, and increased development of blood vessels in the brain.” Rubin said that 3D reconstruction of the brain, and magnetic resonance imaging (MRI) of the mouse brain showed “more new blood vessels and more blood flow,” both of which are normally associated with younger, healthier brain tissue.”

Younger mice, Rubin said, “have a keen sense of olfactory discrimination,” they can sense fine differences in odor. “When we tested the young mice, they avoided the smell of mint; the old mice didn’t. But the old mice exposed to the blood of the young mice, and those treated with GDF11 did.”

“We think an effect of GDF11 is the improved vascularity and blood flow, which is associated with increased neurogenesis,” Rubin said. “However, the increased blood flow should have more widespread effects on brain function. We do think that, at least in principle, there will be a way to reverse some of the cognitive decline that takes place during aging, perhaps even with a single protein. It could be that a molecule like GDF11, or GDF11 itself, could” reverse the damage of aging.

“It isn’t out of question that GDF11,” or a drug developed from it, “might be capable of slowing some of the cognitive defects associated with Alzheimer’s disease, a disorder whose main risk factor is aging itself,” Rubin said. It is even possible that this could occur without directly changing the “plaque and tangle burden” that are the pathological hallmarks of Alzheimer’s. Thus, a future treatment for this disease might be a combination of a therapeutic that reduces plaques and tangles, such as an antibody directed against the β-amyloid peptide, with a potential cognition enhancer like GDF11.

Wagers said that the two research groups are in discussions with a venture capital group to obtain funding to “be able to do the additional preclinical work” necessary before moving GDF11 into human trials.

“I would wager that the results of this work, together with the other work, will translate into a clinical trial and a treatment,” said the stem cell biologist. “But of course that’s just a wager.”

Filed under GDF11 aging alzheimer's disease muscle cells brain function medicine science

163 notes

Fight Memory Loss with a Smile (or Chuckle) 
Too much stress can take its toll on the body, mood, and mind. As we age it can contribute to a number of health problems, including high blood pressure, diabetes, and heart disease. Recent research has shown that the stress hormone cortisol damages certain neurons in the brain and can negatively affect memory and learning ability in the elderly. Researchers at Loma Linda University have delved deeper into cortisol’s relationship to memory and whether humor and laughter—a well-known stress reliever—can help lessen the damage that cortisol can cause. Their findings were presented on Sunday, April 27, at the Experimental Biology meeting.
Gurinder Singh Bains et al. showed a 20-minute laugh-inducing funny video to a group of healthy elderly individuals and a group of elderly people with diabetes. The groups where then asked to complete a memory assessment that measured their learning, recall, and sight recognition. Their performance was compared to a control group of elderly people who also completed the memory assessment, but were not shown a funny video. Cortisol concentrations for both groups were also recorded at the beginning and end of the experiment.
The research team found a significant decrease in cortisol concentrations among both groups who watched the video. Video-watchers also showed greater improvement in all areas of the memory assessment when compared to controls, with the diabetic group seeing the most dramatic benefit in cortisol level changes and the healthy elderly seeing the most significant changes in memory test scores.
From the authors: “Our research findings offer potential clinical and rehabilitative benefits that can be applied to wellness programs for the elderly,” Dr. Bains said. “The cognitive components—learning ability and delayed recall—become more challenging as we age and are essential to older adults for an improved quality of life: mind, body, and spirit. Although older adults have age-related memory deficits, complimentary, enjoyable, and beneficial humor therapies need to be implemented for these individuals.”
Study co-author and long-time psychoneuroimmunology humor researcher, Dr. Lee Berk, added, “It’s simple, the less stress you have the better your memory. Humor reduces detrimental stress hormones like cortisol that decrease memory hippocampal neurons, lowers your blood pressure, and increases blood flow and your mood state. The act of laughter—or simply enjoying some humor—increases the release of endorphins and dopamine in the brain, which provides a sense of pleasure and reward. These positive and beneficial neurochemical changes, in turn, make the immune system function better. There are even changes in brain wave activity towards what’s called the “gamma wave band frequency”, which also amp up memory and recall. So, indeed, laughter is turning out to be not only a good medicine, but also a memory enhancer adding to our quality of life.”

Fight Memory Loss with a Smile (or Chuckle)

Too much stress can take its toll on the body, mood, and mind. As we age it can contribute to a number of health problems, including high blood pressure, diabetes, and heart disease. Recent research has shown that the stress hormone cortisol damages certain neurons in the brain and can negatively affect memory and learning ability in the elderly. Researchers at Loma Linda University have delved deeper into cortisol’s relationship to memory and whether humor and laughter—a well-known stress reliever—can help lessen the damage that cortisol can cause. Their findings were presented on Sunday, April 27, at the Experimental Biology meeting.

Gurinder Singh Bains et al. showed a 20-minute laugh-inducing funny video to a group of healthy elderly individuals and a group of elderly people with diabetes. The groups where then asked to complete a memory assessment that measured their learning, recall, and sight recognition. Their performance was compared to a control group of elderly people who also completed the memory assessment, but were not shown a funny video. Cortisol concentrations for both groups were also recorded at the beginning and end of the experiment.

The research team found a significant decrease in cortisol concentrations among both groups who watched the video. Video-watchers also showed greater improvement in all areas of the memory assessment when compared to controls, with the diabetic group seeing the most dramatic benefit in cortisol level changes and the healthy elderly seeing the most significant changes in memory test scores.

From the authors: “Our research findings offer potential clinical and rehabilitative benefits that can be applied to wellness programs for the elderly,” Dr. Bains said. “The cognitive components—learning ability and delayed recall—become more challenging as we age and are essential to older adults for an improved quality of life: mind, body, and spirit. Although older adults have age-related memory deficits, complimentary, enjoyable, and beneficial humor therapies need to be implemented for these individuals.”

Study co-author and long-time psychoneuroimmunology humor researcher, Dr. Lee Berk, added, “It’s simple, the less stress you have the better your memory. Humor reduces detrimental stress hormones like cortisol that decrease memory hippocampal neurons, lowers your blood pressure, and increases blood flow and your mood state. The act of laughter—or simply enjoying some humor—increases the release of endorphins and dopamine in the brain, which provides a sense of pleasure and reward. These positive and beneficial neurochemical changes, in turn, make the immune system function better. There are even changes in brain wave activity towards what’s called the “gamma wave band frequency”, which also amp up memory and recall. So, indeed, laughter is turning out to be not only a good medicine, but also a memory enhancer adding to our quality of life.”

Filed under aging memory memory loss laughter stress cortisol Experimental Biology Meeting 2014 neuroscience science

105 notes

In Old Age, Lack of Emotion and Interest May Signal Your Brain Is Shrinking

Older people who have apathy but not depression may have smaller brain volumes than those without apathy, according to a new study published in the April 16, 2014, online issue of Neurology®, the medical journal of the American Academy of Neurology. Apathy is a lack of interest or emotion.

image

“Just as signs of memory loss may signal brain changes related to brain disease, apathy may indicate underlying changes,” said Lenore J. Launer, PhD, with the National Institute on Aging at the National Institutes of Health (NIH) in Bethesda, MD, and a member of the American Academy of Neurology. “Apathy symptoms are common in older people without dementia. And the fact that participants in our study had apathy without depression should turn our attention to how apathy alone could indicate brain disease.”

Launer’s team used brain volume as a measure of accelerated brain aging. Brain volume losses occur during normal aging, but in this study, larger amounts of brain volume loss could indicate brain diseases.

For the study, 4,354 people without dementia and with an average age of 76 underwent an MRI scan. They were also asked questions that measure apathy symptoms, which include lack of interest, lack of emotion, dropping activities and interests, preferring to stay at home and having a lack of energy.

The study found that people with two or more apathy symptoms had 1.4 percent smaller gray matter volume and 1.6 percent less white matter volume compared to those who had less than two symptoms of apathy. Excluding people with depression symptoms did not change the results.

Gray matter is where learning takes place and memories are stored in the brain. White matter acts as the communication cables that connect different parts of the brain.

“If these findings are confirmed, identifying people with apathy earlier may be one way to target an at-risk group,” Launer said.

Filed under apathy emotion aging gray matter white matter brain structure neuroimaging neuroscience science

176 notes

Study Examines Vitamin D Deficiency and Cognition Relationship
Vitamin D deficiency and cognitive impairment are common in older adults, but there isn’t a lot of conclusive research into whether there’s a relationship between the two.
A new study from Wake Forest Baptist Medical Center published online ahead of print this month in the Journal of the American Geriatrics Society enhances the existing literature on the subject.
“This study provides increasing evidence that suggests there is an association between low vitamin D levels and cognitive decline over time,” said lead author Valerie Wilson, M.D., assistant professor of geriatrics at Wake Forest Baptist. “Although this study cannot establish a direct cause and effect relationship, it would have a huge public health implication if vitamin D supplementation could be shown to improve cognitive performance over time because deficiency is so common in the population.”
Wilson and colleagues were interested in the association between vitamin D levels and cognitive function over time in older adults. They used data from the Health, Aging and Body composition (Health ABC) study to look at the relationship. The researchers looked at 2,777 well-functioning adults aged 70 to 79 whose cognitive function was measured at the study’s onset and again four years later. Vitamin D levels were measured at the 12-month follow-up visit.
The Health ABC study cohort consists of 3,075 Medicare-eligible, white and black, well-functioning, community-dwelling older adults who were recruited between April 1997 and June 1998 from Pittsburgh, Pa., and Memphis, Tenn.
“With just the baseline observational data, you can’t conclude that low vitamin D causes cognitive decline. When we looked four years down the road, low vitamin D was associated with worse cognitive performance on one of the two cognitive tests used,” Wilson said. “It is interesting that there is this association and ultimately the next question is whether or not supplementing vitamin D would improve cognitive function over time.”
Wilson said randomized, controlled trials are needed to determine whether vitamin D supplementation can prevent cognitive decline and definitively establish a causal relationship.
“Doctors need this information to make well-supported recommendations to their patients,” Wilson said. “Further research is also needed to evaluate whether specific cognitive domains, such as memory versus concentration, are especially sensitive to low vitamin D levels.”

Study Examines Vitamin D Deficiency and Cognition Relationship

Vitamin D deficiency and cognitive impairment are common in older adults, but there isn’t a lot of conclusive research into whether there’s a relationship between the two.

A new study from Wake Forest Baptist Medical Center published online ahead of print this month in the Journal of the American Geriatrics Society enhances the existing literature on the subject.

“This study provides increasing evidence that suggests there is an association between low vitamin D levels and cognitive decline over time,” said lead author Valerie Wilson, M.D., assistant professor of geriatrics at Wake Forest Baptist. “Although this study cannot establish a direct cause and effect relationship, it would have a huge public health implication if vitamin D supplementation could be shown to improve cognitive performance over time because deficiency is so common in the population.”

Wilson and colleagues were interested in the association between vitamin D levels and cognitive function over time in older adults. They used data from the Health, Aging and Body composition (Health ABC) study to look at the relationship. The researchers looked at 2,777 well-functioning adults aged 70 to 79 whose cognitive function was measured at the study’s onset and again four years later. Vitamin D levels were measured at the 12-month follow-up visit.

The Health ABC study cohort consists of 3,075 Medicare-eligible, white and black, well-functioning, community-dwelling older adults who were recruited between April 1997 and June 1998 from Pittsburgh, Pa., and Memphis, Tenn.

“With just the baseline observational data, you can’t conclude that low vitamin D causes cognitive decline. When we looked four years down the road, low vitamin D was associated with worse cognitive performance on one of the two cognitive tests used,” Wilson said. “It is interesting that there is this association and ultimately the next question is whether or not supplementing vitamin D would improve cognitive function over time.”

Wilson said randomized, controlled trials are needed to determine whether vitamin D supplementation can prevent cognitive decline and definitively establish a causal relationship.

“Doctors need this information to make well-supported recommendations to their patients,” Wilson said. “Further research is also needed to evaluate whether specific cognitive domains, such as memory versus concentration, are especially sensitive to low vitamin D levels.”

Filed under cognitive impairment vitamin deficiency vitamin d aging cognitive performance neuroscience science

236 notes

Study says we’re over the hill at 24
It’s a hard pill to swallow, but if you’re over 24 years of age you’ve already reached your peak in terms of your cognitive motor performance, according to a new Simon Fraser University study.
SFU’s Joe Thompson, a psychology doctoral student, associate professor Mark Blair, Thompson’s thesis supervisor, and Andrew Henrey, a statistics and actuarial science doctoral student, deliver the news in a just-published PLOS ONE Journal paper.
In one of the first social science experiments to rest on big data, the trio investigates when we start to experience an age-related decline in our cognitive motor skills and how we compensate for that.
The researchers analyzed the digital performance records of 3,305 StarCraft 2 players, aged 16 to 44. StarCraft 2 is a ruthless competitive intergalactic computer war game that players often undertake to win serious money.
Their performance records, which can be readily replayed, constitute big data because they represent thousands of hours worth of strategic real-time cognitive-based moves performed at varied skill levels.
Using complex statistical modeling, the researchers distilled meaning from this colossal compilation of information about how players responded to their opponents and more importantly, how long they took to react.
“After around 24 years of age, players show slowing in a measure of cognitive speed that is known to be important for performance,” explains Thompson, the lead author of the study, which is his thesis. “This cognitive performance decline is present even at higher levels of skill.”
But there’s a silver lining in this earlier-than-expected slippery slope into old age. “Our research tells a new story about human development,” says Thompson.
“Older players, though slower, seem to compensate by employing simpler strategies and using the game’s interface more efficiently than younger players, enabling them to retain their skill, despite cognitive motor-speed loss.”
For example, older players more readily use short cut and sophisticated command keys to compensate for declining speed in executing real time decisions.
 The findings, says Thompson, suggest “that our cognitive-motor capacities are not stable across our adulthood, but are constantly in flux, and that our day-to-day performance is a result of the constant interplay between change and adaptation.”
Thompson says this study doesn’t inform us about how our increasingly distracting computerized world may ultimately affect our use of adaptive behaviours to compensate for declining cognitive motor skills.
But he does say our increasingly digitized world is providing a growing wealth of big data that will be a goldmine for future social science studies such as this one.

Study says we’re over the hill at 24

It’s a hard pill to swallow, but if you’re over 24 years of age you’ve already reached your peak in terms of your cognitive motor performance, according to a new Simon Fraser University study.

SFU’s Joe Thompson, a psychology doctoral student, associate professor Mark Blair, Thompson’s thesis supervisor, and Andrew Henrey, a statistics and actuarial science doctoral student, deliver the news in a just-published PLOS ONE Journal paper.

In one of the first social science experiments to rest on big data, the trio investigates when we start to experience an age-related decline in our cognitive motor skills and how we compensate for that.

The researchers analyzed the digital performance records of 3,305 StarCraft 2 players, aged 16 to 44. StarCraft 2 is a ruthless competitive intergalactic computer war game that players often undertake to win serious money.

Their performance records, which can be readily replayed, constitute big data because they represent thousands of hours worth of strategic real-time cognitive-based moves performed at varied skill levels.

Using complex statistical modeling, the researchers distilled meaning from this colossal compilation of information about how players responded to their opponents and more importantly, how long they took to react.

“After around 24 years of age, players show slowing in a measure of cognitive speed that is known to be important for performance,” explains Thompson, the lead author of the study, which is his thesis. “This cognitive performance decline is present even at higher levels of skill.”

But there’s a silver lining in this earlier-than-expected slippery slope into old age. “Our research tells a new story about human development,” says Thompson.

“Older players, though slower, seem to compensate by employing simpler strategies and using the game’s interface more efficiently than younger players, enabling them to retain their skill, despite cognitive motor-speed loss.”

For example, older players more readily use short cut and sophisticated command keys to compensate for declining speed in executing real time decisions.

 The findings, says Thompson, suggest “that our cognitive-motor capacities are not stable across our adulthood, but are constantly in flux, and that our day-to-day performance is a result of the constant interplay between change and adaptation.”

Thompson says this study doesn’t inform us about how our increasingly distracting computerized world may ultimately affect our use of adaptive behaviours to compensate for declining cognitive motor skills.

But he does say our increasingly digitized world is providing a growing wealth of big data that will be a goldmine for future social science studies such as this one.

Filed under motor skills cognition aging memory cognitive performance psychology neuroscience science

66 notes

Older People with Faster Decline In Memory and Thinking Skills May Have Lower Risk of Cancer Death
Older people who are starting to have memory and thinking problems, but do not yet have dementia may have a lower risk of dying from cancer than people who have no memory and thinking problems, according to a study published in the April 9, 2014, online issue of Neurology®, the medical journal of the American Academy of Neurology.
“Studies have shown that people with Alzheimer’s disease are less likely to develop cancer, but we don’t know the reason for that link,” said study author Julián Benito-León, MD, PhD, of University Hospital 12 of October in Madrid, Spain. “One possibility is that cancer is underdiagnosed in people with dementia, possibly because they are less likely to mention their symptoms or caregivers and doctors are focused on the problems caused by dementia. The current study helps us discount that theory.”
The study involved 2,627 people age 65 and older in Spain who did not have dementia at the start of the study. They took tests of memory and thinking skills at the start of the study and again three years later, and were followed for an average of almost 13 years. The participants were divided into three groups: those whose scores on the thinking tests were declining the fastest, those whose scores improved on the tests, and those in the middle.
During the study, 1,003 of the participants died, including 339 deaths, or 34 percent, among those with the fastest decline in thinking skills and 664 deaths, or 66 percent, among those in the other two groups. A total of 21 percent of those in the group with the fastest decline died of cancer, according to their death certificates, compared to 29 percent of those in the other two groups.
People in the fastest declining group were still 30 percent less likely to die of cancer when the results were adjusted to control for factors such as smoking, diabetes and heart disease, among others.
“We need to understand better the relationship between a disease that causes abnormal cell death and one that causes abnormal cell growth,” Benito-León said. “With the increasing number of people with both dementia and cancer, understanding this association could help us better understand and treat both diseases.”

Older People with Faster Decline In Memory and Thinking Skills May Have Lower Risk of Cancer Death

Older people who are starting to have memory and thinking problems, but do not yet have dementia may have a lower risk of dying from cancer than people who have no memory and thinking problems, according to a study published in the April 9, 2014, online issue of Neurology®, the medical journal of the American Academy of Neurology.

“Studies have shown that people with Alzheimer’s disease are less likely to develop cancer, but we don’t know the reason for that link,” said study author Julián Benito-León, MD, PhD, of University Hospital 12 of October in Madrid, Spain. “One possibility is that cancer is underdiagnosed in people with dementia, possibly because they are less likely to mention their symptoms or caregivers and doctors are focused on the problems caused by dementia. The current study helps us discount that theory.”

The study involved 2,627 people age 65 and older in Spain who did not have dementia at the start of the study. They took tests of memory and thinking skills at the start of the study and again three years later, and were followed for an average of almost 13 years. The participants were divided into three groups: those whose scores on the thinking tests were declining the fastest, those whose scores improved on the tests, and those in the middle.

During the study, 1,003 of the participants died, including 339 deaths, or 34 percent, among those with the fastest decline in thinking skills and 664 deaths, or 66 percent, among those in the other two groups. A total of 21 percent of those in the group with the fastest decline died of cancer, according to their death certificates, compared to 29 percent of those in the other two groups.

People in the fastest declining group were still 30 percent less likely to die of cancer when the results were adjusted to control for factors such as smoking, diabetes and heart disease, among others.

“We need to understand better the relationship between a disease that causes abnormal cell death and one that causes abnormal cell growth,” Benito-León said. “With the increasing number of people with both dementia and cancer, understanding this association could help us better understand and treat both diseases.”

Filed under memory dementia cancer cognitive decline aging neurology neuroscience science

256 notes

Dog watch - How attention changes in the course of a dog’s life
Dogs are known to be Man’s best friend. No other pet has adjusted to Man’s lifestyle as this four-legged animal. Scientists at the Messerli Research Institute at the Vetmeduni Vienna, have been the first to investigate the evolution of dogs’ attentiveness in the course of their lives and to what extent they resemble Man in this regard. The outcome: dogs’ attentional and sensorimotor control developmental trajectories are very similar to those found in humans. The results were published in the journal Frontiers in Psychology.
Dogs are individual personalities, possess awareness, and are particularly known for their learning capabilities, or trainability. To learn successfully, they must display a sufficient quantity of attention and concentration. However, the attentiveness of dogs’ changes in the course of their lives, as it does in humans. The lead author Lisa Wallis and her colleagues investigated 145 Border Collies aged 6 months to 14 years in the Clever Dog Lab at the Vetmeduni Vienna and determined, for the first time, how attentiveness changes in the entire course of a dog’s life using a cross-sectional study design.
Humans are more interesting for dogs than objects
To determine how rapidly dogs of various age groups pay attention to objects or humans, the scientists performed two tests. In the first situation the dogs were confronted with a child’s toy suspended suddenly from the ceiling. The scientists measured how rapidly each dog reacted to this occurrence and how quickly the dogs became accustomed to it. Initially all dogs reacted with similar speed to the stimulus, but older dogs lost interest in the toy more rapidly than younger ones did.
In the second test situation, a person known to the dog entered the room and pretended to paint the wall. All dogs reacted by watching the person and the paint roller in the person’s hands for a longer duration than the toy hanging from the ceiling. 
Wallis’ conclusion: “So-called social attentiveness was more pronounced in all dogs than “non-social” attentiveness. The dogs generally tended to react by watching the person with the object for longer than an object on its own. We found that older dogs - like older human beings - demonstrated a certain calmness. They were less affected by new items in the environment and thus showed less interest than younger dogs.”
Selective attention is highest in mid-adulthood
In a further test the scientists investigated so-called selective attention. The dogs participated in an alternating attention task, where they had to perform two tasks consecutively. First, they needed  to find a food reward thrown onto the floor by the experimenter, then after eating the food, the experimenter waited for the dog to establish eye contact with her.  These tasks were repeated for a further twenty trials. The establishment of eye contact was marked by a clicking sound produced by a  “clicker” and small pieces of hot dog were used as a reward. The time spans to find the food and look up into the face were measured. With respect to both time spans, middle-aged dogs (3 to 6 years) reacted most rapidly.
Under these test conditions, sensorimotor abilities were highest among dogs of middle age. Younger dogs fared more poorly probably because of their general lack of experience. Motor abilities in dogs as in humans deteriorate with age. Humans between the age of 20 and 39 years experience a similar peak in sensorimotor abilities,” says Wallis.
Adolescent dogs have the steepest learning curve
Dogs also go through a difficult phase during adolescence (1-2 years) which affects their ability to pay attention. This phase of hormonal change may be compared to puberty in Man. Therefore, young dogs occasionally reacted with some delay to the clicker test. However, Wallis found that adolescent dogs improved their performance more rapidly than other age groups after several repetitions of the clicker test. In other words, the learning curve was found to be steepest in puberty. “Thus, dogs in puberty have great potential for learning and therefore trainability” says Wallis.
Dogs as a model for ADHD and Alzheimer’s disease
As the development of attentiveness in the course of a dog’s life is similar to human development in many respects, dogs make appropriate animal models for various human psychological diseases. For instance, the course of diseases like ADHD (attention deficit/hyperactivity disorder) or Alzheimer’s can be studied by observing the behavior of dogs. In her current project Wallis is investigating the effects of diet on cognition in older dogs together with her colleague Durga Chapagain. The scientists are still looking for dog owners who would like to participate in a long-term study.

Dog watch - How attention changes in the course of a dog’s life

Dogs are known to be Man’s best friend. No other pet has adjusted to Man’s lifestyle as this four-legged animal. Scientists at the Messerli Research Institute at the Vetmeduni Vienna, have been the first to investigate the evolution of dogs’ attentiveness in the course of their lives and to what extent they resemble Man in this regard. The outcome: dogs’ attentional and sensorimotor control developmental trajectories are very similar to those found in humans. The results were published in the journal Frontiers in Psychology.

Dogs are individual personalities, possess awareness, and are particularly known for their learning capabilities, or trainability. To learn successfully, they must display a sufficient quantity of attention and concentration. However, the attentiveness of dogs’ changes in the course of their lives, as it does in humans. The lead author Lisa Wallis and her colleagues investigated 145 Border Collies aged 6 months to 14 years in the Clever Dog Lab at the Vetmeduni Vienna and determined, for the first time, how attentiveness changes in the entire course of a dog’s life using a cross-sectional study design.

Humans are more interesting for dogs than objects

To determine how rapidly dogs of various age groups pay attention to objects or humans, the scientists performed two tests. In the first situation the dogs were confronted with a child’s toy suspended suddenly from the ceiling. The scientists measured how rapidly each dog reacted to this occurrence and how quickly the dogs became accustomed to it. Initially all dogs reacted with similar speed to the stimulus, but older dogs lost interest in the toy more rapidly than younger ones did.

In the second test situation, a person known to the dog entered the room and pretended to paint the wall. All dogs reacted by watching the person and the paint roller in the person’s hands for a longer duration than the toy hanging from the ceiling.

Wallis’ conclusion: “So-called social attentiveness was more pronounced in all dogs than “non-social” attentiveness. The dogs generally tended to react by watching the person with the object for longer than an object on its own. We found that older dogs - like older human beings - demonstrated a certain calmness. They were less affected by new items in the environment and thus showed less interest than younger dogs.”

Selective attention is highest in mid-adulthood

In a further test the scientists investigated so-called selective attention. The dogs participated in an alternating attention task, where they had to perform two tasks consecutively. First, they needed  to find a food reward thrown onto the floor by the experimenter, then after eating the food, the experimenter waited for the dog to establish eye contact with her.  These tasks were repeated for a further twenty trials. The establishment of eye contact was marked by a clicking sound produced by a  “clicker” and small pieces of hot dog were used as a reward. The time spans to find the food and look up into the face were measured. With respect to both time spans, middle-aged dogs (3 to 6 years) reacted most rapidly.

Under these test conditions, sensorimotor abilities were highest among dogs of middle age. Younger dogs fared more poorly probably because of their general lack of experience. Motor abilities in dogs as in humans deteriorate with age. Humans between the age of 20 and 39 years experience a similar peak in sensorimotor abilities,” says Wallis.

Adolescent dogs have the steepest learning curve

Dogs also go through a difficult phase during adolescence (1-2 years) which affects their ability to pay attention. This phase of hormonal change may be compared to puberty in Man. Therefore, young dogs occasionally reacted with some delay to the clicker test. However, Wallis found that adolescent dogs improved their performance more rapidly than other age groups after several repetitions of the clicker test. In other words, the learning curve was found to be steepest in puberty. “Thus, dogs in puberty have great potential for learning and therefore trainability” says Wallis.

Dogs as a model for ADHD and Alzheimer’s disease

As the development of attentiveness in the course of a dog’s life is similar to human development in many respects, dogs make appropriate animal models for various human psychological diseases. For instance, the course of diseases like ADHD (attention deficit/hyperactivity disorder) or Alzheimer’s can be studied by observing the behavior of dogs. In her current project Wallis is investigating the effects of diet on cognition in older dogs together with her colleague Durga Chapagain. The scientists are still looking for dog owners who would like to participate in a long-term study.

Filed under attention learning social attentiveness dogs aging animal model psychology neuroscience science

free counters