The feeling of hunger itself may protect against Alzheimer’s disease, according to study published today in the journal PLOS ONE. Interestingly, the results of this study in mice suggest that mild hunger pangs, and related hormonal pathways, may be as important to the much-discussed value of “caloric restriction” as actually eating less.

Caloric restriction is a regimen where an individual consumes fewer calories than average, but not so few that they become malnourished. Studies in many species have suggested that it could protect against neurodegenerative disorders and extend lifespans, but the effect has not been confirmed in human randomized clinical trials.
Efforts to understand how cutting calories may protect the brain have grown increasingly important with news that American Alzheimer’s deaths are increasing, and because the best available treatments only delay onset in a subset of patients.
Study authors argue that hormonal signals are the middlemen between an empty gut and the perception of hunger in the brain, and that manipulating them may effectively counter age-related cognitive decline in the same way as caloric restriction.
“This is the first paper, as far as we are aware, to show that the sensation of hunger can reduce Alzheimer’s disease pathology in a mouse model of the disease,” said Inga Kadish, Ph.D., assistant professor in the Department of Cell, Developmental and Integrative Biology (CDIB) within the School of Medicine at the University of Alabama at Birmingham. “If the mechanisms are confirmed, hormonal hunger signaling may represent a new way to combat Alzheimer’s disease, either by itself or combined with caloric restriction.”
The team theorizes that feeling hungry creates mild stress. That, in turn, fires up metabolic signaling pathways that counter plaque deposits known to destroy nerve cells in Alzheimer’s patients. The idea is an example of hormesis theory, where damaging stressors like starvation are thought to be good for you when experienced to a lesser degree.
To study the sensation of hunger, the research team analyzed the effects of the hormone ghrelin, which is known to make us feel hungry. They used a synthetic form of ghrelin in pill form, which let them control dosage such that the ghrelin-treated mice felt steadily, mildly hungry.
If it could be developed, a treatment that affected biochemical pathways downstream of hunger signals might help delay cognitive decline without consigning people to a life of feeling hungry. Straight caloric restriction would not be tolerable for many persons over the long-run, but manipulating post-hunger signaling might.
This line of thinking becomes important because any protective benefit brought about by drugs or diets that mildly adjust post-hunger signals might be most useful if started in those at risk as early in life as possible. Attempts to treat the disease years later – when nerve networks are damaged enough for neurological symptoms to appear – may be too late. In the current study, it was long-term treatment with a ghrelin agonist that improved cognitive performance in mice tested when they had reached an advanced age.
Study details
The study looked at whether or not the feeling of hunger, in the absence of caloric restriction, could counter Alzheimer’s pathology in mice genetically engineered to have three genetic mutations known to cause the disease in humans.
Study mice were divided into three groups: one that received the ‘synthetic ghrelin’ (ghrelin agonist), a second that underwent caloric restriction (20 percent less food) and a third group that was fed normally. Study measures looked at each group’s ability to remember, their degree of Alzheimer’s pathology and their level of related, potentially harmful immune cell activation.
Results of such studies are most appropriately presented in terms of general trends in the data and statistical assessments of their likelihood if only chance factors were in play, a trait captured in each result’s P value (the smaller the better). Thus, the first formal result of the study are that, in mice with the human Alzheimer’s mutations, both the group treated with the ghrelin agonist LY444711 and the group that underwent caloric restriction performed significantly better in the a water maze than did than mice fed normally (p=0.023).
The water maze is the standard test used to measure mouse memory. Researchers put mice in a pool with an invisible platform on which they could rest, and measured how quickly the mice found the platform in a series of tests. Mice with normal memory will remember where the platform is, and find it more quickly each time they are placed in the pool. Ghrelin agonist-treated mice found the hidden platform 26 percent faster than control mice, with caloric restricted mice doing so 23 percent faster than control mice.
The second result was a measure of the buildup of a cholesterol-related protein called amyloid beta in the forebrain, an early step in the destruction of nerve cells that accompanies Alzheimer’s disease. The formal amyloid beta results show that mice either treated with the ghrelin agonist or calorically restricted had significantly less buildup of amyloid beta in the dentate gyrus, the part of the brain that controls memory function, than mice fed normally (i.e., control, 3.95±0.83; LY, 2.05±0.26 and CR, 1.28±0.17%, respectively; Wilcoxon p=0.04).
The above results translate roughly into a 67 percent reduction of this pathology in caloric-restricted mice as compared to control mice, and a 48 percent reduction of amyloid beta deposits when comparing the ghrelin-treated mice with the control group. These percentages are neither final nor translatable to humans, but are simply meant to convey the idea of “better.”
Finally, the team examined the difference in immune responses related to Alzheimer’s pathology in each of the three groups. Microglia are the immune cells of the brain, engulfing and removing invading pathogens and dead tissue. They have also been implicated in several diseases when their misplaced activation damages tissues. The team found that mice receiving the ghrelin agonist treatment had both reduced levels of microglial activation compared to the control group, similar to the effect of caloric restriction.
The ghrelin agonist used in the study does not lend itself to clinical use and will not play a role in the future prevention of Alzheimer’s disease, said Kadish. It was meant instead to prove a principle that hormonal hunger signaling itself can counter Alzheimer’s pathology in a mammal. The next step is to understand exactly how it achieved this as a prerequisite to future treatment design.
Ghrelin is known to create hunger signals by interacting with the arcuate nucleus in the part of the brain called the hypothalamus, which then sends out signaling neuropeptides that help the body sense and respond to energy needs. Studies already underway in Kadish’s lab seek to determine the potential role of these pathways and related genes in countering disease.
“Our group in the School of Public Health was studying whether or not a ghrelin agonist could make mice hungry as we sought to unravel mechanisms contributing to the life-prolonging effects of caloric restriction,” said David Allison, Ph.D., associate dean for Science in the UAB School of Public Health and the project’s initiator.
“Because of the interdisciplinary nature of UAB, our work with Dr. Allison led to an amazing conversation with Dr. Kadish about how we might combine our research with her longtime expertise in neurology because caloric restriction had been shown in early studies to counter Alzheimer’s disease,” said Emily Dhurandhar, Ph.D., a trainee in the UAB Nutrition Obesity Research Center and first study author. “The current study is the result.”
Risk prediction tools that estimate future risk of heart disease and stroke may be more useful predictors of future decline in cognitive abilities, or memory and thinking, than a dementia risk test, according to a new study published in the April 2, 2013, print issue of Neurology®, the medical journal of the American Academy of Neurology.
“This is the first study that compares these risk scores with a dementia risk score to study decline in cognitive abilities 10 years later,” said Sara Kaffashian, PhD, with the French National Institute of Health and Medical Research (INSERM) in Paris, France.
The study involved 7,830 men and women with an average age of 55. Risk of heart disease and stroke (cardiovascular disease) and risk of dementia were calculated for each participant at the beginning of the study. The heart disease risk score included the following risk factors: age, blood pressure, treatment for high blood pressure, high density lipoprotein (HDL) cholesterol, total cholesterol, smoking, and diabetes. The stroke risk score included age, blood pressure, treatment for high blood pressure, diabetes, smoking, history of heart disease, and presence of cardiac arrhythmia (irregular heart beat).
The dementia risk score included age, education, blood pressure, body mass index (BMI), total cholesterol, exercise, and whether a person had the APOE ?4 gene, a gene associated with dementia.
Memory and thinking abilities were measured three times over 10 years.
The study found that all three risk scores predicted 10-year decline in multiple cognitive tests. However, heart disease risk scores showed stronger links with cognitive decline than a dementia risk score. Both heart and stroke risk were associated with decline in all cognitive tests except memory; dementia risk was not linked with decline in memory and verbal fluency.
“Although the dementia and cardiovascular risk scores all predict cognitive decline starting in late middle age, cardiovascular risk scores may have an advantage over the dementia risk score for use in prevention and for targeting changeable risk factors since they are already used by many physicians. The findings also emphasize the importance of risk factors for cardiovascular disease such as high cholesterol and high blood pressure in not only increasing risk of heart disease and stroke but also having a negative impact on cognitive abilities,” said Kaffashian.
A team of researchers at The New York Stem Cell Foundation Research Institute led by Scott Noggle, PhD, Director of the NYSCF Laboratory and the NYSCF – Charles Evans Senior Research Fellow for Alzheimer’s Disease, and Michael W. Nestor, PhD, a NYSCF Postdoctoral Research Fellow, has developed a technique to produce three-dimensional cultures of induced pluripotent stem (iPS) cells called embryoid bodies, amenable to live cell imaging and to electrical activity measurement. As reported in their Stem Cell Research study, these cell aggregates enable scientists to both model and to study diseases such as Alzheimer’s and Parkinson’s disease.
The NYSCF Alzheimer’s disease research team aims to better understand and to find treatments to this disease through stem cell research. For such disorders in which neurons misfire or degenerate, the NYSCF team creates “disease in a dish” models by reprogramming patients’ skin and or blood samples into induced pluripotent stem (iPS) cells that can become neurons and the other brain cells affected in the diseases.
The cells in our body form three-dimensional networks, essential to tissue function and overall health; however, previous techniques to form complex brain tissue resulted in structures that, while similar in form to naturally occurring neurons, undermined imaging or electrical recording attempts.
In the current study, the Noggle and Nestor with NYSCF scientists specially adapted two-dimensional culture methods to grow three-dimensional neuron structures from iPS cells. The resultant neurons were “thinned-out,” enabling calcium-imaging studies, which measure the electrical activity of cells like neurons.
"Combining the advantages of iPS cells grown in a 3D environment with those of a 2D system, our technique produces cells that can be used to observe electrical activity of putative networks of biologically active neurons, while simultaneously imaging them," said Nestor. "This is key to modeling and studying neurodegenerative diseases."
Neural networks, thought to underlie learning and memory, become disrupted in Alzheimer’s disease. By generating aggregates from iPS cells and comparing these to an actual patient’s brain tissue, scientists may uncover how disease interferes with these cell-to-cell interactions and understand how to intervene to slow or stop Alzheimer’s disease.
"This critical new tool developed by our Alzheimer’s team will accelerate Alzheimer’s research, enabling more accurate manipulation of cells to find a cure to this disease," said Susan L. Solomon, CEO of NYSCF.
Clumps of proteins that accumulate in brain cells are a hallmark of neurological diseases such as dementia, Parkinson’s disease and Alzheimer’s disease. Over the past several years, there has been much controversy over the structure of one of those proteins, known as alpha synuclein.

MIT computational scientists have now modeled the structure of that protein, most commonly associated with Parkinson’s, and found that it can take on either of two proposed states — floppy or rigid. The findings suggest that forcing the protein to switch to the rigid structure, which does not aggregate, could offer a new way to treat Parkinson’s, says Collin Stultz, an associate professor of electrical engineering and computer science at MIT.
“If alpha synuclein can really adopt this ordered structure that does not aggregate, you could imagine a drug-design strategy that stabilizes these ordered structures to prevent them from aggregating,” says Stultz, who is the senior author of a paper describing the findings in a recent issue of the Journal of the American Chemical Society.
For decades, scientists have believed that alpha synuclein, which forms clumps known as Lewy bodies in brain cells and other neurons, is inherently disordered and floppy. However, in 2011 Harvard University neurologist Dennis Selkoe and colleagues reported that after carefully extracting alpha synuclein from cells, they found it to have a very well-defined, folded structure.
That surprising finding set off a scientific controversy. Some tried and failed to replicate the finding, but scientists at Brandeis University, led by Thomas Pochapsky and Gregory Petsko, also found folded (or ordered) structures in the alpha synuclein protein.
Stultz and his group decided to jump into the fray, working with Pochapsky’s lab, and developed a computer-modeling approach to predict what kind of structures the protein might take. Working with the structural data obtained by the Brandeis researchers, Stultz created a model that calculates the probabilities of many different possible structures, to determine what set of structures would best explain the experimental data.
The calculations suggest that the protein can rapidly switch among many different conformations. At any given time, about 70 percent of individual proteins will be in one of the many possible disordered states, which exist as single molecules of the alpha synuclein protein. When three or four of the proteins join together, they can assume a mix of possible rigid structures, including helices and beta strands (protein chains that can link together to form sheets).
“On the one hand, the people who say it’s disordered are right, because a majority of the protein is disordered,” Stultz says. “And the people who would say that it’s ordered are not wrong; it’s just a very small fraction of the protein that is ordered.”
“This paper seems to bridge the gap” between the two camps, says Trevor Creamer, an associate professor of molecular and cellular biochemistry at the University of Kentucky who was not involved in this research. Also important is the model’s prediction of new structures for the protein that experimental biologists can now look for, Creamer adds.
The MIT researchers also found that when alpha synuclein adopts an ordered structure, similar to that described by Selkoe and co-workers, the portions of the protein that tend to aggregate with other molecules are buried deep within the structure, explaining why those ordered forms do not clump together.
Stultz is now working to figure out what controls the protein’s configuration. There is some evidence that other molecules in the cell can modify alpha synuclein, forcing it to assume one conformation or another.
“If this structure really does exist, we have a new way now of potentially designing drugs that will prevent aggregation of alpha synuclein,” he says.
Scientists have known for some time that the human brain’s ability to stay calm and focused is limited and can be overwhelmed by the constant noise and hectic, jangling demands of city living, sometimes resulting in a condition informally known as brain fatigue.
With brain fatigue, you are easily distracted, forgetful and mentally flighty — or, in other words, me.
But an innovative new study from Scotland suggests that you can ease brain fatigue simply by strolling through a leafy park.

The idea that visiting green spaces like parks or tree-filled plazas lessens stress and improves concentration is not new. Researchers have long theorized that green spaces are calming, requiring less of our so-called directed mental attention than busy, urban streets do. Instead, natural settings invoke “soft fascination,” a beguiling term for quiet contemplation, during which directed attention is barely called upon and the brain can reset those overstretched resources and reduce mental fatigue.
But this theory, while agreeable, has been difficult to put to the test. Previous studies have found that people who live near trees and parks have lower levels of cortisol, a stress hormone, in their saliva than those who live primarily amid concrete, and that children with attention deficits tend to concentrate and perform better on cognitive tests after walking through parks or arboretums. More directly, scientists have brought volunteers into a lab, attached electrodes to their heads and shown them photographs of natural or urban scenes, and found that the brain wave readouts show that the volunteers are more calm and meditative when they view the natural scenes.
But it had not been possible to study the brains of people while they were actually outside, moving through the city and the parks. Or it wasn’t, until the recent development of a lightweight, portable version of the electroencephalogram, a technology that studies brain wave patterns.
For the new study, published this month in The British Journal of Sports Medicine, researchers at Heriot-Watt University in Edinburgh and the University of Edinburgh attached these new, portable EEGs to the scalps of 12 healthy young adults. The electrodes, hidden unobtrusively beneath an ordinary looking fabric cap, sent brain wave readings wirelessly to a laptop carried in a backpack by each volunteer.
The researchers, who had been studying the cognitive impacts of green spaces for some time, then sent each volunteer out on a short walk of about a mile and half that wound through three different sections of Edinburgh.
The first half mile or so took walkers through an older, historic shopping district, with fine, old buildings and plenty of pedestrians on the sidewalk, but only light vehicle traffic.
The walkers then moved onto a path that led through a park-like setting for another half mile.
Finally, they ended their walk strolling through a busy, commercial district, with heavy automobile traffic and concrete buildings.
The walkers had been told to move at their own speed, not to rush or dawdle. Most finished the walk in about 25 minutes.
Throughout that time, the portable EEGs on their heads continued to feed information about brain wave patterns to the laptops they carried.
Afterward, the researchers compared the read-outs, looking for wave patterns that they felt were related to measures of frustration, directed attention (which they called “engagement”), mental arousal and meditativeness or calm.
What they found confirmed the idea that green spaces lessen brain fatigue.
When the volunteers made their way through the urbanized, busy areas, particularly the heavily trafficked commercial district at the end of their walk, their brain wave patterns consistently showed that they were more aroused, attentive and frustrated than when they walked through the parkland, where brain-wave readings became more meditative.
While traveling through the park, the walkers were mentally quieter.
Which is not to say that they weren’t paying attention, said Jenny Roe, a professor in the School of the Built Environment at Heriot-Watt University, who oversaw the study. “Natural environments still engage” the brain, she said, but the attention demanded “is effortless. It’s called involuntary attention in psychology. It holds our attention while at the same time allowing scope for reflection,” and providing a palliative to the nonstop attentional demands of typical, city streets.
Of course, her study was small, more of a pilot study of the nifty new, portable EEG technology than a definitive examination of the cognitive effects of seeing green.
But even so, she said, the findings were consistent and strong and, from the viewpoint of those of us over-engaged in attention-hogging urban lives, valuable. The study suggests that, right about now, you should consider “taking a break from work,” Dr. Roe said, and “going for a walk in a green space or just sitting, or even viewing green spaces from your office window.” This is not unproductive lollygagging, Dr. Roe helpfully assured us. “It is likely to have a restorative effect and help with attention fatigue and stress recovery.”
-by Gretchen Reynolds, The New York Times
Focusing on the present rather than letting the mind drift may help to lower levels of the stress hormone cortisol, suggests new research from the Shamatha Project at the University of California, Davis.

The ability to focus mental resources on immediate experience is an aspect of mindfulness, which can be improved by meditation training.
"This is the first study to show a direct relation between resting cortisol and scores on any type of mindfulness scale," said Tonya Jacobs, a postdoctoral researcher at the UC Davis Center for Mind and Brain and first author of a paper describing the work, published this week in the journal Health Psychology.
High levels of cortisol, a hormone produced by the adrenal gland, are associated with physical or emotional stress. Prolonged release of the hormone contributes to wide-ranging, adverse effects on a number of physiological systems.
The new findings are the latest to come from the Shamatha Project, a comprehensive long-term, control-group study of the effects of meditation training on mind and body.
Led by Clifford Saron, associate research scientist at the UC Davis Center for Mind and Brain, the Shamatha Project has drawn the attention of both scientists and Buddhist scholars including the Dalai Lama, who has endorsed the project.
In the new study, Jacobs, Saron and their colleagues used a questionnaire to measure aspects of mindfulness among a group of volunteers before and after an intensive, three-month meditation retreat. They also measured cortisol levels in the volunteers’ saliva.
During the retreat, Buddhist scholar and teacher B. Alan Wallace of the Santa Barbara Institute for Consciousness Studies trained participants in such attentional skills as mindfulness of breathing, observing mental events, and observing the nature of consciousness. Participants also practiced cultivating benevolent mental states, including loving kindness, compassion, empathic joy and equanimity.
At an individual level, there was a correlation between a high score for mindfulness and a low score in cortisol both before and after the retreat. Individuals whose mindfulness score increased after the retreat showed a decrease in cortisol.
"The more a person reported directing their cognitive resources to immediate sensory experience and the task at hand, the lower their resting cortisol," Jacobs said.
The research did not show a direct cause and effect, Jacobs emphasized. Indeed, she noted that the effect could run either way — reduced levels of cortisol could lead to improved mindfulness, rather than the other way around. Scores on the mindfulness questionnaire increased from pre- to post-retreat, while levels of cortisol did not change overall.
According to Jacobs, training the mind to focus on immediate experience may reduce the propensity to ruminate about the past or worry about the future, thought processes that have been linked to cortisol release.
"The idea that we can train our minds in a way that fosters healthy mental habits and that these habits may be reflected in mind-body relations is not new; it’s been around for thousands of years across various cultures and ideologies," Jacobs said. "However, this idea is just beginning to be integrated into Western medicine as objective evidence accumulates. Hopefully, studies like this one will contribute to that effort."
Saron noted that in this study, the authors used the term “mindfulness” to refer to behaviors that are reflected in a particular mindfulness scale, which was the measure used in the study.
"The scale measured the participants’ propensity to let go of distressing thoughts and attend to different sensory domains, daily tasks, and the current contents of their minds. However, this scale may only reflect a subset of qualities that comprise the greater quality of mindfulness, as it is conceived across various contemplative traditions," he said.
Previous studies from the Shamatha Project have shown that the meditation retreat had positive effects on visual perception, sustained attention, socio-emotional well-being, resting brain activity and on the activity of telomerase, an enzyme important for the long-term health of body cells.
Johns Hopkins scientists say they have evidence from animal studies that a type of central nervous system cell other than motor neurons plays a fundamental role in the development of amyotrophic lateral sclerosis (ALS), a fatal degenerative disease. The discovery holds promise, they say, for identifying new targets for interrupting the disease’s progress.
In a study described online in Nature Neuroscience, the researchers found that, in mice bred with a gene mutation that causes human ALS, dramatic changes occurred in oligodendrocytes — cells that create insulation for the nerves of the central nervous system — long before the first physical symptoms of the disease appeared. Oligodendrocytes located near motor neurons — cells that govern movement — died off at very high rates, and new ones regenerated in their place were inferior and unhealthy.
The researchers also found, to their surprise, that suppressing an ALS-causing gene in oligodendrocytes of mice bred with the disease — while still allowing the gene to remain in the motor neurons — profoundly delayed the onset of ALS. It also prolonged survival of these mice by more than three months, a long time in the life span of a mouse. These observations suggest that oligodendrocytes play a very significant role in the early stage of the disease.
“The abnormalities in oligodendrocytes appear to be having a negative impact on the survival of motor neurons,” says Dwight E. Bergles, Ph.D., a co-author and a professor of neuroscience at the Johns Hopkins University School of Medicine. “The motor neurons seem to be dependent on healthy oligodendrocytes for survival, something we didn’t appreciate before.”
“These findings teach us that cells we never thought had a role in ALS not only are involved but also clearly contribute to the onset of the disease,” says co-author Jeffrey D. Rothstein, M.D., Ph.D., a professor of neurology at Johns Hopkins and director of the Johns Hopkins Medicine Brain Science Institute.
Scientists have long believed that oligodendrocytes functioned only as structural elements of the central nervous system. They wrap around nerves, making up the myelin sheath that provides the “insulation” that allows nerve signals to be transmitted rapidly and efficiently. However, Rothstein and others recently discovered that oligodendrocytes also deliver essential nutrients to neurons, and that most neurons need this support to survive.
The Johns Hopkins team of Bergles and Rothstein published a paper in 2010 that described in mice with ALS an unexpected massive proliferation of oligodendrocyte progenitor cells in the spinal cord’s motor neurons, and that these progenitors were being mobilized to make new oligodendrocytes. The researchers believed that these cells were multiplying because of an injury to oligodendrocytes, but they weren’t sure what was happening. Using a genetic method of tracking the fate of oligodendrocytes, in the new study, the researchers found that cells present in young mice with ALS were dying off at an increasing rate in concert with advancing disease. Moreover, the development of the newly formed oligodendrocytes was stalled and they were not able to provide motor neurons with a needed source of cell nutrients.
To determine whether the changes to the oligodendrocytes were just a side effect of the death of motor neurons, the scientists used a poison to kill motor neurons in the ALS mice and found no response from the progenitors, suggesting, says Rothstein, that it is the mutant ALS gene that is damaging oligodendrocytes directly.
Meanwhile, in separate experiments, the researchers found similar changes in samples of tissues from the brains of 35 people who died of ALS. Rothstein says it may be possible to see those changes early on in the disease and use MRI technology to follow progression.
“If our research is confirmed, perhaps we can start looking at ALS patients in a different way, looking for damage to oligodendrocytes as a marker for disease progression,” Rothstein says. “This could not only lead to new treatment targets but also help us to monitor whether the treatments we offer are actually working.”
ALS, also known as Lou Gehrig’s disease, named for the Yankee baseball great who died from it, affects nerve cells in the brain and spinal cord that control voluntary muscle movement. The nerve cells waste away or die, and can no longer send messages to muscles, eventually leading to muscle weakening, twitching and an inability to move the arms, legs and body. Onset is typically around age 50 and death often occurs within three to five years of diagnosis. Some 10 percent of cases are hereditary.
There is no cure for ALS and there is only one FDA-approved drug treatment, which has just a small effect in slowing disease progression and increasing survival.
Even though myelin loss has not previously been thought to occur in the gray matter, a region in the brain where neurons process information, the researchers in the new study found in ALS patients a significant loss of myelin in one of every three samples of human tissue taken from the brain’s gray matter, suggesting that the oligodendrocytes were abnormal. It isn’t clear if the oligodendrocytes that form this myelin in the gray matter play a different role than in white matter — the region in the brain where signals are relayed.
The findings further suggest that clues to the treatment of other diseases long believed to be focused in the brain’s gray matter — such as Alzheimer’s disease, Huntington’s disease and Parkinson’s disease — may be informed by studies of diseases of the white matter, such as multiple sclerosis (MS). Bergles says ALS and MS researchers never really thought their diseases had much in common before.
Oligodendrocytes have been under intense scrutiny in MS, Bergles says. In MS, the disease over time can transform from a remitting-relapsing form — in which myelin is attacked but then is regenerated when existing progenitors create new oligodendrocytes to re-form myelin — to a more chronic stage in which oligodendrocytes are no longer regenerated. MS researchers are working to identify new ways to induce the creation of new oligodendrocytes and improve their survival. “It’s possible that we may be able to dovetail with some of the same therapeutics to slow the progression of ALS,” Bergles says.
Replicative aging (also known as replicative senescence) causes mammalian cells to undergo a process of growth arrest dependent on telomeres (the shortening of repeated sequences at the ends of chromosomes). Neurons, on the other hand, are exempt from aging, and so the question of their actual lifespan has remained unanswered. Recently, however, scientists at the University of Pavia and the University of Turin demonstrated that neuronal lifespan is not limited by the organism’s maximum lifespan but, remarkably, continues when transplanted in a longer-living host. The researchers accomplished this by transplanting embryonic mouse cerebellar precursors into the developing brain of longer-living rats, in which the grafted mouse neurons survived for up to three years – twice the average lifespan of the donor mice.

Dr. Lorenzo Magrassi discussed the challenges he and his colleagues, Dr. Ketty Leto and Dr. Ferdinando Rossi, encountered in their research. “Cell transplantation into the developing rat brain is a technique that was originally developed by us and other research groups in the early nineties of the last century,” Magrassi tells Medical Xpress. “In recent years, we improved the protocol that, now standardized, allows reliable implantation rates with good survival rates.” While not all implanted embryos develop into adult animals carrying a viable transplant, Magrassi adds, the percentage of those that do is sufficient to plan a long-term survival experiment involving roughly 100 such successfully-born animals.
In addressing these challenges, Magrassi says that together with the intrinsic bonus of studying cells inside the nervous system, which is immunoprivileged, they transplanted cells before development of the thymus (a specialized organ of the immune system) was complete. The latter can help induce immunological tolerance in the host to the engrafted cells.
One remaining question is if their research can potentially be extended to determine whether or not a maximum lifespan exists for any postmitotic mammalian cells – Including neurons. “Similar techniques can, in principle, be extended to other organs containing perennial cells,” Magrassi notes, “but we don’t have direct experience with injecting cells into organs outside of the central nervous system.” Since the central nervous system is privileged compared to other organs that are more prone to immunological surveillance and attack, a major problem when transferring their experimental paradigm to other organs, he explains, could be an increase in immunological problems.
The scientists say their results suggest that neuronal survival and aging are coincidental but separable processes, thus increasing the hope that extending organismal lifespan by dietary, behavioral, and pharmacologic interventions will not necessarily result in a neuronally depleted brain. “Even after taking into account the obvious species differences, our results in rodents can be extrapolated by analogy to humans and other longer-living species where this sort of experiment is impossible,” Magrassi explains. “Our findings suggest that extending life by extending average organismal lifespan – a hallmark of all technologically advanced societies – will not necessarily result in neuron-impoverished brains well before the longer-living individual dies.” This bodes well for those studying life extension: Their efforts are not intrinsically futile, Magrassi notes, because in the absence of pathology, prolonging life span does not necessarily mean dementia due to widespread loss of neurons, as many people still think. “Roughly speaking,” Magrassi illustrates, “if the average lifespan of humans is now 80 years, our results suggest that at ages up to 160 years our neurons can survive if not hit by specific insults.
That said, however, Magrassi acknowledges that neuronal death is not the only effect of normal aging in the brain. “For example,” he illustrates, “cerebellar neurons – which in term of synaptic loss behave like the majority of neurons in the brain – show a substantial loss of dendritic branches, spines and synapses in normal aging. In our research, we studied transplanted mouse Purkinje cells to determine if their spine density decreased with time at the same rate of Purkinje cells in the mouse or in the rat.” Purkinje cells are large GABAergic (that is, gamma-Aminobutyric acid-producing) neurons, with many branching extensions, found in the cortex of the cerebellum. “The results of our experiments indicate that age-related progressive spine loss of grafted mouse Purkinje cells follows a slower pace, typical of the longer living rat, thus reaching absolute levels of spine loss comparable to those observed in aged mice at much longer survival times that are typical of the rat.”
Moreover, Magrassi adds that their experiments clearly show that by escaping immunological rejection, transplanted neurons can survive undisturbed for the entire life of the host. “This has implications for the ongoing discussion of the detrimental effects of immune attacks on transplanted neural cells for therapeutic purposes,”
Moving forward, in order to screen for intra- and extracellular changes that could be responsible for the long term survival of the mouse cells transplanted into rat brains – as well as the slowdown of dendritic spine loss – the team is planning to perform host and transplanted cell microdissection followed by a proteomic approach. “If we discover what factor or factors cause those changes,” Magrassi points out, “we could hopefully then develop more efficient drugs for treating all pathological neurodegenerative conditions in which neurons start to lose synaptic contacts and die well before organismal death – for example, dementia, memory loss and cognitive impairment. Of course,” he adds, “this work is still in progress and the results are preliminary.”
In addition, the scientists are currently testing xenotransplantation using different transgenic mouse strains with altered aging pathways as donors to characterize the pathways that led to their results.
Magrassi sees other areas of research that might benefit from their study. “Knowing that neuronal aging in rodents is not a cell-autonomous process is important not only for neuroscience,” he concludes. “It also has implications for evolutionary biology and epidemiology.”
The smooth operation of the brain requires a certain robustness to fluctuations in its home within the body. At the same time, its extraordinary power derives from an activity structure poised at criticality. In other words, it is highly responsive to many low-threshold events. When forced beyond its comfort zone in parameter space—its operating temperature, electrolytes, sugars, blood gas or even sensory input— the direct result is seizure, coma, or both. It would appear that anything rendered too hot or cold, too concentrated or scarce, precipitates seizure. In those genetically predisposed, or compromised by head trauma, the seizing tends toward full-blown epilepsy. A group in Hamburg, led by Michael Frotscher has been chipping away at the causes of common form a epilepsy, temporal lobe epilepsy (TLE). Their latest research published in the journal, Cerebral Cortex, takes a closer at differentiated neurons in the dentate gyrus of mouse hippocampus. Once thought to be completely immobilized by virtue of their broadly integrated dendritic trees, these neurons are now shown to become migratory once again in direct response to seizure activity.

Genetic predisposition to seizure can come in the form of ongoing chemical or metabolic imbalance due to defects in enzymes, ion channels or receptors. Alternatively it manifests through direct structural defect as a result of a developmental flaw. In slice preparations, Frotscher looked at a particular form of TLE, where the granule cell layer (GCL) in the dentate gyrus is disrupted. The cells there have either failed to migrate along glial scaffolds into a compact layer with clearly defined margins, or aberrant clumps of cells congregate in the wrong places. Seizures secondary to fever have been known to cause this aberrant migration of granule cells, as has a particular kind of mouse mutant known as the reeler mouse.
The catalog of mouse mutants is expansive; it is a veritable library of hopeless monsters. The reeler mutant, known since 1951, has a unique set of issues wherein cells fail to migrate to the right spots in the cerebellum, cortex, and hippocampus. The protein, reelin was later discovered as one of the causes of this particular phenotype. Reelin is an extracellular matrix protein which initially provides scaffolding for neuron migration, and later a fence to fix neurons in place. In mice with mutated reelin protein, cells in all parts of the hippocampus, not just the dentate gyrus are spread out into a broad and diffuse layer.
By injecting kainate (KA), an excitotoxin that predictably results in seizures, into the dentate gyrus, Frotscher biased the granule cells into entering a phase of bursting activity. With their glutamate receptors fully activated by KA, the granule cells fire rapid volleys of spikes followed by deep depolarization periods. Cells that had been fluorescently labeled with GFP and observed with real time video microscopy were also seen to become motile and dispersed. The normal band of granule cells doubled, or tripled, in thickness. Next, Frostcher looked for a link between this response to KA and the reelin protein. Both reelin mRNA and reelin immunoreactivity were found to be reduced in the dentate granule cells that had been dispersed by KA.
Against this tableau of complex responses to KA, is the fact that adult neurogenesis of dentate granule cells occurs within many mammalian species. A narrowly-defined rostral migratory stream normally delivers fresh cells to both the dentate gyrus and olfactory bulb. Application of BrdU, a marker of newly born cells, labeled microglial and astrocytes near the site of injection, but only a few of the granule cells. As an excitotoxin, KA may be expected to kill at least some cells outright, and cause significant dendritic degeneration in many more. An interesting question to ask, is how does KA induce granule cell dispersion despite the dense interconnections with their neighbors?
During KA induced motility, the nucleus was typically observed to translocate within the cell into one of the dendrites, pulling the soma along with it. This process is believed to involve a myosin-dependant forward flow of actin structural protein within the cell. Outside the cell, changes to the reelin matrix appear to be involved as well. One potential mechanism that has emerged is that reelin induces serine phosporylation of cofilin, an actin-associated protein involved in depolymerization. The authors conclude reelin-induced cofilin phosphorylation controls neuronal migration during development, and prevents abnormal motility in the mature brain.
Undoubtedly many mechanisms are involved in the KA-induced seizure and reelin story. Other cell types in the dentate gyrus need to be looked at in closer detail. For example, how reelin expression is regulated, and which cells manufacture it are current areas of study. It is important as well to differentiate between the causes of seizure, and its consequences. On paper they can be neatly packaged concepts but in the real tissue, and in intact animals, they can be anything but.
Clumps of α-synuclein protein in nerve cells are hallmarks of many degenerative brain diseases, most notably Parkinson’s disease.

“No one has been able to determine if Lewy bodies and Lewy neurites, hallmark pathologies in Parkinson’s disease can be degraded,” says Virginia Lee, PhD, director of the Center for Neurodegenerative Disease Research, at the Perelman School of Medicine, University of Pennsylvania.
“With the new neuron model system of Parkinson’s disease pathologies our lab has developed recently, we demonstrated that these aberrant clumps in cells resist degradation as well as impair the function of the macroautophagy system, one of the major garbage disposal systems within the cell.”
Macroautophagy, literally self eating, is the degradation of unnecessary or dysfunctional cellular bits and pieces by a compartment in the cell called the lysosome.
Lee, also a professor of Pathology and Laboratory Medicine, and colleagues published their results in the early online edition of the Journal of Biological Chemistry this week.
Alpha-synuclein (α-syn ) diseases all have clumps of the protein and include Parkinson’s disease (PD), and array of related disorders: PD with dementia , dementia with Lewy bodies, and multiple system atrophy. In most of these, α-syn forms insoluble aggregates of stringy fibrils that accumulate in the cell body and extensions of neurons.
These unwanted α-syn clumps are modified by abnormal attachments of many phosphate chemical groups as well as by the protein ubiquitin, a molecular tag for degradation. They are widely distributed in the central nervous system, where they are associated with neuron loss.
Using cell models in which intracellular α-syn clumps accumulate after taking up synthetic α-syn fibrils, the team showed that α-syn inclusions cannot be degraded, even though they are located near the lysosome and the proteasome, another type of garbage disposal in the cell.
The α-syn aggregates persist even after soluble α-syn levels within the cell are substantially reduced, suggesting that once formed, the α-syn inclusions are resistant to being cleared. What’s more, they found that α-syn aggregates impair the overall autophagy degradative process by delaying the maturation of autophagy machines known as autophagosomes, which may contribute to the increased cell death seen in clump-filled nerve cells. Understanding the impact of α-syn aggregates on autophagy may help elucidate therapies for α-syn-related neurodegeneration.
Women who abruptly and prematurely lose estrogen from surgical menopause have a two-fold increase in cognitive decline and dementia.

"This is what the clinical studies indicate and our animal studies looking at the underlying mechanisms back this up," said Brann, corresponding author of the study in the journal Brain. “We wanted to find out why that is occurring. We suspect it’s due to the premature loss of estrogen.”
In an effort to mimic what occurs in women, Brann and his colleagues looked at rats 10 weeks after removal of their estrogen-producing ovaries that were either immediately started on low-dose estrogen therapy, started therapy 10 weeks later or never given estrogen.
When the researchers caused a stroke-like event in the brain’s hippocampus, a center of learning and memory, they found the rodents treated late or not at all experienced more brain damage, specifically to a region of the hippocampus called CA3 that is normally stroke-resistant.
To make matters worse, untreated or late-treated rats also began an abnormal, robust production of Alzheimer’s disease-related proteins in the CA3 region, even becoming hypersensitive to one of the most toxic of the beta amyloid proteins that are a hallmark of Alzheimer’s.
Both problems appear associated with the increased production of free radicals in the brain. In fact, when the researchers blocked the excessive production, heightened stroke sensitivity and brain cell death in the CA3 region were reduced.
Interestingly the brain’s increased sensitivity to stressors such as inadequate oxygen was gender specific, Brann said. Removing testes in male rats, didn’t affect stroke size or damage.
Although exactly how it works is unknown, estrogen appears to help protect younger females from problems such as stroke and heart attack. Their risks of the maladies increase after menopause to about the same as males. Follow up studies are needed to see if estrogen therapy also reduces sensitivity to the beta amyloid protein in the CA3 region, as they expect, Brann noted.
Brann earlier showed that prolonged estrogen deprivation in aging rats dramatically reduces the number of brain receptors for the hormone as well as its ability to prevent strokes. Damage was forestalled if estrogen replacement was started shortly after hormone levels drop, according to the 2011 study in the journal Proceedings of the National Academy of Sciences.
The surprising results of the much-publicized Women’s Health Initiative – a 12-year study of 161,808 women ages 50-79 – found hormone therapy generally increased rather than decreased stroke risk as well as other health problems. Critics said one problem with the study was that many of the women, like Brann’s aged rats, had gone years without hormone replacement, bolstering the case that timing is everything.