Deep inside the brains of people with dementia and Lou Gehrig’s disease, globs of abnormal protein gum up the inner workings of brain cells – dooming them to an early death.

But boosting those cells’ natural ability to clean up those clogs might hold the key to better treatment for such conditions.
That’s the key finding of new research from a University of Michigan Medical School physician scientist and his colleagues in California and the United Kingdom. They reported their latest findings this week in the journal Nature Chemical Biology.
Though the team showed the effect worked in animals and human neurons from stem cells, not patients, their discoveries point the way to find new medicines that boost the protein-clearing cleanup process.
The work also shows how an innovative microscope technique can help researchers see what’s going on inside brain cells, as they labor to clear out the protein buildup.
The researchers focused on a crucial cell-cleaning process called autophagy – a hot topic in basic medical research these days, as scientists discover its important role in many conditions. In autophagy, cells bundle unwanted materials up, break them down and push the waste products out.
In the newly published research, the team showed how the self-cleaning capacity of some brain cells gets overwhelmed if the cells make too much of an abnormal protein called TDP43. They found that cells vary greatly in how quickly their autophagy capacity gets swamped.

In brain cells that were made from stem cells derived from ALS patients, treatment with two drugs that stimulate autophagy led to longer cell survival (middle two lines).
But they also showed how three drugs that boost autophagy – speeding up the clean-out process – could keep the brain cells alive longer.
Longer-living, TDP43-clearing brain cells are theoretically what people with Lou Gehrig’s disease (amyotrophic lateral sclerosis or ALS) and certain forms of dementia (called frontotemporal) need. But only further research will show for sure.
Sami Barmada, M.D., Ph.D., the U-M neurologist and scientist who is first author of the new study, says the new findings are encouraging – and so is the success of a microscope technique used in the research. His new lab, in the U-M Department of Neurology, is continuing to refine ways to view the inner workings of nerve cells.
“Using this new visualization technique, we could truly see how the protein was being cleared, and therefore which compounds could enhance the pace of clearance and shorten the half-life of TDP43 inside cells,” he says. “This allowed us to see that increased autophagy was directly related to improved cell survival.”
Barmada worked on the team at the Gladstone Institutes and the University of California San Francisco headed by Steven Finkbeiner, M.D., Ph.D., that published the new findings. The team used stem cells derived from the cells of people who have ALS to grow neurons and astrocytes – the two types of brain cell most crucial to normal brain function.
Because he both sees patients in clinic and studies neurological disease in the laboratory, Barmada brings a special perspective to the research.
At U-M, he specializes in treating patients who have neurological diseases that affect both thinking and muscle control. About a third of ALS patients develop signs of frontotemporal dementia, also called FTD – and about 10 percent of people with FTD also have a motor neuron disease that affects their brain’s ability to control muscle movement.
One of the drugs tested in the study, an antipsychotic drug developed in the 1960s to treat people with schizophrenia, had actually shown some anti-dementia promise in human ALS patients, but comes with many side effects. Barmada notes that Finkbeiner’s team at the Gladstone Institute is already working to identify other compounds that could produce the effect with fewer side effects.
Interestingly, small studies have suggested that people with schizophrenia who take antipsychotic drugs are much less likely to develop ALS.
Barmada’s work at U-M now focuses on the connection between brain cells’ ability to clear abnormal proteins. He also studies the cells’ regulation of RNA molecules created as part of expressing protein-encoding genes. Looking further upstream in the protein-producing process could yield further clues to why disease develops and what can be done about it, he says.
An international team of researchers identified a pathogenic mechanism that is common to several neurodegenerative diseases. The findings suggest that it may be possible to slow the progression of dementia even after the onset of symptoms.

The relentless increase in the incidence of dementia in aging societies poses an enormous challenge to health-care systems. An international team of researchers led by Professor Christian Haass and Gernot Kleinberger at the LMU‘s Adolf-Butenandt-Institute and the German Center for Neurodegenerative Diseases (DZNE), has now elucidated the mode of action of a genetic defect that contributes to the development of several different dementia syndromes.
Neurodegenerative disorders such as Alzheimer’s and Parkinson’s diseases or frontotemporal dementia display a number of common features. They are all characterized by the appearance in the brains of affected patients of abnormally high levels of insoluble protein deposits, which are associated with massive loss of nerve cells. In order to minimize further damage to nerve cells in the vicinity of such deposits, dead cells and the proteinaceous aggregates released from them must be efficiently degraded and disposed of. This task is performed by specialized phagocytic cells – the so-called microglia – which act as “sanitary inspectors” in the brain to ensure the prompt removal of debris that presents a danger to the health of nearby cells. Microglia are found only in the central nervous system, but functionally they represent a division of the body’s innate immune system.
As Haass and his colleagues now report in the latest issue of the journal Science Translational Medicine, specific mutations in the gene for a protein called TREM2, which regulates the uptake of waste products by microglia, lead to its absence from the cell surface. TREM2 is normally inserted into the plasma membrane of microglial cells such that part of it extends through the membrane as an extracellular domain. This exposed portion of TREM2 is responsible for the recognition of waste products left behind by dead cells. “We believe that the genetic defect disrupts the folding of the protein chain soon during its synthesis in the cell, so that it is degraded before it can reach the surface of the microglia,” says Kleinberger. As a result, the amount of debris that the microglia can cope with is significantly reduced. Consequently, the toxic protein deposits, as well as whole dead cells, cannot be efficiently removed and continue to accumulate in the brain. This is expected to trigger inflammatory reactions that may promote further nerve-cell loss.
The new study thus pinpoints a mechanism that influences the course of several different brain diseases. “In addition, our findings may perhaps point to ways of slowing the rate of progression of these illnesses even after the manifestation of overt signs of dementia, which has not been possible so far,” says Haass. “That this may indeed be feasible is suggested by the initial results of an experiment in which we were able to stimulate the phagocytic activity of microglia by pharmacological means.”
Men born in November, December or January are more likely of being left-handed than during the rest of the year. While the genetic bases of handedness are still under debate, scientists at the Faculty of Psychology, University of Vienna, obtained indirect evidence of a hormonal mechanism promoting left-handedness among men. Psychologist Ulrich Tran and his colleagues published their findings in the scientific journal “Cortex”.

Various manual tasks in everyday life require the use of the right hand or are optimized for right-handers. Around 90 percent of the general population is right-handed, only about 10 percent is left-handed. The study of Ulrich Tran, Stefan Stieger, and Martin Voracek comprised two large and independent samples of nearly 13000 adults from Austria and Germany. As in modern genetic studies, where a discovery-and-replication-sample design is standard, the use of two samples allowed testing the replicability and robustness of findings within one-and-the-same study. Overall, 7.5 percent of women and 8.8 percent of men were left-handed. “We were surprised to see that this imbalance was caused by more left-handed men being born specifically during November, December, and January. On a monthly average, 8.2 percent of left-handed men were born during the period February to October. During November to January, this number rose to 10.5 percent”, according to Ulrich Tran, lead author of the study.
A hormonal cause during embryonic development
"Presumably, the relative darkness during the period November to January is not directly connected to this birth seasonality of handedness. We assume that the relative brightness during the period May to July, half a year before, is its distal cause", explains Ulrich Tran. A theory, brought forth in the 1980s by US neurologists Norman Geschwind and Albert Galaburda, posits that testosterone delays the maturation of the left brain hemisphere during embryonic development. The left brain hemisphere is dominant among right-handers, the right brain hemisphere is dominant among left-handers. Intrauterine testosterone levels are higher in the male fetus, because of its own testosterone secretion, than in the female fetus. However, the testosterone level of the mother and external factors may also affect intrauterine testosterone levels. Specifically, more daylight may increase testosterone levels, making a seasonality effect plausible.
Previous studies on the subject provided mixed and inconsistent evidence. There was no clear indication which season has an effect, and whether seasonality affects men, women or both sexes equally. According to the current findings, there is a small, but robust and replicable, effect of birth seasonality on handedness, affecting only men. These results are consistent with a hormonal basis of handedness, corroborating thus an old and controversial theory. However, the exact way of causation needs to be investigated in future studies.
Johns Hopkins researchers have begun to connect the dots between a schizophrenia-linked genetic variation and its effect on the developing brain. As they report July 3 in the journal Cell Stem Cell, their experiments show that the loss of a particular gene alters the skeletons of developing brain cells, which in turn disrupts the orderly layers those cells would normally form.

(Image caption: Left, human neural stem cells form rosettes as they grow into different cell types, with ringlike patterns of PKCλ protein in the center. A neural rosette with a 15q11.2 microdeletion, a risk factor for schizophrenia, appears disorganized and lacks the ringlike PKCλ protein structure, right, suggesting that this risk factor acts early in the neurodevelopmental process. Credit: Ki-Jun Yoon/Johns Hopkins Medicine)
“This is an important step toward understanding what physically happens in the developing brain that puts people at risk of schizophrenia,” says Guo-li Ming, M.D., Ph.D., a professor of neurology and neuroscience in the Johns Hopkins University School of Medicine’s Institute for Cell Engineering.
While no single genetic mutation is known to cause schizophrenia, so-called genome wide association studies have identified variations that are more common in people with the condition than in the general population. One of these is a missing piece from an area of the genome labeled 15q11.2. “While the deletion is linked to schizophrenia, having extra copies of this part of the genome raises the risk of autism,” notes Ming.
For the new study, Ming’s research group, along with that of her husband and collaborator, neurology and neuroscience professor Hongjun Song, Ph.D., used skin cells from people with schizophrenia who were missing part of 15q11.2 on one of their chromosomes. (Because everyone carries two copies of their genome, the patients each had an intact copy of 15q11.2 as well.)
The researchers grew the human skin cells in a dish and coaxed them to become induced pluripotent stem cells, and then to form neural progenitor cells, a kind of stem cell found in the developing brain.
“Normally, neural progenitors will form orderly rings when grown in a dish, but those with the deletion didn’t,” Ming says. To find out which of the four known genes in the missing piece of the genome were responsible for the change, the researchers engineered groups of progenitors that each produced less protein than normal from one of the suspect genes. The crucial ingredient in ring formation turned out to be a gene called CYFIP1.
The team then altered the genomes of neural progenitors in mouse embryos so that they made less of the protein created by CYFIP1. The brain cells of the fetal mice turned out to have similar defects in structure to those in the dish-grown human cells. The reason, the team found, is that CYFIP1 plays a role in building the skeleton that gives shape to each cell, and its loss affects spots called adherens junctions where the skeletons of two neighboring cells connect.
Having less CYFIP1 protein also caused some neurons in the developing mice to end up in the wrong layer within the brain. “During development, new neurons get in place by ‘climbing’ the tendrils of neural progenitor cells,” Ming says. “We think that disrupted adherens junctions don’t provide a stable enough anchor for neural progenitors, so the ‘rope’ they form doesn’t quite get new neurons to the right place.”
The researchers say they also found that CYFIP1 is part of a complex of proteins called WAVE, which is key to building the cellular skeleton.
Many people with a CYFIP1 deletion do not get schizophrenia, so the team suspected the condition was more likely to arise in people with a second defect in the WAVE complex.
Analyzing data from genomewide association studies, they found a variation in the WAVE complex signaling gene ACTR2/Arp2 that, combined with the CYFIP1 deletion, increased the risk of schizophrenia more than either genetic change by itself.
In adding to science’s understanding of schizophrenia, the study also shows how other mental illnesses might be similarly investigated, the researchers say. “Using induced pluripotent stem cells from people with schizophrenia allowed us to see how their genes affected brain development,” says Song. “Next, we’d like to investigate what effects remain in the mature brain.”
Imagine feeling a slimy jellyfish, a prickly cactus or map directions on your iPad mini Retina display, because that’s where tactile technology is headed. But you’ll need more than just an index finger to feel your way around.

New research at UC Berkeley has found that people are better and faster at navigating tactile technology when using both hands and several fingers. Moreover, blind people in the study outmaneuvered their sighted counterparts – especially when using both hands and several fingers – possibly because they’ve developed superior cognitive strategies for finding their way around.
Bottom line: Two hands are better than one in the brave new world of tactile or “haptic” technology, and the visually impaired can lead the way.
”Most sighted people will explore these types of displays with a single finger. But our research shows that this is a bad decision. No matter what the task, people perform better using multiple fingers and hands,” said Valerie Morash, a doctoral student in psychology at UC Berkeley, and lead author of the study just published in the online issue of the journal, Perception.
“We can learn from blind people how to effectively use multiple fingers, and then teach these strategies to sighted individuals who have recently lost vision or are using tactile displays in high-stakes applications like controlling surgical robots,” she added.
For decades, scientists have studied how receptors on the fingertips relay information to the brain. Now, researchers at Disney and other media companies are implementing more tactile interfaces, which use vibrations, and electrostatic or magnetic feedback for users to find their way around, or experience how something feels.
In this latest study, Morash and fellow researchers at UC Berkeley and the Smith-Kettlewell Eye Research Institute in San Francisco tested 14 blind adults and 14 blindfolded sighted adults on several tasks using a tactile map. Using various hand and finger combinations, they were tasked with such challenges as finding a landmark or figuring out if a road looped around.
Overall, both blind and sighted participants performed better when using both hands and several fingers, although blind participants were, on average, 50 percent faster at completing the tasks, and even faster when they used both hands and all their fingers.
“As we move forward with integrating tactile feedback into displays, these technologies absolutely need to support multiple fingers,” Morash said. “This will promote the best tactile performance in applications such as the remote control of robotics used in space and high-risk situations, among other things.”
Neuroscientists leading the largest longitudinal adolescent brain imaging study to date have learned that predicting teenage binge-drinking is possible. In fact, say the researchers in the group’s latest publication, a number of factors – genetics, brain function and about 40 different variables – can help scientists predict with about 70 percent accuracy which teens will become binge drinkers. The study appears online July 3, 2014 as an Advance Online Publication in the journal Nature.

First author Robert Whelan, Ph.D., a former University of Vermont (UVM) postdoctoral fellow in psychiatry and current lecturer at University College Dublin, and senior author Hugh Garavan, Ph.D., UVM associate professor of psychiatry, and colleagues conducted 10 hours of comprehensive assessments – these included neuroimaging to assess brain activity and brain structure, along with other measures such as IQ, cognitive task performance, personality and blood tests – on each of 2,400 14-year-old adolescents at eight different sites across Europe.
“Our goal was to develop a model to better understand the relative roles of brain structure and function, personality, environmental influences and genetics in the development of adolescent abuse of alcohol,” says Whelan. “This multidimensional risk profile of genes, brain function and environmental influences can help in the prediction of binge drinking at age 16 years.”
A 2012 Nature Neuroscience paper by the same researchers identified brain networks that predisposed some teens to higher-risk behaviors like experimentation with drugs and alcohol. This new study develops on that earlier work by following those kids for years (the participants in the study are now 19 years old) and identifying those who developed a pattern of binge-drinking. The 2014 Nature study aimed to predict those who went on to drink heavily at age 16 using only data collected at age 14. They applied a broad range of measures, developing a unique analytic method to predict which individuals would become binge-drinkers. The reliability of the results were confirmed by showing the same accuracy when tested on a new, separate group of teenagers. The result was a list of predictors that ranged from brain and genetics to personality and personal history factors.
“Notably, it’s not the case that there’s a single one or two or three variables that are critical,” says Garavan. “The final model was very broad – it suggests that a wide mixture of reasons underlie teenage drinking.”
Some of the best predictors, shares Garavan, include variables like personality, sensation-seeking traits, lack of conscientiousness, and a family history of drug use. Having even a single drink at age 14, was also a powerful predictor. That type of risk-taking behavior – and the impulsivity that often accompanies it – was a critical predictor. In addition, those teens who had experienced several stressful life events were among those at greater risk for binge-drinking.
One interesting finding, says Garavan, was that bigger brains were also predictive. Adolescents undergo significant brain changes, so in addition to the formation of personalities and social networks, it’s actually normal for their brains to reduce to a more efficient size.
“There’s refining and sculpting of the brain, and most of the gray matter – the neurons and the connections between them, are getting smaller and the white matter is getting larger,” he explains. “Kids with more immature brains – those that are still larger – are more likely to drink.”
Garavan, Whelan and colleagues believe that by better understanding the probable causal factors for binge-drinking, targeted interventions for those most at risk could be applied.
Gunter Schumann, M.D.,professor of biological psychiatry and head of the section at the Social, Genetic and Developmental Psychiatry Centre, Institute of Psychiatry, King’s College London, is the principle investigator of the IMAGEN study, which is the source of this latest paper. “We aimed to develop a ‘gold standard’ model for predicting teenage behavior, which can be used as a benchmark for the development of simpler, widely applicable prediction models,” says Schumann. “This work will inform the development of specific early interventions in carriers of the risk profile to reduce the incidence of adolescent substance abuse. We now propose to extend analysis of the IMAGEN data in order to investigate the development of substance use patterns in the context of moderating environmental factors, such as exposure to nicotine or drugs as well as psychosocial stress.”
In the future, the researchers hope to perform more in-depth analyses of the brain factors involved and determine whether or not there are different predictors for abuse of other drugs. A similar analysis, which is using the same dataset to look at the predictors of cannabis use, is planned for the near future.
Researchers at Duke-NUS Graduate Medical School Singapore (Duke-NUS) have found evidence that the less older adults sleep, the faster their brains age. These findings, relevant in the context of Singapore’s rapidly ageing society, pave the way for future work on sleep loss and its contribution to cognitive decline, including dementia.

Past research has examined the impact of sleep duration on cognitive functions in older adults. Though faster brain ventricle enlargement is a marker for cognitive decline and the development of neurodegenerative diseases such as Alzheimer’s, the effects of sleep on this marker have never been measured.
The Duke-NUS study examined the data of 66 older Chinese adults, from the Singapore-Longitudinal Aging Brain Study(1). Participants underwent structural MRI brain scans measuring brain volume and neuropsychological assessments testing cognitive function every two years. Additionally, their sleep duration was recorded through a questionnaire. Those who slept fewer hours showed evidence of faster ventricle enlargement and decline in cognitive performance.
"Our findings relate short sleep to a marker of brain aging," said Dr June Lo, the lead author and a Duke-NUS Research Fellow. "Work done elsewhere suggests that seven hours a day(2) for adults seems to be the sweet spot for optimal performance on computer based cognitive tests. In coming years we hope to determine what’s good for cardio-metabolic and long term brain health too," added Professor Michael Chee, senior author and Director of the Centre for Cognitive Neuroscience at Duke-NUS.
Research suggests that people at increased risk for developing addiction share many of the same neurobiological signatures of people who have already developed addiction. This similarity is to be expected, as individuals with family members who have struggled with addiction are over-represented in the population of addicted people.
However, a generation of animal research supports the hypothesis that the addiction process changes the brain in ways that converge with the distinctive neurobiology of the heritable risk for addiction. In other words, the more one uses addictive substances, the more one’s brain acquires the profile of someone who has inherited a risk for addiction.
One such change is a reduction in striatal dopamine release. Dopamine is a key brain chemical messenger involved in reward-related behaviors. Disturbances in dopamine signaling appear to contribute to reward processing that biases people to seek drug-like rewards and to develop drug-taking habits.
In the current issue of Biological Psychiatry, researchers at McGill University report that individuals at high risk for addiction show the same reduced dopamine response often observed in addicted individuals, identifying a new link between addiction risk and addiction in humans.
Dr. Marco Leyton and his colleagues recruited young adults, aged 18 to 25, who were classified into three groups: 1) a high-risk group of occasional stimulant users with an extensive family history of substance abuse; 2) a comparison group of occasional stimulant users with no family history; and 3) a second comparison group of individuals with no history of stimulant use and no known risk factors for addiction. Volunteers underwent a positron emission tomography (PET) scan involving the administration of amphetamine, which enabled the researchers to measure their dopamine response.
The authors found that the high-risk group of non-dependent young adults with extensive family histories of addiction displayed markedly reduced dopamine responses in comparison with both stimulant-naïve subjects and non-dependent users with no family history.
“This interesting new parallel between addiction risk and addiction may help to focus our attention on reward-related processes that contribute to the development of addiction, perhaps informing prevention strategies,” said Dr. John Krystal, Editor of Biological Psychiatry.
Leyton, a Professor at McGill University, said, “Young adults at risk of addictions have a strikingly disturbed brain dopamine reward system response when they are administered amphetamine. Past drug use seemed to aggravate the dopamine response also but this was not a sufficient explanation. Instead, the disturbance may be a heritable biological marker that could identify those at highest risk.”
This finding suggests that there are common brain mechanisms that promote the use of addictive substances in vulnerable people and in people who have long-standing habitual substance use.
Better understanding this biology may help to advance our understanding of how people develop addiction problems, as well as providing hints related to biological mechanisms that might be targeted for prevention and treatment.
Patients with Alzheimer’s disease run a high risk of seizures. While the amyloid-beta protein involved in the development and progression of Alzheimer’s seems the most likely cause for this neuronal hyperactivity, how and why this elevated activity takes place hasn’t yet been explained — until now.

A new study by Tel Aviv University researchers, published in Cell Reports, pinpoints the precise molecular mechanism that may trigger an enhancement of neuronal activity in Alzheimer’s patients, which subsequently damages memory and learning functions. The research team, led by Dr. Inna Slutsky of TAU’s Sackler Faculty of Medicine and Sagol School of Neuroscience, discovered that the amyloid precursor protein (APP), in addition to its well-known role in producing amyloid-beta, also constitutes the receptor for amyloid-beta. According to the study, the binding of amyloid-beta to pairs of APP molecules triggers a signalling cascade, which causes elevated neuronal activity.
Elevated activity in the hippocampus — the area of the brain that controls learning and memory — has been observed in patients with mild cognitive impairment and early stages of Alzheimer’s disease. Hyperactive hippocampal neurons, which precede amyloid plaque formation, have also been observed in mouse models with early onset Alzheimer’s disease. “These are truly exciting results,” said Dr. Slutsky. “Our work suggests that APP molecules, like many other known cell surface receptors, may modulate the transfer of information between neurons.”
With the understanding of this mechanism, the potential for restoring memory and protecting the brain is greatly increased.
Building on earlier research
The research project was launched five years ago, following the researchers’ discovery of the physiological role played by amyloid-beta, previously known as an exclusively toxic molecule. The team found that amyloid-beta is essential for the normal day-to-day transfer of information through the nerve cell networks. If the level of amyloid-beta is even slightly increased, it causes neuronal hyperactivity and greatly impairs the effective transfer of information between neurons.
In the search for the underlying cause of neuronal hyperactivity, TAU doctoral student Hilla Fogel and postdoctoral fellow Samuel Frere found that while unaffected “normal” neurons became hyperactive following a rise in amyloid-beta concentration, neurons lacking APP did not respond to amyloid-beta. “This finding was the starting point of a long journey toward decoding the mechanism of APP-mediated hyperactivity,” said Dr. Slutsky.
The researchers, collaborating with Prof. Joel Hirsch of TAU’s Faculty of Life Sciences, Prof. Dominic Walsh of Harvard University, and Prof. Ehud Isacoff of University of California Berkeley, harnessed a combination of cutting-edge high-resolution optical imaging, biophysical methods and molecular biology to examine APP-dependent signalling in neural cultures, brain slices, and mouse models. Using highly sensitive biophysical techniques based on fluorescence resonance energy transfer (FRET) between fluorescent proteins in close proximity, they discovered that APP exists as a dimer at presynaptic contacts, and that the binding of amyloid-beta triggers a change in the APP-APP interactions, leading to an increase in calcium flux and higher glutamate release — in other words, brain hyperactivity.
A new approach to protecting the brain
"We have now identified the molecular players in hyperactivity," said Dr. Slutsky. "TAU postdoctoral fellow Oshik Segev is now working to identify the exact spot where the amyloid-beta binds to APP and how it modifies the structure of the APP molecule. If we can change the APP structure and engineer molecules that interfere with the binding of amyloid-beta to APP, then we can break up the process leading to hippocampal hyperactivity. This may help to restore memory and protect the brain."
Previous studies by Prof. Lennart Mucke’s laboratory strongly suggest that a reduction in the expression level of “tau” (microtubule-associated protein), another key player in Alzheimer’s pathogenesis, rescues synaptic deficits and decreases abnormal brain activity in animal models. “It will be crucial to understand the missing link between APP and ‘tau’-mediated signalling pathways leading to hyperactivity of hippocampal circuits. If we can find a way to disrupt the positive signalling loop between amyloid-beta and neuronal activity, it may rescue cognitive decline and the conversion to Alzheimer’s disease,” said Dr. Slutsky.
A drug that blocks the action of the enzyme Cdk5 could substantially reduce brain damage if administered shortly after a stroke, UT Southwestern Medical Center research suggests.
The findings, reported in the June 11 issue of the Journal of Neuroscience, determined in rodent models that aberrant Cdk5 activity causes nerve cell death during stroke.
“If you inhibit Cdk5, then the vast majority of brain tissue stays alive without oxygen for up to one hour,” said Dr. James Bibb, Associate Professor of Psychiatry and Neurology and Neurotherapeutics at UT Southwestern and senior author of the study. “This result tells us that Cdk5 is a central player in nerve cell death.”
More importantly, development of a Cdk5 inhibitor as an acute neuroprotective therapy has the potential to reduce stroke injury.
“If we could block Cdk5 in patients who have just suffered a stroke, we may be able to reduce the number of patients in our hospitals who become disabled or die from stroke. Doing so would have a major impact on health care,” Dr. Bibb said.
While several pharmaceutical companies worked to develop Cdk5 inhibitors years ago, these efforts were largely abandoned since research indicated blocking Cdk5 long-term could have detrimental effects. At the time, many scientists thought aberrant Cdk5 activity played a major role in the development of Alzheimer’s disease and that Cdk5 inhibition might be beneficial as a treatment.
Based on Dr. Bibb’s research and that of others, Cdk5 has both good and bad effects. When working normally, Cdk5 adds phosphates to other proteins that are important to healthy brain function. On the flip side, researchers have found that aberrant Cdk5 activity contributes to nerve cell death following brain injury and can lead to cancer.
“Cdk5 regulates communication between nerve cells and is essential for proper brain function. Therefore, blocking Cdk5 long-term may not be beneficial,” Dr. Bibb said. “Until now, the connection between Cdk5 and stroke injury was unknown, as was the potential benefit of acute Cdk5 inhibition as a therapy.”
In this study, researchers administered a Cdk5 inhibitor directly into dissected brain slices after adult rodents suffered a stroke, in addition to measuring the post-stroke effects in Cdk5 knockout mice.
“We are not yet at a point where this new treatment can be given for stroke. Nevertheless, this research brings us a step closer to developing the right kinds of drugs,” Dr. Bibb said. “We first need to know what mechanisms underlie the disease before targeted treatments can be developed that will be effective. As no Cdk5 blocker exists that works in a pill form, the next step will be to develop a systemic drug that could be used to confirm the study’s results and lead to a clinical trial at later stages.”
Currently, there is only one FDA-approved drug for acute treatment of stroke, the clot-busting drug tPA. Other treatment options include neurosurgical procedures to help minimize brain damage.
Without a steady supply of blood, neurons can’t work. That’s why one of the culprits behind Alzheimer’s disease is believed to be the persistent blood clots that often form in the brains of Alzheimer’s patients, contributing to the condition’s hallmark memory loss, confusion and cognitive decline.

New experiments in Sidney Strickland’s Laboratory of Neurobiology and Genetics at Rockefeller University have identified a compound that might halt the progression of Alzheimer’s by interfering with the role amyloid-β, a small protein that forms plaques in Alzheimer’s brains, plays in the formation of blood clots. This work is highlighted in the July issue of Nature Reviews Drug Discovery.
For more than a decade, potential Alzheimer’s drugs have targeted amyloid-β, but, in clinical trials, they have either failed to slow the progression of the disease or caused serious side effects. However, by targeting the protein’s ability to bind to a clotting agent in blood, the work in the Strickland lab offers a promising new strategy, according to the highlight published in print on July 1.
This latest study builds on previous work in Strickland’s lab showing amyloid-β can interact with fibrinogen, the clotting agent, to form difficult-to-break-down clots that alter blood flow, cause inflammation and choke neurons.
“Our experiments in test tubes and in mouse models of Alzheimer’s showed the compound, known as RU-505, helped restore normal clotting and cerebral blood flow. But the big pay-off came with behavioral tests in which the Alzheimer’s mice treated with RU-505 exhibited better memories than their untreated counterparts,” Strickland says. “These results suggest we have found a new strategy with which to treat Alzheimer’s disease.”
RU-505 emerged from a pack of 93,716 candidates selected from libraries of compounds, the researchers write in the June issue of the Journal of Experimental Medicine. Hyung Jin Ahn, a research associate in the lab, examined these candidates with a specific goal in mind: Find one that interferes with the interaction between fibrinogen and amyloid-β. In a series of tests that began with a massive, automated screening effort at Rockefeller’s High Throughput Resource Center, Ahn and colleagues winnowed the 93,000 contenders to five. Then, test tube experiments whittled the list down to one contender: RU-505, a small, synthetic compound. Because RU-505 binds to amyloid-β and only prevents abnormal blood clot formation, it does not interfere with normal clotting. It is also capable of passing through the blood-brain barrier.
“We tested RU-505 in mouse models of Alzheimer’s disease that over-express amyloid-β and have a relatively early onset of disease. Because Alzheimer’s disease is a long-term, progressive disease, these treatments lasted for three months,” Ahn says. “Afterward, we found evidence of improvement both at the cellular and the behavioral levels.”
The brains of the treated mice had less of the chronic and harmful inflammation associated with the disease, and blood flow in their brains was closer to normal than that of untreated Alzheimer’s mice. The RU-505-treated mice also did better when placed in a maze. Mice naturally want to escape the maze, and are trained to recognize visual cues to find the exit quickly. Even after training, Alzheimer’s mice have difficulty in exiting the maze. After these mice were treated with RU-505, they performed much better.
“While the behavior and the brains of the Alzheimer’s mice did not fully recover, the three-month treatment with RU-505 prevents much of the decline associated with the disease,” Strickland says.
The researchers have begun the next steps toward developing a human treatment. Refinements to the compound are being supported by the Robertson Therapeutic Development Fund and the Tri-Institutional Therapeutic Discovery Institute. As part of a goal to help bridge critical gaps in drug discovery, these initiatives support the early stages of drug development, as is being done with RU-505.
“At very high doses, RU-505 is toxic to mice and even at lower doses it caused some inflammation at the injection site, so we are hoping to find ways to reduce this toxicity, while also increasing RU-505’s efficacy so smaller doses can accomplish similar results,” Ahn says.
NYU Langone Medical Center is now using a novel technology that serves as a “flight simulator” for neurosurgeons, allowing them to rehearse complicated brain surgeries before making an actual incision on a patient.

The new simulator, called the Surgical Rehearsal Platform (SRP), creates an individualized walkthrough for neurosurgeons based on 3D imaging taken from the patient’s CT and MRI scans. Surgeons then plan and rehearse the surgeries using the unique software, which combines life-like tissue reaction with accurate modeling of surgical tools and clamps, to enable them to navigate multiple-angled models of a patient’s brain and vasculature.
The SRP was developed by Surgical Theater of Cleveland, Ohio. This augmented reality technology may help improve safety and efficiency during surgeries for conditions including pituitary tumors, skull base tumors, intrinsic brain tumors, aneurysms, and arteriovenous malformations (AVMs), and could potentially allow surgeons from around the world to simultaneously collaborate on a patient’s case in real-time.
”We are excited to partner with Surgical Theater to bring their Surgery Rehearsal Platform to our institution,” said John G. Golfinos, MD, chair of the Department of Neurosurgery at NYU Langone Medical Center and associate professor of neurosurgery at NYU School of Medicine. “The reaction of tissue in these 3D images is incredibly life-like and modeling of surgical tools is equally impressive. The SRP also will enhance the training of medical students, residents and fellows and help them hone their skills in new and more meaningful ways.”
When using the SRP, surgeons can rehearse a specific patient’s case on computer monitors connected to controllers that simulate surgical tools. For example, when rehearsing a surgery for an aneurysm, the SRP reacts realistically when the surgeon virtually applies a clip to the blood vessel. The surgeon then can assess the tissue’s mechanical properties and view realistic microscopic characteristics including shadowing and texture to plan approaches, so that when the real surgery is being performed, doctors have rehearsed and already have a mental picture of what is being seen in the OR.
The SRP obtained clearance from the U.S. Food and Drug Administration (FDA) in February 2013 as a pre-operative software for simulating and evaluating surgical treatment options.
In addition, a newer-generation of this technology from Surgical Theater, the Surgical Navigation Advanced Platform (SNAP), has an application pending with the FDA to allow the tool to be taken into the operating room, so surgeons can see behind arteries and other critical structures in real-time.
Researchers believe they have learned how mutations in the gene that causes Huntington’s disease kill brain cells, a finding that could open new opportunities for treating the fatal disorder. Scientists first linked the gene to the inherited disease more than 20 years ago.

Huntington’s disease affects five to seven people out of every 100,000. Symptoms, which typically begin in middle age, include involuntary jerking movements, disrupted coordination and cognitive problems such as dementia. Drugs cannot slow or stop the progressive decline caused by the disorder, which leaves patients unable to walk, talk or eat.
Lead author Hiroko Yano, PhD, of Washington University School of Medicine in St. Louis, found in mice and in mouse brain cell cultures that the disease impairs the transfer of proteins to energy-making factories inside brain cells. The factories, known as mitochondria, need these proteins to maintain their function. When disruption of the supply line disables the mitochondria, brain cells die.
“We showed the problem could be fixed by making cells overproduce the proteins that make this transfer possible,” said Yano, assistant professor of neurological surgery, neurology and genetics. “We don’t know if this will work in humans, but it’s exciting to have a solid new lead on how this condition kills brain cells.”
The findings are available online in Nature Neuroscience.
Huntington’s disease is caused by a defect in the huntingtin gene, which makes the huntingtin protein. Life expectancy after initial onset is about 20 years.
Scientists have known for some time that the mutated form of the huntingtin protein impairs mitochondria and that this disruption kills brain cells. But they have had difficulty understanding specifically how the gene harms the mitochondria.
For the new study, Yano and collaborators at the University of Pittsburgh worked with mice that were genetically modified to simulate the early stages of the disorder.
Yano and her colleagues found that the mutated huntingtin protein binds to a group of proteins called TIM23. This protein complex normally helps transfer essential proteins and other supplies to the mitochondria. The researchers discovered that the mutated huntingtin protein impairs that process.
The problem seems to be specific to brain cells early in the disease. At the same point in the disease process, the scientists found no evidence of impairment in liver cells, which also produce the mutated huntingtin protein.
The researchers speculated that brain cells might be particularly reliant on their mitochondria to power the production and recycling of the chemical signals they use to transmit information. This reliance could make the cells vulnerable to disruption of the mitochondria.
Other neurodegenerative conditions, including Alzheimer’s disease and amyotrophic lateral sclerosis, also known as Lou Gehrig’s disease, have been linked to problems with mitochondria. Scientists may be able to build upon these new findings to better understand these disorders.
A specific preparation of cocoa-extract called Lavado may reduce damage to nerve pathways seen in Alzheimer’s disease patients’ brains long before they develop symptoms, according to a study conducted at the Icahn School of Medicine at Mount Sinai and published June 20 in the Journal of Alzheimer’s Disease (JAD).

Specifically, the study results, using mice genetically engineered to mimic Alzheimer’s disease, suggest that Lavado cocoa extract prevents the protein β-amyloid- (Aβ) from gradually forming sticky clumps in the brain, which are known to damage nerve cells as Alzheimer’s disease progresses.
Lavado cocoa is primarily composed of polyphenols, antioxidants also found in fruits and vegetables, with past studies suggesting that they prevent degenerative diseases of the brain.
The Mount Sinai study results revolve around synapses, the gaps between nerve cells. Within healthy nerve pathways, each nerve cell sends an electric pulse down itself until it reaches a synapse where it triggers the release of chemicals called neurotransmitters that float across the gap and cause the downstream nerve cell to “fire” and pass on the message.
The disease-causing formation of Aβ oligomers – groups of molecules loosely attracted to each other –build up around synapses. The theory is that these sticky clumps physically interfere with synaptic structures and disrupt mechanisms that maintain memory circuits’ fitness. In addition, Aβ triggers immune inflammatory responses, like an infection, bringing an on a rush of chemicals and cells meant to destroy invaders but that damage our own cells instead.
“Our data suggest that Lavado cocoa extract prevents the abnormal formation of Aβ into clumped oligomeric structures, to prevent synaptic insult and eventually cognitive decline,” says lead investigator Giulio Maria Pasinetti, MD, PhD, Saunders Family Chair and Professor of Neurology at the Icahn School of Medicine at Mount Sinai. “Given that cognitive decline in Alzheimer’s disease is thought to start decades before symptoms appear, we believe our results have broad implications for the prevention of Alzheimer’s disease and dementia.
Evidence in the current study is the first to suggest that adequate quantities of specific cocoa polyphenols in the diet over time may prevent the glomming together of Aβ into oligomers that damage the brain, as a means to prevent Alzheimer’s disease.
The research team led by Dr. Pasinetti tested the effects of extracts from Dutched, Natural, and Lavado cocoa, which contain different levels of polyphenols. Each cocoa type was evaluated for its ability to reduce the formation of Aβ oligomers and to rescue synaptic function. Lavado extract, which has the highest polyphenol content and anti-inflammatory activity among the three, was also the most effective in both reducing formation of Aβ oligomers and reversing damage to synapses in the study mice.
“There have been some inconsistencies in medical literature regarding the potential benefit of cocoa polyphenols on cognitive function,” says Dr. Pasinetti. “Our finding of protection against synaptic deficits by Lavado cocoa extract, but not Dutched cocoa extract, strongly suggests that polyphenols are the active component that rescue synaptic transmission, since much of the polyphenol content is lost by the high alkalinity in the Dutching process.”
Because loss of synaptic function may have a greater role in memory loss than the loss of nerve cells, rescue of synaptic function may serve as a more reliable target for an effective Alzheimer’s disease drug, said Dr. Pasinetti.
The new study provides experimental evidence that Lavado cocoa extract may influence Alzheimer’s disease mechanisms by modifying the physical structure of Aβ oligomers. It also strongly supports further studies to identify the metabolites of Lavado cocoa extract that are active in the brain and identify potential drug targets.
In addition, turning cocoa-based Lavado into a dietary supplement may provide a safe, inexpensive and easily accessible means to prevent Alzheimer’s disease, even in its earliest, asymptomatic stages.
It has become increasingly common to hear reports that big brains are not necessary, or even an evolutionary fluke. However, the new article found that increases in the size of brain areas, such as the visual cortex, are an essential element of evolution.

As part of the study, the researchers found that an increase in the size of the visual part of the brain in different primate species, including humans, apes, and monkeys, is associated with enhanced visual processing.
It is controversial whether overall brain size can predict intelligence. However the size of specialised areas within the brain is associated with specific changes in behaviour such as reducing the susceptibility to visual illusions and increasing the visual acuity or fine details that can be seen.
First author, Dr Alexandra de Sousa explained: “Primates with a bigger visual cortex have better visual resolution, the precision of vision, and reduced visual illusion strength. In essence, the bigger the brain area, the better the visual processing ability.
“The size of brain areas predicts not only the number of neurons (brain cells) in that area, but also the likelihood of connections between neurons. These connections allow for increasingly complex computations to be made that allow for more accurate, and more difficult, visual perception.”
Co-author, Dr Michael Proulx, Senior Lecturer (Associate Professor) in Psychology, added: “This paper is a novel attempt to bring together the micro and macro anatomy of the brain with behaviour. We link visual abilities, the size of brain areas, and the number of neurons that make up those brain areas to provide a framework that ties brain structure and function together.
“The theory of brain size that we discuss can be tested in the future with more behavioural tests of other species, gathering more comparative neuroanatomical data, and by testing other senses and multi-sensory perception, too. We might be able to even predict how well extinct species could sense the world based on fossil data.”
For the study, Dr Alexandra de Sousa, an expert in brain evolution, provided brain size measurements from her and other’s neuroanatomical research. Dr Michael Proulx, an expert in perception, found psychological studies of visual illusions and visual acuity in the same species or general of animals.
The paper ‘What can volumes reveal about human brain evolution? A framework for bridging behavioral, histometric and volumetric perspectives’ is published today in Frontiers in Neuroanatomy – an online, open access journal.