Each year, approximately 2 million traumatic brain injuries (TBIs) occur in the USA, according to the Centers for Disease Control and Prevention. That number includes troops wounded in Iraq and Afghanistan, for whom TBI is considered an invisible wound of war, one that has few successful treatments. “We have nothing beyond ibuprofen for most TBIs,” said Dr. Angus Scrimgeour, who has been investigating the effects of low zinc diets on cell stress following a blast injury. “The adult brain does not self-repair from this kind of trauma.”
Scrimgeour works for the US Army Research Institute of Environmental Medicine and recently looked at the effects of 5-weeks of low and adequate zinc diets on a specific protein in muscle cells called MMP. The study recreated blast injuries in 32 rats similar to what soldiers experience from IEDs, including loss of consciousness. An equal number of rats served as a control group. Results suggest that zinc supplementation reduces blast-induced cell stress. He presented the results of his research at the American Society for Nutrition’s Scientific Sessions & Annual Meeting at EB on Sunday, April 27.
“We know that soldiers’ brain tissue cannot repair on low zinc diets,” said Scrimgeour. “And they are losing zinc through diarrhea and sweating.” The question moving forward is whether prevention through diet supplementation or post-blast treatment works best to repair behavioral deficits associated with mild TBI.
Scrimgeour added that further research is planned to investigate nutrient combinations for treating mild TBI, including omega-3, vitamin D, glutamine and/or zinc. Although the Army is conducting this research, the results can be applied outside of the military, according to Scrimgeour. “As the blast impact experienced by Soldiers are similar to those experienced during head injuries received in a car accident or during an NFL concussion, these findings could translate from the Soldier to the civilian population.” Scrimgeour cautioned, however, that what works in animals doesn’t always work in soldiers, which is why more research is needed.

Almost half of all homeless men who took part in a study by St. Michael’s Hospital had suffered at least one traumatic brain injury in their life and 87 per cent of those injuries occurred before the men lost their homes.
While assaults were a major cause of those traumatic brain injuries, or TBIs, (60 per cent) many were caused by potentially non-violent mechanisms such as sports and recreation (44 per cent) and motor vehicle collisions and falls (42 per cent).
The study, led by Dr. Jane Topolovec-Vranic, a clinical researcher in the hospital’s Neuroscience Research Program, was published in the journal CMAJ Open.
Dr. Topolovec-Vranic said it’s important for health care providers and others who work with homeless people to be aware of any history of TBI because of the links between such injuries and mental health issues, substance abuse, seizures and general poorer physical health.
The fact that so many homeless men suffered a TBI before losing their home suggests such injuries could be a risk factor for becoming homeless, she said. That makes it even more important to monitor young people who suffer TBIs such as concussions for health and behavioural changes, she said.
Dr. Topolovec-Vranic looked at data on 111 homeless men aged 27 to 81 years old who were recruited from a downtown Toronto men’s shelter. She found that 45 per cent of these men had experienced a traumatic brain injury, and of these, 70 per cent were injured during childhood or teenage years and 87 per cent experienced an injury before becoming homeless.
In men under age 40, falls from drug/alcohol blackouts were the most common cause of traumatic brain injury while assault was the most common in men over 40 years old.
Recognition that a TBI sustained in childhood or early teenage years could predispose someone to homelessness may challenge some assumptions that homelessness is a conscious choice made by these individuals, or just the result of their addictions or mental illness, said Dr. Topolovec-Vranic.
This study received funding from the Canadian Institutes of Health Research and the Ontario Neurotrauma Foundation.
Separately, a recent study by Dr. Stephen Hwang of the hospital’s Centre for Research on Inner City Health, found the number of people who are homeless or vulnerably housed and who have also suffered a TBI may be as high as 61 per cent—seven times higher than the general population.
Dr. Hwang’s study, published in the Journal of Head Trauma Rehabilitation, is one of the largest studies to date investigating TBI in homeless populations. The findings come from the Health and Housing in Transition Study, which tracks the health and housing status of homeless and vulnerably housed people in Toronto, Vancouver and Ottawa.
Methylphenidate, also known as Ritalin, may prevent the depletion of self-control, according to research published in Psychological Science, a journal of the Association for Psychological Science.

Self-control can be difficult — sticking with a diet or trying to focus attention on a boring textbook are hard things to do. Considerable research suggests one potential explanation for this difficulty: Exerting self-control for a long period seems to “deplete” our ability to exert self-control effectively on subsequent tasks.
“It is as if self-control is a limited resource that ‘runs out’ if it is used too much,” says lead researcher Chandra Sripada of the University of Michigan. “If we could figure out the brain mechanisms that cause regulatory depletion, then maybe we could find a way to prevent it.”
Previous research has implicated the neurotransmitters dopamine and norepinephrine in regulatory processing. Sripada and University of Michigan collaborators Daniel Kessler and John Jonides decided to see whether manipulating levels of these transmitters might affect regulatory depletion.
The researchers tested 108 adult participants, all of whom took a drug capsule 60 minutes prior to testing. Half of the participants received a capsule that contained methylphenidate, a medication used to treat ADHD that increases brain dopamine and norepinephrine. The other half received a placebo capsule. The study was double-blind, so neither the participants nor the researchers knew at the time of testing who had received which capsule.
The participants then completed a computer-based task in which they were required to press a button when a word containing the letter e appeared on screen. Some were given modified instructions that asked them to refrain from pressing the button if the letter e was next to or one extra letter away from another vowel — this version of the task was designed to tax participants’ self-control.
All of the participants then completed a second computer task aimed at testing their ability to process competing information and exert regulatory control in order to make a correct response.
In line with the researchers’ hypotheses, participants who received the placebo and performed the taxing version of the first task showed greater variability in how quickly they responded in the second task, compared to those whose self-control hadn’t been depleted in the first task.
But for those participants who took the methylphenidate capsule, the first task didn’t have an effect on later performance — the methylphenidate seemed to counteract the self-regulatory depletion incurred by the harder version of the first task.
“These results indicate that depletion of self-control due to prior effort can be fully blocked pharmacologically,” says Sripada. “The task we give people to deplete their self-control is pretty cognitively demanding, so we were surprised at how effective methylphenidate was in blocking depletion of self-control.”
Sripada and colleagues suggest that methylphenidate may help to boost performance of the specific circuits in the brain’s prefrontal cortex that are normally compromised after sustained exertion of self-control.
This doesn’t mean, however, that those of us looking to boost our self-control should go out and get some Ritalin:
“Methylphenidate is a powerful psychotropic medicine that should only be taken with a prescription,” says Sripada. “We want to use this research to better understand the brain mechanisms that lead to depletion of self-control, and what interventions — pharmacological or behavioral — might prevent this.”
With a new generation of military veterans returning home from Iraq and Afghanistan, post-traumatic stress disorder (PTSD) has become a prominent concern in American medical institutions and the culture at-large. Estimates indicate that as many as 35 percent of personnel deployed to Iraq and Afghanistan suffer from PTSD. New research from the University of South Carolina School of Medicine is shedding light on how PTSD is linked to other diseases in fundamental and surprising ways.
The rise in PTSD has implications beyond the impact of the psychiatric disorder and its immediate consequences, which include elevated suicide risk and inability to lead a normal life, that result in approximately $3 billion in lost productivity every year. Over time, these PTSD patients will continue to experience increased risks of a myriad of medical conditions like cardiovascular disease, diabetes, gastrointestinal disease, fibromyalgia, musculoskeletal disorders and others, all of which share chronic inflammation as a common underlying cause.
The mechanisms that trigger PTSD, and that cause PTSD patients to suffer from higher rates of chronic-inflammation-related medical conditions remain unknown. Additionally, PTSD is incurable, and though there are available treatments, they are often not completely effective. In an effort to get to the root of PTSD, and begin to understand the links between PTSD and the secondary diseases that often come with it, a team at the University of South Carolina School of Medicine is investigating PTSD through the lens of inflammation. They have recently published findings of a new study, “Dysregulation in microRNA Expression is Associated with Alterations in Immune Functions in Combat Veterans with Post-traumatic Stress Disorder,” in the journal PLOS ONE.
In this study, led by Drs. Prakash Nagarkatti and Mitzi Nagarkatti, the authors investigated microRNA profiles and tried to establish a link between the microRNA and inflammation in combat veterans of the Persian Gulf, Iraq and Afghanistan wars who are PTSD patients at the Dorn VA Medical Center. MicroRNA are small, noncoding RNA that can switch human genes on and off, effectively controlling gene expression. Some specific types of microRNA are known to regulate genes involved in inflammation, making them a kind of marker that can indicate when inflammation is present.
The microRNA role in PTSD has not been investigated previous to this study, which found that the PTSD patients had significant alterations in microRNA expression. The study analyzed 1163 microRNA and found that the expression of microRNA that regulate genes involved in inflammation were altered in PTSD patients. The alterations were found to be linked to heightened inflammation in these patients.
Dr. Mitzi Nagarkatti sums up the significance of this study as follows: “We are very excited about these results. Thus far, no one had looked at the role of microRNA in the blood of PTSD patients. Thus, our finding that the alterations in these small molecules are connected to higher inflammation seen in these patients is very interesting and helps establish the connection between war trauma and microRNA changes.”
In addition to the alterations in microRNA expression, the study also found that PTSD patients had higher levels of inflammation caused by certain types of immune cells called T cells. These T cells produced higher levels of inflammatory mediators called cytokines, specifically interferon-gamma and interleukin-17. This finding was especially interesting because one of the inflammation-associated microRNAs, miR-125a, which specifically targets increased production of interferon-gamma, was found to have decreased expression in the PTSD patients studied. Overall, these results suggested that trauma may cause alterations in the expression of microRNA which promote inflammation in PTSD patients.
Commenting on this, Dr. Prakash Nagarkatti said, “These studies form the foundation to further analyze the role of microRNA in PTSD. Trauma experienced during war may trigger changes in microRNA which may in turn cause various clinical disorders seen in PTSD patients. Our long-term goal is to identify whether PTSD patients express a unique signature profile of microRNA which can be used towards early detection, prevention and treatment of PTSD.”
Researchers at Aarhus University, Denmark, have drawn up the most detailed ‘image of the enemy’ to date of one of the body’s most important players in the development of Parkinson’s disease. This provides much greater understanding of the battle taking place when the disease occurs – knowledge that is necessary if we are to understand and treat Parkinsonism. However, it also raises an existential question because part of the conclusion is that we do not live forever!
Parkinson’s disease is one of the most common neurological disorders, with about 7000 people suffering from the disease in Denmark alone. There is no cure, and the symptoms continue to get worse. The disease occurs because different nerves in the brain die. These include the nerve cells that form dopamine, which is known as the brain’s ‘reward substance’ and which also helps control our fine motor skills.
A group of researchers from Aarhus University, the University of Southern Denmark (SDU) and the University of Cambridge has just published two studies in the prestigious Journal of the American Chemical Society (JACS) and Angewandte Chemie. These studies provide the best insight to date into the behaviour of a particular protein state that plays an important role in Parkinson’s disease. In other words, they have created a detailed image of what is presumed to be the arch enemy we are up against in our understanding of Parkinsonism. It is an advanced antagonist, and one that functions with a considerable degree of unpredictability. “Fighting the enemy is by no means a Sunday outing,” say the main authors of the results – Professor Daniel Otzen, Aarhus University, and his colleagues Nikolai Lorenzen and Wojciech Paslawski, who recently defended their PhD dissertations on this subject at Aarhus University’s Interdisciplinary Nanoscience Centre (iNANO).
Scientists have known that abnormal brain growth is associated with autism spectrum disorder. However, the relationship between the two has not been well understood.

(Image: Thinkstock)
Now, scientists from the Florida campus of The Scripps Research Institute (TSRI) have shown that mutations in a specific gene that is disrupted in some individuals with autism results in too much growth throughout the brain, and yet surprisingly specific problems in social interactions, at least in mouse models that mimic this risk factor in humans.
“What was striking is that these were basically normal animals in terms of behavior, but there were consistent deficits in tests of social interaction and recognition—which approximate a major symptom of autism,” said Damon Page, a TSRI biologist who led the study. “This suggests that when most parts of the brain are overgrown, the brain somehow adapts to it with minimal effects on behavior in general. However, brain circuits relevant to social behavior are more vulnerable or less able to tolerate this overgrowth.”
The study, which focuses on the gene phosphatase and tensin homolog (PTEN), was recently published online ahead of print by the journal Human Molecular Genetics.
Autism spectrum disorder is a neurodevelopmental disorder involving a range of symptoms and disabilities involving social deficits and communication difficulties, repetitive behaviors and interests, and sometimes cognitive delays. The disorder affects in approximately one percent of the population; some 80 percent of those diagnosed are male.
In a previous study, Page and colleagues found that mutations in Pten causes increased brain size and social deficits, with both symptoms being exacerbated by a second “hit” to a gene that regulates levels of the neurotransmitter serotonin in the brain. In the new study, the TSRI team set out to explore whether mutations in Pten result in widespread or localized overgrowth within the brain, and whether changes in brain growth are associated with broad or selective deficits in tests of autism-relevant behaviors in genetically altered mice. The team tested mice for autism spectrum disorder-related behaviors including mood, anxiety, intellectual, and circadian rhythm and/or sleep abnormalities.
The researchers found that Pten mutant mice showed altered social behavior, but few other changes—a more subtle change than would have been predicted given broad expression and critical cellular function of the gene.
Intriguingly, some of the more subtle impairments were sex-specific. In addition to social impairments, males with the mutated gene showed abnormalities related to repetitive behavior and mood/anxiety, while females exhibited additional circadian activity and emotional learning problems.
The results raise the question of how mutations in PTEN, a general regulator of growth, can have relatively selective effects on behavior and cognitive development. One idea is that PTEN mutations may desynchronize the normal pattern of growth in key cell types—the study points to dopamine neurons—that are relevant for social behavior.
“Timing is everything,” Page said. “Connections have to form in the right place at the right time for circuits to develop normally. Circuitry involved in social behavior may turn out to be particularly vulnerable to the effects of poorly coordinated growth.”
A new study suggests that targeting B cells, which are a type of white blood cell in the immune system, may be associated with reduced disease activity for people with multiple sclerosis (MS). The study is released today and will be presented at the American Academy of Neurology’s 66th Annual Meeting in Philadelphia, April 26 to May 3, 2014.
For the study, 231 people with relapsing-remitting MS received either a placebo or one of several low dosages of the drug ofatumumab, which is an anti-B cell antibody, for 24 weeks, with the first 12 weeks making up the placebo-controlled period. The main objective was to determine the effects of ofatumumab dosing regimens compared to placebo on the total number of new brain lesions assessed every four weeks over a 12-week period.
All dose groups including placebo showed lesion activity in the first four weeks with lesion suppression in all ofatumumab dose groups from weeks four to12. Researchers measured the amount of B cells in participants and compared that to the total number of new brain lesions that appeared on brain scans, which is a marker of disease activity.
The researchers found that when B cells were reduced to below a threshold of 64 cells per microliter, disease activity, as measured by appearance of new brain lesions, was significantly reduced. On average, participants had an annualized rate of less than one new brain lesion per year when B cells were maintained below a threshold of 32 to 64 cells per microliter, compared with 16 lesions without treatment.
The most common side effects, defined as those occurring in at least five percent of participants and at a rate twice that of placebo for weeks zero to12, were injection-related reaction, dizziness, anxiety, fever, respiratory tract infection and nerve pain.
Study author Daren Austin, PhD, of GlaxoSmithKline in Uxbridge, United Kingdom, and a member of the American Academy of Neurology, said the study results also suggest that peripheral, rather than central, B cells may be the most relevant target for anti-B cell therapy.
“These results need to be validated, of course, but the findings are interesting,” Austin said. “They provide new insight into the mechanism of B cells in MS and present a possible new target threshold for exploring the potential benefit of anti-B cell therapy.” Ofatumumab is not approved anywhere in the world for use in the treatment of multiple sclerosis.
Ellen’s (not her real name) adoptive parents weren’t surprised when the school counselor suggested that she might have attention deficit hyperactivity disorder (ADHD).
Several professionals had made this suggestion over the years. Given that homework led to one explosion after another, and that at school Ellen, who is eleven, spent her days jiggling up and down in her seat, unable to concentrate for more than ten minutes, it seemed a reasonable assumption. Yet her parents always felt that ADHD didn’t quite capture the extent of Ellen’s issues over the years. Fortunately the school counsellor was familiar with fetal alcohol spectrum disorder (FASD). When she learned that Ellen’s birth mother had consumed alcohol during pregnancy, she raised the possibility that Ellen’s problems could be attributable to FASD and referred her for further assessment.
It’s a familiar story, and most of us reading about Ellen would assume that she did indeed suffer from ADHD.
But now researchers from McGill have suggested that there may be an overreporting of attention problems in children with FASD, simply because parents and teachers are using a misplaced basis for comparison. They are testing and comparing children with FASD with children of the same physical or chronological age, rather than with children of the same mental age, which is often quite a lot younger.

“Because the link between fetal alcohol syndrome and ADHD is so commonly described in the literature, both parents and teachers are more likely to expect these children to have attention problems,” says Prof. Jacob Burack, a professor in McGill’s Dept. of Educational and Counselling Psychology and the senior author on a recent study on the subject. “But what teachers often don’t recognize is that although the child they are dealing with is eleven years old in chronological terms, they are actually functioning at the developmental age of an eight-year old. That’s a pretty big difference. And when you use mental age as the basis of comparison, many of the attention problems that have been described in children with FASD no longer seem of primary importance.”
The researchers recruited children with FASD whose average chronological age was just under twelve years old. But their average mental age, determined by standard tests, was actually closer to nine-and-a-half years old. (The children were recruited through the Asante Centre for Fetal Alcohol Syndrome in British Columbia, and though the number of children studied may appear small, this is a fairly typical size for studies on FASD, given the difficulties of the diagnostic process.)
These children were then compared with children who were developing typically and whose average chronological age was about eight-and-a-half years old and whose average mental age was similar to that of the group of children diagnosed with FASD.
After using tests to measure specific aspects of attention, the researchers then compared the performance of children with FASD on these tests with the results of children of the same mental age. What they found was that while children like Ellen had difficulties with certain kinds of attention skills, notably in terms of shifting attention from one object to another, there were other areas, such as focus, where they had no significant difficulties at all. So, if we were to compare these aspects of attention to a hockey game, typically these children would have no difficulty focusing on the puck in the arena, but would have problems following the puck being passed from one player to another.
This suggests to Dr. Kimberly Lane, the PhD student who conducted the research, that there is a need to develop a more nuanced understanding of attention skills. “We use words like attention loosely, but it’s really an umbrella term that covers various aspects of attending to different people or events or environments,” says Dr. Lane. “By using more complex assessment techniques of various aspects of attention it will be possible to get a better picture of the attention difficulties faced by children with FASD,” she adds.
“But no matter what the tests say, it’s important for teachers and parents to understand that.the difficulties these children have with attention may be less important than their more general problems, and we need to work with them as they are.”
Johns Hopkins researchers report that they have identified a protein essential to the formation of the tiny brain region in mice that coordinates sleep-wake cycles and other so-called circadian rhythms.

(Image caption: An illustration of the activity patterns of normal mice (left). An illustration of the activity patterns mice whose “master clock,” or SCN, has been disrupted (right). Credit: Cell Reports, Bedont et al.)
By disabling the gene for that key protein in test animals, the scientists were able to home in on the mechanism by which that brain region, known as the suprachiasmatic nucleus or SCN, becomes the body’s master clock while the embryo is developing.
The results of their experiments, reported in the tk issue of Cell Reports, are an important step toward understanding how to better manage the disruptive effects experienced by shift workers, as well as treatment of people with sleep disorders, the researchers say.
“Shift workers tend to have higher rates of diabetes, obesity, depression and cancer. Many researchers think that’s somehow connected to their irregular circadian rhythms, and thus to the SCN,” says Seth Blackshaw, Ph.D., an associate professor in the Department of Neuroscience and the Institute for Cell Engineering at the Johns Hopkins University School of Medicine. “Our new research will help us and other researchers isolate the specific impacts of the SCN on mammalian health.”
Blackshaw explains that every cell in the body has its own “clock” that regulates aspects such as its rate of energy use. The SCN is the master clock that synchronizes these individual timekeepers so that, for example, people feel sleepy at night and alert during the day, are hungry at mealtimes, and are prepared for the energy influx that hits fat cells after eating. “A unique property of the SCN is that if its cells are grown in a dish, they quickly synchronize their clocks with each another,” Blackshaw says.
But while evidence like this gave researchers an idea of the SCN’s importance, they hadn’t completely teased its role apart from that of the body’s other clocks, or from other parts of the brain.
The Johns Hopkins team looked for ways to knock down SCN function by targeting and disabling certain genes that disrupt only the formation of the SCN clock. They analyzed which genes were active in different areas of developing mouse brains to identify those that were “turned on” only in the SCN. One of the “hits” was Lhx1, a member of a family of genes whose protein products affect development by controlling the activity of other genes. When the researchers turned off Lhx1 in the SCN of mouse embryos, the grown mice lacked distinctive biochemical signatures seen in the SCN of normal mice.
The genetically modified mice behaved differently, too. Some fell into a pattern of two to three separate cycles of sleep and activity per day, in contrast to the single daily cycle found in normal mice, while others’ rhythms were completely disorganized, Blackshaw says. Though an SCN is present in mutant mice, it communicates poorly with clocks elsewhere in the body.
Blackshaw says he expects that the mutant mice will prove a useful tool in finding whether disrupted signaling from the SCN actually leads to the health problems that shift workers experience, and if so, how this might happen. Although mouse models do not correlate fully to human disease, their biochemical and genetic makeup is closely aligned.
Blackshaw’s team also plans to continue studying the biochemical chain of events surrounding the Lhx1 protein to determine which proteins turn the Lhx1 gene on and which genes it, in turn, directly switches on or off. Those genes could be at the root of inherited sleep disorders, Blackshaw says, and the proteins they make could prove useful as starting points for the development of new drugs to treat insomnia and even jet lag.
An international team of researchers have identified a previously unknown neurodegenerative disorder and discovered it is caused by a single mutation in one individual born during the height of the Ottoman Empire in Turkey about 16 generations ago.

(Image caption: An fMRI scan of the brain of a patient with CLP1 mutation reveals severe atrophy of the brainstem (red line) and cerebellum (blue) as well as lack of formation of the corpus callosum (green), which connects both sides of the cerebrum (yellow), which is also atrophied. The lines outline approximately the expected sizes of the brain areas. A study traced the mutation to a single individual born in Turkey during the Ottoman Empire, some 16 generations ago.)
The genetic cause of the rare disorder was discovered during a massive analysis of the individual genomes of thousands of Turkish children suffering from neurological disorders.
“The more we learn about basic mechanisms behind rare forms of neuro-degeneration, the more novel insights we can gain into more common diseases such as Alzheimer’s or Lou Gehrig’s Disease,” said Murat Gunel, the Nixdorff-German Professor of Neurosurgery, and professor of genetics and neurobiology at Yale.
Gunel is a senior co-author of one of two papers published in the April 24 issue of the journal Cell that document the devastating effects of a mutation in the CLP1 gene. Gunel and colleagues at Yale Center for Mendelian Genomics along with Joseph Gleeson’s group at University of California-San Diego compared DNA sequencing results of more than 2,000 children from different families with neurodevelopmental disorders. In four apparently unrelated families, they identified the exact same mutation in the CLP1 gene. Working with the Frank Bass group from the Netherlands, the researchers also studied how CLP1 mutations interfered with the transfer of information encoded within genes to cells’ protein-making machinery.
The discovery of the identical mutation in seemingly unrelated families originally from eastern Turkey suggested an ancestral mutation, dating back several generations, noted the researchers.
Affected children suffer from intellectual disability, seizures, and delayed or absent mental and motor development, and their imaging studies show atrophy affecting the cerebral cortex, cerebellum, and the brain stem.
The second Cell paper by researchers from Baylor School of Medicine and Austria also found the identical founder mutation in CLP1 in another 11 children from an additional five families originally from eastern Turkey.
Gunel said that the high prevalence of consanguineous marriages [between closely related people] in Turkey and the Middle East leads to these rare recessive genetic neurodegenerative disorders. Affected children inherit mutations in the same gene from both of their parents, who are closely related to each other, such as first cousins. Without consanguinity between parents, children are very unlikely to inherit two mutations in the same gene.
“By dissecting the genetic basis of these neurodevelopmental disorders, we are gaining fundamental insight into basic physiological mechanisms important for human brain development and function” Gunel said. “We learn a lot about normal biology by studying what happens when things go wrong.”
Better-educated people appear to be significantly more likely to recover from a moderate to severe traumatic brain injury (TBI), suggesting that a brain’s “cognitive reserve” may play a role in helping people get back to their previous lives, new Johns Hopkins research shows.

The researchers, reporting in the journal Neurology, found that those with the equivalent of at least a college education are seven times more likely than those who didn’t finish high school to be disability-free one year after a TBI serious enough to warrant inpatient time in a hospital and rehabilitation facility.
The findings, while new among TBI investigators, mirror those in Alzheimer’s disease research, in which higher educational attainment — believed to be an indicator of a more active, or more effective, use of the brain’s “muscles” and therefore its cognitive reserve — has been linked to slower progression of dementia.
“After this type of brain injury, some patients experience lifelong disability, while others with very similar damage achieve a full recovery,” says study leader Eric B. Schneider, Ph.D., an epidemiologist at the Johns Hopkins University School of Medicine’s Center for Surgical Trials and Outcomes Research. “Our work suggests that cognitive reserve ¬— the brain’s ability to be resilient in the face of insult or injury — could account for the difference.”
Schneider conducted the research in conjunction with Robert D. Stevens. M.D., a neuro-intensive care physician with Johns Hopkins’ Department of Anesthesiology and Critical Care Medicine.
For the study, the researchers studied 769 patients enrolled in the TBI Model Systems database, an ongoing multi-center cohort of patients funded by the National Institute on Disability and Rehabilitation Research. The patients had been hospitalized with a moderate to severe TBI and subsequently admitted to a rehabilitation facility.
Of the 769 patients, 219 — or 27.8 percent — were free of any detectable disability one year after their injury. Twenty-three patients who didn’t complete high school — 9.7 percent of those at that education level — recovered, while 136 patients with between 12 and 15 years of schooling — 30.8 percent of those at that educational level — did. Nearly 40 percent of patients — 76 of the 194 — who had 16 or more years of education fully recovered.
Schneider says researchers don’t currently understand the biological mechanisms that might account for the link between years of schooling and improved recovery.
“People with increased cognitive reserve capabilities may actually heal in a different way that allows them to return to their pre–injury function and/or they may be able to better adapt and form new pathways in their brains to compensate for the injury,” Schneider says. “Further studies are needed to not only find out, but also to use that knowledge to help people with less cognitive reserve.”
Meanwhile, he says, “What we learned may point to the potential value of continuing to educate yourself and engage in cognitively intensive activities. Just as we try to keep our bodies strong in order to help us recover when we are ill, we need to keep the brain in the best shape it can be.”
Adds Stevens: “Understanding the underpinnings of cognitive reserve in terms of brain biology could generate ideas on how to enhance recovery from brain injury.”
Neuroscientists have discovered a brain pathway that underlies the emotional behaviours critical for survival.

New research by the University of Bristol, published in the Journal of Physiology, has identified a chain of neural connections which links central survival circuits to the spinal cord, causing the body to freeze when experiencing fear.
Understanding how these central neural pathways work is a fundamental step towards developing effective treatments for emotional disorders such as anxiety, panic attacks and phobias.
An important brain region responsible for how humans and animals respond to danger is known as the PAG (periaqueductal grey), and it can trigger responses such as freezing, a high heart rate, increase in blood pressure and the desire for flight or fight.
This latest research has discovered a brain pathway leading from the PAG to a highly localised part of the cerebellum, called the pyramis. The research went on to show that the pyramis is involved in generating freezing behaviour when central survival networks are activated during innate and learnt threatening situations.
The pyramis may therefore serve as an important point of convergence for different survival networks in order to react to an emotionally challenging situation.
Dr Stella Koutsikou, first author of the study and Research Associate in the School of Physiology and Pharmacology at the University of Bristol, said: “There is a growing consensus that understanding the neural circuits underlying fear behaviour is a fundamental step towards developing effective treatments for behavioural changes associated with emotional disorders.”
Professor Bridget Lumb, Professor of Systems Neuroscience, added: “Our work introduces the novel concept that the cerebellum is a promising target for therapeutic strategies to manage dysregulation of emotional states such as panic disorders and phobias.”
The researchers involved in this work are all members of Bristol Neuroscience which fosters interactions across one of the largest communities of neuroscientists in the UK.
Professor Richard Apps said: “This is a great example of how Bristol Neuroscience brings together expertise in different fields of neuroscience leading to exciting new insights into brain function.”
A study of older adults at increased risk for Alzheimer’s disease shows that moderate physical activity may protect brain health and stave off shrinkage of the hippocampus – the brain region responsible for memory and spatial orientation that is attacked first in Alzheimer’s disease. Dr. J. Carson Smith, a kinesiology researcher in the University of Maryland School of Public Health who conducted the study, says that while all of us will lose some brain volume as we age, those with an increased genetic risk for Alzheimer’s disease typically show greater hippocampal atrophy over time. The findings are published in the open-access journal Frontiers in Aging Neuroscience.

"The good news is that being physically active may offer protection from the neurodegeneration associated with genetic risk for Alzheimer’s disease," Dr. Smith suggests. "We found that physical activity has the potential to preserve the volume of the hippocampus in those with increased risk for Alzheimer’s disease, which means we can possibly delay cognitive decline and the onset of dementia symptoms in these individuals. Physical activity interventions may be especially potent and important for this group."
Dr. Smith and colleagues, including Dr. Stephen Rao from the Cleveland Clinic, tracked four groups of healthy older adults ages 65-89, who had normal cognitive abilities, over an 18-month period and measured the volume of their hippocampus (using structural magnetic resonance imaging, or MRI) at the beginning and end of that time period. The groups were classified both for low or high Alzheimer’s risk (based on the absence or presence of the apolipoprotein E epsilon 4 allele) and for low or high physical activity levels.
Of all four groups studied, only those at high genetic risk for Alzheimer’s who did not exercise experienced a decrease in hippocampal volume (3 percent) over the 18-month period. All other groups, including those at high risk for Alzheimer’s but who were physically active, maintained the volume of their hippocampus.
"This is the first study to look at how physical activity may impact the loss of hippocampal volume in people at genetic risk for Alzheimer’s disease," says Dr. Kirk Erickson, an associate professor of psychology at the University of Pittsburgh. "There are no other treatments shown to preserve hippocampal volume in those that may develop Alzheimer’s disease. This study has tremendous implications for how we may intervene, prior to the development of any dementia symptoms, in older adults who are at increased genetic risk for Alzheimer’s disease."
Individuals were classified as high risk for Alzheimer’s if a DNA test identified the presence of a genetic marker – having one or both of the apolipoprotein E-epsilon 4 allele (APOE-e4 allele) on chromosome 19 – which increases the risk of developing the disease. Physical activity levels were measured using a standardized survey, with low activity being two or fewer days/week of low intensity activity, and high activity being three or more days/week of moderate to vigorous activity.
"We know that the majority of people who carry the E4 allele will show substantial cognitive decline with age and may develop Alzheimer’s disease, but many will not. So, there is reason to believe that there are other genetic and lifestyle factors at work," Dr. Smith says. "Our study provides additional evidence that exercise plays a protective role against cognitive decline and suggests the need for future research to investigate how physical activity may interact with genetics and decrease Alzheimer’s risk."
Dr. Smith has previously shown that a walking exercise intervention for patients with mild cognitive decline improved cognitive function by improving the efficiency of brain activity associated with memory. He is planning to conduct a prescribed exercise intervention in a population of healthy older adults with genetic and other risk factors for Alzheimer’s disease and to measure the impact on hippocampal volume and brain function.
TAU discovers that protein clusters implicated in neurodegenerative diseases actually serve to protect brain cells
People diagnosed with Huntington’s disease, most in their mid-thirties and forties, face a devastating prognosis: complete mental, physical, and behavioral decline within two decades. “Mutant” protein clusters, long blamed for the progression of the genetic disease, have been the primary focus of therapies in development by pharmaceutical companies. But according to new research from Prof. Gerardo Lederkremer and Dr. Julia Leitman of Tel Aviv University’s Department of Cell Research and Immunology, in collaboration with Prof. Ulrich Hartl of the Max Planck Institute for Biochemistry, these drugs may not only be ineffective — they may pose a serious threat to patients.

In two ground-breaking studies, published in the journals PLOS ONE and Nature Communications, Prof. Lederkremer and his team demonstrated that protein clusters are not the cause of toxicity in Huntington’s disease. On the contrary, these aggregates actually serve as a defense mechanism for “stressed” brain cells. Conducted on tissue cultures using cutting-edge microscopic technology, their studies identified a different causative agent — the “stress response” of affected brain cells.
"The upsetting implication for therapy of this disease is that drugs being developed to interfere with the formation of protein aggregates may in fact be detrimental," said Prof. Lederkremer. "The identification of the new cause will hopefully lead to the development of new therapeutic approaches. This may hold true for other neurodegenerative diseases as well."
Starting from genetic scratch
Prof. Lederkremer and his team chose to examine the effect of protein aggregates in the pathology of Huntington’s disease because its genetic cause is well-known, unlike those of other neurodegenerative diseases, such as Parkinson’s, whose origins remain less clear.
"What we found in this study — a surprise, although we suspected it — was that damage to the cells, the cell ‘stress’ that leads to death of cells, appeared well before the protein aggregates did," said Prof. Lederkremer. "And even more surprising, when the aggregates finally appeared, the stress was reduced, in some cases even stopping. The actual process of forming an aggregate was protective, isolating and segregating the problematic proteins. This explains why in autopsies of people who died of Huntington’s and other diseases like Alzheimer’s or old age, the protein aggregates in the brains were all quite similar, reflecting no specific disease link."
By interfering with the stress response of brain cells, rather than the formation of protein clusters, scientists may be able to slow, or even halt, the progression of neurodegenerative diseases. According to Prof. Lederkremer, this research paves the way for a revolutionary new direction for pharmaceutical research to treat Huntington’s, Alzheimer’s, Parkinson’s, and other neurodegenerative diseases.
Response to stress
"The practical consequences are that several companies are already in advanced stages of development of drugs inhibiting this form of protein aggregate, interfering with the body’s natural process to protect the brain," said Prof. Lederkremer. "But the drugs should be focused on another area altogether, and the protein aggregates, a protective resource for the brain, should be left intact."
Samples of brain cells from mouse models afflicted with Huntington’s disease were examined using “live cell imaging,” the study of live cells through time-lapse microscopy. Prof. Lederkremer and his team were thus able to identify a compound that modified brain cells’ response to stress, promoting their survival.
"Our approach was to interfere with the stress response instead of the formation of the protein aggregates, and the lab succeeded in identifying a compound that altered the response, rescuing affected cells from death," said Prof. Lederkremer. "Our findings are most encouraging for the development of a therapy for this devastating disease, which is presently incurable."
Chimpanzees may throw tantrums like toddlers, but their total brain size suggests they have more self-control than, say, a gerbil or fox squirrel, according to a new study of 36 species of mammals and birds ranging from orangutans to zebra finches.

Scientists at Duke University, UC Berkeley, Stanford, Yale and more than two-dozen other research institutions collaborated on this first large-scale investigation into the evolution of self-control, defined in the study as the ability to inhibit powerful but ultimately counter-productive behavior. They found that the species with the largest brain volume – not volume relative to body size – showed superior cognitive powers in a series of food-foraging experiments.
Moreover, animals with the most varied diets showed the most self-restraint, according to the study published in the journal of the Proceedings of the National Academy of Sciences.
“The study levels the playing field on the question of animal intelligence,” said UC Berkeley psychologist Lucia Jacobs, a co-author of this study and of its precursor, a 2012 paper in the journal, Animal Cognition.
This latest study was led by evolutionary anthropologists Evan MacLean, Brian Hare and Charles Nunn of Duke University. The findings challenge prevailing assumptions that “relative” brain size is a more accurate predictor of intelligence than “absolute” brain size. One possibility, they posited, is that “as brains get larger, the total number of neurons increases and brains tend to become more modularized, perhaps facilitating the evolution of new cognitive networks.”
While participating researchers all performed the same series of experiments, they did so on their own turf and on their own animal subjects. Data was provided on bonobos, chimpanzees, gorillas, olive baboons, stump-tailed macaques, golden snub-nosed monkeys, brown, red-bellied and aye-aye lemurs, coyotes, dogs, gray wolves, Asian elephants, domestic pigeons, orange-winged amazons, Eurasian jays, western scrub jay, zebra finches and swamp sparrows.
Food inside a tube used as bait
In one experiment, creatures large and small were tested to see if they would advance toward a clear cylinder visibly containing food – showing a lack of self-restraint – after they had been trained to access the food through a side opening in an opaque cylinder. Large-brained primates such as gorillas quickly navigated their way to the treat or “bait.” Smaller-brained animals did so with mixed results.
Jacobs and UC Berkeley doctoral student Mikel Delgado contributed the only rodent data in the study, putting some of the campus’s fox squirrels and some Mongolian gerbils in their lab through food-foraging tasks.
Mixed results on campus squirrels’ self-restraint
In the case of the fox squirrels, the red-hued, bushy-tailed critters watched as the food was placed in a side opening of an opaque cylinder. Once they demonstrated a familiarity with the location of the opening, the food was moved to a transparent cylinder and the real test began. If the squirrels lunged directly at the food inside the bottle, they had failed to inhibit their response. But if they used the side entrance, the move was deemed a success.
“About half of the squirrels and gerbils did well and inhibited the direct approach in more than seven out of 10 trials,” Delgado said. “The rest didn’t do so well.”
In a second test, three cups (A, B and C) were placed in a row on their sides so the animals could see which one contained food. It was usually cup A. The cups were then turned upside down so the “baited” cup could no longer be seen. If the squirrels touched the cup with the food three times in a row, they graduated to the next round. This time, the food was moved from cup A to cup C at the other end of the row.
“The question was, would they approach cup A, where they had originally learned the food was placed, or could they update this learned response to get the food from a new location?” Delgado said. “The squirrels and gerbils tended to go to the original place they had been trained to get food, showing a failure to inhibit what they originally learned.” Click here for video showing other animals doing the cup test.
“It might be that a squirrel’s success in life is affected the same way as in people,” Jacobs said. “By its ability to slow down and think a bit before it snatches at a reward.”
A recently FDA-approved device has been shown to reduce seizures in patients with medication-resistant epilepsy by as much as 50 percent. When coupled with an innovative electrode placement planning system developed by physicians at Rush, the device facilitated the complete elimination of seizures in nearly half of the implanted Rush patients enrolled in the decade-long clinical trials.

That’s good news for a large portion of the nearly 400,000 people in the U.S. living with epilepsy whose seizures can’t be controlled with medications and who are not candidates for brain surgery.
Epilepsy is a chronic neurological condition characterized by recurrent seizures that disrupt the senses, or can involve short periods of unconsciousness or convulsions. “Many people with epilepsy have scores of unpredictable seizures every day that make it impossible for them to drive, work or even get a good night’s sleep,” said Dr. Marvin Rossi, co-principal investigator of the NeuroPace Pivotal Clinical Trial and assistant professor of neurology at the Rush Epilepsy Center.
The NeuroPace RNS System uses responsive, or ‘on-demand’ direct stimulation to detect abnormal electrical activity in the brain and deliver small amounts of electrical stimulation to suppress seizures before they begin.
The device is surgically placed underneath the scalp within the skull and connected to electrodes that are strategically placed within the brain where the seizures originate (called the seizure focus). A programmed computer chip in the skull communicates with the system to record data and to help regulate responsive stimulation.
The unique electrode placement planning modeling system developed at Rush uses a computer-intensive mapping system that facilitates surgical placement of electrodes at the precise location in the brain’s temporal lobe circuitry. When stimulated, these extensive epileptic circuits are calmed. The modeling system predicts where in the brain the activity begins and spreads, so that the device can better influence the maximal extent of the epileptic pathway.
The device also acts as an implanted EEG for recording brain activity. This function was first shown at Rush to help determine whether the patient will further benefit from a surgical resection, in which surgeons remove a portion of the temporal lobe network. Dr. Richard Byrne, chairman of Neurosurgery at Rush, implants the electrodes in the temporal lobes.
As a result, physicians at Rush can offer patients the new implantable neurostimulator device, a surgical resection or both with the possibility of completely eliminating seizures. “This device is also being used at Rush as a foundation and inspiration for building cutting-edge hybrid stimulation therapy-drug molecule delivery systems,” said Rossi.
“Devices that treat epilepsy may offer new hope to patients when medication is ineffective and resection is not an option,” said Rossi. “Not long ago, it was highly unlikely that these patients would ever be free of their seizures. Now, several of our Rush patients with this device are actually able to drive, lower or even eliminate their medications and aren’t as limited as they once were. There is no doubt that quality of life of the majority of our implanted patients is significantly improved.”
According to the Centers for Disease Control and Prevention, in 2010, epilepsy affected approximately 2.3 million adults in the U.S. and 467,711 children under the age of 17.
Researchers at the University of Toronto say a sleep disorder that causes people to act out their dreams is the best current predictor of brain diseases like Parkinson’s and many other forms of dementia.

"Rapid-eye-movement sleep behaviour disorder (RBD) is not just a precursor but also a critical warning sign of neurodegeneration that can lead to brain disease," says associate professor and lead author Dr. John Peever. In fact, as many as 80 to 90 per cent of people with RBD will develop a brain disease."
As the name suggests, the disturbance occurs during the rapid-eye-movement (REM) stage of sleep and causes people to act out their dreams, often resulting in injury to themselves and/or bed partner. In healthy brains, muscles are temporarily paralyzed during sleep to prevent this from happening.
"It’s important for clinicians to recognize RBD as a potential indication of brain disease in order to diagnose patients at an earlier stage," says Peever. "This is important because drugs that reduce neurodegeneration could be used in RBD patients to prevent (or protect) them from developing more severe degenerative disorders."
His research examines the idea that neurodegeneration might first affect areas of the brain that control sleep before attacking brain areas that cause more common brain diseases like Alzheimer’s.
Peever says he hopes the results of his study lead to earlier and more effective treatment of neurodegenerative diseases.