A new blood biomarker correctly predicted which concussion victims went on to have white matter tract structural damage and persistent cognitive dysfunction following a mild traumatic brain injury (mTBI). Researchers in the Perelman School of Medicine at the University of Pennsylvania, in conjunction with colleagues at Baylor College of Medicine, found that the blood levels of a protein called calpain-cleaved αII-spectrin N-terminal fragment (SNTF) were twice as high in a subset of patients following a traumatic injury. If validated in larger studies, this blood test could identify concussion patients at increased risk for persistent cognitive dysfunction or further brain damage and disability if returning to sports or military activities.

More than 1.5 million children and adults suffer concussions each year in the United States, and hundreds of thousands of military personal endure these mild traumatic brain injuries worldwide. Current tests are not capable of determining the extent of the injury or whether the injured person will be among the 15-30 percent who experience significant, persistent cognitive deficits, such as processing speed, working memory and the ability to switch or balance multiple thoughts.
"New tests that are fast, simple, and reliable are badly needed to predict who may experience long-term effects from concussions, and as new treatments are developed in the future, to identify who should be eligible for clinical trials or early interventions," said lead author Robert Siman, PhD, research professor of Neurosurgery at Penn. "Measuring the blood levels of SNTF on the day of a brain injury may help to identify the subset of concussed patients who are at risk of persistent disability."
In a study published yesterday in Frontiers in Neurology, Penn and Baylor researchers evaluated blood samples and diffusion tensor images from a subgroup of 38 participants in a larger study of mTBI with ages ranging from 15 to 25 years old. 17 had sustained a head injury caused by blunt trauma, acceleration or deceleration forces, 13 had an orthopaedic injury, and 8 were healthy, uninjured, demographically matched controls.
In taking neuropsychological and cognitive tests over the course of three months, results within the mTBI group varied considerably, with some patients performing as well as the healthy controls throughout, while others showed impairment initially that resolved by three months, and a third group with cognitive dysfunction persisting through three months. The nine patients who had abnormally high levels of SNTF (7 mTBI and 2 orthopaedic patients) also had significant white matter damage apparent in radiological imaging.
"The blood test identified SNTF in some of the orthopaedic injury patients as well, suggesting that these injuries could also lead to abnormalities in the brain, such as a concussion, that may have been overlooked with existing tests," said Douglas Smith, MD, director of the Penn Center for Brain Injury and Repair and professor of Neurosurgery. "SNTF as a marker is consistent with our earlier research showing that calcium is dumped into neurons following a traumatic brain injury, as SNTF is a marker for neurodegeneration driven by calcium overload."
The blood test given on the day of the mild traumatic brain injury showed 100 percent sensitivity to predict concussions leading to persisting cognitive problems, and 75 percent specificity to correctly rule out those without functionally harmful concussions. If validated in larger studies, a blood test measuring levels of SNTF could be helpful in diagnosing and predicting risk of long term consequences of concussion. The Penn and Baylor researchers hope to determine the robustness of these findings with a second larger study, and determine the best time after concussion to measure SNTF in the blood in order to predict persistent brain dysfunction. The team also wants to evaluate their blood test for identifying when repetitive concussions begin to cause brain damage and persistent disability.
Are monkeys, like humans, able to ascertain where objects are located without much more than a sideways glance? Quite likely, says Lau Andersen of the Aarhus University in Denmark, lead author of a study conducted at the Yerkes National Primate Research Center of Emory University, published in Springer’s journal Animal Cognition. The study finds that monkeys are able to localize stimuli they do not perceive.
Humans are able to locate, and even side-step, objects in their peripheral vision, sometimes before they perceive the object even being present. Andersen and colleagues therefore wanted to find out if visually guided action and visual perception also occurred independently in other primates.
The researchers trained five adult male rhesus monkeys (Macaca mulatta) to perform a short-latency, highly stereotyped localization task. Using a touchscreen computer, the animals learned to touch one of four locations where an object was briefly presented. The monkeys also learned to perform a detection task using identical stimuli, in which they had to report the presence or absence of an object by pressing one of two buttons. These techniques are similar to those used to test normal humans, and therefore make an especially direct comparison between humans and monkeys possible. A method called “visual masking” was used to systematically reduce how easily a visual target was processed.
Andersen and his colleagues found that the monkeys were still able to locate targets that they could not detect. The animals performed the tasks very accurately when the stimuli were unmasked, and their performance dropped when visual masking was employed. But monkeys could still locate targets at masking levels for which they reported that no target had been presented. While these results cannot establish the existence of phenomenal vision in monkeys, the discrepancy between visually guided action and detection parallels the dissociation of conscious and unconscious vision seen in humans.
“Knowing whether similar independent brain systems are present in humans and nonverbal species is critical to our understanding of comparative psychology and the evolution of brains,” explains Andersen.
Patients with the most common form of focal epilepsy have widespread, abnormal connections in their brains that could provide clues toward diagnosis and treatment, according to a new study published online in the journal Radiology.

(Image: MP-RAGE volumes are segmented into 83 ROIs, which are further parcellated into 1000 cortical and 15 subcortical ROIs. Whole-brain white matter tractography is performed after voxelwise tensor calculation, and the density of fibers that connect each pair of cortical ROIs is used to calculate structural connectivity. T1w = T1-weighted. Credit: Courtesy of Radiology and RSNA)
Temporal lobe epilepsy is characterized by seizures emanating from the temporal lobes, which sit on each side of the brain just above the ear. Previously, experts believed that the condition was related to isolated injuries of structures within the temporal lobe, like the hippocampus. But recent research has implicated the default mode network (DMN), the set of brain regions activated during task-free introspection and deactivated during goal-directed behavior. The DMN consists of several hubs that are more active during the resting state.
To learn more, researchers performed diffusion tensor imaging, a type of MRI that tracks the movement, or diffusion, of water in the brain’s white matter, the nerve fibers that transmit signals throughout the brain. The study group consisted of 24 patients with left temporal lobe epilepsy who were slated for surgery to remove the site from where their seizures emanated. The researchers compared them with 24 healthy controls using an MRI protocol dedicated to finding white matter tracts with diffusion imaging at high resolution. The data was analyzed with a new technique that identifies and quantifies structural connections in the brain.
Patients with left temporal lobe epilepsy exhibited a decrease in long-range connectivity of 22 percent to 45 percent among areas of the DMN when compared with the healthy controls.
"Using diffusion MRI, we found alterations in the structural connectivity beyond the medial temporal lobe, especially in the default mode network," said Steven M. Stufflebeam, M.D., from the Athinoula A. Martinos Center for Biomedical Imaging at Massachusetts General Hospital in Boston.
In addition to reduced long-range connectivity, the epileptic patients had an 85 percent to 270 percent increase in local connectivity within and beyond the DMN. The researchers believe this may be an adaptation to the loss of the long-range connections.
"The increase in local connections could represent a maladaptive mechanism by which overall neural connectivity is maintained despite the loss of connections through important hub areas," Dr. Stufflebeam said.
The results are supported by prior functional MRI studies that have shown decreased functional connectivity in DMN areas in temporal lobe epilepsy. Researchers are not certain if the structural changes cause the functional changes, or vice versa.
"It’s probably a breakdown of myelin, which is the insulation of neurons, causing a slowdown in the propagation of information, but we don’t know for sure," Dr. Stufflebeam said.
Dr. Stufflebeam and colleagues plan to continue their research, using structural and functional MRI with electroencephalography and magnetoencephalography to track diffusion changes and look at real-time brain activity.
"Our long-term goal is to see if we can we predict from diffusion studies who will respond to surgery and who will not," he said.
People with autism are more likely to also have synaesthesia, suggests new research in the journal Molecular Autism.

Synaesthesia involves people experiencing a ‘mixing of the senses’, for example, seeing colours when they hear sounds, or reporting that musical notes evoke different tastes. Autism is diagnosed when a person struggles with social relationships and communication, and shows unusually narrow interests and resistance to change. The team of scientists from Cambridge University found that whereas synaesthesia only occurred in 7.2% of typical individuals, it occurred in 18.9% of people with autism.
On the face of it, this is an unlikely result, as autism and synaesthesia seem as if they should not share anything. But at the level of the brain, synaesthesia involves atypical connections between brain areas that are not usually wired together (so that a sensation in one channel automatically triggers a perception in another). Autism has also been postulated to involve over-connectivity of neurons (so that the person over-focuses on small details but struggles to keep track of the big picture).
The scientists tested – and confirmed – the prediction that if both autism and synaesthesia involve neural over-connectivity, then synaesthesia might be disproportionately common in autism.
The team, led by Professor Simon Baron-Cohen at the Autism Research Centre at Cambridge University, tested 164 adults with an autism spectrum condition and 97 adults without autism. All volunteers were screened for synaesthesia. Among the 31 people with autism who also had synaesthesia, the most common forms of the latter were ‘grapheme-colour’ (18 of them reported black and white letters being seen as coloured) and ‘sound-colour’ (21 of them reported a sound triggering a visual experience of colour). Another 18 of them reported either tastes, pains, or smells triggering a visual experience of colour.
Professor Baron-Cohen said: “I have studied both autism and synaesthesia for over 25 years and I had assumed that one had nothing to do with the other. These findings will re-focus research to examine common factors that drive brain development in these traditionally very separate conditions. An example is the mechanism ‘apoptosis,’ the natural pruning that occurs in early development, where we are programmed to lose many of our infant neural connections. In both autism and synaesthesia apoptosis may not occur at the same rate, so that these connections are retained beyond infancy.”
Professor Simon Fisher, a member of the team, and Director of the Language and Genetics Department at Nijmegen’s Max Planck Institute, added: “Genes play a substantial role in autism and scientists have begun to pinpoint some of the individual genes involved. Synaesthesia is also thought to be strongly genetic, but the specific genes underlying this are still unknown. This new research gives us an exciting new lead, encouraging us to search for genes which are shared between these two conditions, and which might play a role in how the brain forms or loses neural connections.”
Donielle Johnson, who carried out the study as part of her Master’s degree in Cambridge, said: “People with autism report high levels of sensory hyper-sensitivity. This new study goes one step further in identifying synaesthesia as a sensory issue that has been overlooked in this population. This has major implications for educators and clinicians designing autism-friendly learning environments.”
Researchers from the University of Missouri School of Medicine have found that a new protocol that uses preventive blood-thinning medication in the treatment of patients with traumatic brain injuries reduces the risk of patients developing life-threatening blood clots without increasing the risk of bleeding inside the brain.
According to the Centers for Disease Control and Prevention, at least 1.7 million traumatic brain injuries occur each year. One of the most common complications associated with traumatic brain injuries is the risk of dangerous blood clots that can form in the circulatory system elsewhere in the body. For patients with traumatic injuries, the body forms blood clots which can break loose and travel to the lungs or other areas, causing dangerous complications.
"Our study found that treating traumatic brain-injured patients with an anticoagulant, or blood-thinning medication, is safe and decreases the risk of these dangerous clots," said N. Scott Litofsky, MD, chief of the MU School of Medicine’s Division of Neurological Surgery and director of neuro-oncology and radiosurgery at MU Health Care. "We found that patients treated with preventive blood thinners had a decreased risk of deep-vein blood clots and no increased risk of intracranial hemorrhaging."
In May 2009, Litofsky, along with study co-author Stephen Barnes, MD, acute care surgeon and chief of the MU Division of Acute Care Surgery, created a new protocol for treating head trauma patients in University Hospital’s Frank L. Mitchell Jr., M.D., Trauma Center using blood-thinning medications.
"One of the main challenges in treating patients with traumatic brain injuries is balancing the risk of intracranial bleeding with the risk of blood clots formed elsewhere in the body," Litofsky said.
In the study, the researchers compared the outcomes of 107 patients with traumatic brain injuries who were treated before the new protocol was put into place with the outcomes of 129 patients who were treated with the blood-thinning medication. Among the patients who did not receive blood thinners, six experienced deep-venous clotting, compared with zero instances of the condition in patients who received the medication. Among the patients who did not receive blood thinners, three patients experienced increased bleeding in the brain, compared with one patient who received the medication.
"Based on our results, we will continue to follow the new protocol in our trauma center, and we believe that other trauma centers would benefit from adopting a similar protocol in their practice," Litofsky said. "If we look at this issue across the country, we should hopefully see this complication occurring less often in brain-injured patients."
The study, “Safety and Efficacy of Early Thromboembolism Chemoprophylaxis After Intracranial Hemorrhage from Traumatic Brain Injury,” was published online Sept. 20 by the Journal of Neurosurgery, the journal for the American Association of Neurological Surgeons.
Researchers from TAU demonstrate hyperbaric oxygen therapy significantly revives brain functions and life quality

Every year, nearly two million people in the United States suffer traumatic brain injury (TBI), the leading cause of brain damage and permanent disabilities that include motor dysfunction, psychological disorders, and memory loss. Current rehabilitation programs help patients but often achieve limited success.
Now Dr. Shai Efrati and Prof. Eshel Ben-Jacob of Tel Aviv University’s Sagol School of Neuroscience have proven that it is possible to repair brains and improve the quality of life for TBI victims, even years after the occurrence of the injury.
In an article published in PLoS ONE, Dr. Efrati, Prof. Ben Jacob, and their collaborators present evidence that hyperbaric oxygen therapy (HBOT) should repair chronically impaired brain functions and significantly improve the quality of life of mild TBI patients. The new findings challenge the often-dismissive stand of the US Food and Drug Administration, Centers for Disease Control and Prevention, and the medical community at large, and offer new hope where there was none.
The research trial
The trial included 56 participants who had suffered mild traumatic brain injury one to five years earlier and were still bothered by headaches, difficulty concentrating, irritability, and other cognitive impairments. The patients’ symptoms were no longer improving prior to the trial.
The participants were randomly divided into two groups. One received two months of HBOT treatment while the other, the control group, was not treated at all. The latter group then received two months of treatment following the first control period. The treatments, administered at the Institute of Hyperbaric Medicine at Assaf Harofeh Medical Center, headed by Dr. Efrati, consisted of 40 one-hour sessions, administered five times a week over two months, in a high pressure chamber, breathing 100% oxygen and experiencing a pressure of 1.5 atmospheres, the pressure experienced when diving under water to a depth of 5 meters. The patients’ brain functions and quality of life were then assessed by computerized evaluations and compared with single photon emission computed tomography (SPECT) scans.
Persuasive confirmation
In both groups, the hyperbaric oxygen therapy sessions led to significant improvements in tests of cognitive function and quality of life. No significant improvements occurred by the end of the period of non-treatment in the control group. Analysis of brain imaging showed significantly increased neuronal activity after a two-month period of HBOT treatment compared to the control periods of non-treatment.
"What makes the results even more persuasive is the remarkable agreement between the cognitive function restoration and the changes in brain functionality as detected by the SPECT scans," explained Prof. Ben-Jacob. "The results demonstrate that neuroplasticity can be activated for months and years after acute brain injury."
"But most important, patients experienced improvements such as memory restoration and renewed use of language," Dr. Efrati said. "These changes can make a world of difference in daily life, helping patients regain their independence, go to work, and integrate back into society."
The regeneration process following brain injury involves complex processes, such as building new blood vessels and rebuilding connections between neurons, and requires much energy.
"This is where HBOT treatment can help," said Dr. Efrati. "The elevated oxygen levels during treatment supply the necessary energy for facilitating the healing process."
The findings offer new hope for millions of traumatic brain injury patients, including thousands of veterans wounded in action in Iraq and Afghanistan. The researchers call for additional larger scale, multi-center clinical studies to further confirm the findings and determine the most effective and personalized treatment protocols. But since the hyperbaric oxygen therapy is the only treatment proven to heal TBI patients, the researchers say that the medical community and the US Armed Forces should permit the victims of TBI benefit from the new hope right now, rather than waiting until additional studies are completed.
A study out today in the journal Nature Medicine suggests a potential new treatment for the seizures that often plague children with genetic metabolic disorders and individuals undergoing liver failure. The discovery hinges on a new understanding of the complex molecular chain reaction that occurs when the brain is exposed to too much ammonia.

The study shows that elevated levels of ammonia in the blood overwhelm the brain’s defenses, ultimately causing nerve cells to become overexcited. The researchers have also discovered that bumetanide – a diuretic drug used to treat high blood pressure – can restore normal electrical activity in the brains of mice with the condition and prevent seizures.
“Ammonia is a ubiquitous waste product of regular protein metabolism, but it can accumulate in toxic levels in individuals with metabolic disorders,” said Maiken Nedergaard, M.D., D.M.Sc., co-director of the University of Rochester Medical Center (URMC) Center for Translational Neuromedicine and lead author of the article. “It appears that the key to preventing the debilitating neurological effects of ammonia toxicity is to correct a molecular malfunction which causes nerve cells in the brain to become chemically unbalanced.”
In healthy people, ammonia is processed in the liver, converted to urea, and expelled from the body in urine. Because it is a gas, ammonia can slip through the blood-brain-barrier and make its way into brain tissue. Under normal circumstances, the brain’s housekeeping cells – called astrocytes – sweep up this unwanted ammonia and convert it into a compound called glutamine which can be more easily expelled from the brain.
However, individuals with certain genetic metabolic disorders and people with impaired liver function because of chronic hepatitis, alcoholism, acetaminophen overdose, and other toxic liver conditions cannot remove ammonia from their bodies quickly enough. The result is a larger than normal concentration of ammonia in the blood, a condition called hyperammonemia.
When too much ammonia makes its way into the central nervous system, it can lead to tremors, seizures and, in extreme cases, can cause comas and even lead to death. In children with metabolic disorders the frequent seizures can lead to long-term neurological impairment.
While ammonia has long been assumed to be the culprit behind the neurological problems associated with inherited metabolic disorders and liver failure, the precise mechanisms by which it triggers seizures and comas have not been fully understood. The new study reveals that ammonia causes a chain of events that alters the chemistry and electrical activity of the brain’s nerve cells, causing them to fire in uncontrolled bursts.
One of the keys to unraveling the effects of ammonia on the brain has been new imagining technologies such as two-photon microscopy which allow researchers to watch this phenomenon in real time in the living brains of mice. As suspected, they observed that when high levels of ammonia enter the brain, astrocytes become quickly overwhelmed and cannot remove it fast enough.
The abundant ammonia in the brain mimics the function of potassium, an important player in neurotransmission, and tricks neurons into becoming depolarized. This makes it more likely that electrical activity in the brain will exceed the threshold necessary to trigger seizures.
Furthermore, the researchers observed that one of the neuron’s key molecular gatekeepers – a transporter known as NKCC1 – was also fooled into thinking that the ammonia was potassium. As a result, it went into overdrive, loading neurons with too much chloride. This in turn prevents the cells from stabilizing itself after spikes in activity, keeping the cells in a heightened level of electrical “excitability.”
The team found that the drug bumetanide, a known NKCC1 inhibitor, blocked this process and prevented the cells from overloading with chloride. By knocking down this “secondary” cellular effect of ammonia, the researchers were able to control the seizures in the mice and prolong their survival.
“The neurologic impact of hyperammonemia is a tremendous clinical problem without an effective medical solution,” said Nedergaard. “The fact that bumetanide is already approved for use gives us a tremendous head start in terms of developing a potential treatment for this condition. This study provides a framework to further explore the therapeutic potential of this and other NKCC1 inhibitors.”
Pregnant women may pass on the effects of stress to their fetus by way of bacterial changes in their vagina, suggests a study in mice. It may affect how well their baby’s brain is equipped to deal with stress in adulthood.

The bacteria in our body outnumber our own cells by about 10 to 1, with most of them found in our gut. Over the last few years, it has become clear that the bacterial ecosystem in our body – our microbiome – is essential for developing and maintaining a healthy immune system.
Our gut bugs also help to prevent germs from invading our bodies, and help to absorb nutrients from food.
A baby gets its first major dose of bacteria in life as it passes through its mother’s birth canal. En route, the baby ingests the mother’s vaginal microbes, which begin to colonise the newborn’s gut.
Chris Howerton, then at the University of Pennsylvania in Philadelphia, and his colleagues wanted to know if this initial population of bacteria is important in shaping a baby’s neurological development, and whether that population is influenced by stress during pregnancy.
Stressful pregnancy
The first step was to figure out what features of the mother’s vaginal microbiome might be altered by stress, and then see if any of those changes were transmitted to the offspring’s gut.
To do this, the team exposed 10 pregnant mice to a different psychologically stressful experience, such as exposing them to fox odour, keeping their cages lit at night, or temporarily restraining them every day for what would be the equivalent of the first trimester of their pregnancy. Another 10 pregnant mice were housed normally during the same time.
The team took samples of their vaginal bacteria throughout the pregnancy and again just after the mice had given birth. These samples were genetically sequenced to see what types of bacteria were present.
The microbiomes of the stressed mice were remarkably different to those of the unstressed mice after they had each given birth. There were more types of bacteria present, and the proportion of one common gut bacteria, Lactobacillus, was significantly reduced.
Like mother, like pup
To see whether these changes had been passed on to the pups, a few days after birth the pups’ nascent gut bacteria was removed from their colon and sequenced. Sure enough, the same bacterial patterns were seen in the pups of stressed mothers.
By analysing tissue from the pups’ hypothalamus – a brain area involved in hormone control, behaviour and sleep, among other things – the team was able to infer which genes were affected by the stress-induced changes in each mother’s microbiome.
They found that the expression of 20 genes was affected by the decrease in Lactobacillus, including genes related to the production of new neurons and the growth of synaptic connections in the brain.
These genetic outcomes in the brain are probably a result of a different suite of nutrients and metabolites circulating in the “stressed” pup’s blood, thanks to the altered gut flora they inherited. Indeed, when the team analysed the blood of the pups of the stressed mothers, they found that there were fewer molecules present necessary for the formation of essential neurotransmitters – chemicals that transmit signals to the brain. Furthermore, there were lower levels of a molecule thought to protect the brain from harmful oxidative stress.
"These changes are significant and are likely to be important for determining how the brain initially develops and how it will respond in the future to things like stress or changes in the environment," says Tracy Bale, Howerton’s supervisor during the research and director of the University of Pennsylvania lab.
As well as changing the nutrients available, the microbiome could also affect the brain via the immune system or by innervating the nerves in the gut that connect to it. “These three mechanisms aren’t mutually exclusive. It’s likely that they all play a role,” says Howerton.
Human angle
If the same effects are seen in humans, there may be a straightforward solution. “We can easily manipulate the bacteria we have inside of us,” says Howerton. For example, if a certain cocktail of bacteria is found to be beneficial to the newborns of stressed mothers, we could give it to them right after birth, he suggests. This approach could also benefit babies born via C-section, who do not pass through their mother’s birth canal, or those born to mothers whose gut bacteria has been disrupted as a result of antibiotic use during pregnancy.
Bale is now investigating the link between bacteria and brain development in pregnant women who have been through several traumatic experiences to analyse the effects on their babies’ gut bacteria. She also intends to follow their children’s behaviour as they grow up.
Resource rationale
"This is a remarkable trans-disciplinary study in how it bridged multiple organ systems to illuminate a complex question," says Catherine Hagan from the University of Missouri in Columbia. She says that more work needs to be done to show a causal link. "Mice are not tiny people – people are not big mice – more data is needed to understand how stress in mothers affects brain development in children," she says. "That said, mice and people have enough in common that this study provides a rationale for allocating resources to address such a concern."
"At the end of the day, most of what makes you ‘you’, and what drives your quality of life, comes down to the brain," says Bale. "It’s this very important, vulnerable tissue that is susceptible to many perturbations. If the microbiome is proven to be one of these driving forces, then it’s essential we know just how factors in our environment can change it and can reprogram the brain."
A drug that mimics some effects of alcohol but lacks its harmful properties would have real benefit for public health, a leading scientist has argued.

Professor David Nutt, the Edmond J. Safra Professor of Neuropsychopharmacology at Imperial College London, has identified candidate molecules that reproduce the pleasurable effects of alcohol but are much less toxic. He is looking for investors to help develop the product and bring it to the market.
Alcohol mimics a chemical called GABA which is produced in the brain, but it also acts on receptors for other brain chemicals. The alcohol substitute would be designed to target GABA receptors very selectively, avoiding undesirable side effects such as hangovers and loss of coordination. An antidote could also be made to block the receptor, allowing drinkers to sober up quickly.
Professor Nutt told the Today programme on BBC Radio 4 that he first tested such a compound many years ago, but even better substitutes could be developed.
“There’s no question that you can produce a whole range of effects like alcohol by manipulating this system in the brain,” he said. “In some experiments, the effect is indistinguishable from alcohol.
“What we want to do is get rid of any the unwanted effects of inebriation, like aggression and memory impairment, and we just want to keep the pleasure and the sense of relaxation.
“We think by clever molecular modelling we can get rid of the risk of addiction as well.”
Professor Nutt hopes to make a range of cocktails containing his synthetic alcohol substitute. He has spoken to investors about taking the product to market, but many are wary that the drug might be controlled by legislation.
“I would like the government to make a recommendation that we try to improve on the health of our people by allowing these kind of substitute alcohols to be legal.”
Alcohol is responsible for 2.5 million deaths worldwide each year. Making safer alternatives available could reduce the harms significantly, Professor Nutt argued.
“I think this would be a serious revolution in health benefits, just as the e-cigarette is going to revolutionise the smoking of tobacco. I find it weird that we haven’t been talking about this before because it’s such an obvious target for health improvement.”
Nicotine withdrawal might take over your body, but it doesn’t take over your brain. The symptoms of nicotine withdrawal are driven by a very specific group of neurons within a very specific brain region, according to a report in Current Biology, a Cell Press publication, on November 14. Although caution is warranted, the researchers say, the findings in mice suggest that therapies directed at this group of neurons might one day help people quit smoking.

(Image: Fotolia)
"We were surprised to find that one population of neurons within a single brain region could actually control physical nicotine withdrawal behaviors," says Andrew Tapper of the Brudnick Neuropsychiatric Research Institute at the University of Massachusetts Medical School.
Tapper and his colleagues first obtained mice addicted to nicotine by delivering the drug to mice in their water for a period of 6 weeks. Then they took the nicotine away. The mice started scratching and shaking in the way a dog does when it is wet. Close examination of the animals’ brains revealed abnormally increased activity in neurons within a single region known as the interpeduncular nucleus.
When the researchers artificially activated those neurons with light, animals showed behaviors that looked like nicotine withdrawal, whether they had been exposed to the drug or not. The reverse was also true: treatments that lowered activity in those neurons alleviated nicotine withdrawal symptoms.
That the interpeduncular nucleus might play such a role in withdrawal from nicotine makes sense because the region receives connections from other areas of the brain involved in nicotine use and response, as well as feelings of anxiety. The interpeduncular nucleus is also densely packed with nicotinic acetylcholine receptors that are the molecular targets of nicotine.
It is much less clear whether the findings related to nicotine will be relevant to other forms of addiction, but there are some hints that they may.
"Smoking is highly prevalent in people with other substance-use disorders, suggesting a potential interaction between nicotine and other drugs of abuse," Tapper says. "In addition, naturally occurring mutations in genes encoding the nicotinic receptor subunits that are found in the interpeduncular nucleus have been associated with drug and alcohol dependence."
A protein shed by HIV-infected brain cells alters synaptic connections between networks of nerve cells, according to new research out of the University of Minnesota. The findings could explain why nearly half of all patients infected with the AIDS virus experience some level of neurocognitive impairment.
The research was published in the current volume of the Journal of Neuroscience.
“The synaptic changes didn’t appear to be a symptom of nerve death,” said Nicholas Hargus, Ph.D., lead author on the paper and a post-doctoral fellow in the Department of Pharmacology in the University of Minnesota Medical School. “Instead, the changes appeared to be a protective response resulting from the over-excitation of the network by the HIV protein transactivator of transcription (Tat). Essentially, the neuroprotective mechanism has gone awry.”
HIV-associated neurocognitive disorders (HAND) are an indirect result of HIV, as the disease itself does not infect neurons. Tat has been shown to contribute heavily to the development of HAND in patients. Hargus and Stanley Thayer, Ph.D., professor in the Department of Pharmacology, wanted to learn more about the relationship between Tat and HAND to better understand how to treat the disorders.
Researchers replicated the impact of the Tat in a rat model and tracked the changes to the synaptic proteins. They found changes in both inhibitory and excitatory synapses were initiated by specific Tat binding activity. This discovery indicated a pharmacological change due to exposure to Tat.
“We found drugs altering synaptic transmission between nerve cells reversed the synaptic changes induced by Tat,” said Thayer. “In the future, this could provide a target for the development of drugs to act upon and improve cognitive function in patients.”
Ongoing experiments are investigating the relationship between drug-induced changes in synaptic connections and the changes in cognitive function. In the future, high throughput approaches to assess synaptic function will be developed for evaluating drug candidates.
A research team at Worcester Polytechnic Institute (WPI) and The Rockefeller University in New York has developed a novel system to image brain activity in multiple awake and unconstrained worms. The technology, which makes it possible to study the genetics and neural circuitry associated with animal behavior, can also be used as a high-throughput screening tool for drug development targeting autism, anxiety, depression, schizophrenia, and other brain disorders.

Image: Neurons in the worms (marked by arrows) glow as the animals sense attractive odors.
The team details their technology and early results in the paper “High-throughput imaging of neuronal activity in Caenorhabditis elegans,” published on-line in advance of print by the journal Proceedings of the National Academy of Sciences.
"One of our major objectives is to understand the neural signals that direct behavior—how sensory information is processed through a network of neurons leading to specific decisions and responses," said Dirk Albrecht, PhD, assistant professor of biomedical engineering at WPI and senior author of the paper. Albrecht led the research team both at WPI and at Rockefeller, where he served previously as a postdoctoral researcher in the lab of Cori Bargmann, PhD, a Howard Hughes Medical Institute Investigator and a co-author of the new paper.
To study neuronal activity, Albrecht’s lab uses the tiny worm Caenorhabditis elegans (C. elegans), a nematode found in many environments around the world. A typical adult C. elegans is just 1 millimeter long and has 969 cells, of which 302 are neurons. Despite its small size, the worm is a complex organism able to do all of the things animals must do to survive. It can move, eat, mate, and process environmental cues that help it forage for food or react to threats. As a bonus for researchers, C.elegans is transparent. By using various imaging technologies, including optical microscopes, one can literally see into the worm and watch physiological processes in real time.
Numerous studies have been done by “worm labs” around the world exploring various neurological processes in C. elegans. These have typically been done using one worm at a time, with the animal’s body fixed in place on a slide. In their new paper, Albrecht’s team details how they imaged, recorded, and analyzed specific neurons in multiple animals as they wormed their way around a custom-designed microfluidic array, called an arena, where they were exposed to favorable or hostile sensory cues.
Specifically, the team engineered a strain of worms with neurons near the head that would glow when they sensed food odors. In experiments involving up to 23 worms at a time, Albrecht’s team infused pulses of attractive or repulsive odors into the arena and watched how the worms reacted. In general, the worms moved towards the positive odors and away from the negative odors, but the behaviors did not always follow this pattern. “We were able to show that the sensory neurons responded to the odors similarly in all the animals, but their behavioral responses differed significantly,” Albrecht said. “These animals are genetically identical, and they were raised together in the same environment, so where do their different choices come from?”
In addition to watching the head neurons light up as they picked up odor cues, the new system can trace signaling through “interneurons.” These are pathways that connect external sensors to the rest of the network (the “worm brain”) and send signals to muscle cells that adjust the worm’s movement based on the cues. Numerous brain disorders in people are believed to arise when neural networks malfunction. In some cases the malfunction is dramatic overreaction to a routine stimulus, while in others it is a lack of appropriate reactions to important cues. Since C. elegans and humans share many of the same genes, discovering genetic causes for differing neuronal responses in worms could be applicable to human physiology. Experimental compounds designed to modulate the action of nerve cells and neuronal networks could be tested first on worms using Albrecht’s new system. The compounds would be infused in the worm arena, along with other stimuli, and the reaction of the worms’ nervous systems could be imaged and analyzed.
"The basis of our work is to combine biomedical engineering and neuroscience to answer some of these fundamental questions and hopefully gain insight that would be beneficial for understanding and eventually treating human disorders," Albrecht said.
The novel compound IRL-1620 may be useful in treating Alzheimer’s disease (AD) as it has been shown to prevent cognitive impairment and oxidative stress in animal models. This research is being presented at the 2013 American Association of Pharmaceutical Scientists (AAPS) Annual Meeting and Exposition, the world’s largest pharmaceutical sciences meeting, in San Antonio, Nov. 10–14.
AD is a form of dementia that worsens over time, leading to a slow decline in cognitive functions and affecting memory, thinking, and behavior. More than 5 million Americans are living with AD, according to the Alzheimer’s Association.
Anil Gulati, M.D., Ph.D., FCP, and Seema Briyal, Ph.D., along with their colleagues from Midwestern University, administered Amyloid beta (Aβ), a main component of certain deposits located in AD patients’ brains, to normal and diabetic rats on days 1, 7, and 14. Spatial learning and memory were tested in a Morris water maze. The pool was divided into four equal quadrants, and an escape platform was hidden below the surface at a fixed location in one of the quadrants.
The rats had to find the platform within 60 seconds. The average time it took on day 4 for Aβ-treated rats to locate the platform was 55.05 seconds, though a majority of this group was not able to find it in the designated time. Aβ rats treated with IRL-1620 were able to locate the platform in 26.53 seconds, nearly half the time. After five days, Aβ rats treated with IRL-1620 showed a 60 percent improvement in learning and memory.
“Our research is based on the idea of using the Endothelin (ET) system in the treatment of AD,” said Gulati. “The ET system is traditionally known to play a role in the regulation of blood flow. This is important in the potential treatment of AD since disturbances in blood flow could damage the brain’s ability to clear damaging particles, leading to a build-up of toxic substances and cognitive impairment.”
The next stage of Gulati’s research is to further investigate the endothelin receptor type B’s mechanisms of neuroprotection and to look into possible resulting tissue changes following AD.
The FDA has approved five medications to treat the symptoms of AD. Current drugs help mask the symptoms but do not treat the underlying disease. A breakthrough Alzheimer’s treatment would target the underlying disease and stop or delay the cell damage that eventually leads to the worsening of symptoms.
Why does it take longer to recognise a familiar face when seen in an unfamiliar setting, like seeing a work colleague when on holiday? A new study published today in Nature Communications has found that part of the reason comes down to the processes that our brain performs when learning and recognising faces.

During the experiment, participants were shown faces of people that they had never seen before, while lying inside an MRI scanner in the Department of Psychology at Royal Holloway. They were shown some of these faces numerous times from different angles and were asked to indicate whether they had seen that person before or not.
While participants were relatively good at recognising faces once they had seen them a few times, using a new mathematical approach, the scientists found that people’s decisions of whether they recognised someone were also dependent on the context in which they encountered the face. If participants had recently seen lots of unfamiliar faces, they were more likely to say that the face they were looking at was unfamiliar, even if they had seen the face several times before and had previously reported that they did recognise the face.
Activity in two areas of the brain matched the way in which the mathematical model predicted people’s performance.
“Our study has characterised some of the mathematical processes that are happening in our brain as we do this,” said lead author Dr Matthew Apps. “One brain area, called the fusiform face area, seems to be involved in learning new information about faces and increasing their familiarity.
“Another area, called the superior temporal sulcus, we found to have an important role in influencing our report of whether we recognise someone’s face, regardless of whether we are actually familiar with them or not. While this seems rather counter-intuitive, it may be an important mechanism for simplifying all the information that we need to process about faces.”
“Face recognition is a fundamental social skill, but we show how error prone this process can be. To recognise someone, we become familiar with their face, by learning a little more about what it looks like,” said co-author Professor Manos Tsakiris.
“At the same time, we often see people in different contexts. The recognition biases that we measured might give us an advantage in integrating information about identity and social context, two key elements of our social world.”
Massachusetts General Hospital (MGH) investigators have used a new sequencing method to identify a group of genes used by the brain’s immune cells – called microglia – to sense pathogenic organisms, toxins or damaged cells that require their response. Identifying these genes should lead to better understanding of the role of microglia both in normal brains and in neurodegenerative disorders and may lead to new ways to protect against the damage caused by conditions like Alzheimer’s and Parkinson’s diseases. The study, which has been published online in Nature Neuroscience, also finds that the activity of microglia appears to become more protective with aging, as opposed to increasingly toxic, which some previous studies had suggested.
"We’ve been able to define, for the first time, a set of genes microglia use to sense their environment, which we are calling the microglial sensome," says Joseph El Khoury, MD, of the MGH Center for Immunology and Inflammatory Diseases and Division of Infectious Diseases, senior author of the study. "Identifying these genes will allow us to specifically target them in diseases of the central nervous system by developing ways to upregulate or downregulate their expression."
A type of macrophage, microglia are known to constantly survey their environment in order to sense the presence of infection, inflammation, and injured or dying cells. Depending on the situation they encounter, microglia may react in a protective manner – engulfing pathogenic organisms, toxins or damaged cells – or release toxic substances that directly destroy microbes or infected brain cells. Since this neurotoxic response can also damage healthy cells, keeping it under control is essential, and excess neurotoxicity is known to contribute to the damage caused by several neurodegenerative disorders.
El Khoury’s team set out to define the transcriptome – the complete set of RNA molecules transcribed by a cell – of the microglia of healthy, adult mice and compared that expression profile to those of macrophages from peripheral tissues of the same animals and of whole brain tissue. Using a technique called direct RNA sequencing, which is more accurate than previous methods, they identified a set of genes uniquely expressed in the microglia and measured their expression levels, the first time such a gene expression ‘snapshot’ has been produced for any mammalian brain cell, the authors note.
Since aging is known to alter gene expression throughout the brain, the researchers then compared the sensome of young adult mice to that of aged mice. They found that – contrary to what previous studies had suggested – the expression of genes involved in potentially neurotoxic actions, such as destroying neurons, was downregulated as animals aged, while the expression of neuroprotective genes involved in sensing and removing pathogens was increased. El Khoury notes that the earlier studies suggesting increased neurotoxicity with aging did not look at the cells’ full expression profile and often were done in cultured cells, not in living animals.
"Establishing the sensome of microglia allows us to clearly understand how they interact with and respond to their environment under normal conditions," he explains. "The next step is to see what happens under pathologic conditions. We know that microglia become more neurotoxic as Alzheimer’s disease and other neurodegenerative disorders progress, and recent studies have identified two of the microglial sensome genes as contributing to Alzheimer’s risk. Our next steps should be defining the sensome of microglia and other brain cells in humans, identifying how the sensome changes in central nervous system disorders, and eventually finding ways to safely manipulate the sensome pharmacologically."
University of Adelaide researchers have taken a step forward in unravelling the causes of a commonly inherited intellectual disability, finding that a genetic mutation leads to a reduction in certain proteins in the brain.
ARX is among the top four types of intellectual disability linked to the X-chromosome in males. So far, 115 families, including many large Australian families, have been discovered to carry an ARX (Aristaless related homeobox) mutation that gives rise to intellectual disability.
"There is considerable variation in the disability across families, and within families with a single mutation. Symptoms among males always include intellectual disability, as well as a range of movement disorders of the hand, and in some cases severe seizures," says Associate Professor Cheryl Shoubridge, Head of Molecular Neurogenetics with the University of Adelaide’s Robinson Institute.
ARX mutations were first discovered by the University of Adelaide’s Professor Jozef Gecz in 2002. To date, researchers have detected 52 different ARX mutations and 10 distinct clinical syndromes.
Associate Professor Shoubridge is lead author of a new paper on ARX intellectual disability published in the journal Human Molecular Genetics.
In laboratory studies, Associate Professor Shoubridge’s team has shown that mutations lead to a significant reduction in ARX proteins in the brain, but the actual causes and mechanisms involved in this remain unknown. Her team tested six genes that the ARX protein interacts with, and found that one of them - a gene likely to be important to early brain development - appears to be adversely affected by the reduction of ARX proteins.
"This plays an important role in setting up architecture and networks in the brain, which become disrupted due to the mutation", Associate Professor Shoubridge says.
"The discovery of this genetic link is an important step forward but there is still much work to be done. We’re now looking further at the mechanism of the reduction in ARX protein and what that means for the brain at a functional level."
Associate Professor Shoubridge says up to 3% of the population is affected by some kind of intellectual disability, costing $14.7 billion each year in Australia alone.
"The personal cost to families is enormous, especially in the most severe cases. Being able to unravel why and how these disabilities occur is very important to us and to the many people whose lives are affected by these conditions," she says.
A growing body of evidence shows the impact of diet on brain function, and identifies patterns of brain activity associated with eating disorders such as binge eating and purging. The findings were presented at Neuroscience 2013, the annual meeting of the Society for Neuroscience and the world’s largest source of emerging news about brain science and health.
Millions of people worldwide suffer from eating disorders such as anorexia, bulimia, and binge eating. With increased risk for psychiatric and chronic diseases, today’s studies are valuable in helping generate new strategies to treat disorders from obesity to anorexia.
Today’s new findings show that:
Other recent findings discussed show that:
“As scientists uncover the impacts of diet on brain function, the adage ‘You are what you eat,’ takes on new meaning,” said press conference moderator Fernando Gomez-Pinilla, PhD, of the University of California, Los Angeles, an expert in the impact of the environment on brain health. “We cannot separate the nutritional benefits of food for the body from that of the mind. What we put into the body also shapes the brain, for better or for worse.”
New findings show that extensive musical training affects the structure and function of different brain regions, how those regions communicate during the creation of music, and how the brain interprets and integrates sensory information. The findings were presented at Neuroscience 2013, the annual meeting of the Society for Neuroscience and the world’s largest source of emerging news about brain science and health.
These insights suggest potential new roles for musical training including fostering plasticity in the brain, an alternative tool in education, and treating a range of learning disabilities.
Today’s new findings show that:
Some of the brain changes that occur with musical training reflect the automation of task (much as one would recite a multiplication table) and the acquisition of highly specific sensorimotor and cognitive skills required for various aspects of musical expertise.
“Playing a musical instrument is a multisensory and motor experience that creates emotions and motions — from finger tapping to dancing — and engages pleasure and reward systems in the brain. It has the potential to change brain function and structure when done over a long period of time,” said press conference moderator Gottfried Schlaug, MD, PhD, of Harvard Medical School/Beth Israel Deaconess Medical Center, an expert on music, neuroimaging and brain plasticity. “As today’s findings show, intense musical training generates new processes within the brain, at different stages of life, and with a range of impacts on creativity, cognition, and learning.”
New studies released today reveal links between social status and specific brain structures and activity, particularly in the context of social stress. The findings were presented at Neuroscience 2013, the annual meeting of the Society for Neuroscience and the world’s largest source of emerging news about brain science and health.
Using human and animal models, these studies may help explain why position in social hierarchies strongly influences decision-making, motivation, and altruism, as well as physical and mental health. Understanding social decision-making and social ladders may also aid strategies to enhance cooperation and could be applied to everyday situations from the classroom to the boardroom.
Today’s new findings show that:
Other recent findings discussed show that:
“Social subordination and social instability have been associated with an increased incidence of mental illness in humans,” said press conference moderator Larry Young, PhD, of Emory University, an expert in brain functions involved with social behavior. “We now have a better picture of how these situations impact the brain. While this information could lead to new treatments, it also calls on us to evaluate how we construct social hierarchies — whether in the workplace or school — and their impacts on human well-being.”
Many of us have steeled ourselves for those ‘needle in a haystack’ tasks of finding our vehicle in an airport car park, or scouring the supermarket shelves for a favourite brand.

A new scientific study has revealed that our understanding of how the human brain prepares to perform visual search tasks of varying difficulty may now need to be revised.
When people search for a specific object, they tend to hold in mind a visual representation of it, based on key attributes like shape, size or colour. Scientists call this ‘advanced specification’. For example, we might search for a friend at a busy railway station by scanning the platform for someone who is very tall or who is wearing a green coat, or a combination of these characteristics.
Researchers from the School of Psychology at the University of Lincoln, UK, set out to better explain how these abstract visual representations are formed. They used fMRI scanners to record neural activity when volunteers prepared to search for a target object: a coloured letter amid a screen of other coloured letters.
Their findings, published in the journal ‘Brain Research’, are the first to fully isolate the different areas of the human brain involved in this ‘prepare to search’ function. Surprisingly, they show that the advanced frontal areas of the brain, usually key to advanced cognitive tasks, appear to take a backseat. Instead it is the basic back areas of the brain and the sub-cortical areas that do the work.
Dr Patrick Bourke from the University of Lincoln’s School of Psychology, who led the study, said: “Up until now, when researchers have studied visual search tasks they have also found that frontal areas of the brain were active. This has been assumed to indicate a control system: an ‘executive’ that largely resides in the advanced front of the brain which sends signals to the simpler back of the brain, activating visual memories. Here, when we isolated the ‘prepare’ part of the task from the actual search and response phase we found that this activation in the front was no longer present.”
This finding has important implications for understanding the fundamental brain processes involved. It was previously thought that the Intra-parietal region of the brain, which is linked to visual attention, was the central component of the supposed ‘front-back’ control network, relaying useful information (such as a shape or colour bias) from frontal areas of the brain to the back, where simple visual representations of the object are held. If the frontal areas are not activated in the preparation phase, this cannot be the case.
The study also showed that the pattern of brain activation varied depending on the anticipated difficulty of the search task, even when the target object was the same. This indicates that rather than holding in mind a single representation of an object, a new target is constructed each time, depending on the nature of the task.
Dr Bourke added: “While consistent with previous brain imaging work on visual search, these results change the interpretations and assumptions that have been applied previously. Notably, they highlight a difference between studies of animals’ brains and those of humans. Studies with monkeys convincingly show the front-back control system and we thought we understood how this worked. At the same time our findings are consistent with a growing body of brain imaging work in humans that also shows no frontal brain activation when short term memories are held.”
People in middle age who have a high blood pressure measure called pulse pressure are more likely to have biomarkers of Alzheimer’s disease in their spinal fluid than those with lower pulse pressure, according to research published in the November 13, 2013, online issue of Neurology®, the medical journal of the American Academy of Neurology.
Pulse pressure is the systolic pressure, or the top number in a blood pressure reading, minus the diastolic, or the bottom number. Pulse pressure increases with age and is an index of the aging of the vascular system.
The study involved 177 people ages 55 to 100 with no symptoms of Alzheimer’s disease. Participants had their pulse pressure taken and lumbar punctures to obtain spinal fluid.
The study found that people who have higher pulse pressure are more likely to have the Alzheimer’s biomarkers amyloid beta, or plaques, and p-tau protein, or tangles, in their cerebral spinal fluid than those with lower pulse pressure. For every 10 point rise in pulse pressure, the average level of p-tau protein in the spinal fluid rose by 1.5 picograms per millileter. A picogram is one trillionth of a gram.
“These results suggest that the forces involved in blood circulation may be related to the development of the hallmark Alzheimer’s disease signs that cause loss of brain cells,” said study author Daniel A. Nation, PhD, of the VA San Diego Healthcare System.
The relationship was found in people age 55 to 70, but not in people age 70 to 100.
“This is consistent with findings indicating that high blood pressure in middle age is a better predictor of later problems with memory and thinking skills and loss of brain cells than high blood pressure in old age,” Nation said.
People who are in love are less able to focus and to perform tasks that require attention. Researcher Henk van Steenbergen concludes this, together with colleagues from Leiden University and the University of Maryland. The article has appeared in the journal Motivation and Emotion.

The more in love, the less focused you are
Forty-three participants who had been in a relationship for less than half a year performed a number of tasks during which they had to discriminate irrelevant from relevant information as soon as possible. It appeared that the more in love they were, the less able they were to ignore the irrelevant information. Love intensity thus was related to how well someone is able to focus. There was no difference between men and women.
Cognitive control
The participants listened to music that elicited romantic feelings and thought of a romantic event to intensify their love feelings. Participants also completed a questionnaire that was used to assess the intensity of their love feelings. The results of the study by Henk van Steenbergen differed from results from previous studies. Those previous studies showed that the ability to ignore distracting information is required to maintain a long-term romantic relationship. Being able to control oneself (also called “cognitive control”) and to resist temptations that could threaten the relationship is essential in long-term love.
Thinking of your beloved
In the study by Van Steenbergen, in contrast, the participants had become involved in a romantic relationship only a few months ago. “When you have just become involved in a romantic relationship you’ll probably find it harder to focus on other things because you spend a large part of your cognitive resources on thinking of your beloved”, Van Steenbergen says. “For long-lasting love in a long-term relationship, on the other hand, it seems crucial to have proper cognitive control.” Over time, a balance between less and more cognitive control may be critical for a successful relationship.
Why is romantic love associated with cognitive control?
Van Steenbergen emphasizes that the link between romantic love and cognitive control is a new area of research. “The reason why romantic love is associated with cognitive control is still unknown. It could be that lovers use all their cognitive resources to think about their beloved, which leaves them no resources to perform a boring task. It could also be that the association goes in the opposite direction: people who have reduced cognitive control may experience more intense love feelings than people who have higher levels of cognitive control.” Future research will have to clarify this.
Patients with traumatic brain injury (TBI) had increased deposits of β-Amyloid (Αβ) plaques, a hallmark of Alzheimer Disease (AD), in some areas of their brains in a study by Young T. Hong, Ph.D., of the University of Cambridge, England, and colleagues.
There may be epidemiological or pathophysiological (changes because of injury) links between TBI and AD, and Αβ plaques are found in as many as 30 percent of patients who die in the acute phase after a TBI. The plaques appear within hours of the injury and can occur in patients of all ages, according to the study background.
Researchers used imaging and brain tissue acquired during autopsies to examine Αβ deposition in patients with TBI. Researchers performed positron emission tomography (PET) imaging using carbon 11-labeled Pittsburgh Compound B ([11C]PIB), a marker of brain amyloid deposition, in 15 participants with a TBI and 11 healthy patients. Autopsy-acquired brain tissue was obtained from 16 people who had a TBI, as well as seven patients with a nonneurological cause of death.
The study’s findings indicate that patients with TBI showed increases in [11C]PIB binding, which may be a marker of Αβ plaque in some areas of the brain.
“The use of ([11C]PIB PET for amyloid imaging following TBI provides us with the potential for understanding the pathophysiology of TBI, for characterizing the mechanistic drivers of disease progression or suboptimal recovery in the subacute phase of TBI, for identifying patients at high risk of accelerated AD, and for evaluating the potential of antiamyloid therapies,” the authors conclude.
A polymer originally designed to help mend broken bones could be successful in delivering chemotherapy drugs directly to the brains of patients suffering from brain tumours, researchers at The University of Nottingham have discovered.

Their study, published in the journal PLOS ONE, shows that the biomaterial can be easily applied to the cavity created following brain cancer surgery and used to release chemotherapy drugs over several weeks.
The targeted nature of the therapy could also reduce the toxic effects of chemotherapy drugs on healthy parts of the body, potentially reducing the debilitating side-effects that many patients experience after cancer treatment.
Patient survival
Dr Ruman Rahman, of the University’s Children’s Brain Tumour Research Centre (CBTRC), who led the study, said: “Our system is an innovative method of drug delivery for the treatment of brain tumours and is intended to be administered immediately after surgery by the operating neurosurgeon.
“Ultimately, this method of drug delivery, in combination with existing therapies, may result in more effective treatment of brain tumours, prolonged patient survival and reduced morbidity.”
Brain tumours are the major cause of cancer-related death in children and adults up to the age of 40. Most relapses occur when surgeons are unable to remove all of the cancerous cells during surgery – something which can be particularly challenging in very young children and babies and by the very nature of a type of adult brain cancer called glioblastoma.
Although alternative systems for delivery of drugs directly to the brain have been developed, they are used infrequently because their success has been limited. This new drug delivery system is the first that can be moulded to the shape of the brain tumour cavity and the first to deliver several different drugs over a clinically meaningful period of time.
The Nottingham polymer formulation is made from two types of micro-particles called PLGA and PEG and has been developed and patented by leading tissue engineer Professor Kevin Shakesheff, based in the University’s School of Pharmacy. A powder at room temperature, it can be mixed to a toothpaste-like consistency with the addition of water.
Unique properties
The unique properties of the polymer lie in its ability to set into a rigid structure only when it reaches body temperature (37 degrees), a feature perfectly tailored for use in medical therapies. It was originally developed as a scaffold on to which new bone cells could be grown to speed up the knitting back together of broken bones.
Dr Ruman Rahman at the CBTRC and Dr Cheryl Rahman from the School of Pharmacy spotted the potential for the polymer to deliver chemotherapy drugs directly to patients’ brain tumours. The work was performed at the CBTRC with neurosurgeon Mr Stuart Smith and neuro-oncologist Professor Richard Grundy. The cavity left by the removal of a tumour would be lined with the polymer while in paste form, which would start to solidify and gradually release the chemotherapy drugs after the incision has been closed. This would directly target any residual cells not initially removed during surgery.
In the lab, the Nottingham scientists were able to successfully demonstrate the slow-release properties of the material by placing paste loaded with three commonly used chemotherapy drugs into a solution of saline and measuring the quantities of the drugs given out by the material over time.
To establish whether the material itself is safe to use on patients in this form of therapy, they used it to create a 3D model onto which they were able to grow brain tumour cells and healthy brain blood vessel cells without any toxicity. They then simulated surgery on a sheep’s brain from an abattoir by moulding the paste around a brain cavity and warming the brain to human body temperature to harden the polymer.
The brain was then scanned using CT and MRI technology to demonstrate that it is still possible to distinguish the polymer from normal brain tissue on a routine brain scan, an aspect crucial for doctors when dealing with follow-up care for brain tumour patients who have undergone surgery.
Robust material
The team also dealt with concerns that the material could disintegrate and release its chemotherapy contents too quickly during the subsequent radiotherapy which many cancer patients undergo following surgery. By placing the biomaterial loaded with chemotherapy drugs into a head cavity of a medical training dummy and subjecting it to the same duration and intensity of radiotherapy used for brain tumour patients they were able to successfully demonstrate the robust integrity of the structure.
Finally they showed that a chemotherapy drug called etoposide could be effective at killing brain cancer cells in a mouse when released from the polymer formulation. The next stage of the research will be to extend the study in mice with brain tumours to test whether animals with the drug-loaded polymers survive longer. The team are also investigating the release of other chemotherapeutic drugs that hold promise, supported by a recent grant award from Sparks.
As the research used a biomaterial and chemotherapy drugs already approved for medical use, many of the usual ethical approval hurdles to allow further investigation have already been cleared.
The first clinical test, anticipated in 3 years’ time, will be to devise a multi-centre phase 0 clinical trial which would involve testing the therapy on a small number of patients for whom other clinical treatments have not been successful and would otherwise only be offered palliative care.
“This is a very exciting development and holds considerable promise for the treatment of malignant brain tumours in the near future” commented Professor Grundy, Co-Director of the CBTRC.
Research released today reveals new mechanisms and areas of the brain associated with anxiety and depression, presenting possible targets to understand and treat these debilitating mental illnesses. The findings were presented at Neuroscience 2013, the annual meeting of the Society for Neuroscience and the world’s largest source of emerging news about brain science and health.
More than 350 million people worldwide suffer from clinical depression and between 5 and 25 percent of adults suffer from generalized anxiety, according to the World Health Organization. The resulting emotional and financial costs to people, families, and society are significant. Further, antidepressants are not always effective and often cause severe side effects.
Today’s new findings show that:
Other recent findings discussed show that:
“Today’s findings represent our rapidly growing understanding of the individual molecules and brain circuits that may contribute to depression and anxiety,” said press conference moderator Lisa Monteggia, PhD, of the University of Texas Southwestern Medical Center, an expert on mechanisms of antidepressant action. “These exciting discoveries represent the potential for significant changes in how we diagnose and treat these illnesses that touch millions.”