Scientists Shed Light on Cause of Spastic Paraplegia
Scientists at The Scripps Research Institute (TSRI) have discovered that a gene mutation linked to hereditary spastic paraplegia, a disabling neurological disorder, interferes with the normal breakdown of triglyceride fat molecules in the brain. The TSRI researchers found large droplets of triglycerides within the neurons of mice modeling the disease.
The findings, reported this week online ahead of print by the journal Proceedings of the National Academy of Sciences, point the way to potential therapies and showcase an investigative strategy that should be useful in determining the biochemical causes of other genetic illnesses. Scientists in recent decades have linked thousands of gene mutations to human diseases, yet many of the genes in question code for proteins of unknown function.
“We often need to understand the protein function that is disrupted by a gene mutation, if we’re going to understand the mechanistic basis for the disease and move towards developing a therapy, and that is what we’ve tried to do here,” said Benjamin F. Cravatt, professor and chair of TSRI’s Department of Chemical Physiology.
There is currently no treatment for hereditary spastic paraplegia (HSP), a set of genetic illnesses whose symptoms include muscle weakness and stiffness, and in some cases cognitive impairments. About 100,000 people worldwide live with HSP.
In the new study, Cravatt and members of his laboratory, including graduate student Jordon Inloes and postdoctoral fellow Ku-Lung Hsu, focused on DDHD2, an enzyme of unclear function whose gene is mutated in a subset of HSP cases. “These cases involving DDHD2 disruption feature cognitive defects as well as spasticity and muscle wasting, so they’re among the more devastating forms of this illness,” said Cravatt.
To start, the researchers created a mouse model of DDHD2-related HSP, in which a targeted deletion from the DDHD2 gene eliminated the expression of the DDHD2 protein. “These mice showed symptoms similar to those of HSP patients, including abnormal gait and lower performance on tests of movement and cognition,” said Inloes.
Prior research had suggested that the DDHD2 enzyme is expressed in the brain and is involved somehow in lipid metabolism. One study reported elevated levels of an unknown fat molecule in the brains of DDHD2-mutant HSP patients. Cravatt’s team compared the tissues of the no-DDHD2 mice to the tissues of mice with normal versions of the gene, and also found that the mutant mice had much higher levels of a type of fat molecule, principally in the brain.
Using a set of sophisticated “lipidomics” tests to analyze the accumulating fat molecules, they identified them as triglycerides—a major component of stored fat in the body, and a risk factor for obesity, atherosclerosis and type 2 diabetes.
“We were able to show as well, using both light microscopy and electron microscopy, that droplets of triglyceride-rich fat are present in the neurons of DDHD2-knockout mice, in several brain regions, but are not present in normal mice,” said Inloes.
For the next phase of the study, Cravatt’s team developed a complementary tool for studying DDHD2’s function: a specific inhibitor of the DDHD2 enzyme, one of a set of powerful enzyme-blocking compounds they had identified in a study reported last year. “After four days of treatment with this inhibitor, normal mice showed an increase in brain triglycerides,” said Inloes. “This suggests that DDHD2 normally breaks down triglycerides, and its inactivity allows triglycerides to build up.”
Finally the team confirmed DDHD2’s role in triglyceride metabolism by showing that triglycerides are rapidly broken down into smaller fatty acids in its presence. “These findings give us some insight, at least, into the biochemical basis of the HSP syndrome,” said Cravatt.
Future projects in this line of inquiry, he adds, include a study of how triglyceride droplets in neurons lead to impairments of movement and cognition, and research on potential therapies to counter these effects, including the possible use of diacylglycerol transferase (DGAT) inhibitors, which reduce the natural production of triglycerides.
Cravatt also notes that the same approach used in this study can be applied to other enzymes in DDHD2’s class (serine hydrolases), whose dysfunctions cause human neurological disorders.
Single-Neuron “Hub” Orchestrates Activity of an Entire Brain Circuit
The idea of mapping the brain is not new. Researchers have known for years that the key to treating, curing, and even preventing brain disorders such as Alzheimer’s disease, epilepsy, and traumatic brain injury, is to understand how the brain records, processes, stores, and retrieves information.
New Tel Aviv University research published in PLOS Computational Biology makes a major contribution to efforts to navigate the brain. The study, by Prof. Eshel Ben-Jacob and Dr. Paolo Bonifazi of TAU’s School of Physics and Astronomy and Sagol School of Neuroscience, and Prof. Alessandro Torcini and Dr. Stefano Luccioli of the Instituto dei Sistemi Complessi, under the auspices of TAU’s Joint Italian-Israeli Laboratory on Integrative Network Neuroscience, offers a precise model of the organization of developing neuronal circuits.
In an earlier study of the hippocampi of newborn mice, Dr. Bonifazi discovered that a few “hub neurons” orchestrated the behavior of entire circuits. In the new study, the researchers harnessed cutting-edge technology to reproduce these findings in a computer-simulated model of neuronal circuits. “If we are able to identify the cellular type of hub neurons, we could try to reproduce them in vitro out of stem cells and transplant these into aged or damaged brain circuitries in order to recover functionality,” said Dr. Bonifazi.
Flight dynamics and brain neurons
"Imagine that only a few airports in the world are responsible for all flight dynamics on the planet," said Dr. Bonifazi. "We found this to be true of hub neurons in their orchestration of circuits’ synchronizations during development. We have reproduced these findings in a new computer model."
According to this model, one stimulated hub neuron impacts an entire circuit dynamic; similarly, just one muted neuron suppresses all coordinated activity of the circuit. “We are contributing to efforts to identify which neurons are more important to specific neuronal circuits,” said Dr. Bonifazi. “If we can identify which cells play a major role in controlling circuit dynamics, we know how to communicate with an entire circuit, as in the case of the communication between the brain and prosthetic devices.”
Conducting the orchestra of the brain
In the course of their research, the team found that the timely activation of cells is fundamental for the proper operation of hub neurons, which, in turn, orchestrate the entire network dynamic. In other words, a clique of hubs works in a kind of temporally-organized fashion, according to which “everyone has to be active at the right time,” according to Dr. Bonifazi.
Coordinated activation impacts the entire network. Just by alternating the timing of the activity of one neuron, researchers were able to affect the operation of a small clique of neurons, and finally that of the entire network.
"Our study fits within framework of the ‘complex network theory,’ an emerging discipline that explores similar trends and properties among all kinds of networks — i.e., social networks, biological networks, even power plants," said Dr. Bonifazi. "This theoretical approach offers key insights into many systems, including the neuronal circuit network in our brains."
Parallel to their theoretical study, the researchers are conducting experiments on in vitro cultured systems to better identify electrophysiological and chemical properties of hub neurons. The joint Italy-Israel laboratory is also involved in a European project aimed at linking biological and artificial neuronal circuitries to restore lost brain functions.
Study reveals new clues to help understand brain stimulation
Findings could help guide clinicians in selecting stimulation sites and improve treatment for neurological and psychiatric disorders
Over the past several decades, brain stimulation has become an increasingly important treatment option for a number of psychiatric and neurological conditions.
Divided into two broad approaches, invasive and noninvasive, brain stimulation works by targeting specific sites to adjust brain activity. The most widely known invasive technique, deep brain stimulation (DBS), requires brain surgery to insert an electrode and is approved by the U.S. Food and Drug Administration (FDA) for the treatment of Parkinson’s disease and essential tremor. Noninvasive techniques, including transcranial magnetic stimulation (TMS), can be administered from outside the head and are currently approved for the treatment of depression. Brain stimulation can result in dramatic benefit to patients with these disorders, motivating researchers to test whether it can also help patients with other diseases.
But, in many cases, the ideal sites to administer stimulation have remained ambiguous. Exactly where in the brain is the best spot to stimulate to treat a given patient or a given disease?
Now a new study in the Proceedings of the National Academy of Sciences (PNAS) helps answer this question. Led by investigators at Beth Israel Deaconess Medical Center (BIDMC), the findings suggest that brain networks – the interconnected pathways that link brain circuits to one another— can help guide site selection for brain stimulation therapies.
"Although different types of brain stimulation are currently applied in different locations, we found that the targets used to treat the same disease are nodes in the same connected brain network," says first author Michael D. Fox, MD, PhD, an investigator in the Berenson-Allen Center for Noninvasive Brain Stimulation and in the Parkinson’s Disease and Movement Disorders Center at BIDMC.
"This may have implications for how we administer brain stimulation to treat disease. If you want to treat Parkinson’s disease or tremor with brain stimulation, you can insert an electrode deep in the brain and get a great effect. However, getting this same benefit with noninvasive stimulation is difficult, as you can’t directly stimulate the same site deep in the brain from outside the head," explains Fox, an Assistant Professor of Neurology at Harvard Medical School (HMS). "But, by looking at the brain’s own network connectivity, we can identify sites on the surface of the brain that connect with this deep site, and stimulate those sites noninvasively."
Brain networks consist of interconnected pathways linking brain circuits or loops, similar to a college campus in which paved sidewalks connect a wide variety of buildings.
In this paper, Fox led a team that first conducted a large-scale literature search to identify all neurological and psychiatric diseases where improvement had been seen with both invasive and noninvasive brain stimulation. Their analysis revealed 14 conditions: addiction, Alzheimer’s disease, anorexia, depression, dystonia, epilepsy, essential tremor, gait dysfunction, Huntington’s disease, minimally conscious state, obsessive compulsive disorder, pain, Parkinson disease and Tourette syndrome. They next listed the stimulation sites, either deep in the brain or on the surface of the brain, thought to be effective for the treatment of each of the 14 diseases.
"We wanted to test the hypothesis that these various stimulation sites are actually different spots within the same brain network," explains Fox. "To examine the connectivity from any one site to other brain regions, we used a data base of functional MRI images and a technique that enables you to see correlations in spontaneous brain activity." From these correlations, the investigators were able to create a map of connections from deep brain stimulation sites to the surface of the brain. When they compared this map to sites on the brain surface that work for noninvasive brain stimulation, the two matched.
"These results suggest that brain networks might be used to help us better understand why brain stimulation works and to improve therapy by identifying the best place to stimulate the brain for each individual patient and given disease," says senior author Alvaro Pascual-Leone, MD, PhD, the Director of the Berenson-Allen Center for Noninvasive Brain Stimulation at BIDMC and Professor of Neurology at HMS. "This study illustrates the potential of gaining fundamental insights into brain function while helping patients with debilitating diseases, and provides us with a powerful way of selecting targets based on their connectivity to other regions that can be widely applied to help guide brain stimulation therapy across multiple neurological and psychiatric disorders."
"As we’re trying different types of brain stimulation for different diseases, the question comes up, ‘How does one relate to the other?’" notes Fox. "In other words, can we use the success in one to help design a trial or inform how we apply a new type of brain stimulation? Our new findings suggest that resting-state functional connectivity may be useful for translating therapy between treatment modalities, optimizing treatment and identifying new stimulation targets."
Protein regulates neuronal communication by self-association
The protein alpha-synuclein is a well-known player in Parkinson’s disease and other related neurological conditions, such as dementia with Lewy bodies. Its normal functions, however, have long remained unknown. An enticing mystery, say researchers, who contend that understanding the normal is critical in resolving the abnormal.
Alpha-synuclein typically resides at presynaptic terminals – the communication hubs of neurons where neurotransmitters are released to other neurons. In previous studies, Subhojit Roy, MD, PhD, and colleagues at the University of California, San Diego School of Medicine had reported that alpha-synuclein diminishes neurotransmitter release, suppressing communication among neurons. The findings suggested that alpha-synuclein might be a kind of singular brake, helping to prevent unrestricted firing by neurons. Precisely how, though, was a mystery.
Then Harvard University researchers reported in a recent study that alpha-synuclein self-assembles multiple copies of itself inside neurons, upending an earlier notion that the protein worked alone. And in a new paper, published this month in Current Biology, Roy, a cell biologist and neuropathologist in the departments of Pathology and Neurosciences, and co-authors put two and two together, explaining how these aggregates of alpha-synuclein, known as multimers, might actually function normally inside neurons.
First, they confirmed that alpha-synuclein multimers do in fact congregate at synapses, where they help cluster synaptic vesicles and restrict their mobility. Synaptic vesicles are essentially tiny packages created by neurons and filled with neurotransmitters to be released. By clustering these vesicles at the synapse, alpha-synuclein fundamentally restricts neurotransmission. The effect is not unlike a traffic light – slowing traffic down by bunching cars at street corners to regulate the overall flow.
“In normal doses, alpha-synuclein is not a mechanism to impair communication, but rather to manage it. However it’s quite possible that in disease, abnormal elevations of alpha-synuclein levels lead to a heightened suppression of neurotransmission and synaptic toxicity,” said Roy.
“Though this is obviously not the only event contributing to overall disease neuropathology, it might be one of the very first triggers, nudging the synapse to a point of no return. As such, it may be a neuronal event of critical therapeutic relevance.”
Indeed, Roy noted that alpha-synuclein has become a major target for potential drug therapies attempting to reduce or modify its levels and activity.
Protein that Causes Frontotemporal Dementia also Implicated in Alzheimer’s Disease
Researchers at the Gladstone Institutes have shown that low levels of the protein progranulin in the brain can increase the formation of amyloid-beta plaques (a hallmark of Alzheimer’s disease), cause neuroinflammation, and worsen memory deficits in a mouse model of this condition. Conversely, by using a gene therapy approach to elevate progranulin levels, scientists were able to prevent these abnormalities and block cell death in this model.
Progranulin deficiency is known to cause another neurodegenerative disorder, frontotemporal dementia (FTD), but its role in Alzheimer’s disease was previously unclear. Although the two conditions are similar, FTD is associated with greater injury to cells in the frontal cortex, causing behavioral and personality changes, whereas Alzheimer’s disease predominantly affects memory centers in the hippocampus and temporal cortex.
Earlier research showed that progranulin levels were elevated near plaques in the brains of patients with Alzheimer’s disease, but it was unknown whether this effect counteracted or exacerbated neurodegeneration. The new evidence, published today in Nature Medicine, shows that a reduction of the protein can severely aggravate symptoms, while increases in progranulin may be the brain’s attempt at fighting the inflammation associated with the disease.
According to first author S. Sakura Minami, PhD, a postdoctoral fellow at the Gladstone Institutes, “This is the first study providing evidence for a protective role of progranulin in Alzheimer’s disease. Prior research had shown a link between Alzheimer’s and progranulin, but the nature of the association was unclear. Our study demonstrates that progranulin deficiency may promote Alzheimer’s disease, with decreased levels rendering the brain vulnerable to amyloid-beta toxicity.”
In the study, the researchers manipulated several different mouse models of Alzheimer’s disease, genetically raising or lowering their progranulin levels. Reducing progranulin markedly increased amyloid-beta plaque deposits in the brain as well as memory impairments. Progranulin deficiency also triggered an over-active immune response in the brain, which can contribute to neurological disorders. In contrast, increasing progranulin levels via gene therapy effectively lowered amyloid beta levels, protecting against cell toxicity and reversing the cognitive deficits typically seen in these Alzheimer’s models.
These effects appear to be linked to progranulin’s involvement in phagocytosis, a type of cellular house-keeping whereby cells “eat” other dead cells, debris, and large molecules. Low levels of progranulin can impair this process, leading to increased amyloid beta deposition. Conversely, increasing progranulin levels enhanced phagocytosis, decreasing the plaque load and preventing neuron death.
“The profound protective effects of progranulin against both amyloid-beta deposits and cell toxicity have important therapeutic implications,” said senior author Li Gan, PhD, an associate investigator at Gladstone and associate professor of neurology at the University of California, San Francisco. “The next step will be to develop progranulin-enhancing approaches that can be used as potential novel treatments, not only for frontotemporal dementia, but also for Alzheimer’s disease.”
Scientists Identify the Signature of Aging in the Brain
How the brain ages is still largely an open question – in part because this organ is mostly insulated from direct contact with other systems in the body, including the blood and immune systems. In research that was recently published in Science, Weizmann Institute researchers Prof. Michal Schwartz of the Neurobiology Department and Dr. Ido Amit of Immunology Department found evidence of a unique “signature” that may be the “missing link” between cognitive decline and aging. The scientists believe that this discovery may lead, in the future, to treatments that can slow or reverse cognitive decline in older people.
(Image caption: Immunofluorescence microscope image of the choroid plexus. Epithelial cells are in green and chemokine proteins (CXCL10) are in red)
Until a decade ago, scientific dogma held that the blood-brain barrier prevents the blood-borne immune cells from attacking and destroying brain tissue. Yet in a long series of studies, Schwartz’s group had shown that the immune system actually plays an important role both in healing the brain after injury and in maintaining the brain’s normal functioning. They have found that this brain-immune interaction occurs across a barrier that is actually a unique interface within the brain’s territory.
This interface, known as the choroid plexus, is found in each of the brain’s four ventricles, and it separates the blood from the cerebrospinal fluid. Schwartz: “The choroid plexus acts as a ‘remote control’ for the immune system to affect brain activity. Biochemical ‘danger’ signals released from the brain are sensed through this interface; in turn, blood-borne immune cells assist by communicating with the choroid plexus. This cross-talk is important for preserving cognitive abilities and promoting the generation of new brain cells.”
This finding led Schwartz and her group to suggest that cognitive decline over the years may be connected not only to one’s “chronological age” but also to one’s “immunological age,” that is, changes in immune function over time might contribute to changes in brain function – not necessarily in step with the count of one’s years.
To test this theory, Schwartz and research students Kuti Baruch and Aleksandra Deczkowska teamed up with Amit and his research group in the Immunology Department. The researchers used next-generation sequencing technology to map changes in gene expression in 11 different organs, including the choroid plexus, in both young and aged mice, to identify and compare pathways involved in the aging process.
That is how they identified a strikingly unique “signature of aging” that exists solely in the choroid plexus – not in the other organs. They discovered that one of the main elements of this signature was interferon beta – a protein that the body normally produces to fight viral infection. This protein appears to have a negative effect on the brain: When the researchers injected an antibody that blocks interferon beta activity into the cerebrospinal fluid of the older mice, their cognitive abilities were restored, as was their ability to form new brain cells. The scientists were also able to identify this unique signature in elderly human brains. The scientists hope that this finding may, in the future, help prevent or reverse cognitive decline in old age, by finding ways to rejuvenate the “immunological age” of the brain.
Project leader Dr Sharath Sriram, co-leader of the RMIT Functional Materials and Microsystems Research Group, said the nanometer-thin stacked structure was created using thin film, a functional oxide material more than 10,000 times thinner than a human hair.
“The thin film is specifically designed to have defects in its chemistry to demonstrate a ‘memristive’ effect – where the memory element’s behaviour is dependent on its past experiences,” Dr Sriram said.
“With flash memory rapidly approaching fundamental scaling limits, we need novel materials and architectures for creating the next generation of non-volatile memory.
“The structure we developed could be used for a range of electronic applications – from ultrafast memory devices that can be shrunk down to a few nanometers, to computer logic architectures that replicate the versatility and response time of a biological neural network.
“While more investigation needs to be done, our work advances the search for next generation memory technology can replicate the complex functions of human neural system – bringing us one step closer to the bionic brain.”
The research relies on memristors, touted as a transformational replacement for current hard drive technologies such as Flash, SSD and DRAM. Memristors have potential to be fashioned into non-volatile solid-state memory and offer building blocks for computing that could be trained to mimic synaptic interfaces in the human brain.
Study Identifies Unexpected Clue to Peripheral Neuropathies
New research shows that disrupting the molecular function of a tumor suppressor causes improper formation of a protective insulating sheath on peripheral nerves – leading to neuropathy and muscle wasting in mice similar to that in human diabetes and neurodegeneration.
Scientists from Cincinnati Children’s Hospital Medical Center report their findings online Sept. 26 in Nature Communications. The study suggests that normal molecular function of the tumor suppressor gene Lkb1 is essential to an important metabolic transition in cells as peripheral nerves (called axons) are coated with the protective myelin sheath by Schwann glia cells.
“This study is just the tip of the iceberg and a fundamental discovery because of the unexpected finding that a well-known tumor suppressor gene has a novel and important role in myelinating glial cells,” said Biplab Dasgupta PhD, principal investigator and a researcher at the Cincinnati Children’s Cancer and Blood Diseases Institute (CBDI). “Additional study is needed, as the function of Lkb1 may have broader implications – not only in normal development, but also in metabolic reprogramming in human pathologies. This includes functional regeneration of axons after injury and demyelinating neuropathies.”
The process of myelin sheath formation (called myelination) requires extraordinarily high levels of lipid (fat) synthesis because most of myelin is composed of lipids, according to Dasgupta. Lipids are made from citric acid which is produced in the powerhouse of cells called mitochondria. Success of this sheathing process depends on the cells shifting from a glycolytic to mitochondrial oxidative metabolism that generates citric acid, the authors report.
Dasgupta’s research team used Lkb1 mutant mice in the current study. Because the mice did not express Lkb1 in myelin forming glial cells, this allowed scientists to analyze its role in glial cell metabolism and formation of the myelin sheath coating.
When the function of Lkb1 was disrupted in laboratory mice, it blocked the metabolic shift from glycolytic to mitochondrial metabolism, resulting in a thinner myelin sheath (hypomyelination) of the nerves. This caused muscle atrophy, hind limb dysfunction, peripheral neuropathy and even premature death of these mice, according to the authors.
Peripheral neuropathy involves damage to the peripheral nervous system – which transmits information from the brain and spinal cord (the central nervous system) to other parts of the body, according to the National Institute of Neurological Disorders and Stroke (NINDS). There are more than 100 types of peripheral neuropathy, and damage to the peripheral nervous system interferes with crucial messages from the brain to the rest of the body.
The scientists also reported that reducing Lkb1 in Schwann cells decreased the activity of critical metabolic enzyme citrate synthase that makes citric acid. Enhancing Lkb1 increased this activity.
They tested the effect of boosting citric acid levels in the Lbk1 mutant Schwann cells. This enhanced lipid production and partially reversed myelin sheath formation defects in Lbk1 mutant Schwann cells. Dasgupta said this further underscores the importance of Lbk1 and the production of citrate synthase.
Dasgupta and his colleagues are currently testing whether increasing the fat content in the Lbk1 mutant mice diet improves hypomyelination defects. The researchers emphasized the importance of additional research into the laboratory findings to extend their relevance more directly to human disease.
Brain chemical potential new hope in controlling Tourette Syndrome tics
A chemical in the brain plays a vital role in controlling the involuntary movements and vocal tics associated with Tourette Syndrome (TS), a new study has shown.
The research by psychologists at The University of Nottingham, published in the latest edition of the journal Current Biology, could offer a potential new target for the development of more effective treatments to suppress these unwanted symptoms.
The study, led by PhD student Amelia Draper under the supervision of Professor Stephen Jackson, found that higher levels of a neurochemical called GABA in a part of the brain known as the supplementary motor area (SMA) helps to dampen down hyperactivity in the cortical areas that produce movement.
By reducing this hyperactivity, only the strongest signals would get through and produce a movement.
Amelia said: “This result is significant because new brain stimulation techniques can be used to increase or decrease GABA in targeted areas of the cortex. It may be possible that such techniques to adjust the levels of GABA in the SMA could help young people with TS gain greater control over their tics.”
Tourette Syndrome is a developmental disorder associated with these involuntary and repetitive vocal and movement tics. Although the exact cause of TS is unknown, research has shown that people with TS have alterations in their brain ‘circuitry’ that are involved in producing and controlling motor functions.
Both the primary motor cortex (M1) and the supplementary motor area (SMA) are thought to be hyperactive in the brains of those with TS, causing the tics which can be both embarrassing and disruptive, especially for children who often find it difficult to concentrate at school.
Tics can be partially controlled by many people with TS but this often takes enormous mental energy and can leave them exhausted towards the end of the day and can often make their tics more frequent and excessive when they ‘relax’. The majority of people diagnosed with TS in childhood manage to gain control over their tics gradually until they have only mild symptoms by early adulthood but this is often too late for some people who have had their education and social friendships disrupted.
The scientists used a technique called magnetic resonance spectroscopy (MRS) in a 7 Tesla Magnetic Resonance Imaging (MRI) scanner to measure the concentration of certain chemicals in the brain known as neurotransmitters which offer an indication of brain activity.
The chemicals were measured in the M1, the SMA and an area involved in visual processing (V1) which was used as a control (comparison) site. They tested a group of young people with TS and a matched group of typical young people with no known disorders.
They discovered that the people with TS had higher concentrations of GABA, which inhibits neuronal activity, in the SMA.
They used other neuroscience techniques to explore the result in greater detail, finding that having more GABA in the SMA meant that the people with Tourette Syndrome had less activity in the SMA when asked to perform a simple motor task, in this case tapping their finger, which they were able to measure using functional MRI.
Using another technique called transcranial magnetic stimulation (TMS) in which a magnetic field is passed over the brain to stimulate neuron activity, they found that those with the most GABA dampen down the brain activity in the M1 when preparing to make a movement. In contrast, the typically developing group increased their activity during movement preparation.
Finally, they considered how GABA was related to brain structure, specifically the white matter fibre bundles that connect the two hemispheres of the brain, a structure called the corpus callosum. They discovered that those with the highest levels of GABA also had the most connecting fibres, leading them to conclude that the more connecting fibres there are then the more excitatory signals are being produced leading to the need for even more GABA to calm this excess hyperactivity.
The results could lead the way to more targeted approaches to controlling tics. New brain techniques such as transcranial direct-current stimulation (tdcs), a form of neurostimulation which uses constant, low level electrical current delivered directly to the brain via electrodes, has already been shown to be successful in increasing or decreasing GABA in targeted areas of the cortex.
Professor Stephen Jackson added: “This finding is paradoxical because prior to our finding, most scientists working on this topic would have thought that GABA levels in TS would be reduced and not increased as we show. This is because a distinction should be made between brain changes that are causes of the disorder (e.g., reduced GABA cells in some key brain areas) and secondary consequences of the disorder (e.g., increased release of GABA in key brain areas) that act to reduce the effects of the disorder.”
New tdcs devices, similar to commercially-available TENS machines, could potentially be produced to be used by young people with TS to ‘train’ their brains to help them gain control over their tics, offering the benefit that they could be relatively cheap and could be used in the home while performing other tasks such as watching television.
Inattention, hyperactivity, and impulsive behavior in children with ADHD can result in social problems and they tend to be excluded from peer activities. They have been found to have impaired recognition of emotional expression from other faces. The research group of Professor Ryusuke Kakigi of the National Institute for Physiological Sciences, National Institutes of Natural Sciences, in collaboration with Professor Masami K. Yamaguchi and Assistant Professor Hiroko Ichikawa of Chuo University first identified the characteristics of facial expression recognition of children with ADHD by measuring hemodynamic response in the brain and showed the possibility that the neural basis for the recognition of facial expression is different from that of typically developing children. The findings are discussed in Neuropsychologia (available online on Aug. 23, 2014).
The research group showed images of a happy expression or an angry expression to 13 children with ADHD and 13 typically developing children and identified the location of the brain activated at that time. They used non-invasive near-infrared spectroscopy to measure brain activity. Near-infrared light, which is likely to go through the body, was projected through the skull and the absorbed or scattered light was measured. The strength of the light depends on the concentration in “oxyhemoglobin” which gives the oxygen to the nerve cells working actively. The result was that typically developing children showed significant hemodynamic response to both the happy expression and angry expression in the right hemisphere of the brain. On the other hand, children with ADHD showed significant hemodynamic response only to the happy expression but brain activity specific for the angry expression was not observed. This difference in the neural basis for the recognition of facial expression might be responsible for impairment in social recognition and the establishment of peer-relationships.
Scientists Develop First Animal Model for ALS Dementia
The first animal model for ALS dementia, a form of ALS that also damages the brain, has been developed by Northwestern Medicine scientists. The advance will allow researchers to directly see the brains of living mice, under anesthesia, at the microscopic level. This will allow direct monitoring of test drugs to determine if they work.
This is one of the latest research findings since the ALS Ice Bucket Challenge heightened interest in the disease and the need for expanded research and funding.
“This new model will allow rapid testing and direct monitoring of drugs in real time,” said Northwestern scientist and study senior author Teepu Siddique, MD. “This will allow scientists to move quickly and accelerate the testing of drug therapies.”
The new mouse model has the pathological hallmarks of the disease in humans with mutations in the genes for UBQLN2 (ubliqulin 2) and SQSTM1 (P62) that Siddique and colleagues identified in 2011. That pathology was linked to all forms of ALS and ALS/dementia.
Dr. Siddique and Han-Xiang Deng, MD, the corresponding authors on the paper, said they have reproduced behavioral, neurophysiological and pathological changes in a mouse that mimic this form of dementia associated with ALS (amyotrophic lateral sclerosis).
Dr. Siddique is the Les Turner ALS Foundation/Herbert C. Wenske Professor of Neurology at Northwestern University Feinberg School of Medicine and a neurologist at Northwestern Memorial Hospital. Dr. Deng is a research professor in Neurology at Feinberg.
It’s been difficult for scientists to reproduce the genetic mutations of ALS, especially ALS/dementia in animal models, Dr. Siddique noted, which has hampered drug therapy testing.
Five percent or more of ALS cases, also known as Lou Gherig’s disease, also have ALS/dementia.
“ALS with dementia is an even more vicious disease than ALS alone because it attacks the brain causing changes in behavior and language well as causing paralysis,” Dr. Siddique said.
ALS affects an estimated 350,000 people worldwide, with an average survival of three years. In this progressive neurological disorder, the degeneration of neurons leads to muscle weakness and impaired speaking, swallowing and breathing, eventually causing paralysis and death. The associated dementia affects behavior and may affect decision-making, judgment, insight and language.
Brain scans reveal ‘grey matter’ differences in media multitaskers
Simultaneously using mobile phones, laptops and other media devices could be changing the structure of our brains, according to new University of Sussex research.
A study published today (24 September) in PLOS ONE reveals that people who frequently use several media devices at the same time have lower grey-matter density in one particular region of the brain compared to those who use just one device occasionally.
The research supports earlier studies showing connections between high media-multitasking activity and poor attention in the face of distractions, along with emotional problems such as depression and anxiety.
But neuroscientists Kep Kee Loh and Dr Ryota Kanai point out that their study reveals a link rather than causality and that a long-term study needs to be carried out to understand whether high concurrent media usage leads to changes in the brain structure, or whether those with less-dense grey matter are more attracted to media multitasking.
The researchers at the University of Sussex’s Sackler Centre for Consciousness Science used functional magnetic resonance imaging (fMRI) to look at the brain structures of 75 adults, who had all answered a questionnaire regarding their use and consumption of media devices, including mobile phones and computers, as well as television and print media.
They found that, independent of individual personality traits, people who used a higher number of media devices concurrently also had smaller grey matter density in the part of the brain known as the anterior cingulate cortex (ACC), the region notably responsible for cognitive and emotional control functions.
Kep Kee Loh says: “Media multitasking is becoming more prevalent in our lives today and there is increasing concern about its impacts on our cognition and social-emotional well-being. Our study was the first to reveal links between media multitasking and brain structure.”
Scientists have previously demonstrated that brain structure can be altered upon prolonged exposure to novel environments and experience. The neural pathways and synapses can change based on our behaviours, environment, emotions, and can happen at the cellular level (in the case of learning and memory) or cortical re-mapping, which is how specific functions of a damaged brain region could be re-mapped to a remaining intact region.
Other studies have shown that training (such as learning to juggle, or taxi drivers learning the map of London) can increase grey-matter densities in certain parts of the brain.
“The exact mechanisms of these changes are still unclear,” says Kep Kee Loh. “Although it is conceivable that individuals with small ACC are more susceptible to multitasking situations due to weaker ability in cognitive control or socio-emotional regulation, it is equally plausible that higher levels of exposure to multitasking situations leads to structural changes in the ACC. A longitudinal study is required to unambiguously determine the direction of causation.”
Think You Have Alzheimer's? UK Study Suggests You May Be Right
New research by scientists at the University of Kentucky’s Sanders-Brown Center on Aging suggests that people who notice their memory is slipping may be on to something.
The research, led by Richard Kryscio, Ph.D., chair of the Department of of Biostatistics and associate director of the Alzheimer’s Disease Center at UK, appears to confirm that self-reported memory complaints are strong predictors of clinical memory impairment later in life.
Kryscio and his group asked 531 people with an average age of 73 and free of dementia if they had noticed any changes in their memory in the prior year. The participants were also given annual memory and thinking tests for an average of 10 years. After death, participants’ brains were examined for evidence of Alzheimer’s disease.
During the study, 56 percent of the participants reported changes in their memory, at an average age of 82. The study found that participants who reported changes in their memory were nearly three times more likely to develop memory and thinking problems. About one in six participants developed dementia during the study, and 80 percent of those first reported memory changes.
"What’s notable about our study is the time it took for the transition from self-reported memory complaint to dementia or clinical impairment — about 12 years for dementia and nine years for clinical impairment — after the memory complaints began," Kryscio said. "That suggests that there may be a significant window of opportunity for intervention before a diagnosable problem shows up."
Kryscio points out that while these findings add to a growing body of evidence that self-reported memory complaints can be predictive of cognitive impairment later in life, there isn’t cause for immediate alarm if you can’t remember where you left your keys.
"Certainly, someone with memory issues should report it to their doctor so they can be followed. Unfortunately, however, we do not yet have preventative therapies for Alzheimer’s disease or other illnesses that cause memory problems."
The research, which was supported by grants from the National Institutes of Health, the National Institute on Aging, and the National Center for Advancing Translational Sciences, was published in the Sept. 24, 2014, online issue of Neurology.
New EEG electrode set for fast and easy measurement of brain function abnormalities
A new, easy-to-use EEG electrode set for the measurement of the electrical activity of the brain was developed in a recent study completed at the University of Eastern Finland. The solutions developed in the PhD study of Pasi Lepola, MSc, make it possible to attach the electrode set on the patient quickly, resulting in reliable results without any special treatment of the skin. As EEG measurements in emergency care are often performed in challenging conditions, the design of the electrode set pays particular attention to the reduction of electromagnetic interference from external sources.
EEG measurements can be used to detect such abnormalities in the electrical activity of the brain that require immediate treatment. These abnormalities are often indications of severe brain damage, cerebral infarction, cerebral haemorrhage, poisoning, or unspecified disturbed levels of consciousness. One of the most serious brain function abnormalities is a prolonged epileptic seizure, status epilepticus, which is impossible to diagnose without an EEG measurement. In many cases, a rapidly performed EEG measurement and the start of a proper treatment significantly reduces the need for aftercare and rehabilitation. This, in turn, drastically improves the cost-effectiveness of the treatment chain.
Although the benefits of EEG measurements are indisputable, they remain underused in acute and emergency care. A significant reason for this is the fact that the electrode sets available on the markets are difficult to attach on the patient, and their use requires special skills and constant training. This new type of an electrode set is expected to provide solutions for making EEG measurements feasible at as an early stage as possible.
The EEG electrode set was produced using screen printing technology, in which silver ink was used to print the conductors and measurement electrodes on a flexible polyester film. The EEG electrode set consists of 16 hydrogel-coated electrodes which, unlike in the traditional method, are placed on the hair-free areas of the patient’s head, making it easy to attach. The new EEG electrode set significantly speeds up the measurement process because there is no need to scrape the patient’s skin or to use any separate gels. As the electrode set is flexible and solid, the electrodes get automatically placed in their correct places. Furthermore, there is no need to move the patient’s head when putting on the EEG electrode set, which is especially important in patients possibly suffering from a neck or skull injury. Due to the fact that the disposable electrode set is easy and fast to use, it is particularly well-suited to be used in emergency care, in ambulances and even in field conditions. Thanks to the materials used, the electrode set does not interfere with any magnetic resonance or computed tomography imaging the patient may undergo.
The performance of the electrode set was tested by using various electrical tests, on several volunteers, and in real patient cases. The results were compared to those obtained by traditional EEG methods.
The PhD study also focused on the use of screen printing technology solutions to protect electrodes against electromagnetic interference. The silver or graphite shielding layer printed to the outer edge of the electrode set was discovered to significantly reduce external interference on the EEG signal. This shielding layer can be easily and cost-efficiently introduced to all measurement electrodes produced with similar methods. Protecting the electrode with a shielding layer is beneficial when measuring weak signals in conditions that contain external interference.
Researchers Identify Brain Areas Activated by Itch-Relieving Drug
Areas of the brain that respond to reward and pleasure are linked to the ability of a drug known as butorphanol to relieve itch, according to new research led by Gil Yosipovitch, MD, Professor and Chair of the Department of Dermatology at Temple University School of Medicine (TUSM), and Director of the Temple Itch Center. The findings point to the involvement of the brain’s opioid receptors—widely known for their roles in pain, reward, and addiction—in itch relief, potentially opening up new avenues to the development of treatments for chronic itch.
The article, published online September 11, in the Journal of Investigative Dermatology, is the first to show precisely where in the brain butorphanol works to relieve itch. In identifying those areas, the study helps to explain why butorphanol works better for chronic itching mediated by histamine, a small molecule involved in allergic reactions, than for nonhistamine-related types of itch.
"The research allows us to assess butorphanol’s effects," Dr. Yosipovitch said. "We can now identify better targets in the brain that drugs can work on to relieve itch."
The research marks an important step toward the development of itch-specific agents. As Dr. Yosipovitch explained, chronic itching, which affects roughly 12 percent of the population, comprises not just one disease, but many—ranging from atopic eczema and psoriasis to systemic diseases such as lymphoma and chronic liver failure. Biochemically, each of those diseases induces itching via one of two main pathways: one that is mediated by histamine and one that is not. Most pathological itching originates along nonhistaminergic pathways.
Working with Alexandru D. P. Papoiu, MD, PhD, at Wake Forest University School of Medicine, Dr. Yosipovitch experimentally induced itch in human volunteers using either histamine or cowhage, which incites nonhistaminergic itching. Study volunteers were then treated with either butorphanol or a placebo and subjected to functional magnetic resonance imaging (fMRI) to analyze brain activity and assess the effects of butorphanol (or placebo). When volunteers returned seven days later, they received the other treatment and again underwent fMRI.
Butorphanol suppressed histamine itching in all cases and reduced cowhage itching in 35 percent of subjects. The drug’s suppression of histamine itching was associated specifically with the activation of brain areas known as the nucleus accumbens and septal nuclei—areas located deep at the base of the forebrain. The regions are notably rich in so-called kappa (κ)-opioid receptors, on which butorphanol acts. By contrast, the relief of cowhage itch by butorphanol was linked to effects in other brain areas.
The findings suggest that butorphanol works primarily on κ-opioid receptors to suppress the itch sensation induced by histamine. But the drug also has important effects on an itch pathway that does not involve histamine, where the demand for new treatments is greatest.
How nonhistaminergic itching is reduced through the involvement of opioid receptors remains unclear. Opioid receptors modulate the transmission of information about itch in the brain and occur in high levels in the areas of the brain that house neural pathways associated with reward. Reward pathways are known particularly for their response to pleasurable stimuli. Dr. Yosipovitch and Dr. Papoiu have shown in previous work that the activation of reward circuits is correlated with pleasurability and the degree of itch relief derived from self-scratching.
The new study, which Yosipovitch carried out at Wake Forest University prior to joining the TUSM faculty in 2013, further illustrates the power of applying imaging technologies to basic questions in itch research. At Temple’s Itch Center, Yosipovitch is continuing to explore those applications.
"We are in a position now to better understand the itch-scratch cycle," he said. "To break the cycle from the top down, knowing where to target receptors in the brain, would be a major achievement."
Compound from hops aids cognitive function in young animals
Xanthohumol, a type of flavonoid found in hops and beer, has been shown in a new study to improve cognitive function in young mice, but not in older animals.
The research was just published in Behavioral Brain Research by scientists from the Linus Pauling Institute and College of Veterinary Medicine at Oregon State University. It’s another step toward understanding, and ultimately reducing the degradation of memory that happens with age in many mammalian species, including humans.
Flavonoids are compounds found in plants that often give them their color. The study of them – whether in blueberries, dark chocolate or red wine - has increased in recent years due to their apparent nutritional benefits, on issues ranging from cancer to inflammation or cardiovascular disease. Several have also been shown to be important in cognition.
Xanthohumol has been of particular interest because of possible value in treating metabolic syndrome, a condition associated with obesity, high blood pressure and other concerns, including age-related deficits in memory. The compound has been used successfully to lower body weight and blood sugar in a rat model of obesity.
The new research studied use of xanthohumol in high dosages, far beyond what could be obtained just by diet. At least in young animals, it appeared to enhance their ability to adapt to changes in the environment. This cognitive flexibility was tested with a special type of maze designed for that purpose.
“Our goal was to determine whether xanthohumol could affect a process we call palmitoylation, which is a normal biological process but in older animals may become harmful,” said Daniel Zamzow, a former OSU doctoral student and now a lecturer at the University of Wisconsin/Rock County.
“Xanthohumol can speed the metabolism, reduce fatty acids in the liver and, at least with young mice, appeared to improve their cognitive flexibility, or higher level thinking,” Zamzow said. “Unfortunately it did not reduce palmitoylation in older mice, or improve their learning or cognitive performance, at least in the amounts of the compound we gave them.”
Kathy Magnusson, a professor in the OSU Department of Biomedical Sciences, principal investigator with the Linus Pauling Institute and corresponding author on this study, said that xanthohumol continues to be of significant interest for its biological properties, as are many other flavonoids.
“This flavonoid and others may have a function in the optimal ability to form memories,” Magnusson said. “Part of what this study seems to be suggesting is that it’s important to begin early in life to gain the full benefits of healthy nutrition.”
It’s also important to note, Magnusson said, that the levels of xanthohumol used in this study were only possible with supplements. As a fairly rare micronutrient, the only normal dietary source of it would be through the hops used in making beer, and “a human would have to drink 2000 liters of beer a day to reach the xanthohumol levels we used in this research.”
In this and other research, Magnusson’s research has primarily focused on two subunits of the NMDA receptor, called GluN1 and GluN2B. Their decline with age appears to be related to the decreased ability to form and quickly recall memories.
In humans, many adults start to experience deficits in memory around the age of 50, and some aspects of cognition begin to decline around age 40, the researchers noted in their report.
Brain Wave May Be Used to Detect What People Have Seen, Recognize
Brain activity can be used to tell whether someone recognizes details they encountered in normal, daily life, which may have implications for criminal investigations and use in courtrooms, new research shows.
The findings, published in Psychological Science, a journal of the Association for Psychological Science, suggest that a particular brain wave, known as P300, could serve as a marker that identifies places, objects, or other details that a person has seen and recognizes from everyday life.
Research using EEG recordings of brain activity has shown that the P300 brain wave tends to be large when a person recognizes a meaningful item among a list of nonmeaningful items. Using P300, researchers can give a subject a test called the Concealed Information Test (CIT) to try to determine whether they recognize information that is related to a crime or other event.
Most studies investigating P300 and recognition have been conducted in lab settings that are far removed from the kinds of information a real witness or suspect might be exposed to. This new study marks an important advance, says lead research John B. Meixner of Northwestern University, because it draws on details from activities in participants’ normal, daily lives.
“Much like a real crime, our participants made their own decisions and were exposed to all of the distracting information in the world,” he explains.
“Perhaps the most surprising finding was the extent to which we could detect very trivial details from a subject’s day, such as the color of umbrella that the participant had used,” says Meixner. “This precision is exciting for the future because it indicates that relatively peripheral crime details, such as physical features of the crime scene, might be usable in a real-world CIT — though we still need to do much more work to learn about this.”
To achieve a more realistic CIT, Meixner and co-author J. Peter Rosenfeld outfitted 24 college student participants with small cameras that recorded both video and sound — the students wore the cameras clipped to their clothes for 4 hours as they went about their day.
For half of the students, the researchers used the recordings to identify details specific to each person’s day, which became “probe” items for that person. The researchers also came up with corresponding, “irrelevant” items that the student had not encountered — if the probe item was a specific grocery store, for example, the irrelevant items might include other grocery stores.
For the other half of the students, the “probe” items related to details or items they had not encountered, but which were instead drawn from the recordings of other participants. The researchers wanted to simulate a real investigation, in which a suspect with knowledge of a crime would be shown the same crime-related details as a suspect who may have no crime-related knowledge.
The next day, all of the students returned to the lab and were shown a series of words that described different details or items (i.e., the probe and irrelevant items), while their brain activity was recorded via EEG.
The results showed that the P300 was larger for probe items than for irrelevant items, but only for the students who had actually seen or encountered the probe.
Further analyses revealed that P300 responses effectively distinguished probe items from irrelevant items on the level of each individual participant, suggesting that it is a robust and reliable marker of recognition.
These findings have implications for memory research, but they may also have real-world application in the domain of criminal law given that some countries, like Japan and Israel, use the CIT in criminal investigations.
“One reason that the CIT has not been used in the US is that the test may not meet the criteria to be admissible in a courtroom,” says Meixner. “Our work may help move the P300-based CIT one step closer to admissibility by demonstrating the test’s validity and reliability in a more realistic context.”
Meixner, Rosenfeld, and colleagues plan on investigating additional factors that may impact detection, including whether images from the recordings may be even more effective at eliciting recognition than descriptive words – preliminary data suggest this may be the case.
Have you ever eaten something totally new and it made you sick? Don’t give up; if you try the same food in a different place, your brain will be more “forgiving” of the new attempt. In a new study conducted by the Sagol Department of Neurobiology at the University of Haifa, researchers found for the first time that there is a link between the areas of the brain responsible for taste memory in a negative context and those areas in the brain responsible for processing the memory of the time and location of the sensory experience. When we experience a new taste without a negative context, this link doesn’t exist.
The area of the brain responsible for storing memories of new tastes is the taste cortex, found in a relatively insulated area of the human brain known as the insular cortex. The area responsible for formulating a memory of the place and time of the experience (the episode) is the hippocampus. Until now, researchers assumed that there was no direct connection between these areas – i.e., the processing of information about a taste is not related to the time or the place one experiences the taste. The accepted thinking was that a negative experience – for example, being exposed to a bad taste – would be negative in the same way anywhere, and the brain would create a memory of the taste itself, divorced from the time or place.
But in this new study, conducted by doctoral student Adaikkan Chinnakkaruppan in the laboratory of Prof. Kobi Rosenblum of the Sagol Department of Neurobiology at the University of Haifa, in cooperation with the Riken Institute, the leading brain research institute in Tokyo, the researchers demonstrate for the first time that there is a functional link between the two brain regions.
In the study the researchers sought to examine the relationship between the taste cortex (which is responsible for taste memory), and three different areas in the hippocampus: CA1, which is responsible for encoding the concept of space (where we are located); DG, the area responsible for encoding the time relationship between events; and CA3, responsible for filling in missing information. To do this the researchers took ordinary mice and mice that were genetically engineered by their Japanese colleagues such that these three areas of the brain functioned normally but were lacking plasticity, which did not allow new memories reliant on them to be created.
“In brain research, the manipulation we do must be very delicate and precise, otherwise the changes can make the entire experiment irrelevant to proving or refuting the research hypothesis,” said Prof. Rosenblum.
The mice were exposed to two new tastes, one that caused stomach pains (to mimic exposure to toxic food) and another that didn’t cause that feeling. By comparing the two groups it emerged that when the new taste was not accompanied by an association with toxic food, there was no difference between the normal mice and those whose various functional areas in the hippocampus didn’t allow plasticity. But when the taste caused a negative feeling, there was clear involvement of the CA1 area, which is responsible for encoding the space.
“The significance of this is that the moment we go back to the same place at which we experienced the taste associated with a bad feeling, subconsciously the negative memory will be much stronger than if we come to taste the same taste in a totally different place,” explained Prof. Rosenblum. Similarly, the DG area, which is responsible for encoding the time between incidents, was involved the more time that passed between the new taste and the stomach discomfort. “This means that even during a simple associative taste, the brain operates the hippocampus to produce an integrated experience that includes general information about the time between events and their location,” he said.
The findings, which were recently published in the Journal of Neuroscience, expose the complexity and richness of the simple sensory experiences that are engraved in our brains and that in most cases we aren’t even aware of. Moreover, the study can help explain behavioral results and the difficulty in producing memories when certain areas of the brain become dysfunctional following and illness or accident. The better we understand the encoding of simple sensory experiences in the brain and the link between the feeling, time and place of the experiences; we will better understand the complex process of creating memories and storing them in our brains.
Statin Use Following Hemorrhagic Stroke Associated with Improved Survival
Patients who were treated with a statin in the hospital after suffering from a hemorrhagic stroke were significantly more likely to survive than those who were not, according to a study published today in JAMA Neurology. This study was conducted by the same researchers who recently discovered that the use of cholesterol-lowering statins can improve survival in victims of ischemic stroke.
Ischemic stroke is caused by a constriction or obstruction of a blood vessel that blocks blood from reaching areas of the brain, while hemorrhagic stroke, also known as intracerebral hemorrhage, is bleeding in the brain.
“Some previous research has suggested that treating patients with statins after they suffer hemorrhagic stroke may increase their long-term risk of continued bleeding,” said lead author Alexander Flint, MD, PhD, of the Kaiser Permanente Department of Neuroscience in Redwood City, Calif. “Yet the findings of our study suggest that stopping statin treatments for these patients may carry substantial risks.”
The study included 3,481 individuals who were admitted to any of 20 Kaiser Permanente hospitals in Northern California with a hemorrhagic stroke over a 10-year period. Researchers looked at patient survival and discharge 30 days after the stroke.
Patients treated with a statin while in the hospital were more likely to be alive 30 days after suffering a hemorrhagic stroke than those who were not treated with a statin — 81.6 percent versus 61.3 percent. Patients treated with a statin while in the hospital were also more likely to be discharged to home or an acute rehabilitation facility than those who were not — 51.1 percent compared to 35.0 percent.
Patients whose statin therapy was discontinued — that is, patients taking a statin as an outpatient prior to experiencing a hemorrhagic stroke who did not receive a statin as an inpatient — had a mortality rate of 57.8 percent compared with a mortality rate of 18.9 percent for patients using a statin before and during hospitalization.
The researchers concluded that statin use is strongly associated with improved outcomes after hemorrhagic stroke, and that discontinuing statin use is strongly associated with worsened outcomes after hemorrhagic stroke.
Evidence Supports Deep Brain Stimulation for Obsessive-Compulsive Disorder
Available research evidence supports the use of deep brain stimulation (DBS) for patients with obsessive-compulsive disorder (OCD) who don’t respond to other treatments, concludes a review in the October issue of Neurosurgery, official journal of the Congress of Neurological Surgeons (CNS). The journal is published by Lippincott Williams & Wilkins, a part of Wolters Kluwer Health.
Based on evidence, two specific bilateral DBS techniques are recommended for treatment of carefully selected patients with OCD, according to a new clinical practice guideline endorsed by the CNS and the American Association of Neurological Surgeons. While calling for further research in key areas, Dr. Clement Hamani of Toronto Western Hospital and coauthors emphasize that patients with OCD symptoms that don’t respond to other treatments should continue to have access to DBS.
Deep Brain Stimulation for OCD—What’s the Evidence?
Dr. Hamani led a multispecialty expert group in performing a systematic review of research on the effectiveness of DBS for OCD. Deep brain stimulation—placement of electrodes in specific areas of the brain, followed by electrical stimulation of those areas—has become an important treatment for patients with Parkinson’s disease and other movement disorders.
Although many patients with OCD respond well to medications and/or psychotherapy, 40 to 60 percent continue to experience symptoms despite treatment. Over the past decade, a growing number of reports have suggested that DBS may be an effective alternative in these “medically refractory” cases.
Dr. Hamani and colleagues were tasked with analyzing the supporting evidence and developing an initial clinical practice guideline for the use of DBS for patients with OCD. The review and guideline development process was sponsored by the American Society of Stereotactic and Functional Neurosurgery and the CNS. Out of more than 350 papers, the reviewers identified seven high-quality studies evaluating DBS for OCD.
Based on that evidence, they conclude that bilateral stimulation (on both sides of the brain) of two brain “targets”—areas called the subthalamic nucleus and the nucleus accumbens—can be regarded as effective treatments for OCD. In controlled clinical trials, both techniques improved OCD symptoms by around 30 percent on a standard rating scale.
While Research Proceeds, well-selected treatment-resistant severe OCD Patients Should Have Access to DBS
That evidence forms the basis for a clinical guideline stating that bilateral DBS is a “reasonable therapeutic option” for patients with severe OCD that does not respond to other treatments. The guideline also notes that there is “insufficient evidence” supporting the use of any type of unilateral DBS target (one side of the brain) for OCD.
The review highlights the difficulties of studying the effectiveness of DBS for OCD—because most patients respond to medical treatment, studies of this highly specialized treatment typically include only small numbers of patients. Dr. Hamani and coauthors identify some priorities for future research: particularly to identify the most effective brain targets and the subgroups of patients most likely to benefit.
Despite the limited evidence base, DBS therapy for OCD has been approved by the Food and Drug Administration under a humanitarian device exemption. Dr. Hamani and coauthors note that various safeguards are in place to ensure appropriate use, and prevent overuse, of DBS for OCD.
While research continues, they believe that functional neurosurgeons should continue to work with other specialists to ensure that patients with severe, medically refractory OCD continue to have access to potentially beneficial DBS therapy.
An international, interdisciplinary group of researchers led by Gabor G. Kovacs from the Clinical Institute of Neurology at the MedUni Vienna has demonstrated, through the use of a new antibody, how Parkinson’s disease spreads from cell to cell in the human brain. Until now, this mechanism has only been observed in experimental models, but has now been demonstrated for the first time in humans too.
At the focus of the study, recently published in the highly respected journal “Neurobiology of Disease”, is the protein α-synuclein. This protein is present in the human brain but develops into a pathologically modified form in the presence of Parkinson’s disease and a common type of age-related dementia (known as Lewy body dementia, responsible for up to a quarter of all dementia-related diseases).
This study, which was carried out by a team from the MedUni Vienna in collaboration with researchers from the USA, Germany and Hungary, demonstrates for the first time that human nerve cells take up the pathological α-synuclein and thereby transfer the disease from one cell to the next. “This explains why patients with Parkinson’s disease deteriorate more and more from a clinical perspective and develop new symptoms, because the disease is able to spread to other parts of the brain through this infection process,” says Gabor G Kovacs, commenting on the central finding of the study.
New antibody achieved major breakthrough The researchers demonstrated this mechanism using an antibody that scientists from the MedUni Vienna played a key role in helping to develop in collaboration with the German biotech firm Roboscreen. As the study shows, this antibody is the first to distinguish between the physiologically present and disease-associated form of α-synuclein and reacts exclusively with the pathological form.
Mechanism of spread demonstrated for the first time could provide a basis for new treatments for Parkinson’s "For patients with Parkinson’s disease, this means that α-synuclein’s mechanism of spread from cell to cell could serve as a point of therapeutic attack if we are able to block this cell-to-cell transfer mechanism", continues Kovacs. In diagnostic terms, this antibody also represents a major breakthrough, since the antibodies used previously were unable to distinguish between the physiological and disease-associated form, which meant that they could not be used as easily for diagnostic purposes, e.g. in body fluids.
New antibody improves diagnosis The fact that this is now possible for the first time has been demonstrated by a further study, also recently published in the specialist publication “Clinical Neuropathology”. According to this study, the new antibody can be used to detect disease-associated α-synuclein in the cerebrospinal fluid of patients with brain disease associated with α-synuclein. This is of major importance for clinical practice, because it means it will be possible to clinically determine whether the dementia is caused by Lewy bodies or not. This study arose through close collaboration between the Clinical Institute of Neurology (Gabor G. Kovacs) and the University Department of Neurology (Walter Pirker) at the MedUni Vienna.
Cooling of Dialysis Fluids Protects Against Brain Damage
While dialysis can cause blood pressure changes that damage the brain, cooling dialysis fluids can protect against such effects. The findings come from a study appearing in an upcoming issue of the Journal of the American Society of Nephrology (JASN). The cooling intervention can be delivered without additional cost and is simple to perform.
While dialysis is an essential treatment for many patients with kidney disease, it can cause damage to multiple organs, including the brain and heart, due to the sudden removal of bodily fluids.
To characterize dialysis-induced brain injury and to see whether cooled dialysis fluids (called dialysate) might help reduce such injury, Christopher McIntyre, DM, and his colleagues randomized 73 new dialysis patients to dialyze with body temperature dialysate or dialysate cooled to 0.5◦C below body temperature for 1 year.
The study demonstrated that dialysis drives progressive white matter brain injury due to blood pressure instability; however, patients who dialyzed at 0.5◦C below body temperature were completely protected against such white matter changes.
“This study demonstrates that paying attention to improving the tolerability of dialysis treatment—in this case by the simple and safe intervention of reducing the temperature of dialysate—does not just make patients feel better, but also can completely protect the brain from progressive damage,” said Dr. McIntyre.
Down syndrome helps researchers understand Alzheimer’s disease
The link between a protein typically associated with Alzheimer’s disease and its impact on memory and cognition may not be as clear as once thought, according to a new study from the University of Wisconsin-Madison’s Waisman Center. The findings are revealing more information about the earliest stages of the neurodegenerative disease.
The researchers — including lead study author Sigan Hartley, UW-Madison assistant professor of human development and family studies, and Brad Christian, UW-Madison associate professor of medical physics and psychiatry and director of PET Physics in the Waisman Laboratory for Brain Imaging and Behavior — looked at the role of the brain protein amyloid-β in adults living with Down syndrome, a genetic condition that leaves people more susceptible to developing Alzheimer’s. They published their findings in the September issue of the journal Brain.
"Our hope is to better understand the role of this protein in memory and cognitive function," says Hartley. "With this information we hope to better understand the earliest stages in the development of this disease and gain information to guide prevention and treatment efforts."
However, the findings of their study not only may help scientists better understand the condition as it impacts those living with Down syndrome, but they are also relevant to adults without the genetic syndrome.
"There are many unanswered questions about at what point amyloid-β, together with other brain changes, begins to take a toll on memory and cognition and why certain individuals may be more resistant than others," says Hartley.
The UW-Madison scientists, along with collaborators at the University of Pittsburgh, studied 63 healthy adults with Down syndrome, aged 30 to 53, who did not exhibit clinical signs of Alzheimer’s or other forms of dementia. They found that many adults with Down syndrome had high levels of amyloid-β protein but did not suffer the expected negative consequences of the elevated protein.
Alzheimer’s disease is the sixth leading cause of death in the U.S. People with Down syndrome are born with an extra copy of the 21st chromosome, where the gene that codes for the amyloid-β protein resides.
For the study, which was conducted over the course of two days, researchers used magnetic resonance imaging (MRI) and positron emission tomography (PET) scans to capture images of the participants’ brains. Twenty-two of the 63 participants had elevated levels of amyloid-β but showed no evidence of diminished memory or cognitive function when compared to those without elevated levels of the protein. The researchers controlled for differences in age and intellectual level.
Similarly, when assessed as a continuous measure, amyloid-β levels were not tied to differences in memory or cognitive ability, such as changes in visual and verbal memory, attention and language.