New Danish/Italian research shows how medicine for the brain can be absorbed through the nose. This paves the way to more effective treatment of neurological diseases like Alzheimer’s and tumors in the brain.
A big challenge in medical science is to get medicine into the brain when treating patients with neurological diseases. The brain will do everything to keep foreign substances out and therefore the brains of neurological patients fight a constant, daily battle to throw out the medicine prescribed to help the patients.
The problem is the so-called blood-brain barrier, which prevents the active substances in medicine from travelling from the blood into the brain.
"The barrier is created because there is extremely little space between the cells in the brain’s capillar walls. Only very small molecules can enter through these openings and become active in the brain. And for the substances which finally get in, a new problem arises: The brain will do anything to throw them out again", explains assistant professor, Massimiliano di Cagno from in the Department of Physics, Chemistry and Pharmacy.
On this background science is looking for alternative pathways to the brain - and the nose is a candidate receiving much attention. From cocaine abusers it is well known that a substance can be absorbed through the nose and reach the brain extremely effective.
"It is very interesting to investigate if medical drugs can do the same", says di Cagno.
In recent years research has shown that it can be a very good idea to send medicine to the brain via the nose. The medicine can be sprayed into the nose and absorbed through the olfactory bulb, which is positioned at the front of the underside of the brain. Once the medicine passes the olfactory bulb there is direct access to the brain.
But there are many challenges to be solved before patients can be prescribed medication to be taken nasally.
"One of the biggest challenges is getting the olfactory bulb to absorb the substances aimed for the brain", explains di Cagno.
Together with Barbara Luppi from the University of Bologna in Italy he therefore investigated how to improve access to the olfactory bulb.
"It’s all done at nano-level, and the challenge is to find the vehicles that can transport the required medicine to the brain. In our attempts to come up with efficient vehicles we now point at some special liposomes and polymers that can bring an active substance to the olfactory bulb more than 2-3 times more efficiently than when using the standard techniques", explains di Cagno.
Liposomes are small spheres of fat, which is often used to protect active substance and carry them into the body. Polymers are long molecules that can be attached to the liposomes so that they can be made to look like water and thus not be rejected by the body’s immune system.
The improved efficiency is very important for the development of future medicines for neurological diseases. Today a pill has to contain millions of times more active ingredients than the brain needs to fight the disease. But because the blood-brain barrier is so effective and the brain so good at throwing foreign substances out, you have to send an extreme amount of active substances towards the brain.
"In a pill patients receive extremely more medicine than they need, and when we talk about medicines with severe and unpleasant side effects, it is not good. It is therefore very important that we get better at delivering exactly the amount of active substances needed - and no more", says di Cagno.
The new liposomes and polymers from his and Barbara Luppi’s work can not only carry the active ingredients efficiently through the slimy mucosa of a nose, so that they can reach the olfactory bulb. They can also do it over a longer time.
"We want to develop a vehicle that can release the active ingredients over a long time, over many hours, so the patients do not have to spray their nose too many times a day. In our experiments we still saw active substances being released after three hours, and we are very happy with that. One must remember that the nasal mucosa is constantly working to remove foreign objects and substances", says di Cagno.
The researchers performed their tests on the mucous membranes (mucosa) of sheep. Sheep and human mucosa and the mucinous secretions it produces in the nose are very similar. The sheep’s mucosa were cleaned, distributed on a tissue and then stretched over a container. In the container the researchers placed an active substance, hydrocortisone, that had been put inside different kinds of vehicles. After this the researchers observed how effectively and for how long time the various vehicles transported the hydrocortisone through the mucosa.
In the era of globalization, bilingualism is becoming more and more frequent, and it is considered a plus. However, can this skill turn into a disadvantage, when someone acquires aphasia? More precisely, if a bilingual person suffers brain damage (i.e. stroke, head trauma, dementia) and this results in a language impairment called aphasia, then the two languages can be disrupted, thus increasing the challenge of language rehabilitation. According to Dr. Ana Inés Ansaldo, researcher at the Research Centre of the Institut universitaire de gériatrie de Montréal (IUGM), and a professor at the School of Speech Therapy and Audiology at Université de Montréal, research evidence suggests that bilingualism can be a lever—and not an obstacle—to aphasia recovery. A recent critical literature review conducted by Ana Inés Ansaldo and Ladan Ghazi Saidi -Ph.D student- points to three interventional avenues to promote cross-linguistic effects of language therapy (the natural transfer effects that relearning one language has on the other language).

It is important for speech-language pathologists to clearly identify a patient’s mastery of either language before and after aphasia onset, in order to decide which language to stimulate to achieve better results. Overall, the studies reviewed show that training the less proficient language (before or after aphasia onset)—and not the dominant language—results in bigger transfer effects on the untreated language.
Moreover, similarities between the two languages, at the levels of syntax, phonology, vocabulary, and meaning, will also facilitate language transfer. Specifically, working on “cognates,” or similar words in both languages, facilitates cross-linguistic transfer of therapy effects. For example, stimulating the word “table” in French will also help the retrieval of the word “table” in English, as these words have the same meaning and similar sounds in French and English. However, training “non-cognates” (words that sound alike, but do not share the same meanings) can be confusing for the bilingual person with aphasia.
In general, semantic therapy approaches, based on stimulating word meanings, facilitate transfer of therapy effects from the treated language to the untreated one. In other words, drilling based on the word’s semantic properties can help recovering both the target word and its cross-linguistic equivalent. For example, when the speech-language pathologist cues the patient to associate the word “dog” to the ideas of “pet,” “four legs” and “bark,”, the French word “chien”is as well activated, and will be more easily retrieved than by simply repeating the word “dog”.
“In the past, therapists would ask patients to repress or stifle one of their two languages, and focus on the target language. Today, we have a better understanding of how to use both languages, as one can support the other. This is a more complex approach, but it gives better results and respects the inherent abilities of bilingual people. Considering that bilinguals may soon represent the majority of our clients, this is definitely a therapeutic avenue we need to pursue,” explained Ana Inés Ansaldo, who herself is quadrilingual.
People with mental illness smoke at much higher rates than the overall population. But the popular belief that they are self-medicating is most likely wrong, according to researchers at the Indiana University School of Medicine. Instead, they report, research indicates that psychiatric disease makes the brain more susceptible to addiction.
As smoking rates in the general population have fallen below 25 percent, smoking among the mentally ill has remained pervasive, encompassing an estimated half of all cigarettes sold. Despite the well-known health dangers of tobacco consumption, smoking among the mentally ill has long been widely viewed as “self-medication,” reducing the incentive among health care professionals to encourage such patients to quit.
"This is really a devastating problem for people with mental illness because of the broad health consequences of nicotine addiction," said R. Andrew Chambers, M.D., associate professor of psychiatry at the IU School of Medicine. "Nicotine addiction is the number one cause of premature illness and death in the United States, and most of that morbidity and mortality is concentrated in people with mental illness."
In a report published recently in the journal Addiction Biology, the research team lead by Dr. Chambers reported the results of experiments using an established animal model of schizophrenia in which rats display a neuropsychiatric syndrome that closely resembles the disease.
Both the schizophrenia-model rats and normal rats were given access to intravenous self-administration of nicotine.
"The mentally ill rats acquired nicotine use faster and consumed more nicotine," Dr. Chambers said. "Then when we cut them off from access to nicotine, they worked much harder to restore access to nicotine than did the normal ‘control’ rats."
In additional testing, the researchers found that administration of nicotine provided equal, but minimal, cognitive benefits to both groups of rats when performing a memory test. When the nicotine was withdrawn, however, both groups of rats were more cognitively impaired, so that any cognitive benefits to nicotine administration were “paid for” by cognitive impairments later.
“These results strongly suggest that what has changed in mental illness to cause smoking at such high rates results in a co-morbid addiction to which the mentally ill are highly biologically vulnerable. The evidence suggests that the vulnerability is an involuntary biological result of the way the brain is designed and how it develops after birth, rather than it being about a rational choice to use nicotine as a medicine,” Dr. Chambers said.
The data, he said, point to neuro-developmental mechanisms that increase the risk of addiction. Better understanding of those mechanisms could lead to better prevention and treatment strategies, especially among mentally ill smokers, Dr. Chambers said.
A video interview of Dr. Chambers discussing his research is available here.
New research from the University of Sheffield could offer solutions into slowing down the progression of motor neurone disease (MND).

Scientists from the University of Sheffield’s Institute for Translational Neuroscience (SITraN) conducted pioneering research assessing how the devastating debilitating disease affects individual patients.
MND is an incurable disease destroying the body’s cells which control movement causing progressive disability. Present treatment options for those with MND only have a modest effect in improving the patient’s quality of life.
Professor Pamela Shaw, Director of SITraN, and her research team worked in collaboration with a fellow world leading MND scientist Dr Caterina Bendotti and her group at the Mario Negri Institute for Pharmacological Research in Milan, Italy. Together they investigated why the progression of MND following onset of symptoms varies in speed, even in the presence of a known genetic cause of the condition.
The research, published in the scientific journal Brain, investigated two mouse models of MND caused by an alteration in the SOD1 gene, a known cause of MND in humans. One of the strains had a rapidly progressing disease course and the other a much slower change in the symptoms of MND. The teams from Sheffield and Milan looked at the factors which might explain the differences observed in speed and severity in the progression of the disease. They used a scientific technique known as gene expression profiling to identify factors within motor neurones that control vulnerability or resistance to MND in order to shed light on the factors important for the speed of motor neurone injury in human patients.
The study, funded by the Motor Neurone Disease Association, revealed new evidence, at the point of onset of the disease, before muscle weakness was observed, showing key differences in major molecular pathways and the way the protective systems of the body responded, between the profiles of the rapid progressing and slow progressing mouse models. In the case of the model with rapidly progressing MND the motor neurones showed reduced functioning of the cellular systems for energy production, disposal of waste proteins and neuroprotection. Motor neurones from the model with more slowly progressing MND showed an increase in protective inflammation and immune responses and increased function of the mechanisms that protect motor neurones from damage.
The research provides valuable clues about mechanisms that have the effect of slowing down the progression of disabling symptoms in MND.
Professor Shaw said that the state-of-the-art Functional Genomics laboratory in SITraN had enabled the research team to use a cutting edge technique called gene expression profiling.
“This enables us to ‘get inside’ the motor neurones in health and disease and understand better what is happening to cause motor neurone injury in MND,” she said.
“This project was a wonderful collaboration, supported by the MND Association, between research teams in Sheffield and Milan. We are very excited about the results which have given us some new ideas for treatment strategies which may help to slow disease progression in human MND.”
Dr Caterina Bendotti said: “MND is a clinically heterogenous disease with a high variability in its course which makes assessments of potential therapies difficult. Thanks to the recent evidence in our laboratory of a difference in the speed of symptom progression in two MND models carrying the same gene mutation and the successful collaboration with Professor Pamela Shaw and her team, we have identified some mechanisms that may help to predict the disease duration and eventually to slow it down.
“I strongly believe that the new hypotheses generated by this study and our ongoing collaboration are the prerequisites to be able to fight this disease.”
Brian Dickie from MND Association added: “These new and important findings in mice open up the possibility for new treatment approaches in man. It is heartening to see such a productive collaboration between two of the leading MND research labs in Europe, combining their unique specialist knowledge and technical expertise in the fight against this devastating disease.”
MND affects more than 6,000 sufferers in the UK with the majority of cases being sporadic but approximately five per cent of cases are familial or inherited with an identifiable genetic cause. Sufferers may lose their ability to walk, talk, eat and breathe.
A diagnosis of multiple sclerosis (MS) is a hard lot. Patients typically get the diagnosis around age 30 after experiencing a series of neurological problems such as blurry vision, wobbly gait or a numb foot. From there, this neurodegenerative disease follows an unforgiving course.
Many people with MS start using some kind of mobility aid — cane, walker, scooter or wheelchair — by 45 or 50, and those with the most severe cases are typically bed-bound by 60. The medications that are currently available don’t do much to slow the relentless march of the disease.
In search of a better option for MS patients, a team of UW-Madison biochemists has discovered a promising vitamin D-based treatment that can halt — and even reverse — the course of the disease in a mouse model of MS. The treatment involves giving mice that exhibit MS symptoms a single dose of calcitriol, the active hormone form of vitamin D, followed by ongoing vitamin D supplements through the diet. The protocol is described in a scientific article that was published online in August in the Journal of Neuroimmunology.
"All of the animals just got better and better, and the longer we watched them, the more neurological function they regained," says biochemistry professor Colleen Hayes, who led the study.
MS afflicts around 400,000 people nationwide, with 200 new cases diagnosed each week. Early on, this debilitating autoimmune disease, in which the immune system attacks the myelin coating that protects the brain’s nerve cells, causes symptoms including weakness, loss of dexterity and balance, disturbances to vision, and difficulty thinking and remembering. As it progresses, people can lose the ability to walk, sit, see, eat, speak and think clearly.
Current FDA-approved treatments only work for some MS patients and, even among them, the benefits are modest. “And in the long term they don’t halt the disease process that relentlessly eats away at the neurons,” Hayes adds. “So there’s an unmet need for better treatments.”
While scientists don’t fully understand what triggers MS, some studies have linked low levels of vitamin D with a higher risk of developing the disease. Hayes has been studying this “vitamin D hypothesis” for the past 25 years with the long-term goal of uncovering novel preventive measures and treatments. Over the years, she and her researchers have revealed some of the molecular mechanisms involved in vitamin D’s protective actions, and also explained how vitamin D interactions with estrogen may influence MS disease risk and progression in women.
In the current study, which was funded by the National Multiple Sclerosis Society, Hayes’ team compared various vitamin D-based treatments to standard MS drugs. In each case, vitamin D-based treatments won out. Mice that received them showed fewer physical symptoms and cellular signs of disease.
First, Hayes’ team compared the effectiveness of a single dose of calcitriol to that of a comparable dose of a glucocorticoid, a drug now administered to MS patients who experience a bad neurological episode. Calcitriol came out ahead, inducing a nine-day remission in 92 percent of mice on average, versus a six-day remission in 58 percent for mice that received glucocorticoid.
"So, at least in the animal model, calcitriol is more effective than what’s being used in the clinic right now," says Hayes.
Next, Hayes’ team tried a weekly dose of calcitriol. They found that a weekly dose reversed the disease and sustained remission indefinitely.
But calcitriol can carry some strong side effects — it’s a “biological sledgehammer” that can raise blood calcium levels in people, Hayes says — so she tried a third regimen: a single dose of calcitriol, followed by ongoing vitamin D supplements in the diet. This one-two punch “was a runaway success,” she says. “One hundred percent of mice responded.”
Hayes believes that the calcitriol may cause the autoimmune cells attacking the nerve cells’ myelin coating to die, while the vitamin D prevents new autoimmune cells from taking their place.
While she is excited about the prospect of her research helping MS patients someday, Hayes is quick to point out that it’s based on a mouse model of disease, not the real thing. Also, while rodents are genetically homogeneous, people are genetically diverse.
"So it’s not certain we’ll be able to translate (this discovery to humans)," says Hayes. "But I think the chances are good because we have such a broad foundation of data showing protective effects of vitamin D in humans."
The next step is human clinical trials, a step that must be taken by a medical doctor, a neurologist. If the treatment works in people, patients with early symptoms of MS may never need to receive an official diagnosis.
"It’s my hope that one day doctors will be able to say, ‘We’re going to give you an oral calcitriol dose and ramp up the vitamin D in your diet, and then we’re going to follow you closely over the next few months. You’re just going to have this one neurological episode and that will be the end of it,’" says Hayes. "That’s my dream."
Understanding RNA biology in dendrites may inform neurological and psychiatric illness therapeutics
Protein synthesis in the extensions of nerve cells, called dendrites, underlies long-term memory formation in the brain, among other functions. “Thousands of messenger RNAs reside in dendrites, yet the dynamics of how multiple dendrite messenger RNAs translate into their final proteins remain elusive,” says James Eberwine, PhD, professor of Pharmacology, Perelman School of Medicine at the University of Pennsylvania, and co-director of the Penn Genome Frontiers Institute.

Dendrites, which branch from the cell body of the neuron, play a key role in the communication between cells of the nervous system, allowing for many neurons to connect with each other. Dendrites detect the electrical and chemical signals transmitted to the neuron by the axons of other neurons. The synapse is the neuronal structure where this chemical connection is formed, and investigators surmise that it is here where learning and memory occur.
These structural and chemical changes – called synaptic plasticity — require rapid, new synthesis of proteins. Cells may use different rates of translation in different types of mRNA to produce the right amounts and ratios of required proteins.
Knowing how proteins are made to order – as it were - at the synapse can help researchers better understand how memories are made. Nevertheless, the role of this “local” environment in regulating which messenger RNAs are translated into proteins in a neuron’s periphery is still a mystery.
Eberwine, first author Tae Kyung Kim, PhD, a postdoc in the Eberwine lab, and colleagues including Jai Yoon Sul, PhD, assistant professor in Pharmacology, showed that protein translation of two dendrite mRNAs is complex in space and time, as reported online in Cell Reports this week.
“We needed to look at more than one RNA at the same time to get a better handle on real- world processes, and this is the first study to do that in a live neuron,” Eberwine explains.
At Home in the Hippocampus
“It’s not always one particular RNA that dominates at a translation hotspot versus another type of RNA,” says Eberwine. “Since there are 1,000 to 3,000 different mRNA types present in the dendrite overall, but not 1,000 to 3,000 different translational hot spots, do the mRNAs ‘take turns’ being translated in space and time at the ribosomes at the hotspots?”
The researchers engineered the glutamate receptor RNAs to contain different fluorescent proteins that are independently detectable, as well as a photo-switchable protein to determine when new proteins were being made. In the case of the photo-switchable protein studies, when an mRNA for the glutamate receptor protein is marked green, it means it has already been translated.
When a laser is passed over the green protein, it changes to red as a way of tagging when it has been been translated, and new proteins synthesized at that hotspot would be green, which is visible by the appearance of yellow fluorescence (green + red, as measured by light on the visible spectrum). These tricks of the light allow the team to keep track of newly made proteins over time and space.
“This is the first time this method of protein labeling has been used to measure the act of translation of multiple proteins over space and time in a quantitative way,” says Eberwine. “We call it quantitative functional genomics of live cell translation.”
“Our results suggest that the location of the translational hotspot is a regulator of the simultaneous translation of multiple messenger RNAs in nerve cell dendrites and therefore synaptic plasticity,” says Sul.
Laying the Groundwork
Almost 10 years ago, the Eberwine lab discovered that nerve-cell dendrites have the capacity to splice messenger RNA, a process once believed to take place only in the nucleus of cells. Here, a gene is copied into mRNA, which possesses both exons (mature mRNA regions that code for proteins) and introns (non-coding regions). mRNA splicing works by cutting out introns and merging the remaining exon pieces, resulting in an mRNA capable of being translated into a specific protein.
The vast array of proteins within the human body arises in part from the many ways that mRNAs can be spliced and reconnected. Specifically, splicing removes pieces of intron and exon regions from the RNA. The resulting spliced RNA is made into protein.
If the RNA has different exons spliced in and out of it, then different proteins can be made from this RNA. The Eberwine lab was successful in showing that splicing can occur in dendrites because they used sensitive technologies developed in their lab, which permits them to detect and quantify RNA splicing, as well as the translated protein in single isolated dendrites.
Understanding the dynamics of RNA biology and protein translation in dendrites promises to provide insight into regulatory mechanisms that may be modulated for therapeutic purposes in neurological and psychiatric illnesses. The directed development of therapeutics requires this detailed knowledge, says Eberwine.
Activating a mother’s immune system during her pregnancy disrupts the development of neural cells in the brain of her offspring and damages the cells’ ability to transmit signals and communicate with one another, researchers with the UC Davis Center for Neuroscience and Department of Neurology have found. They said the finding suggests how maternal viral infection might increase the risk of having a child with autism spectrum disorder or schizophrenia.

The research, “MHCI Requires MEF2 Transcription Factors to Negatively Regulate Synapse Density during Development and in Disease,” is published in the Journal of Neuroscience.
The study’s senior author is Kimberley McAllister, professor in the Center for Neuroscience with appointments in the departments of Neurology and Neurobiology, Physiology and Behavior, and a researcher with the UC Davis MIND Institute.
“This is the first evidence that neurons in the developing brain of newborn offspring are altered by maternal immune activation,” McAllister said. “Until now, very little has been known about how maternal immune activation leads to autism spectrum disorder and schizophrenia-like pathophysiology and behaviors in the offspring.”
The study was conducted in mice and rats and compared the brains of the offspring of rodents whose immune systems had been activated and those of animals whose immune systems had not been activated. The pups of animals that were exposed to viral infection had much higher brain levels of immune molecules known as the major histocompatibility complex I (MHCI) molecules.
“This is the first evidence that MHCI levels on the surface of young cortical neurons in offspring are altered by maternal immune activation,” McAllister said.
The researchers found that the high MHCI levels impaired the ability of the neurons from the newborn mice’s brains to form synapses, the tiny gaps separating brain cells through which signals are transmitted. Earlier research has suggested that ASD and schizophrenia may be caused by changes in the development of connections in the brain, especially the cerebral cortex.
The researchers experimentally reduced MHCI to normal levels in neurons from offspring following maternal immune activation.
“Remarkably, synapse density returned to normal levels in those neurons,” McAllister said.
“These results indicate that maternal immune activation does indeed alter connectivity during prenatal development, causing a profound deficit in the ability of cortical neurons to form synapses that is caused by changes in levels of MHCI on the neurons,” she said.
MHCI did not work alone to limit the development of synapses. In a series of experiments, the UC Davis researchers determined that MHCI interacted with calcineurin and myocyte enhancer factor-2 (Mef2), a protein that is a critical determinant of neuronal specialization.
MHCI, calcineurin and Mef2 form a biological signaling pathway that had not been previously identified. McAllister’s team showed that in the offspring of the maternal immune activation mothers, this novel signaling pathway was much more active than it was in the offspring of non-MIA animals.
“This finding provides a potential mechanism linking maternal immune activation to disease-linked behaviors,” McAllister said.
It also is a mechanism that may help McAllister and other scientists to develop diagnostic tests and eventually therapies to improve the lives of individuals with these neurodevelopmental disorders.
A new experimental approach to treating a type of brain cancer called medulloblastoma has been developed by researchers at Sanford-Burnham. The method targets cancer stem cells—the cells that are critical for maintaining tumor growth—and halts their ability to proliferate by inhibiting enzymes that are essential for tumor progression. The process destroys the ability of the cancer cells to grow and divide, paving the way for a new type of treatment for patients with this disease.

The research team, led by Robert Wechsler-Reya, Ph.D., professor in Sanford-Burnham’s NCI-Designated Cancer Center and director of the Tumor Initiation and Maintenance Program, discovered that the medulloblastoma cancer cells responsible for tumor growth and progression (called cancer stem cells or tumor-propagating cells—TPCs) divide more quickly than normal cells. Correspondingly, they have higher levels of certain enzymes that regulate the cell cycle (Aurora and Polo-like kinases). By using small-molecule inhibitors to stop the action of these enzymes, the researchers were able to block the growth of tumor cells from mice as well as humans. The research findings are described in an online paper published today by Cancer Research.
“One tumor can have many different types of cells in it, and they can grow at different rates. By targeting fast-growing TPCs with cell-cycle inhibitors, we have developed a new route to assault medulloblastoma. In this study, we have shown that cell-cycle inhibitors essentially block medulloblastoma tumor progression by halting TPC expansion, and have opened the window to preventing cancer recurrence,” said Wechsler-Reya.
The team’s first set of experiments used a mouse model for medulloblastoma. In-vitro studies of mouse tumor cells showed that cell-cycle inhibitors caused tumor cell death. In vivo, mice that were treated with the inhibitor had smaller tumors that weighed less compared to mice that were not treated, essentially halting the progression of the tumor.
The second set of experiments used human medulloblastoma cells. When the researchers treated these human tumor cells with cell-cycle inhibitors, they also observed a significant reduction in tumor growth and progression.
Finally, when the scientists combined cell-cycle inhibitors with treatments currently used for medulloblastoma, they found that the combination worked together to produce results that were greater than either inhibitor alone.
“These results strongly support an approach to treatment that combines current therapies with cell-cycle inhibitors to treat medulloblastoma. Our hope is that the combination of these inhibitors will prevent tumor progression and drug resistance, and improve the overall effectiveness of current treatment options. We look forward to clinical studies in human medulloblastoma patients as well as other cancers that are suitable for this approach,” Wechsler-Reya said.
Proteins play important roles in the human body, particularly neuroproteins that maintain proper brain function.
Brain diseases such as ALS, Alzheimer’s, and Parkinson’s are known as “tangle diseases” because they are characterized by misfolded and tangled proteins which accumulate in the brain.
A team of Australian and American scientists discovered that an unusual amino acid called BMAA can be inserted into neuroproteins, causing them to misfold and aggregate. BMAA is produced by cyanobacteria, photosynthetic bacteria that form scums or mats in polluted lakes or estuaries.
BMAA has been detected in the brain tissues of ALS patients.
In an article published in PLOS ONE scientists at the University of Technology Sydney and the Institute for Ethnomedicine in Jackson Hole, Wyoming, report that BMAA mimics a dietary aminoacid, L-Serine, and is mistakenly incorporated into neuroproteins, causing the proteins to misfold. The misfolded proteins build up in cells, eventually killing them.
"We found that BMAA inserts itself by seizing the transfer RNA for L-Serine. This, in essence, puts a kink in the protein causing it to misfold," says lead author Dr. Rachael Dunlop, a cell biologist in Sydney working in the laboratory of Dr. Ken Rodgers.
"The cells then begin programmed cell death, called apoptosis. "Even more importantly, the scientists found that extra L-Serine added to the cell culture can prevent the insertion of BMAA into neuroproteins. The possibility that L-Serine could be used to prevent or slow ALS is now being studied."
Even though L-serine occurs in our diet, its safety and efficacy for ALS patients should be properly determined through FDA-approved clinical trials before anyone advocates its use,” says American co-author Dr. Paul Cox.
In ALS, motor neurons in the brain and spinal cord die, progressively paralyzing the body until even swallowing and breathing becomes impossible.
The disease is relatively rare but has affected a number of high-profile people including Professor Stephen Hawking and Yankee baseball player Lou Gehrig.
"For many years scientists have linked BMAA to an increased risk of motor neuron disease but the missing pieces of the puzzle relate to how this might occur. Finally, we have one of those pieces," said Dr Sandra Banack, a co-author on the paper.
Bad experiences enhance memory formation about places, scientists at The University of Queensland have found.

Dr Oliver Baumann from the Queensland Brain Institute found that associating negative imagery with specific locations activates a part of the brain responsible for forming memory of places during navigation – the parahippocampal cortex.
“This heightened recall occurs automatically, without people even being aware that the negative imagery is affecting their memories,” said Dr Baumann, who worked on the study in the QBI’s Mattingley lab.
“It could serve as a cue for avoiding potential threats,” Dr Baumann said.
“Our findings show that emotions can exert a powerful influence on spatial and navigational memory for places.
“In future we might be able to boost memory functions by triggering the positive side-effects of emotional arousal, while avoiding the need for negative experiences.”
For the research, Professor Jason Mattingley built a “virtual house” and staged events in each room unrelated to the subject navigating the house.
The events were designed to elicit an emotional response – positive, negative, or neutral, and varied in their rate of occurrence.
“The events were illustrated using images from the International Affective Picture System library and included dramatic scenes of attack and threat, as well as more pleasant imagery,” Dr Baumann said.
The day after navigating through the house, participants viewed static images of the house without the emotional imagery, while their neural activity was recorded using an MRI scanner.
“The results showed that emotional arousal exerted a powerful influence on memory by enhancing parahippocampal activity,” Dr Baumann said.
The study was published in the Journal of Cognitive Neuroscience.
Several studies have shown that expecting a reward or punishment can affect brain activity in areas responsible for processing different senses, including sight or touch. For example, research shows that these brain regions light up on brain scans when humans are expecting a treat. However, researchers know less about what happens when the reward is actually received—or an expected reward is denied. Insight on these scenarios can help researchers better understand how we learn in general.

To get a better grasp on how the brain behaves when people who are expecting a reward actually receive it, or conversely, are denied it, Tina Weis of Carl-von-Ossietzky University and her colleagues monitored the auditory cortex—the part of the brain that processes and interprets sounds—while volunteers solved a task in which they had a chance of winning 50 Euro cents with each round, signaled by a specific sound. Their findings show that the auditory cortex activity picked up both when participants were expecting a reward and received it, as well as when their expectation of receiving no reward was correct.
The article is entitled “Feedback that Confirms Reward Expectation Triggers Auditory Cortex Activity.” It appears in the Articles in Press section of the Journal of Neurophysiology, published by the American Physiological Society.
Methodology
The researchers worked with 105 healthy adult volunteers with normal hearing. While each volunteer received a functional MRI (fMRI)—a brain scan that measures brain activity during tasks—the researchers had them solve a task with sounds where they had the chance of winning money at the end of each round. At the beginning of a round participants heard a sound and had to learn if this sound signified that they could win a 50 Euro cents reward or not. They then saw a number on a screen and had to press a button to indicate whether the number was greater or smaller than 5. If the sound before indicated that they could receive a reward and they solved the number task quickly and correctly, an image of a 50 Euro cents coin appeared on the screen. The researchers monitored brain activity in the subjects’ auditory cortex throughout the task, paying special attention to what happened when they received the reward, or not, at the end of the round.
Results
The study authors found that when the volunteers were expecting and finally received a reward, then their auditory cortex was activated. Similarly, there was an increase in brain activity in this area when the subjects weren’t expecting a reward and didn’t get one. There was no additional activity when they were expecting a reward and didn’t get one.
Importance of the Findings
These findings add to accumulating evidence that the auditory cortex performs a role beyond just processing sound. Rather, this area of the brain appears to be activated during other activities that require learning and thought, such as confirming expectations of receiving a reward.
"Our findings thus support the view of a highly cognitive role of the auditory cortex," the study authors say.
New research that looked at whether two commonly prescribed statin medicines, used to lower low-density lipoprotein (LDL) or‘bad cholesterol’ levels in the blood, can adversely affect cognitive function has found that one of the drugs tested caused memory impairment in rats.

Between six and seven million people in the UK take statins daily and the findings follow anecdotal evidence of people reporting that they feel that their newly prescribed statin is affecting their memory. Last year, the US Food and Drug Administration (FDA) insisted that all manufacturers list in their side effects that statins might affect cognitive function.
The study, led by scientists at the University of Bristol and published in the journal PLOS ONE, tested pravastatin and atorvostatin (two commonly prescribed statins) in rat learning and memory models. The findings show that while no adverse cognitive effects were observed in rat performance for simple learning and memory tasks for atorvostatin, pravastatin impaired their performance.
Rats were treated daily with pravastatin (brand name - Pravachol) or atorvostatin (brand name - Lipitor) for 18 days. The rodents were tested in a simple learning task before, during and after treatment, where they had to learn where to find a food reward. On the last day of treatment and following one week withdrawal, the rats were also tested in a task which measures their ability to recognise a previously encountered object (recognition memory).
The study’s findings showed that pravastatin tended to impair learning over the last few days of treatment although this effect was fully reversed once treatment ceased. However, in the novel object discrimination task, pravastatin impaired object recognition memory. While no effects were observed for atorvostatin in either task.
The results suggest that chronic treatment with pravastatin impairs working and recognition memory in rodents. The reversibility of the effects on stopping treatment is similar to what has been observed in patients, but the lack of effect of atorvostatin suggests that some types of statin may be more likely to cause cognitive impairment than others.
Neil Marrion, Professor of Neuroscience at Bristol’s School of Physiology and Pharmacology in the Faculty of Medical and Veterinary Sciences and the study’s lead author, said: “This finding is novel and likely reflects both the anecdotal reports and FDA advice. What is most interesting is that it is not a feature of all statins. However, in order to better understand the relationship between statin treatment and cognitive function, further studies are needed.”

Greg Noack was 24 when he moved from Ontario to Victoria, B.C. He had just graduated from college and was looking forward to a fresh start.
One early morning in 1996, as he was returning home from his graveyard shift at the hotel, Noack was attacked from behind by a group of men.
He doesn’t remember being struck on the head. He does remember waking from a 15-day coma to learn he had suffered a traumatic brain injury (TBI).
Noack, through the care of his health-care team, relearned how to walk, write, and feel particular emotions.
“I was enamoured by what my therapists were able to do for me,” said Noack. “I was lucky that I got back most of my function.”
Three years post-injury, Noack enrolled in Sault College’s Occupational Therapist Assistant/Physical Therapist Assistant Program and graduated with honours.
Shortly after, Noack was hired by the Toronto Rehab Acquired Brain Injury Rehab team as an occupational therapist assistant and later became a rehab therapist.
Most recently, he was seconded to Dr. Robin Green’s traumatic brain injury research team.
Dr. Green, Senior Scientist and Neuropsychologist, Toronto Rehab and Canada Research Chair in Traumatic Brain Injury, and her Toronto Rehab team have been studying impediments to brain injury recovery as well as treatments to offset the impediments.
Dr. Green’s work suggests that moderate-severe TBI may be a progressive neurological disorder –a whole new way of perceiving the condition.
“What may be occurring after a serious brain injury,” said Dr. Green, “is that damaged tissue is leaving healthy areas of the brain disconnected and under stimulated. Over time, healthy areas may deteriorate.”
Importantly, they discovered that in people with chronic moderate-severe TBI, environmental enrichment – increased physical, social and cognitive stimulation - can offset this deterioration.
Her research paper, entitled “Environmental enrichment may protect against hippocampal atrophy in the chronic stages of traumatic brain injury,” was published September 24 in Frontiers in Human Neuroscience.
In their study of 25 patients with moderate-severe TBI, her team found a positive reaction to environmental enrichment.
Those who reported greater amounts of environmental enrichment – for example, reading, problem solving exercises, puzzles, physical activity, socializing – at 5 months after their injury showed less shrinkage of the hippocampus (associated with memory functioning) from 5 to 28 months post-injury.
“People with moderate-severe TBI are commonly unable to return to the same level of engagement in their work, school or social lives as before the injury,” said Dr. Green. “However, those with greater environmental enrichment may be keeping vulnerable areas stimulated. Environmental enrichment is also known to increase production of neurons in the hippocampus and to promote their integration into existing brain networks.”
Based on the findings from their study, Green’s team is now engaged in research designed to proactively offset deterioration, which includes the delivery of environmental enrichment to patients. Noack is instrumental in delivering enriched therapy for TBI patients who are enrolled in one of Dr. Green’s research studies.
“One thing I loved about this study is that it facilitated greater customization of a patient’s care,” said Noack. “I could see how my patients benefited from the increased amount of stimulation through extended therapy.”
“Although the brains of patients are showing negative changes, patients are still showing recovery of their functioning in spite of it,” said Dr. Green. “If we are able to offset the negative brain changes through the treatments we are developing, we may be able to very significantly improve patients’ recovery and the quality of their aging with a brain injury.”
Researchers from Penn Medicine and University of Oviedo Identify Molecular Pathway Linking ICU Ventilation to Brain Damage
At least 30 percent of patients in intensive care units (ICUs) suffer some form of mental dysfunction as reflected in anxiety, depression, and especially delirium. In mechanically-ventilated ICU patients, the incidence of delirium is particularly high, about 80 percent, and may be due in part to damage in the hippocampus, though how ventilation is increasing the risk of damage and mental impairment has remained elusive.
Now, a new study published in the American Journal of Respiratory and Critical Care Medicine fromresearchers at the University of Oviedo in Spain, St. Michael’s Hospital in Toronto, Canada, and the Perelman School of Medicine at the University Pennsylvaniafound a molecular mechanism that may explain the connection between mechanical ventilation and hippocampal damage in ICU patients.
The investigators, including Adrian González-López, PhD, in the laboratory of Guillermo M. Albaiceta, MD, PhD at the University of Oviedo , and co-authored by Konrad Talbot, PhD, an assistant research professor in Neurobiology in the Department of Psychiatry at Penn Medicine, began by studying the hippocampus in control mice and in mice on low or high-pressure mechanical ventilation for 90 minutes. Compared to the controls, those on either low- or high-pressure ventilation showed evidence of neuronal cell death in the hippocampus, as a result of a cell suicide program called apoptosis.
Searching for the molecular cause of the ventilation-induced apoptosis, the team discovered that a well-known apoptosis trigger had been set off in the hippocampus of the ventilated animals. That trigger is dopamine-induced suppression of a molecule known as Akt, which normally acts to prevent neuronal apoptosis. Akt suppression was clearly evident in the hippocampus of the ventilated mice and was associated with a hyperdopaminergic state (increased levels of dopamine) in that brain area. The ventilated mice had elevated gene expression of the enzyme tyrosine hydroxylase, which is critical in synthesizing dopamine. The resulting rise in dopamine increases the strength of dopamine receptor activation in the hippocampus.
The investigators hypothesized that ventilation-induced apoptosis in the hippocampus was at least partly mediated by elevated activation of dopamine receptors in that brain area. This was confirmed by showing that pretreatment of mice with type 2 (D2) dopamine receptor blockers injected into the ventricles of the brain significantly reduced ventilation-induced apoptosis in the hippocampus.
How mechanical ventilation manages to affect the hippocampus was answered by experiments on mice in which the vagus cranial nerve connecting the lungs with the brain was severed. In these mice, mechanical ventilation had virtually no effect on levels of the dopamine-synthesizing enzyme or on apoptosis in the hippocampus.
The investigators then studied the consequences of ventilation and elevated hippocampal dopamine on dysbindin-1, a protein known to affect levels of cell surface D2 dopamine receptors, cognition, and possibly the risk of psychosis. High-pressure ventilation in mice caused an increase in gene expression of dysbindin-1C, and later, in protein levels of dysbindin-1C. Dopamine alone had similar effects on dysbindin-1C in hippocampal slice preparations, effects that were inhibited by D2 receptor blockers.
Since dysbindin-1 can lower cell-surface D2 receptors and protect against apoptosis, the authors speculate that increased dysbindin-1C expression in the ventilated mice may reflect compensatory responses to ventilation-induced hippocampal apoptosis. That possibility applies to ICU cases given the additional finding by the authors that total dysbindin-1 was increased in hippocampal neurons of ventilated compared to non-ventilated humans who died in the ICU.
The findings could lead to new therapeutic uses of established drugs and targets for new drugs that activate a molecular pathway mediating adverse effects of ICU ventilation on brain function.
“The results prove the existence of a pathogenic mechanism of lung stretch-induced hippocampal apoptosis that could explain the development of neurobehavioral disorders in patients exposed to mechanical ventilation,” the authors write. One of the coauthors, Dr. Talbot, adds: “The study indicates the need to reevaluate use of D2 receptor antagonists in minimizing the negative cognitive effects of mechanical ventilation in ICU patients and to evaluate the novel possibility that elevation in dysbindin-1C expression can also reduce those effects.”
The corresponding author, Dr. Albaiceta, offered a look at future research on this topic: “Now that we have established the mouse model, we are mainly looking for therapeutic approaches aimed at avoiding the vagal activation caused by mechanical ventilation and therefore prevent the deleterious effects observed in the hippocampus,” he said. “We are also interested in studying the relationship between the different described gene polymorphisms of dysbindin, Akt, and type 2 dopamine receptor versus the incidence of neurological disorders in patients on ventilation in ICUs. This could help us to identify susceptible individuals to in which a preventive treatment could be effective.”
For years, scientists have attempted to understand how Alzheimer’s disease harms the brain before memory loss and dementia are clinically detectable. Most researchers think this preclinical stage, which can last a decade or more before symptoms appear, is the critical phase when the disease might be controlled or stopped, possibly preventing the failure of memory and thinking abilities in the first place.

Important progress in this effort is reported in October in Lancet Neurology. Scientists at the Charles F. and Joanne Knight Alzheimer Disease Research Center at Washington University School of Medicine in St. Louis, working in collaboration with investigators at the University of Maastricht in the Netherlands, helped to validate a proposed new system for identifying and classifying individuals with preclinical Alzheimer’s disease.
Their findings indicate that preclinical Alzheimer’s disease can be detected during a person’s life, is common in cognitively normal elderly people and is associated with future mental decline and mortality. According to the scientists, this suggests that preclinical Alzheimer’s disease could be an important target for therapeutic intervention.
A panel of Alzheimer’s experts, convened by the National Institute on Aging in association with the Alzheimer’s Association, proposed the classification system two years ago. It is based on earlier efforts to define and track biomarker changes during preclinical disease.
According to the Washington University researchers, the new findings offer reason for encouragement, showing, for example, that the system can help predict which cognitively normal individuals will develop symptoms of Alzheimer’s and how rapidly their brain function will decline. But they also highlight additional questions that must be answered before the classification system can be adapted for use in clinical care.
“For new treatments, knowing where individuals are on the path to Alzheimer’s dementia will help us improve the design and assessment of clinical trials,” said senior author Anne Fagan, PhD, research professor of neurology. “There are many steps left before we can apply this system in the clinic, including standardizing how we gather and assess data in individuals, and determining which of our indicators of preclinical disease are the most accurate. But the research data are compelling and very encouraging.”
The classification system divides preclinical Alzheimer’s into three stages:
The researchers applied these criteria to research participants studied from 1998 through 2011 at the Knight Alzheimer Disease Research Center. The center annually collects extensive cognitive, biomarker and other health data on normal and cognitively impaired volunteers for use in Alzheimer’s studies.
The scientists analyzed information on 311 individuals age 65 or older who were cognitively normal when first evaluated. Each participant was evaluated annually at the center at least twice; the participant in this study with the most data had been followed for 15 years.
At the initial testing, 41 percent of the participants had no indicators of Alzheimer’s disease (stage 0); 15 percent were in stage 1 of preclinical disease; 12 percent were in stage 2; and 4 percent were in stage 3. The remaining participants were classified as having cognitive impairments caused by conditions other than Alzheimer’s (23 percent) or did not meet any of the proposed criteria (5 percent).
“A total of 31 percent of our participants had preclinical disease,” said Fagan. “This percentage matches findings from autopsy studies of the brains of older individuals, which have shown that about 30 percent of people who were cognitively normal had preclinical Alzheimer’s pathology in their brain.”
Scientists believe the rate of cognitive decline increases as people move through the stages of preclinical Alzheimer’s. The new data support this idea. Five years after their initial evaluation, 11 percent of the stage 1 group, 26 percent of the stage 2 group, and 52 percent of the stage 3 group had been diagnosed with symptomatic Alzheimer’s.
Individuals with preclinical Alzheimer’s disease were six times more likely to die over the next decade than older adults without preclinical Alzheimer’s disease, but researchers don’t know why.
“Risk factors for Alzheimer’s disease might also be associated with other life-threatening illnesses,” Fagan said. “It’s also possible that the presence of Alzheimer’s hampers the diagnosis and treatment of other conditions or contributes to health problems elsewhere in the body. We don’t have enough data yet to say, but it’s an issue we’re continuing to investigate.”
Real-time Imaging Technique Provides Essential Molecular Picture of Protective Nerve Sheath
Researchers have made an exciting breakthrough – developing a first-of-its-kind imaging tool to examine myelin damage in multiple sclerosis (MS). An extremely difficult disease to diagnose, the tool will help physicians diagnose patients earlier, monitor the disease’s progression, and evaluate therapy efficacy.

Case Western Reserve University School of Medicine scientists have developed a novel molecular probe detectable by positron emission tomography (PET) imaging. The new molecular marker, MeDAS, offers the first non-invasive visualization of myelin integrity of the entire spinal cord at the same time, as published today in an article in the Annals of Neurology.
“While MS originates in the immune system, the damage occurs to the myelin structure of the central nervous system. Our discovery brings new hope to clinicians who may be able to make an accurate diagnosis and prognosis in as little as a few hours compared to months or even years,” said Yanming Wang, PhD, senior author of study and associate professor of radiology at Case Western Reserve University School of Medicine. “Because of its shape and size, it is particularly difficult to directly detect myelin damage in the spinal cord; this is the first time we have been able to image its function at the molecular level.”
As the most common acquired autoimmune disease currently affecting more than two million people worldwide, MS is characterized by destruction of myelin, the membrane that protects nerves. Once damaged, it inhibits the nerves’ ability to transmit electrical impulses, causing cognitive impairment and mobility dysfunction. So far, there is no cure for MS, therapies are only available that modify the symptoms.
In addition to its role in monitoring the effects of myelin-repair drugs currently under development, the new imaging tool offers a real-time quantitative clinical diagnosis of MS. A long lag exists between the onset of disease, physical symptoms in the patient and diagnosis via behavioral testing and magnetic resonance imaging (MRI). The lesions, or plaques, as detected by a MRI in the brain and spinal cord are not myelin specific and thus poorly associated with a patient’s disease severity or progression. There is an urgent need to find a new imaging marker that correlates with a patient’s pathology.
“This discovery has open the door to develop new drugs that can truly restore nerve function, not just modify the symptoms,” said Robert Miller, PhD, co-author on the study, vice president for research for Case Western Reserve and the Allen C. Holmes Professor of Neurological Diseases at the School of Medicine. “A cure for MS requires both repairing myelin and a tool to measure the mechanism.”
For the past 20 years, Miller’s lab has been working tirelessly to create new myelin-repair therapies that would restore nerve function. Successful translation of new drugs from animal studies to human clinical trials is contingent upon researchers’ ability to measure and evaluate the effectiveness of a therapy.
Created by Wang’s laboratory, the MeDAS molecular probe works like a homing device. Injected into the body intravenously, it is programmed to seek out and bind only to myelin in the central nervous system, i.e., the brain, spinal cord and optic nerves. A positron-emitting radioisotope label on the molecule allows a PET scanner to detect the targets and quantify their intensity and location. The data can then be reconstructed into an image as shown in the article: http://onlinelibrary.wiley.com/doi/10.1002/ana.23965/abstract.
“This is an indispensable tool to help find a new way to treat MS down the road” said Chunying Wu, PhD, first author of the study and instructor of radiology at Case Western Reserve. “It can also be used as a platform technology to unlock the mysteries of other myelin related diseases such as spinal cord injury.”
Our brains give us the remarkable ability to make sense of situations we’ve never encountered before—a familiar person in an unfamiliar place, for example, or a coworker in a different job role—but the mechanism our brains use to accomplish this has been a longstanding mystery of neuroscience.

Now, researchers at the University of Colorado Boulder have demonstrated that our brains could process these new situations by relying on a method similar to the “pointer” system used by computers. “Pointers” are used to tell a computer where to look for information stored elsewhere in the system to replace a variable.
For the study, published today in the Proceedings of the National Academy of Sciences, the research team relied on sentences with words used in unique ways to test the brain’s ability to understand the role familiar words play in a sentence even when those words are used in unfamiliar, and even nonsensical, ways.
For example, in the sentence, “I want to desk you,” we understand the word “desk” is being used as a verb even though our past experience with the word “desk” is as a noun.
“The fact that you understand that the sentence is grammatically well formed means you can process these completely novel inputs,” said Randall O’Reilly, a professor in CU-Boulder’s Department of Psychology and Neuroscience and co-author of the study. “But in the past when we’ve tried to get computer models of a brain to do that, we haven’t been successful.”
This shows that human brains are able to understand the sentence as a structure with variables—a subject, a verb and often, an object—and that the brain can assign a wide variety of words to those variables and still understand the sentence structure. But the way the brain does this has not been understood.
Computers routinely complete similar tasks. In computer science, for example, a computer program could create an email form letter that has a pointer in the greeting line. The pointer would then draw the name information for each individual recipient into the greeting being sent to that person.
In the new study, led by Trenton Kriete, a postdoctoral researcher in O’Reilly’s lab, the scientists show that the connections in the brain between the prefrontal cortex and the basal ganglia could play a similar role to the pointers used in computer science. The researchers added new information about how the connections between those two regions of the brain could work into their model.
The result was that the model could be trained to understand simple sentences using a select group of words. After the training period, the researchers fed the model new sentences using familiar words in novel ways and found that the model could still comprehend the sentence structure.
While the results show that a pointer-like system could be at play in the brain, the function is not identical to the system used in computer science, the scientists said. It’s similar to comparing an airplane’s wing and a bird’s wing, O’Reilly said. They’re both used for flying but they work differently.
In the brain, for example, the pointer-like system must still be learned. The brain has to be trained, in this case, to understand sentences while a computer can be programmed to understand sentences immediately.
“As your brain learns, it gets better and better at processing these novel kinds of information,” O’Reilly said.
Working with mice, Johns Hopkins researchers have discovered that weeks of treatment with a repurposed FDA-approved drug halted the growth of — and ultimately left no detectable trace of — brain tumor cells taken from adult human patients.
The scientists targeted a mutation in the IDH1 gene first identified in human brain tumors called gliomas by a team of Johns Hopkins cancer researchers in 2008. This mutation was found in 70 to 80 percent of lower-grade and progressive forms of the brain cancer. The change occurs within a single spot along a string of thousands of genetic coding letters, and is disruptive enough to keep the seemingly innocuous protein from playing its role in converting glucose into energy. Instead, the mutation hijacks the protein to make a new molecule not normally found in the cell, which is apparently a linchpin in the process of forming and maintaining cancer cells.
Encouraged by the new findings, described online Sept. 16 in the open-access journal Oncotarget, the Johns Hopkins researchers say they want to work quickly to design a clinical trial to bring what they learned in mice to humans with gliomas. Despite the growing understanding of IDH1 mutant gliomas, the development of effective therapies has proven challenging, they say.
"Usually in the lab, we’re happy to see a drug slow down tumor growth," says Alexandra Borodovsky, a graduate student in the Cellular and Molecular Medicine Program at the Johns Hopkins University School of Medicine who performed the experiments. "We never expect tumors to regress, but that is exactly what happened here."
"This therapy has worked amazingly well in these mice," says study leader Gregory J. Riggins, M.D., Ph.D., a professor of neurosurgery and oncology at the Johns Hopkins University School of Medicine. "We have spoken with neurosurgeons here, and as soon as possible, we want to start discussing the parameters of a clinical trial to see if this will work in our patients as a follow-up to surgery."
The researchers caution that many treatments have cured cancers in mice, and then failed in humans.
The IDH1 gene, whose name stands for isocitrate dehydrogenase 1, produces an enzyme that regulates cell metabolism. Mutations, or changes in the DNA code, force the IDH1 gene to increase production of a flawed version of the enzyme. The flawed enzyme produces large amounts of an entirely new molecule, called 2-hydroxyglutarate. This molecule is believed to cause groups of atoms called methyl groups to latch onto the DNA strand.
Although methylation is a normal cellular process, when too many methyl groups glom onto the DNA, Riggins says, this can interfere with normal cell biology and eventually contribute to cancer formation and growth.
Borodovsky, Riggins and their colleagues — including Timothy A. Chan, M.D., Ph.D., of Memorial Sloan-Kettering Cancer Center in New York — thought that a drug that could strip those methyl groups might be able to reverse the cancer process in those cancers with IDH1 mutations. They chose 5-azacytidine, which is approved to treat a pre-leukemia condition called myelodysplastic syndrome and is being tested on lung and other cancers at Johns Hopkins and elsewhere.
Riggins notes that one of the difficulties in developing treatments for IDH1 mutant brain cancers is finding a model in which to study them. Cell lines containing the IDH1 mutation are difficult to grow in the laboratory, for example. Borodovsky worked with Johns Hopkins neurosurgeons to obtain tumor cells from glioma patients likely to have IDH1 mutations and injected them under the skins of mice. She did this for months, before finally getting the tumor cells to grow.
Once the tumors grew, the researchers injected the mice with 5-azacytidine for 14 weeks and saw a dramatic reduction in growth and what appeared to be complete regression. Then they withdrew therapy. Seven weeks later, the tumors had not regrown. The researchers, however, said they do expect the tumors to regrow at some point, and are still monitoring the mice.
The type of tumor targeted by the researchers eventually progresses to a subtype of glioblastoma multiform — the deadliest form of brain cancer — known as progressive or secondary glioblastoma. They arise as a lower-grade glioma and are initially treated with surgery alone, but eventually they progress to the more lethal form of tumor. Survival is longer than with glioblastoma, but it is found in younger patients, those under the age of 50. While both types of tumor look the same at the end, they look very different at the molecular level, Riggins says, leading researchers to believe they may have a better chance at targeting the progressive tumors, which are more likely to have the IDH1 mutation.
Chan’s team at Sloan-Kettering simultaneously published a paper in Oncotarget, along with Borodovsky and Riggins, which describes similar results in a different animal model using a similar drug. This is further evidence that the strategy is a sound one, Riggins says.
Humans and other mammals show particularly intensive sleeping patterns during puberty. The brain also matures fastest in this period. But when pubescent rats are administered caffeine, the maturing processes in their brains are delayed. This is the result of a study supported by the Swiss National Science Foundation (SNSF).

Children’s and young adults’ average caffeine consumption has increased by more than 70 per cent over the past 30 years, and an end to this rise is not in sight: the drinks industry is posting its fastest-growing sales in the segment of caffeine-laden energy drinks. Not everybody is pleased about this development. Some people are worried about possible health risks caused in young consumers by the pick-me-up.
Researchers led by Reto Huber of the University Children’s Hospital Zurich are now adding new arguments to the debate. In their recently published study conducted on rats, the conclusions call for caution: in pubescent rodents, caffeine intake equating to three to four cups of coffee per day in humans results in reduced deep sleep and a delayed brain development.
Peak level during puberty
Both in humans and in rats, the duration and intensity of deep sleep as well as the number of synapses or connections in the brain increase during childhood, reaching their highest level during puberty and dropping again in adult age. “The brain of children is extremely plastic due to the many connections,” says Huber. When the brain then begins to mature during puberty, a large number of these connections are lost. “This optimisation presumably occurs during deep sleep. Key synapses extend, others are reduced; this makes the network more efficient and the brain more powerful,” says Huber.
Timid instead of curious
Huber’s group of researchers administered moderate quantities of caffeine to 30-day-old rats over five days and measured the electrical current generated by their brains. The deep sleep periods, which are characterised by slow waves, were reduced from day 31 until day 42, i.e. well beyond the end of administering caffeine. Compared to the rats being given pure drinking water, the researchers found far more neural connections in the brains of the caffeine-drinking animals at the end of the study. The slower maturing process in the brain also had an impact on behaviour: rats normally become more curious with age, but the rats consuming caffeine remained timid and cautious.
The brain goes through a delicate maturing phase in puberty, during which many mental diseases can break out. And even if the rat brain differs clearly from that of humans, the many parallels in how the brains develop raise the question as to whether children’s and young adults’ caffeine intake really is harmless or whether it might be wiser to abstain from consuming the pick-me-up. “There is still need for research in this area,” says Huber.
In a landmark discovery, the final piece in the puzzle of understanding how the brain circuitry vital to normal fertility in humans and other mammals operates has been put together by researchers at New Zealand’s University of Otago.
Their new findings, which appear in the leading international journal Nature Communications, will be critical to enabling the design of novel therapies for infertile couples as well as new forms of contraception.
The research team, led by Otago neuroscientist Professor Allan Herbison, have discovered the key cellular location of signalling between a small protein known as kisspeptin* and its receptor, called Gpr54. Kisspeptin had earlier been found to be crucial for fertility in humans, and in a subsequent major breakthrough Professor Herbison showed that this molecule was also vital for ovulation to occur.
In the latest research, Professor Herbison and colleagues at Otago and Heidelberg University, Germany, provide conclusive evidence that the kisspeptin-Gpr54 signalling occurs in a small population of nerve cells in the brain called gonadotropin-releasing hormone (GnRH) neurons.
Using state-of-the-art techniques, the researchers studied mice that lacked Gpr54 receptors in only their GnRH neurons and found that these did not undergo puberty and were infertile. They then showed that infertile mice could be rescued back to completely normal fertility by inserting the Gpr54 gene into just the GnRH neurons.
Professor Herbison says the findings represent a substantial step forward in enabling new treatments for infertility and new classes of contraceptives to be developed.
"Infertility is a major issue affecting millions of people worldwide. It’s currently estimated that up to 20 per cent of New Zealand couples are infertile, and it is thought that up to one-third of all cases of infertility in women involve disorders in the area of brain circuitry we are studying.
"Our new understanding of the exact mechanism by which kisspeptin acts as a master controller of reproduction is an exciting breakthrough which opens up avenues for tackling what is often a very heart-breaking health issue. Through detailing this mechanism we now have a key chemical switch to which drugs can be precisely targeted," Professor Herbison says.
As well as the findings’ benefits for advancing new therapies for infertility and approaches to controlling fertility, they suggest that targeting kisspeptin may be valuable in treating diseases such as prostate cancer that are influenced by sex steroid hormone levels in the blood, he says.
Professor Herbison noted that the research findings represent a long-standing collaborative effort with the laboratory of Professor Gunther Schutz at Heidelberg University, Germany.
Professor Herbison is Director of the University’s Centre for Neuroendocrinology, which is the world-leading research centre investigating how the brain controls fertility.
"We are delighted to have published this work in one of the top scientific journals and also to be able to maintain the leading role of New Zealand researchers in understanding fertility control," he says.
The toxoplasma parasite can be deadly, causing spontaneous abortion in pregnant women or killing immune-compromised patients, but it has even stranger effects in mice.

Infected mice lose their fear of cats, which is good for both cats and the parasite, because the cat gets an easy meal and the parasite gets into the cat’s intestinal tract, the only place it can sexually reproduce and continue its cycle of infection.
New research by graduate student Wendy Ingram at the University of California, Berkeley, reveals a scary twist to this scenario: the parasite’s effect seem to be permanent. The fearless behavior in mice persists long after the mouse recovers from the flu-like symptoms of toxoplasmosis, and for months after the parasitic infection is cleared from the body, according to research published today (Sept. 18) in the journal PLoS ONE.
“Even when the parasite is cleared and it’s no longer in the brains of the animals, some kind of permanent long-term behavior change has occurred, even though we don’t know what the actual mechanism is,” Ingram said. She speculated that the parasite could damage the smell center of the brain so that the odor of cat urine can’t be detected. The parasite could also directly alter neurons involved in memory and learning, or it could trigger a damaging host response, as in many human autoimmune diseases.
Ingram became interested in the protozoan parasite, Toxoplasma gondii, after reading about its behavior-altering effects in mice and rats and possible implications for its common host, the domesticated cat, and even humans. One-third of people around the world have been infected with toxoplasma and probably have dormant cysts in their brains. Kept in check by the body’s immune system, these cysts sometimes revive in immune-compromised people, leading to death, and some preliminary studies suggest that chronic infection may be linked to schizophrenia or suicidal behavior.
Pregnant women are already warned to steer clear of kitty litter, since the parasite is passed through cat feces and can cause blindness or death in the fetus. One main source of spread is undercooked pork, Ingram said.
With the help of Michael Eisen and Ellen Robey, UC Berkeley professors of molecular and cell biology, Ingram set out three years ago to discover how toxoplasma affects mice’s hard-wired fear of cats. She tested mice by seeing whether they avoided bobcat urine, which is normal behavior, versus rabbit urine, to which mice don’t react. While earlier studies showed that mice lose their fear of bobcat urine for a few weeks after infection, Ingram showed that the three most common strains of Toxoplasma gondii make mice less fearful of cats for at least four months.
Using a genetically altered strain of toxoplasma that is not able to form cysts and thus is unable to cause chronic infections in the brain, she demonstrated that the effect persisted for four months even after the mice completely cleared the microbe from their bodies. She is now looking at how the mouse immune system attacks the parasite to see whether the host’s response to the infection is the culprit.
“This would seem to refute – or at least make less likely – models in which the behavior effects are the result of direct physical action of parasites on specific parts of the brain,” Eisen wrote in a blog post about the research.
“The idea that this parasite knows more about our brains than we do, and has the ability to exert desired change in complicated rodent behavior, is absolutely fascinating,” Ingram said. “Toxoplasma has done a phenomenal job of figuring out mammalian brains in order to enhance its transmission through a complicated life cycle.”
Brain regions associated with memory shrink as adults age, and this size decrease is more pronounced in those who go on to develop neurodegenerative disease, reports a new study published Sept. 18 in the Journal of Neuroscience. The volume reduction is linked with an overall decline in cognitive ability and with increased genetic risk for Alzheimer’s disease, the authors say.

Image: Network of brain regions, highlighted in red and yellow, show atrophy in both healthy aging and neurodegenerative disease. The regions highlighted are susceptible to normal aging and dementia.
“Our results identify a specific pattern of structural brain changes that may provide a possible brain marker for the onset of Alzheimer’s disease,” said Nathan Spreng, assistant professor of human development and the Rebecca Q. and James C. Morgan Sesquicentennial Faculty Fellow in Cornell’s College of Human Ecology.
The study is one of the first to measure structural changes in a collection of brain regions – not just one single area – over the adult life course and from normal aging to neurodegenerative disease, said Spreng, who co-authored the study with Gary R. Turner of York University in Toronto.
Overall, they studied brain data from 848 individuals spanning the adult lifespan, using data from the Open Access Series of Imaging Studies and the Alzheimer’s Disease Neuroimaging Initiative (ADNI). About half of the ADNI sample was assessed multiple times over several years, allowing the researchers to measure brain changes over time and determine who did and did not progress to dementia.
The researchers found that brain volume in the default network (a set of brain regions associated with internally generated thoughts such as memory) declined in both healthy and pathological aging. The researchers noted the greatest decline in Alzheimer’s patients and in those who progressed from mild cognitive impairment to Alzheimer’s disease. Reduced brain volumes in these regions were associated with declines in cognitive ability, the presence of known biological markers of Alzheimer’s disease and with carrying the APOE4 variant of APOE gene, a known risk factor for Alzheimer’s.
“While elements of the default network have previously been implicated in aging and neurodegenerative disease, few studies have examined broad network changes over the full adult life course with such large participant samples and including both behavioral and genetic data,” said Spreng. “Our findings provide evidence for a network-based model of neurodegenerative disease, in which progressive brain changes spread through networks of connected brain regions.”
Scientists at the University of Alabama at Birmingham have identified a molecular pathway that seems to contribute to the ability of malignant glioma cells in a brain tumor to spread and invade previously healthy brain tissue. Researchers said the findings, published Sept. 19, 2013, in the journal PLOS ONE, provide new drug-discovery targets to rein in the ability of these cells to move.

Gliomas account for about a third of brain tumors, and survival rates are poor; only about half of the 10,000 Americans diagnosed with malignant glioma survive the first year, and only about one quarter survive for two years.
“Malignant gliomas are notorious, not only because of their resistance to conventional chemotherapy and radiation therapy, but also for their ability to invade the surrounding brain, thus causing neurological impairment and death,” said Hassan Fathallah-Shaykh, M.D., Ph.D., associate professor in the UAB Department of Neurology. “Brain invasion, a hallmark of gliomas, also helps glioma cells evade therapeutic strategies.”
Fathallah-Shaykh said there is a great deal of interest among scientists in the idea that a low-oxygen environment induces glioma cells to react with aggressive movement, migration and brain invasion. A relatively new cancer strategy to shrink tumors is to cut off the tumor’s blood supply – and thus its oxygen source – through the use of anti-angiogenesis drugs. Angiogenesis is the process of making new blood vessels.
“Stop angiogenesis and you shut off a tumor’s blood and oxygen supply, denying it the components it needs to grow,” said Fathallah-Shaykh. “Drugs that stop angiogenesis are believed to create a kind of killing field. This study identified four glioma cell lines that dramatically increased their motility when subjected to a low-oxygen environment – in effect escaping the killing field to create a new colony elsewhere in the brain.”
Fathallah-Shaykh and his team then identified two proteins that form a pathway linking low oxygen, or hypoxia, to increased motility.
“We identified a signaling protein that is activated by hypoxia called Src,” said Fathallah-Shaykh. “We also identified a downstream protein called neural Wiskott-Aldrich syndrome protein (N-WASP), which is regulated by Src in the cell lines with increased motility.”
The researchers then used protein inhibitors to shut off Src and N-WASP. When either protein was inhibited, low oxygen lost its ability to augment cell movement.
“These findings indicate that Src, N-WASP and the linkage between them – which is something we don’t fully understand yet – are key targets for drugs that would interfere with the ability of a cell to move.” said Fathallah-Shaykh. “If we can stop them from moving, then techniques such as anti-angiogenesis should be much more effective. Anti-motility drugs could be a key component in treating gliomas in the years to come.”
Researchers at UT Southwestern Medical Center have identified a cellular switch that potentially can be turned off and on to slow down, and eventually inhibit the growth of the most commonly diagnosed and aggressive malignant brain tumor.

Findings of their investigation show that the protein RIP1 acts as a mediator of brain tumor cell survival, either protecting or destroying cells. Researchers believe that the protein, found in most glioblastomas, can be targeted to develop a drug treatment for these highly malignant brain tumors. The study was published online Aug. 22 in Cell Reports.
"Our study identifies a new mechanism involving RIP1that regulates cell division and death in glioblastomas," said senior author Dr. Amyn Habib, associate professor of neurology and neurotherapeutics at UT Southwestern, and staff neurologist at VA North Texas Health Care System. "For individuals with glioblastomas, this finding identified a target for the development of a drug treatment option that currently does not exist."
In the study, researchers used animal models to examine the interactions of the cell receptor EGFRvIII and RIP1. Both are used to activate NFκB, a family of proteins that is important to the growth of cancerous tumor cells. When RIP1 is switched off in the experimental model, NFκB and the signaling that promotes tumor growth is also inhibited. Furthermore, the findings show that RIP1 can be activated to divert cancer cells into a death mode so that they self-destruct.
According to the American Cancer Society, about 30 percent of brain tumors are gliomas, a fast-growing, treatment-resistant type of tumor that includes glioblastomas, astrocytomas, oligodendrogliomas, and ependymomas. In many cases, survival is tied to novel clinical trial treatments and research that will lead to drug development.
Research on synapse stabilization could aid understanding of autism, schizophrenia, intellectual disability

When we’re born, our brains aren’t very organized. Every brain cell talks to lots of other nearby cells, sending and receiving signals across connections called synapses.
But as we grow and learn, things get a bit more stable. The brain pathways that will serve us our whole lives start to organize, and less-active, inefficient synapses shut down.
But why and how does this happen? And what happens when it doesn’t go normally? New research from the University of Michigan Medical School may help explain.
In a new paper in Nature Neuroscience, a team of U-M neuroscientists reports important findings about how brain cells called neurons keep their most active connections with other cells, while letting other synapses lapse.
Specifically, they show that SIRP alpha, a protein found on the surface of various cells throughout the body, appears to play a key role in the process of cementing the most active synaptic connections between brain cells. The research, done in mouse brains, was funded by the National Institutes of Health and several foundations.
The findings boost understanding of basic brain development – and may aid research on conditions like autism, schizophrenia, epilepsy and intellectual disability, all of which have some basis in abnormal synapse function.
“For the brain to be really functional, we need to keep the most active and most efficient connections,” says senior author Hisashi Umemori, M.D., Ph.D., a research assistant professor at U-M’s Molecular and Behavioral Neuroscience Institute and assistant professor of biological chemistry in the Medical School. “So, during development it’s crucial to establish efficient connections, and to eliminate inactive ones. We have identified a key molecular mechanism that the brain uses to stabilize and maturate the most active connections.”
Umemori says the new findings on SIRP alpha grew directly out of previous work on competition between neurons, which enables the most active ones to become part of pathways and circuits. (Read more on this research)
The team suspected that there must be some sort of signal between the two cells on either side of each synapse — something that causes the most active synapses to stabilize. So they set out to find out what it was.
SIRP-rise findings
The group had previously shown that SIRP-alpha was involved in some way in a neuron’s ability to form a presynaptic nerve terminal – an extension of the cell that reaches out toward a neighboring cell, and can send the chemical signals that brain cells use to talk to one another.
SIRP-alpha is also already known to serve an important function in the rest of the body – essentially, helping normal cells tell the immune system not to attack them. It may also help cancer cells evade detection by the immune system’s watchdogs.
In the new study, the team studied SIRP alpha function in the brain – and started to understand its role in synapse stabilization. They focused on the hippocampus, a region of the brain very important to learning and memory.
Through a range of experiments, they showed that when a brain cell receives signals from a neighboring cell across a synapse, it actually releases SIRP-alpha into the space between the cells. It does this through the action of molecules inside the cell – called CaMK and MMP – that act like molecular scissors, cutting a SIRP-alpha protein in half so that it can float freely away from the cell.
The part of the SIRP-alpha protein that floats into the synapse “gap” latches on to a receptor on the other side, called a CD47 receptor. This binding, in turn, appears to tell the cell that the signal it sent earlier was indeed received – and that the synapse is a good one. So, the cell brings more chemical signaling molecules down that way, and releases them into the synapse.
As more and more nerve messages travel between the “sending” and “receiving” cells on either side of that synapse, more SIRP-alpha gets cleaved, released into the synapse, and bound to CD47.
The researchers believe this repeated process is what helps the cells determine which synapses to keep – and which to let wither.
Umemori says the team next wants to look at what happens when SIRP-alpha doesn’t get cleaved as it should – and at what’s happening in cells when a synapse gets eliminated.
“This step of shedding SIRP-alpha must be critical to developing a functional neural network,” he says. “And if it’s not done well, disease or disorders may result. Perhaps we can use this knowledge to treat diseases caused by defects in synapse formation.”
He notes that the gene for the CD47 receptor is found in the same general area of our DNA as several genes that are suspected to be involved in schizophrenia.
If the development of our nervous system is disturbed, we risk developing serious neurological diseases, impairing our sensory systems, movement control or cognitive functions. This is true for all organisms with a well-developed nervous system, from man to worm. New research from BRIC, University of Copenhagen reveals how a tiny molecule called mir-79 regulates neural development in roundworms. The molecule is required for correct migration of specific nerve cells during development and malfunction causes defects in the nervous system of the worm. The research has just been published in the journal Science.
Hundreds of worms lie in a small plastic plate under the laboratory microscope. Over the last three years, the group of Associate Professor Roger Pocock has used the roundworm C. elegans tostudy the development of the nervous system. They have just made an important discovery.

“Our new results show that a small molecule called mir-79 is indispensable for development of the worm’s nervous system. mir-79 acts by equipping special signal molecules with a transmitter, which tells the nerve cells how they should migrate during development of the worm. If we remove mir-79 with gene technology, development of the worm nervous system goes awry”, says postdoc Mikael Egebjerg Pedersen, who is responsible for the experimental studies.
mir-79 adds just the right combination of sugar
The research shows that mir-79 acts by controlling the addition of certain groups of sugars to selected signaling molecules. In the world of cells, sugar molecules act as transmitters.

When the nerve cells come into contact with the sugar-transmitters, they are informed where to locate themselves during neural development. If the researchers remove mir-79, the migration of the nerve cells is misguided causing neuronal defects in the worms.
“It has earlier been shown that signaling molecules guide nerve migration, but our research shows that mir-79 regulates nerve cell migration by controlling the correct balance of sugar-transmitters on signaling molecules. If mir-79 does not function, the worm nervous system is malformed. In the wild, such defects would be harmful for worm survival”, explains Roger Pocock who leads the research group behind the finding.
Worm studies reveal important clues for neuronal repair
A version of mir-79 called mir-9 is found in humans. Therefore, these results are important for understanding how our nervous system develops during fetal development. In addition, the results add to the understanding of how nerve cells may be stimulated to repair damage in our brain or spinal cord.
“Our nervous system is a tissue which is not easily repaired after damage. So, how certain molecular cues can stimulate nerve cells to migrate is an important brick in the puzzle. This will enable us to understand how nerve tissue can be regenerated after, for example, a stroke or an accident. If we can use such knowledge to mimic the signals, we may be able to stimulate nerve cells to migrate into a damaged area”, says Roger Pocock.
Worms are a fantastic model to study how the nervous system develops and how nerve cells form neuronal circuits. Most of the genes that control nervous system development in the worm are also found in humans. At the same time, the reduced complexity of the worm nervous system allows researchers to investigate central biological mechanisms. With new technologies they can mark single cells or molecules, and as worms are transparent, the researchers can track the marked molecules or cells live during worm development.
The next step for the researchers is to investigate how the regulatory pathway they have revealed is regulated in cultures of human cells.
NIH-funded discovery began with asking how the brain learns to see
A class of proteins that controls visual system development in the young brain also appears to affect vulnerability to Alzheimer’s disease in the aging brain. The proteins, which are found in humans and mice, join a limited roster of molecules that scientists are studying in hopes of finding an effective drug to slow the disease process.

Image: PirB (red) is heavily concentrated on the surface of growing nerve cells. Courtesy of Dr. Carla Shatz, Stanford.
"People are just beginning to look at what these proteins do in the brain. While more research is needed, these proteins may be a brand new target for Alzheimer’s drugs," said Carla Shatz, Ph.D., the study’s lead investigator. Dr. Shatz is a professor of biology and neurobiology at Stanford University in California, and the director of Stanford’s interdisciplinary biosciences program, BioX.
She and her colleagues report that LilrB2 (pronounced “leer-bee-2”) in humans and PirB (“peer-bee”) in mice can physically partner with beta-amyloid, a protein fragment that accumulates in the brain during Alzheimer’s disease. This in turn triggers a harmful chain reaction in brain cells. In a mouse model of Alzheimer’s, depleting PirB in the brain prevented the chain reaction and reduced memory loss.
The research was funded in part by the National Eye Institute, the National Institute on Aging (NIA), and the National Institute of Neurological Disorders and Stroke (NINDS), all part of the National Institutes of Health. It is reported in the Sept. 20 issue of Science.
"These findings provide valuable insight into Alzheimer’s, a complex disorder involving the abnormal build-up of proteins, inflammation and a host of other cellular changes," said Neil Buckholtz, Ph.D., director of the neuroscience division at NIA. "Our understanding of the various proteins involved, and how these proteins interact with each other, may one day result in effective interventions that delay, treat or even prevent this dreaded disease."
Alzheimer’s disease is the most common cause of dementia in older adults, and affects as many as 5 million Americans. Large clumps—or plaques—of beta-amyloid and other proteins accumulate in the brain during Alzheimer’s, but many researchers believe the disease process starts long before the plaques appear. Even in the absence of plaques, beta-amyloid has been shown to cause damage to brain cells and the delicate connections between them.
Dr. Shatz’s discovery took a unique path. She is a renowned neuroscientist, but Alzheimer’s disease is not her focus area. For decades, she has studied plasticity—the brain’s capacity to learn and adapt—focusing mostly on the visual system.
"Dr. Shatz has always been a leader in the field of plasticity, and now she’s taken yet another innovative step—giving us new insights into the abnormal plasticity that occurs in Alzheimer’s disease," said Michael Steinmetz, Ph.D., a program director at NEI. "These findings rest squarely on basic research into the development of the visual system." NEI has funded Dr. Shatz for more than 35 years.
During development, the eyes compete to connect within a limited territory of the brain—a process known as ocular dominance plasticity. The competition takes place during a limited time in early life. If visual experience through one eye is impaired during that time—for example, by a congenital cataract (present from birth)—it can permanently lose territory to the other eye.
"Ocular dominance is a classic example of how a brain circuit can change with experience," Dr. Shatz said. "We’ve been trying to understand it at a molecular level for a long time."
Her search eventually led to PirB, a protein on the surface of nerve cells in the mouse brain. She discovered that mice without the gene for PirB have an increase in ocular dominance plasticity. In adulthood, when the visual parts of their brains should be mature, the connections there are still flexible. This established PirB as a “brake on plasticity” in the healthy brain, Dr. Shatz said.
It wasn’t long before she began to wonder if PirB might also put a brake on plasticity in Alzheimer’s disease. In the current study, she pursued that question with Taeho Kim, Ph.D., a postdoctoral fellow in her lab, and Christopher M. William, M.D., Ph.D., a neuropathology fellow at Massachusetts General Hospital in Boston. Bradley Hyman, M.D., Ph.D., a professor of neurology at Mass General, was a collaborator on the project.
First, the team repeated the genetic experiment that Dr. Shatz had done in normal mice—but this time, they deleted the PirB gene in the Alzheimer’s mice. By about nine months of age, these mice typically develop learning and memory problems. But that didn’t happen in the absence of PirB.
Next, the researchers began thinking about how PirB might fit into the Alzheimer’s disease process, and particularly how it might interact with beta-amyloid. Dr. Kim theorized that since PirB resides on the surface of nerve cells, it might act as a binding site—or receptor—for beta-amyloid. Indeed, he found that PirB binds tightly to beta-amyloid, especially to tiny clumps of it that are believed to ultimately grow into plaques.
Beta-amyloid is known to weaken synapses—the connections between nerve cells. The researchers found that PirB appears to be an accomplice in this process. Without PirB, synapses in the mouse brain were resistant to the effects of beta-amyloid. Other experiments showed that binding between PirB and beta-amyloid can trigger a cascade of harmful reactions that can lead to the breakdown of synapses.
Although PirB is a mouse protein, humans have a closely related protein called LilrB2. The researchers found that this protein also binds tightly to beta-amyloid. By examining brain tissue from people with Alzheimer’s disease, they also found evidence that LilrB2 may trigger the same harmful reactions that PirB can trigger in the mouse brain.
"These are novel results, and direct interaction between beta-amyloid and PirB-related proteins opens up welcome avenues for investigating new drug targets for Alzheimer’s disease," said Roderick Corriveau, Ph.D., a program director at NINDS.
Dr. Shatz said she hopes to interest other researchers to work on developing drugs to block PirB and LilrB2. Currently, no drugs treat the underlying causes of Alzheimer’s disease. Most of the interventions that have reached clinical testing are designed to clear away beta-amyloid. To date, only two other beta-amyloid receptors (PrP-C and EphB2) have been found and are being pursued as drug targets.
Clock’s rhythm ensures steady energy supply to cells during times of fasting
Each of our cells has an energy furnace, and it is called a mitochondrion. A Northwestern University-led research team now has identified a new mode of timekeeping that involves priming the cell’s furnace to properly use stored fuel when we are not eating.
The interdisciplinary team has identified the “match” and “flint” responsible for lighting this tiny furnace. And the match is only available when the circadian clock says so, underscoring the importance of the biological timing system to metabolism.
“Circadian clocks are with us on Earth because they have everything to do with energy,” said Joe Bass, M.D., who led the research. “If an organism burns its energy efficiently, it has a better chance of survival. Our results tell us how the circadian clock triggers the cell’s energy-burning process. Cells are most capable of using fuel when the clock is working properly.”
Bass is the Charles F. Kettering Professor and chief of the division of endocrinology, metabolism and molecular medicine at Northwestern University Feinberg School of Medicine and an endocrinologist at Northwestern Memorial Hospital.
Mitochondria regulate the supply of energy to cells when we are at rest, with no glucose available from food. In a study of mice, the researchers found that the circadian clock supplies the match to light the furnace and on the match tip is a critical compound called NAD+. It combines with an enzyme in mitochondria called Sirtuin 3, which acts as the flint, to light the furnace. When the clock in an animal isn’t working, the animal can’t metabolize stored energy and the process doesn’t ignite.
This pathway through which the body clock controls activities within the mitochondria shows how energy generation is tied tightly to the light-dark/activity-rest cycle each day.
The findings, which could be useful in the development of therapies to treat metabolic disorders related to circadian disruption, is published today (Sept. 19) by the journal Science.
The results demonstrate that the circadian clock, a genetic timekeeper that evolved to enable organisms to track the daily transition from light to darkness early in evolution, generates oscillations in mitochondrial energy capacity through rhythmic regulation of NAD+ biosynthesis.
The clock facilitates oxidative rhythms that anticipate an animal’s fasting/feeding cycle that occurs during the transition from light to darkness and wakefulness to sleep each day, and, in so doing, prevents the cell from “starving” during the night.
To understand how mitochondria are affected by circadian clock disorder, the researchers genetically removed the clocks in laboratory mice and compared them to controls. Both groups of mice were studied in a state of fasting; this “stress” test enabled the researchers to pinpoint just how the clock maintains “energy reserves” (akin to stress testing of a bank).
Bass and his research group worked together with Navdeep S. Chandel, a colleague of Bass’ at Feinberg, and John M. Denu, at the University of Wisconsin-Madison. They found the mice lacking clocks had defects in their mitochondria: the mitochondria could not metabolize stored energy and had no reserve to prevent depletion of the main currency, ATP. (Adenosine triphosphate is an energy-bearing molecule found in all living cells.)
Working with Northwestern colleague Milan Mrksich, they went on to show that removal of the clock depletes the necessary ingredient to turn on an enzyme within mitochondria, Sirtuin 3, which activates energy burning during fasting.
The researchers also showed that when the circadian clock was disrupted, resulting in a lack of NAD+, they could provide NAD+ supplements and restore function to the mitochondrion.
The findings expand the understanding of the molecular pathways linking the circadian clock with metabolism and show that the clock provides an essential buffer to stabilize the cell as organisms transition between eating and fasting each day. This knowledge has implications for disease intervention and prevention, including of diabetes, and potentially for states of increased cell demand for metabolism (including inflammation and cancer).
“We have established the chain of events that couples the clock’s control switch with the machinery of the mitochondria,” said Bass, who also is a member of the department of neurobiology at the Weinberg College of Arts and Sciences. “We now have identified an additional link in the supply chain that provides energy to the cell at different phases of our daily sleep-wake cycle. These findings establish a key role for the NAD+ biosynthetic cycle in this process.”
Major senior authors from Northwestern include Chandel, a professor in medicine-pulmonary and cell and molecular biology at Feinberg, and Mrksich, the Henry Wade Rogers Professor of Biomedical Engineering, Chemistry and Cell and Molecular Biology at Feinberg, Weinberg and the McCormick School of Engineering and Applied Science. Chandel and Mrksich are members of the Robert H. Lurie Comprehensive Cancer Center of Northwestern University.
The co-first authors are Clara Bien Peek, a postdoctoral fellow, and Alison H. Affinati, an M.D./Ph.D. candidate, both working in Bass’ lab. They have literally worked around the clock on the research, which builds on the earlier work of co-author Kathryn Moynihan Ramsey. In 2009, she and colleagues reported in Science that the compound NAD, together with the enzyme SIRT1, functions as a molecular “switch” to coordinate the internal clock with metabolic systems.
The current research team combined Northwestern expertise in basic circadian clock research, chemistry and physiology with outside collaborators who were able to verify the Northwestern findings.
Co-author Eric Goetzman, from the University of Pittsburgh School of Medicine, an expert in the rare children’s disease called metabolic myopathy, was able to confirm that the pattern the researchers observed in mice was the same as that seen in these children. Fasting can be life-threatening for children with this disorder because they can’t metabolize stored energy due to defects in their mitochondria.
Analyses by co-author Christopher B. Newgard at Duke University Medical Center identified a signature profile of the metabolic myopathy in mice with altered circadian clock genes.