Johns Hopkins scientists have developed new drugs that — at least in a laboratory dish — appear to halt the brain-destroying impact of a genetic mutation at work in some forms of two incurable diseases, amyotrophic lateral sclerosis (ALS) and dementia.
They made the finding by using neurons they created from stem cells known as induced pluripotent stem cells (iPS cells), which are derived from the skin of people with ALS who have a gene mutation that interferes with the process of making proteins needed for normal neuron function.
“Efforts to treat neurodegenerative diseases have the highest failure rate for all clinical trials,” says Jeffrey D. Rothstein, M.D., Ph.D., a professor of neurology and neuroscience at the Johns Hopkins University School of Medicine and leader of the research described online in the journal Neuron. “But with this iPS technology, we think we can target an exact subset of patients with a specific mutation and succeed. It’s individualized brain therapy, just the sort of thing that has been done in cancer, but not yet in neurology.”
Scientists in 2011 discovered that more than 40 percent of patients with an inherited form of ALS and at least 10 percent of patients with the non-inherited sporadic form have a mutation in the C9ORF72 gene. The mutation also occurs very often in people with frontotemporal dementia, the second-most-common form of dementia after Alzheimer’s disease. The same research appeared to explain why some people develop both ALS and the dementia simultaneously and that, in some families, one sibling might develop ALS while another might develop dementia.
In the C9ORF72 gene of a normal person, there are up to 30 repeats of a series of six DNA letters (GGGGCC); but in people with the genetic glitch, the string can be repeated thousands of times. Rothstein, who is also director of the Johns Hopkins Brain Science Institute and the Robert Packard Center for ALS Research, used his large bank of iPS cell lines from ALS patients to identify several with the C9ORF72 mutation, then experimented with them to figure out the mechanism by which the “repeats” were causing the brain cell death characteristic of ALS.
In a series of experiments, Rothstein says, they discovered that in iPS neurons with the mutation, the process of using the DNA blueprint to make RNA and then produce protein is disrupted. Normally, RNA-binding proteins facilitate the production of RNA. Instead, in the iPS neurons with the C9ORF72 mutation, the RNA made from the repeating GGGGCC strings was bunching up, gumming up the works by acting like flypaper and grabbing hold of the extremely important RNA binding proteins, including one known as ADARB2, needed for the proper production of many other cellular RNAs. Overall, the C9ORF72 mutation made the cell produce abnormal amounts of many other normal RNAs and made the cells very sensitive to stress.
To counter this effect, the researchers developed a number of chemical compounds targeting the problem. This compound behaved like a coating that matches up to the GGGGCC repeats like velcro, keeping the flypaper-like repeats from attracting the bait, allowing the RNA-binding protein to properly do its job.
Rothstein says Isis Pharmaceuticals helped develop many of the studied compounds and, by working closely with the Johns Hopkins teams, could begin testing it in human ALS patients with the C9ORF72 mutation in the next several years. In collaboration with the National Institutes of Health, plans are already underway to begin to identify a group of patients with the C9ORF72 mutation for future research.
Rita Sattler, Ph.D., an assistant professor of neurology at Johns Hopkins and the co-investigator of the study, says without iPS technology, the team would have had a difficult time studying the C9ORF72 mutation. “Typically, researchers engineer rodents with mutations that mimic the human glitches they are trying to research and then study them,” she says. “But the nature of the multiple repeats made that nearly impossible.” The iPS cells did the job just as well or even better than an animal model, Sattler says, in part because the experiments could be done using human cells.
“An iPS cell line can be used effectively and rapidly to understand disease mechanisms and as a tool for therapy development,” Rothstein adds. “Now we need to see if our findings translate into a valuable treatment for humans.”
The researchers also analyzed brain tissue from people with the C9ORF72 mutation who died of ALS. They saw evidence of this bunching up and found that the many genes that were altered as a consequence of this mutation in the iPS cells were also abnormal in the brain tissue, thereby showing that iPS cells can be a faithful tool to study the human disease and discover effective therapies.
In the future, the scientists will look at cerebral spinal fluid from ALS patients with the C9ORF72 mutation, searching for proteins that were found both in the fluid and the iPS cells. These may pave the way to develop markers that can be studied by clinicians to see if the treatment is working once the drug therapy is moved to clinical trials.
ALS, sometimes known as Lou Gehrig’s disease, named for the Yankee baseball great who died from it, destroys nerve cells in the brain and spinal cord that control voluntary muscle movement. The nerve cells waste away or die, and can no longer send messages to muscles, eventually leading to muscle weakening, twitching and an inability to move the arms, legs and body. Onset is typically around age 50 and death often occurs within three to five years of diagnosis. Some 10 percent of cases are hereditary. There is no cure for ALS and there is only one FDA-approved drug treatment, which has just a small effect in slowing disease progression and increasing survival, Rothstein notes.
Research in mouse whiskers reveals signal pathway from touch neuron to brain

Human fingertips have several types of sensory neurons that are responsible for relaying touch signals to the central nervous system. Scientists have long believed these neurons followed a linear path to the brain with a “labeled-lines” structure.
But new research on mouse whiskers from Duke University reveals a surprise — at the fine scale, the sensory system’s wiring diagram doesn’t have a set pattern. And it’s probably the case that no two people’s touch sensory systems are wired exactly the same at the detailed level, according to Fan Wang, Ph.D., an associate professor of neurobiology in the Duke Medical School.
The results, which appear online in Cell Reports, highlight a “one-to-many, many-to-one” nerve connectivity strategy. Single neurons send signals to multiple potential secondary neurons, just as signals from many neurons can converge onto a secondary neuron. Many such connections are likely formed by chance, Wang said. This connectivity scheme allows the touch system to have many possible combinations to encode a large repertoire of textures and forms.
"We take our sense of touch for granted," Wang said. "When you speak, you are not aware of the constant tactile feedback from your tongue and teeth. Without such feedback, you won’t be able to say the words correctly. When you write with a pen, you’re mostly unaware of the sensors telling you how to move it."
It’s not feasible to visualize the touch pathways in the human brain at high resolutions. So, Wang and her collaborators from the University of Tsukuba in Japan and the Friedrich Miescher Institute for Biomedical Research in Switzerland used the whiskers of laboratory mice to map how distinct sensor neurons, presumably detecting different mechanical stimuli, are connected to signal the brain. When the sensory neurons are activated, they send the signal along an axon — a long, slender nerve fiber than conducts electric impulses to the brain. The researchers traced signals running the long path from the mouse’s whiskers to the brain.
Wang’s group used a combination of genetic engineering and fluorescent tags delivered by viruses to color-code four different kinds of neurons and map their connections.
Earlier work by Wang and others had found that all of the 100 to 200 sensors associated with a single whisker project their axons to a large structure representing that whisker in the brain. Each whisker has its own neural representation structure.
"People have thought that within the large whisker-representing structure, there will be finer-scale, labeled lines," Wang said. "In other words, different touch sensors would send information through separate parallel pathways, into that large structure. But surprisingly, we did not find such organized pathways. Instead, we found a completely unorganized mosaic pattern of connections within the large structure. Information from different sensors is intermixed already at the first relay station inside the brain."
Wang said the next step will be to stimulate the labeled circuits in different ways to see how impulses travel the network.
"We want to figure out the exact functions and signals transmitted by different sensors during natural tactile behaviors and determine their exact roles on the perception of textures," she said.
Anyone who has suffered through sleepless nights due to uncontrollable itching knows that not all itching is the same. New research at Washington University School of Medicine in St. Louis explains why.
Working in mice, the scientists have shown that chronic itching, which can occur in many medical conditions, from eczema and psoriasis to kidney failure and liver disease, is different from the fleeting urge to scratch a mosquito bite.
That’s because chronic itching appears to incorporate more than just the nerve cells, or neurons, that normally transmit itch signals. The researchers found that in chronic itching, neurons that send itch signals also co-opt pain neurons to intensify the itch sensation.

The new discovery may lead to more effective treatments for chronic itching that target activity in neurons involved in both pain and itch. The research is reported online Oct. 15 in The Journal of Clinical Investigation and will appear in the November print issue.
“In normal itching, there’s a fixed pathway that transmits the itch signal,” said senior investigator Zhou-Feng Chen, PhD, who directs Washington University’s Center for the Study of Itch. “But with chronic itching, many neurons can be turned into itch neurons, including those that typically transmit pain signals. That helps explain why chronic itching can be so excruciating.”
Chen, a professor of anesthesiology, and his colleagues generated mice in which a protein called BRAF always is active and continually sends signals inside itch neurons. The BRAF gene and the protein it makes are involved in the body’s pain response, but scientists didn’t know whether the gene also played a role in itch.
“We thought the animals might be prone to feeling pain rather than itching,” Chen explained. “To our great surprise, the mice scratched spontaneously. At first, we didn’t know why they were scratching, but it turns out we developed a mouse model of chronic itch.”
Further studies discovered that the BRAF protein could turn on many itch genes, and they showed similar changes of gene expression in mice with chronic itch induced by dry skin and in mice with allergic contact dermatitis, two of the skin conditions that frequently cause people to scratch incessantly.
The findings suggest that targeting proteins in the BRAF pathway may open new avenues for treating chronic itch, a condition in which few therapies are effective. One possibility includes using drugs that are prescribed to treat pain.
“Certain drugs are used to inhibit some of the same targets in patients with chronic pain, and those medications also may quiet down itch,” Chen said.
In earlier studies, Chen identified gastrin-releasing peptide (GRP), a substance that carries itch signals to a gene called GRPR (gastrin-releasing peptide receptor) in the spinal cord. In the new study, GRP and GRPR activity was doubled in the genetically altered mice, which could account for some of the increase in the intensity of itching. But other genes that normally are activated by pain also were turned on in the itch pathway, further intensifying the itch sensation.
Surprisingly, however, the mice had a normal response to pain, indicating that the pain and itch pathways are very different.
Unlike scratching a mosquito bite, which usually is only a temporary sensation, chronic itch can persist much longer, according to Chen, also a professor of psychiatry and of developmental biology. His team found that the mice in this study not only scratched spontaneously but also had more severe responses when exposed to substances that normally would induce acute itching.
“In people, chronic itching can last for weeks, months or even years,” Chen said. “These mice are helping us to understand the pathways that can be involved in transmitting itch signals and the many contributors to chronic itching. There are many pathways leading from BRAF, and all of these could be potential targets for anti-itch therapies.”
About a dozen years ago, scientists discovered that a hormone called ghrelin enhances appetite. Dubbed the “hunger hormone,” ghrelin was quickly targeted by drug companies seeking treatments for obesity — none of which have yet panned out.

MIT neuroscientists have now discovered that ghrelin’s role goes far beyond controlling hunger. The researchers found that ghrelin released during chronic stress makes the brain more vulnerable to traumatic events, suggesting that it may predispose people to posttraumatic stress disorder (PTSD).
Drugs that reduce ghrelin levels, originally developed to try to combat obesity, could help protect people who are at high risk for PTSD, such as soldiers serving in war, says Ki Goosens, an assistant professor of brain and cognitive sciences at MIT, and senior author of a paper describing the findings in the Oct. 15 online edition of Molecular Psychiatry.
“Perhaps we could give people who are going to be deployed into an active combat zone a ghrelin vaccine before they go, so they will have a lower incidence of PTSD. That’s exciting because right now there’s nothing given to people to prevent PTSD,” says Goosens, who is also a member of MIT’s McGovern Institute for Brain Research.
Lead author of the paper is Retsina Meyer, a recent MIT PhD recipient. Other authors are McGovern postdoc Anthony Burgos-Robles, graduate student Elizabeth Liu, and McGovern research scientist Susana Correia.
Stress and fear
Stress is a useful response to dangerous situations because it provokes action to escape or fight back. However, when stress is chronic, it can produce anxiety, depression and other mental illnesses.
At MIT, Goosens discovered that one brain structure that is especially critical for generating fear, the amygdala, has a special response to chronic stress. The amygdala produces large amounts of growth hormone during stress, a change that seems not to occur in other brain regions.
In the new paper, Goosens and her colleagues found that the release of the growth hormone in the amygdala is controlled by ghrelin, which is produced primarily in the stomach and travels throughout the body, including the brain.
Ghrelin levels are elevated by chronic stress. In humans, this might be produced by factors such as unemployment, bullying, or loss of a family member. Ghrelin stimulates the secretion of growth hormone from the brain; the effects of growth hormone from the pituitary gland in organs such as the liver and bones have been extensively studied. However, the role of growth hormone in the brain, particularly the amygdala, is not well known.
The researchers found that when rats were given either a drug to stimulate the ghrelin receptor or gene therapy to overexpress growth hormone over a prolonged period, they became much more susceptible to fear than normal rats. Fear was measured by training all of the rats to fear an innocuous, novel tone. While all rats learned to fear the tone, the rats with prolonged increased activity of the ghrelin receptor or overexpression of growth hormone were the most fearful, assessed by how long they froze after hearing the tone. Blocking the cell receptors that interact with ghrelin or growth hormone reduced fear to normal levels in chronically stressed rats.
When rats were exposed to chronic stress over a prolonged period, their circulating ghrelin and amygdalar growth hormone levels also went up, and fearful memories were encoded more strongly. This is similar to what the researchers believe happens in people who suffer from PTSD.
“When you have people with a history of stress who encounter a traumatic event, they are more likely to develop PTSD because that history of stress has altered something about their biology. They have an excessively strong memory of the traumatic event, and that is one of the things that drives their PTSD symptoms,” Goosens says.
New drugs, new targets
Over the last century, scientists have described the hypothalamic-pituitary-adrenal (HPA) axis, which produces adrenaline, cortisol (corticosterone in rats), and other hormones that stimulate “fight or flight” behavior. Since then, stress research has focused almost exclusively on the HPA axis.
After discovering ghrelin’s role in stress, the MIT researchers suspected that ghrelin was also linked to the HPA axis. However, they were surprised to find that when the rats’ adrenal glands — the source of corticosterone, adrenaline, and noradrenaline — were removed, the animals still became overly fearful when chronically stressed. The authors also showed that repeated ghrelin-receptor stimulation did not trigger release of HPA hormones, and that blockade of the ghrelin receptor did not blunt release of HPA stress hormones. Therefore, the ghrelin-initiated stress pathway appears to act independently of the HPA axis. “That’s important because it gives us a whole new target for stress therapies,” Goosens says.
Pharmaceutical companies have developed at least a dozen possible drug compounds that interfere with ghrelin. Many of these drugs have been found safe for humans, but have not been shown to help people lose weight. The researchers believe these drugs could offer a way to vaccinate people entering stressful situations, or even to treat people who already suffer from PTSD, because ghrelin levels remain high long after the chronic stress ends.
PTSD affects about 7.7 million American adults, including soldiers and victims of crimes, accidents, or natural disasters. About 40 to 50 percent of patients recover within five years, Meyer says, but the rest never get better.
The researchers hypothesize that the persistent elevation of ghrelin following trauma exposure could be one of the factors that maintain PTSD. “So, could you immediately reverse PTSD? Maybe not, but maybe the ghrelin could get damped down and these people could go through cognitive behavioral therapy, and over time, maybe we can reverse it,” Meyer says.
Working with researchers at Massachusetts General Hospital, Goosens’ lab is now planning to study ghrelin levels in human patients suffering from anxiety and fear disorders. They are also planning a clinical trial of a drug that blocks ghrelin to see if it can prevent relapse of depression.
Scientists at the University of Washington have used genetic engineering to identify a population of neurons that tell the brain to shut off appetite. Their study, “Genetic identification of a neural circuit that suppresses appetite,” was published Oct. 13 in Nature.
To identify these neurons, or cells that process and transmit information in the brain, researchers first considered what makes an animal lose its appetite. There are a number of natural reasons, including infection, nausea, pain or simply having eaten too much already.
Nerves within the gut that are distressed or insulted send information to the brain through the vagus nerve. Appetite is suppressed when these messages activate specific neurons – ones that contain CGRP, (calcitonin gene-related peptide) in a region of the brain called the parabrachial nucleus.
In mouse trials, researchers used genetic techniques and viruses to introduce light-activatable proteins into CGRP neurons. Activation of these proteins excites the cells to transmit chemical signals to other regions of the brain. When they activated the CGRP neurons with a laser, the hungry mice immediately lost their appetite and walked away from their liquid diet (Ensure); when the laser was turned off, the mice resumed drinking the liquid diet.
"These results demonstrate that activation of the CGRP-expressing neurons regulates appetite. This is a nice example of how the brain responds to unfavorable conditions in the body, such as nausea caused by food poisoning" said Richard Palmiter, UW professor of biochemistry and investigator of the Howard Hughes Medical Institute.
Using a similar approach, neurons in other brain regions have been identified that can stimulate the appetite of mice that are not hungry. Researchers hope to identify the complete neural circuit (wiring diagram) in the brain that regulates feeding behavior. By identifying these neural circuits, scientists may be able to design therapies that promote or decrease appetite.
The brain is plastic - adapting to the hundreds of experiences in our daily lives by reorganizing pathways and making new connections between nerve cells. This plasticity requires that memories of new information and experiences are formed fast. So fast that the body has a special mechanism, unique to nerve cells, that enables memories to be made rapidly. In a new study from The Montreal Neurological Institute and Hospital, The Neuro, McGill University with colleagues at the Université de Montréal, researchers have discovered that nerve cells have a special ‘pre-assembly’ technique to expedite the manufacture of proteins at nerve cell connections (synapses), enabling the brain to rapidly form memories and be plastic.

Making a memory requires the production of proteins at synapses. These proteins then change the strength of the connection or pathway. In nerve cells the production process for memory proteins is already pre-assembled at the synapse but stalled just before completion, awaiting the proper signals to finish, thereby speeding up the entire process. When it comes time to making the memory, the process is switched on and the protein is made in a flash. The mechanism is analogous to a pre-fab home, or pre-made pancake batter that is assembled in advance and then quickly completed in the correct location at the correct time.
“It’s not only important to make proteins in the right place but, it’s also important not to make the protein when it’s the wrong time,” says Dr. Wayne Sossin, neuroscientist at The Neuro and senior investigator on the paper. “This is especially important with nerve cells in the brain, as you only want the brain to make precise connections. If this process is indiscriminate, it leads to neurological disease. This mechanism to control memory protein synthesis solves two problems: 1) how to make proteins only at the right time and 2) how to make proteins as quickly as possible in order to tightly associate the synaptic change with the experience/memory.
Making proteins from genetic material involves two major steps [a Nobel prize was awarded for the identification of the cell’s protein-making process]. In the first step, called transcription, the information in DNA that is stored and protected within the centre of the cell is copied to a messenger RNA (mRNA) – this copy is then moved to where it is needed in the cell. In the second step, called translation, the mRNA is used as a template of genetic information and ‘read’ by little machines called ribosomes, which decode the mRNA sequence and stitch together the correct amino acids to form the protein.
Dr. Sossin’s group at The Neuro has discovered that the mRNA travels to the synapse already attached to the ribosome, with the protein production process stopped just before completion of the product (at the elongation/termination step of translation, where amino acids are being assembled into protein). The ‘pre-assembly’ process then waits for synaptic signals before re-activating to produce a lot of proteins quickly in order to form a memory. “Our results reveal a new mechanism underlying translation-dependent synaptic plasticity, which is dysregulated in neurodevelopmental and psychiatric pathologies”, added Dr. Sossin. “Understanding the pathways involved may provide new therapeutic targets for neurodevelopmental disorders. “
A brain region activated when people are asked to perform mathematical calculations in an experimental setting is similarly activated when they use numbers — or even imprecise quantitative terms, such as “more than”— in everyday conversation, according to a study by Stanford University School of Medicine scientists.

Using a novel method, the researchers collected the first solid evidence that the pattern of brain activity seen in someone performing a mathematical exercise under experimentally controlled conditions is very similar to that observed when the person engages in quantitative thought in the course of daily life.
“We’re now able to eavesdrop on the brain in real life,” said Josef Parvizi, MD, PhD, associate professor of neurology and neurological sciences and director of Stanford’s Human Intracranial Cognitive Electrophysiology Program. Parvizi is the senior author of the study, published Oct. 15 in Nature Communications. The study’s lead authors are postdoctoral scholar Mohammad Dastjerdi, MD, PhD, and graduate student Muge Ozker.
The finding could lead to “mind-reading” applications that, for example, would allow a patient who is rendered mute by a stroke to communicate via passive thinking. Conceivably, it could also lead to more dystopian outcomes: chip implants that spy on or even control people’s thoughts.
“This is exciting, and a little scary,” said Henry Greely, JD, the Deane F. and Kate Edelman Johnson Professor of Law and steering committee chair of the Stanford Center for Biomedical Ethics, who played no role in the study but is familiar with its contents and described himself as “very impressed” by the findings. “It demonstrates, first, that we can see when someone’s dealing with numbers and, second, that we may conceivably someday be able to manipulate the brain to affect how someone deals with numbers.”
The researchers monitored electrical activity in a region of the brain called the intraparietal sulcus, known to be important in attention and eye and hand motion. Previous studies have hinted that some nerve-cell clusters in this area are also involved in numerosity, the mathematical equivalent of literacy.
However, the techniques that previous studies have used, such as functional magnetic resonance imaging, are limited in their ability to study brain activity in real-life settings and to pinpoint the precise timing of nerve cells’ firing patterns. These studies have focused on testing just one specific function in one specific brain region, and have tried to eliminate or otherwise account for every possible confounding factor. In addition, the experimental subjects would have to lie more or less motionless inside a dark, tubular chamber whose silence would be punctuated by constant, loud, mechanical, banging noises while images flashed on a computer screen.
“This is not real life,” said Parvizi. “You’re not in your room, having a cup of tea and experiencing life’s events spontaneously.” A profoundly important question, he said, is: “How does a population of nerve cells that has been shown experimentally to be important in a particular function work in real life?”
His team’s method, called intracranial recording, provided exquisite anatomical and temporal precision and allowed the scientists to monitor brain activity when people were immersed in real-life situations. Parvizi and his associates tapped into the brains of three volunteers who were being evaluated for possible surgical treatment of their recurring, drug-resistant epileptic seizures.
The procedure involves temporarily removing a portion of a patient’s skull and positioning packets of electrodes against the exposed brain surface. For up to a week, patients remain hooked up to the monitoring apparatus while the electrodes pick up electrical activity within the brain. This monitoring continues uninterrupted for patients’ entire hospital stay, capturing their inevitable repeated seizures and enabling neurologists to determine the exact spot in each patient’s brain where the seizures are originating.
During this whole time, patients remain tethered to the monitoring apparatus and mostly confined to their beds. But otherwise, except for the typical intrusions of a hospital setting, they are comfortable, free of pain and free to eat, drink, think, talk to friends and family in person or on the phone, or watch videos.
The electrodes implanted in patients’ heads are like wiretaps, each eavesdropping on a population of several hundred thousand nerve cells and reporting back to a computer.
In the study, participants’ actions were also monitored by video cameras throughout their stay. This allowed the researchers later to correlate patients’ voluntary activities in a real-life setting with nerve-cell behavior in the monitored brain region.
As part of the study, volunteers answered true/false questions that popped up on a laptop screen, one after another. Some questions required calculation — for instance, is it true or false that 2+4=5? — while others demanded what scientists call episodic memory — true or false: I had coffee at breakfast this morning. In other instances, patients were simply asked to stare at the crosshairs at the center of an otherwise blank screen to capture the brain’s so-called “resting state.”
Consistent with other studies, Parvizi’s team found that electrical activity in a particular group of nerve cells in the intraparietal sulcus spiked when, and only when, volunteers were performing calculations.
Afterward, Parvizi and his colleagues analyzed each volunteer’s daily electrode record, identified many spikes in intraparietal-sulcus activity that occurred outside experimental settings, and turned to the recorded video footage to see exactly what the volunteer had been doing when such spikes occurred.
They found that when a patient mentioned a number — or even a quantitative reference, such as “some more,” “many” or “bigger than the other one” — there was a spike of electrical activity in the same nerve-cell population of the intraparietal sulcus that was activated when the patient was doing calculations under experimental conditions.
That was an unexpected finding. “We found that this region is activated not only when reading numbers or thinking about them, but also when patients were referring more obliquely to quantities,” said Parvizi.
“These nerve cells are not firing chaotically,” he said. “They’re very specialized, active only when the subject starts thinking about numbers. When the subject is reminiscing, laughing or talking, they’re not activated.” Thus, it was possible to know, simply by consulting the electronic record of participants’ brain activity, whether they were engaged in quantitative thought during nonexperimental conditions.
Any fears of impending mind control are, at a minimum, premature, said Greely. “Practically speaking, it’s not the simplest thing in the world to go around implanting electrodes in people’s brains. It will not be done tomorrow, or easily, or surreptitiously.”
Parvizi agreed. “We’re still in early days with this,” he said. “If this is a baseball game, we’re not even in the first inning. We just got a ticket to enter the stadium.”
Neurons that process sensory information such as touch and vision are arranged in precise, well-characterized maps that are crucial for translating perception into understanding. A study published by Cell Press on October 14 in the journal Developmental Cell reveals that the actual act of birth in mice causes a reduction in a brain chemical called serotonin in the newborn mice, triggering sensory maps to form. The findings shed light on the key role of a dramatic environmental event in the development of neural circuits and reveal that birth itself is one of the triggers that prepares the newborn for survival outside the womb.

"Our results clearly demonstrate that birth has active roles in brain formation and maturation," says senior study author Hiroshi Kawasaki of Kanazawa University in Japan. "We found that birth regulates neuronal circuit formation not only in the somatosensory system but also in the visual system. Therefore, it seems reasonable to speculate that birth actually plays a wider role in various brain regions."
Mammals ranging from mice to humans have brain maps that represent various types of sensory information. In a region of the rodent brain known as the barrel cortex, neurons that process tactile information from whiskers are arranged in a map corresponding to the spatial pattern of whiskers on the snout, with neighboring columns of neurons responding to stimulation of adjacent whiskers. Although previous studies have shown that the neurotransmitter serotonin influences the development of sensory maps, its specific role during normal development has not been clear until now.
In this new study, Kawasaki and his team find that the birth of mouse pups leads to a drop in serotonin levels in the newborn’s brain, triggering the formation of neural circuits in the barrel cortex and in the lateral geniculate nucleus (LGN), a brain region that processes visual information. When mice were treated with drugs that either induced preterm birth or decreased serotonin signaling, neural circuits in the barrel cortex as well as in the LGN formed more quickly. Conversely, neural circuits in the barrel cortex failed to form when the mice were treated with a drug that increased serotonin signaling, suggesting that a reduction in levels of this neurotransmitter is crucial for sensory map formation.
Because serotonin also plays a key role in mental disorders, it is possible that abnormalities in birth processes and the effects on subsequent serotonin signaling and brain development could increase the risk of psychiatric diseases. “Uncovering the entire picture of the downstream signaling pathways of birth may lead to the development of new therapeutic methods to control the risk of psychiatric diseases induced by abnormal birth,” Kawasaki says.
Faced with news of suicides and brain damage in former professional football players, geneticist Barry Ganetzky bemoaned the lack of model systems for studying the insidious and often delayed consequences linked to head injuries.
Then he remembered an unexplored observation from nearly 40 years ago: a sharp strike to a vial of fruit flies left them temporarily stunned, only to recover a short time later. At the time he had marked it only as a curiosity.

Now a professor of genetics at UW–Madison, Ganetzky is turning his accidental discovery into a way to study traumatic brain injury (TBI). He and David Wassarman, a UW professor of cell and regenerative biology, report this week (Oct. 14) in the Proceedings of the National Academy of Sciences on the first glimpses of the genetic underpinnings of susceptibility to brain injuries and links to human TBI.
TBIs occur when a force on the body jostles the brain inside the head, causing it to strike the inside of the skull. More than 1.7 million TBIs occur each year in the United States, about one-third due to falls and the rest mainly caused by car crashes, workplace accidents, and sports injuries. TBIs are also a growing issue in combat veterans exposed to explosions.
In many cases, the immediate effects of TBI are temporary and may seem mild — confusion, dizziness or loss of coordination, headaches, vision problems. But over time, impacts may lead to neurodegeneration and related symptoms, including memory loss, cognitive problems, severe depression, or Alzheimer’s-like dementia. Together TBIs cost tens of billions of dollars annually in medical expenses and indirect costs such as lost productivity.
Though TBIs can be classified from “mild” to “severe” based on symptoms, there is a poor understanding of the underlying medical causes.
“Unlike many important medical problems — high blood pressure, cancer, diabetes, heart disease — where we know something about the biology, we know almost nothing about TBI,” Ganetzky says. “Why does a blow to the head cause epilepsy? Or how does it lead down the road to neurodegeneration? Nobody has answers to those questions — in part, because it’s really hard to study in humans.”
Enter the fruit fly. The fly brain is encased in a hard cuticle analogous to the skull, and the basic mechanisms affecting nervous system function are the same in flies and mammals. In the new study, Ganetzky and Wassarman describe a way to reproducibly inflict traumas that seem to mimic the injuries and symptoms of human TBI.
“Now we have a system where we can look at the variables that are the inputs into TBI and determine the relative contributions of each to the pathological outcomes. That’s the real power of the flies,” says Wassarman.
As with humans, few flies die from the immediate impact. Afterward, though, the treated flies show many of the same physical consequences as humans who sustain concussions or other TBIs, including temporary incapacitation, loss of coordination and activation of the innate immune response in the short term, followed by neurodegeneration and sometimes an early death.
The researchers, led by Rebeccah Katzenberger, senior research specialist in the UW Department of Cell and Regnerative Biology, also found that age seems to play an important role. Older flies are more susceptible than younger ones to the effects of the impact and, Wassarman says, many of the outcomes of TBI are very similar to normal aging processes.
With this model, the researchers say, they can now draw on the vast collection of genetic tools and techniques available for fruit flies to probe the underlying drivers of damage.
“What we really want is to understand the immediate and long term consequences in cellular and molecular terms,” says Ganetzky. “From that understanding we can proceed in a more directed way to diagnostics and therapeutics.”
One of the key things they have already identified is the crucial role genetics plays in determining the outcome of an injury, revealed by the high degree of variability seen among different strains of flies. This finding may explain why all potential TBI drugs to date have failed in clinical trials despite showing promise in individual rodent models.
As Wassarman explains, “The heart of the problem of solving traumatic brain injury is that we’re all different.”
They are continuing to develop the model through large-scale genetic analysis and have already found that different sets of genes correlate with susceptibility in flies of different ages. With their system, they can also examine the effects of repeated injuries.
Ganetzky sees tremendous potential for developing applications from the fly-based approach and the Wisconsin Alumni Research Foundation (WARF) has filed for patent protection on the discovery.
“These exciting findings that we can study traumatic brain injury — a disorder of growing concern for athletes, the military, and parents — in the elegantly simple model of fruit flies is sure to interest those researchers and companies looking to address this concern,” says Jennifer Gottwald, WARF licensing manager. “The use of this model can accelerate the work of the medical research community in finding treatments and therapies to help patients.”
Recent scientific findings have raised the fear that young athletes may fare worse after sustaining a sports-related concussion than older athletes.
Researchers in the Vanderbilt Sports Concussion Center compared symptoms associated with concussion in middle- and high-school aged athletes with those in college-age athletes and found no significant differences between the two age groups.
The study, “Does age affect symptom recovery after sports-related concussion? A study of high school and college athletes,” was published online Sept. 24 ahead of print in the Journal of Neurosurgery: Pediatrics.
Lead authors were Vanderbilt University School of Medicine students Young Lee and Mitchell Odom. Other researchers were Scott Zuckerman, M.D., Gary Solomon, Ph.D., and Allen Sills, M.D.
In this retrospective study, the researchers reviewed a database containing information on pre-concussion and post-concussion symptoms in two different age groups: younger (13-16 years old) and older (18-22 years old). Athletes (92 in each group) were evenly matched with respect to gender, number of previous concussions, and time to the first post-concussion test.
Each athlete completed individual pre- and post-concussion questionnaires that covered a variety of symptoms associated with concussion, some of which were headache, nausea, dizziness, fatigue, sleep problems, irritability and difficulties with concentration or memory. Each athlete’s post-concussion scores were compared to his or her own individual baseline scores.
The number or severity of symptoms cited at baseline and post-concussion showed no significant difference between the two age groups. Symptoms returned to baseline levels within 30 days after concussion in 95.7 percent of the younger athletes and in 96.7 percent of the older athletes.
“In the evaluation of sports-related concussion, it is imperative to parse out different ways of assessing outcomes: neurocognitive scores versus symptom endorsement versus balance issues, school performance, etc,” Zuckerman said.
“It appears that symptoms may not be a prominent driver when assessing outcomes of younger versus older athletes. We hope that our study can add insight into the evaluation of youth athletes after sports-related concussion.”
Researchers at the University of Toronto discover how the body’s muscles accidentally fall asleep while awake
Normally muscles contract in order to support the body, but in a rare condition known as cataplexy the body’s muscles “fall asleep” and become involuntarily paralyzed. Cataplexy is incapacitating because it leaves the affected individual awake, but either fully or partially paralyzed. It is one of the bizarre symptoms of the sleep disorder called narcolepsy.
“Cataplexy is characterized by muscle paralysis during cognitive awareness, but we didn’t understand how this happened until now, said John Peever of the University of Toronto’s Department of Cell & Systems Biology. “We have shown that the neuro-degeneration of the brain cells that synthesize the chemical hypocretin causes the noradrenaline system to malfunction. When the norandrenaline system stops working properly, it fails to keep the motor and cognitive systems coupled. This results in cataplexy – the muscles fall asleep but the brain stays awake.”
Peever and Christian Burgess, also of Cell & Systems Biology used hypocretin-knockout mice (mice that experience cataplexy), to demonstate that a dysfunctional relationship between the noradrenaline system and the hypocretin-producing system is behind cataplexy. The research was recently published in the journal Current Biology.
The scientists first established that mice experienced sudden loss of muscle tone during cataplectic episodes. They then administered drugs to systematically inhibit or activate a particular subset of adrenergic receptors, the targets of noradrenaline. They were able to reduce the incidence of cataplexy by 90 per cent by activating noradrenaline receptors. In contrast, they found that inhibiting the same receptors increased the incidence of cataplexy by 92 per cent. Their next step was to successfully link how these changes affect the brain cells that directly control muscles.
They found that noradrenaline is responsible for keeping the brain cells (motoneurons) and muscles active. But during cataplexy when muscle tone falls, noradrenaline levels disappear. This forces the muscle to relax and causes paralysis during cataplexy. Peever and Burgess found that restoring noradrenaline pre-empted cataplexy, confirming that the noradrenaline system plays a key role.
A protein that is increased by endurance exercise has been isolated and given to non-exercising mice, in which it turned on genes that promote brain health and encourage the growth of new nerves involved in learning and memory, report scientists from Dana-Farber Cancer Institute and Harvard Medical School.
The findings, reported in the journal Cell Metabolism, help explain the well-known capacity of endurance exercise to improve cognitive function, particularly in older people. If the protein can be made in a stable form and developed into a drug, it might lead to improved therapies for cognitive decline in older people and slow the toll of neurodegenerative diseases such Alzheimer’s and Parkinson’s, according to the investigators.
“What is exciting is that a natural substance can be given in the bloodstream that can mimic some of the effects of endurance exercise on the brain,” said Bruce Spiegelman, PhD, of Dana-Farber and HMS. He is co-senior author of the publication with Michael E. Greenberg, PhD, chair of neurobiology at HMS.
The Spiegelman group previously reported that the protein, called FNDC5, is produced by muscular exertion and is released into the bloodstream as a variant called irisin. In the new research, endurance exercise – mice voluntarily running on a wheel for 30 days – increased the activity of a metabolic regulatory molecule, PGC-1α, in muscles, which spurred a rise in FNDC5 protein. The increase of FNDC5 in turn boosted the expression of a brain-health protein, BDNF (brain-derived neurotrophic protein) in the dentate gyrus of the hippocampus, a part of the brain involved in learning and memory.
It has been found that exercise stimulates BDNF in the hippocampus, one of only two areas of the adult brain that can generate new nerve cells. BDNF promotes development of new nerves and synapses – connections between nerves that allow learning and memory to be stored – and helps preserve the survival of brain cells.
How exercise raises BDNF activity in the brain wasn’t known; the new findings linking exercise, PGC-1α, FNDC5 and BDNF provide a molecular pathway for the effect, although Spiegelman and his colleagues suggest there are probably others.
Having shown that FNDC5 is a molecular link between exercise and increased BDNF in the brain, the scientists asked whether artificially increasing FNDC5 in the absence of exercise would have the same effect. They used a harmless virus to deliver the protein to mice through the bloodstream, in hopes the FNDC5 could reach the brain and raise BDNF activity. Seven days later, they examined the mouse brains and observed a significant increase in BDNF in the hippocampus.
“Perhaps the most exciting result overall is that peripheral deliver of FNDC5 with adenoviral vectors is sufficient to induce central expression of Bdnf and other genes with potential neuroprotective functions or those involved in learning and memory,” the authors said. Spiegelman cautioned that further research is needed to determine whether giving FNDC5 actually improves cognitive function in the animals. The scientists also aren’t sure whether the protein that got into the brain is FNDC5 itself, or irisin, or perhaps another variant of the protein.
Spiegelman said that development of irisin as a drug will require creating a more stable form of the protein.
Project delves deeply in genomics of 599 glioblastoma multiforme cases to better target disease
When The Cancer Genome Atlas launched its massively collaborative approach to organ-by-organ genomic analysis of cancers, the brain had both the benefit, and the challenge, of going first.
TCGA ganged up on glioblastoma multiforme (GBM), the most common and lethal of brain tumors, with more than 100 scientists from 14 institutions tracking down the genomic abnormalities that drive GBM.
Five years later, older and wiser, TCGA revisited glioblastoma, producing a broader, deeper picture of the drivers – and potential therapeutic targets – of the disease published in the Oct. 10 issue of Cell.
“The first paper in 2008 characterized glioblastoma in important new ways and illuminated the path for all TCGA organ studies that have followed,” said senior author Lynda Chin, M.D., professor and chair of Genomic Medicine and scientific director of the Institute for Applied Cancer Science at The University of Texas MD Anderson Cancer Center.
“Our new study reflects major improvements in technology applied to many more tumor samples to more completely characterize the landscape of genomic alterations in glioblastoma,” said Chin, who was also co-senior author of the first paper while she was on the faculty of Dana-Farber Cancer Institute in Boston.
“Information generated by this unbiased, data-driven analysis presents new opportunities to discover genomics-based biomarkers, understand disease mechanisms and generate new hypotheses to develop better, targeted therapies,” Chin said.
About 23,000 new cases of GBM are predicted in the United States during 2013 and more than 14,000 people expected to die of the disease. Most patients die within 15 months of diagnosis.
Well of rich, detailed data will nurture better treatment
New information about genetic mutations, deletions and amplifications; gene expression and epigenetic regulation; structural changes due to chromosomal alterations, proteomic effects and the molecular networks that drive GBM make for a deep, broad dataset that will underpin research and clinical advances for years to come.
“Our main contribution is this tremendous resource for the GBM research community, which is already heavily relying on the earlier TCGA study,” said co-lead author Roeland Verhaak, Ph.D., assistant professor of Bioinformatics and Computational Biology at MD Anderson. “Whatever new treatments people come up with for GBM, I’m very confident that their discovery and development will in some way have benefited from this rich and detailed data set,” he said.
The Cell paper describes analysis of tumor samples and molecular data from 599 patients at 17 study sites. Detailed clinical information including treatment and survival was available for almost all cases
New targetable mutations
In addition to confirming significantly mutated genes discovered earlier, such as the tumor suppressors TP53, PTEN and RB1 and the oncogene PIK3CA, the analysis identified 61 new mutated genes. The most frequent mutations occurred in from 1.7 to 9 percent of cases.
Two of these, BRAF and FGFR, might have more immediate clinical relevance, Verhaak noted. MD Anderson neuro-oncologists are checking to see if patients have these mutations. Drugs are available to address those variations now, Verhaak said. The BRAF point mutation in GBM is the same commonly found in melanoma, which is treated by a new class of drugs.
More twists and turns for EGFR
The larger data set and an improved analytical algorithm allowed major refinement of gene amplification and deletion information. For example, common amplification events were found to occur more frequently than previously known, including amplification of the epidermal growth factor receptor (EGFR) on chromosome 7.
EGFR is both amplified and mutated frequently in GBM; yet therapeutic efforts targeting EGFR so far have failed. “We found EGFR is more frequently altered than we already thought,” Verhaak said.
Overall, the EGFR gene was mutated, rearranged, amplified or otherwise altered in 57 percent of tumors. Increased EGFR protein levels in GBM cells correlated with the many mechanisms of EGFR alteration, Verhaak said.
A treatment based on EGFR still has great potential, he noted. But strategies to target EGFR will need to address the likelihood that different alterations of EGFR might be present in the same tumor and affect the impact of targeted drugs.
Breaking GBM into molecular subtypes
Verhaak and other researchers in recent years have begun to classify GBM tumors by gene expression. Four such subgroups — neural, proneural, mesenchymal and classical — were further characterized by DNA methylation pattern, signaling pathway activity and by clinical measures such as survival and treatment response. Methylation of a gene turns it off.
Understanding the subgroups could establish biomarkers to guide treatment and identify new therapeutic targets.
The team found, for example, that the survival advantage of the proneural subtype depends on a specific DNA methylation pattern known as G-CIMP and that DNA methylation of the MGMT gene may serve as a biomarker of treatment response in the classical subtype.
A study at Texas Christian University in Fort Worth has found that the attractiveness of others can have an impact on how much we lie or misrepresent and to the extent that we believe those lies/misrepresentations.
For example, Harry gets a call from a political polling organization and is asked for his opinion of the Patient Protection and Affordable Care Act. He gives it the lowest possible rating. A few weeks later, Harry meets an attractive woman named Sally online. During their conversation, Sally mentions that she answered the same question by the same polling organization and expressed high approval of Obamacare. She then asks “What approval rating did you give Obamacare when they asked you?”
This question poses a dilemma for Harry. Should he tell the truth or should he shade the truth? To the extent that Harry finds Sally very attractive and is motivated to create a positive impression, he might shade the truth about his past behavior by claiming to have expressed at least moderate approval of Obamacare. What, if any, effect would this misrepresentation have on Harry’s memory for how he actually answered on the day he was contacted by the polling organization?
“What we know is that people will embellish or distort facts when telling stories, which causes them to oftentimes remember the lies more so than the truth,” said Charles Lord, professor of psychology at Texas Christian University in Fort Worth. “Research has also showed us that people tell others what they want to hear. In this case, Harry will lie to impress Sally, and he is also more likely to fool himself into believing the lie.”
Researchers asked single individuals if they agreed or disagreed with instituting “comprehensive mandatory exams” for graduating seniors using a 1-10 scale. A total of 44 individuals did not want to institute mandatory exams. Those respondents were then led to believe they would be meeting a member of the opposite sex who wanted to institute mandatory exams by scoring those a nine on the survey. They also were shown a photo of this person and asked to report on a 1-7 scale if they found their partner “physically attractive and wanted to get along with and make a good impression on this partner.”
Participants were then asked to complete a profile to be sent to their partner before an in-person meeting answering the same question about “comprehensive mandatory exams.” Researchers found there was a correlation between the attractiveness of the partner and those warming to the idea of “comprehensive mandatory exams.”
Researchers then retested students with some of the same questions they had taken two weeks earlier by asking respondents to remember what they had said in the initial survey.
“Participants with relatively attractive potential partners remembered giving more positive initial survey responses than participants with relatively unattractive potential partners,” said Lord.
Researchers then tested 117 additional undergraduate students letting them see profile pictures and foreknowledge of how those students responded. They were told they would be partnered with these individuals later in the course. Findings showed that people with perceived “attractive partners” aligned their views more closely with the partner than those with unattractive partners.
“In both experiments we found that knowing the other person’s positive evaluation in advance led participants to misrepresent their own previous evaluations, and this misrepresentation, in turn, altered memories for participants’ own actual past actions,” said Lord.
These findings appear in the forthcoming edition of the Journal of Social Cognition.
Finding that the opioid system can act to ease social pain, not just physical pain, may aid understanding of depression and social anxiety

A brain image showing in orange/red one area of the brain where the natural painkiller (opioid) system was highly active in research volunteers who are experiencing social rejection. This region, called the amygdala, was one of several where the U-M team recorded the first images of this system responding to social pain, not just physical pain. Studying this response, and the variation between people, could aid understanding of depression and anxiety. Credited to UofM Health.
“Sticks and stones may break my bones, but words will never hurt me,” goes the playground rhyme that’s supposed to help children endure taunts from classmates. But a new study suggests that there’s more going on inside our brains when someone snubs us – and that the brain may have its own way of easing social pain.
The findings, recently published in Molecular Psychiatry by a University of Michigan Medical School team, show that the brain’s natural painkiller system responds to social rejection – not just physical injury.
What’s more, people who score high on a personality trait called resilience – the ability to adjust to environmental change – had the highest amount of natural painkiller activation.
Ever wondered how your brain controls movement or creates memories? The wonders and complexities of the human brain are being explained at a free festival of neuroscience, organised by the University of Bristol to give a unique insight into the power of our cleverest organ.

The Bristol Neuroscience Festival, held on 11 and 12 October and open to the public, will celebrate pioneering brain research in the city, with the chance to hear from world-leading academics in the field while taking part in hands-on activities and experiments for all ages.
Professor David Nutt, one of the UK’s leading neuroscientists and the government’s former chief drugs adviser, will give an insight into his career and his work at a public lecture on Friday, 11 October at 7pm.
There will also be a series of free shorter talks throughout both days, with experts from the University talking about the research and major discoveries they have been involved with – from how the brain controls hormones and stress responses to how to improve the memory of people with dementia.
Other topics include obesity, blood pressure, pregnancy and emotions, pain, caffeine, sleep, addictions, tumours and depression.
Local schools, both primary and secondary, will learn more about the brain and how it works through a series of interactive exhibits, including a Scalextric racing car game that’s controlled by people’s brainwaves thanks to an electroencephalography (EEG) brainwave monitor.
At-Bristol will be running their Boggling Brain Show for eight to 12-year-olds, with innovative demonstrations showing what this complex organ is made from, before different areas of the brain are explored using a special MRI scanner.
The festival marks the 10th anniversary of Bristol Neuroscience (BN), which was founded by the University of Bristol in 2003 to ensure that all neuroscientists in Bristol could benefit from the wide cross-disciplinary expertise and facilities in the University and its partner hospitals. It has since become a model for other cities across the UK.
Expertise within BN ranges from molecular and cellular neuroscience to clinical, patient-based research, with areas of interest including human cognition, synaptic plasticity, stress and dementia.
Professor Richard Apps, Director of Bristol Neuroscience, said: “The festival will showcase the fantastic range of world-class neuroscience research that occurs at Bristol and is an ideal opportunity to celebrate that work and share it with the public.
“We’ve planned a range of activities to appeal to all ages, from fun hands-on tests to serious talks and the opportunity to learn about complex neuroscience from experts in the field.”
There will also be a SciArt photographic exhibition, featuring a collection of 36 beautiful neuroscience images, and the opportunity to knit-a-neurone, whilst listening to bespoke music, inspired by neuroscience.
Exhibits at the festival have been developed in collaboration with external partners including At-Bristol, Happy City, Bristol City Council, BRACE, Glenside Hospital Museum and UWE.
The Bristol Neuroscience Festival runs from 10am to 6pm, on 11 and 12 October, in the Wills Memorial Building. All events are free but tickets are required for the talks. For more information, please see here.
Although problems with memory become increasingly common as people age, in some persons, memories last long time, even a life time. On the other hand, some people experience milder to substantial memory problems even at an earlier age.

Although there are several risk factors of dementia, abnormal fat metabolism has been known to pose a risk for memory and learning. People with high amounts of abdominal fat in their middle age are 3.6 times as likely to develop memory loss and dementia later in their life.
Neurological scientists at the Rush University Medical Center in collaboration with the National Institutes of Health have discovered that same protein that controls fat metabolism in the liver resides in the memory center of the brain (hippocampus) and controls memory and learning.
Results from the study funded by the Alzheimer’s Association and the National Institutes of Health were recently published in Cell Reports.
“We need to better understand how fat is connected to memory and learning so that we can develop effective approach to protect memory and learning,” said Kalipada Pahan, PhD, the Floyd A. Davis professor of neurology at Rush University Medical Center.
The liver is the body’s major fat metabolizing organ. Peroxisome proliferator-activated receptor alpha (PPARalpha) is known to control fat metabolism in the liver. Accordingly, PPARalpha is highly expressed in the liver.
“We are surprised to find high level of PPARalpha in the hippocampus of animal models,” said Pahan.
“While PPARalpha deficient mice are poor in learning and memory, injection of PPARα to the hippocampus of PPARalpha deficient mice improves learning and memory,” said Pahan.
Since PPARalpha directly controls fat metabolism, people with abdominal fat levels have depleted PPARalpha in the liver and abnormal lipid metabolism. At first, these individuals lose PPARalpha from the liver and then eventually from the whole body including the brain. Therefore, abdominal fat is an early indication of some kind of dementia later in life, according to Pahan.
By bone marrow chimera technique, researchers were able to create some mice having normal PPARalpha in the liver and depleted PPARalpha in the brain. These mice were poor in memory and learning. On the other hand, mice that have normal PPARalpha in the brain and depleted PPARalpha in the liver showed normal memory.
“Our study indicates that people may suffer from memory-related problems only when they lose PPARalpha in the hippocampus”, said Pahan.
CREB (cyclic AMP response element-binding protein) is called the master regulator of memory as it controls different memory-related proteins. “Our study shows that PPARalpha directly stimulates CREB and thereby increases memory-related proteins”, said Pahan.
“Further research must be conducted to see how we could potentially maintain normal PPARalpha in the brain in order to be resistant to memory loss”, said Pahan.
Case Western Reserve University researchers today published findings that point to a promising discovery for the treatment and prevention of prion diseases, rare neurodegenerative disorders that are always fatal. The researchers discovered that recombinant human prion protein stops the propagation of prions, the infectious pathogens that cause the diseases.
“This is the very first time recombinant protein has been shown to inhibit diseased human prions,” said Wen-Quan Zou, MD, PhD, senior author of the study and associate professor of pathology and neurology at Case Western Reserve School of Medicine.
Recombinant human prion protein is generated in E. coli bacteria and it has the same protein sequence as normal human brain protein. But different in that, the recombinant protein lacks attached sugars and lipids. In the study, published online in Scientific Reports, researchers used a method called protein misfolding cyclic amplification which, in a test-tube, mimics the prions’ replication within the human brain. The propagation of human prions was completely inhibited when the recombinant protein was added into the test-tube. The researchers found that the inhibition is dose-dependent and highly specific in responding to the human-form of the recombinant protein, as compared to recombinant mouse and bovine prion proteins. They demonstrated that the recombinant protein works not only in the cell-free model but also in cultured cells, which are the first steps of translational research. Further, since the recombinant protein has an identical sequence to the brain protein, the application of the recombinant protein is less likely to cause side effects.
Prion diseases are a group of fatal transmissible brain diseases affecting both humans and animals. Prions are formed through a structural change of a normal prion protein that resides in all humans. Once formed, they continue to recruit other normal prion protein and finally cause spongiform-like damage in the brain. Currently, the diseases have no cure.
Previous outbreaks of mad cow disease and subsequent occurrences of the human form, variant Creutzfeldt–Jakob disease, have garnered a great deal of public attention. The fear of future outbreaks makes the search for successful interventions all the more urgent.
A research team, headed by Theodore Friedmann, MD, professor of pediatrics at the University of California, San Diego School of Medicine, says a gene mutation that causes a rare but devastating neurological disorder known as Lesch-Nyhan syndrome appears to offer clues to the developmental and neuronal defects found in other, diverse neurological disorders like Alzheimer’s, Parkinson’s and Huntington’s diseases.
The findings, published in the October 9, 2013 issue of the journal PLOS ONE, provide the first experimental picture of how gene expression errors impair the ability of stem cells to produce normal neurons, resulting instead in neurological disease. More broadly, they indicate that at least some distinctly different neurodevelopmental and neurodegenerative disorders share basic, causative defects.
The scientists say that understanding defects in Lesch-Nyhan could help identify errant processes in other, more common neurological disorders, perhaps pointing the way to new kinds of therapies.
Lesch-Nyhan syndrome is caused by defects in the HPRT1 gene (short for hypoxanthine guanine phosphoribosyltransferace, the enzyme it encodes), a gene that is well-known for its essential “housekeeping duties,” among them helping generate purine nucleotides – the building blocks of DNA and RNA.
Mutations in the gene result in deficiencies in the HPRT enzyme, leading to defective expression of the neurotransmitter dopamine and subsequent abnormal neuron function. HPRT mutation is known to be the specific cause of Lesch-Nyhan, an inherited neurodevelopmental disorder characterized by uncontrollable repetitive body movements, cognitive defects and compulsive self-mutilating behaviors. The disorder was first described in 1964 by medical student Michael Lesch and his mentor, William Nyhan, MD, professor emeritus at UC San Diego School of Medicine.
Eating disorders like anorexia nervosa and bulimia often run in families, but identifying specific genes that increase a person’s risk for these complex disorders has proved difficult.
Now scientists from the University of Iowa and University of Texas Southwestern Medical Center have discovered—by studying the genetics of two families severely affected by eating disorders—two gene mutations, one in each family, that are associated with increased risk of developing eating disorders.
Moreover, the new study shows that the two genes interact in the same signaling pathway in the brain, and that the two mutations produce the same biological effect. The findings suggest that this pathway might represent a new target for understanding and potentially treating eating disorders.
"If you’re considering two randomly discovered genes, the chance that they will interact is small. But, what really sealed the deal for us that the association was real was that the mutations have the same effect," says Michael Lutter, UI assistant professor of psychiatry and senior author of the study.
Overall, the study, published Oct. 8 in the Journal of Clinical Investigation, suggests that mutations that decrease the activity of a transcription factor—a protein that turns on the expression of other genes—called estrogen-related receptor alpha (ESRRA) increase the risk of eating disorders.
The challenge of finding genes for complex diseases
Anorexia nervosa and bulimia nervosa are fairly common, especially among women. They affect between 1 and 3 percent of women. They also are among the most lethal of all psychiatric diseases; about 1 in 1,000 women will die from anorexia.
Finding genes associated with complex diseases like eating disorders is challenging. Scientists can analyze the genetics of thousands of people and use statistics to find common, low-risk gene variations, the accumulation of which causes complex disorders from psychiatric conditions like eating disorders to conditions like heart disease or obesity.
On the other end of the spectrum are very rare gene variants, which confer an almost 100 percent risk of getting the disease. To track down these variants, researchers turn to large families that are severely affected by an illness.
Lutter and his colleagues were able to work with two such families to identify the two new genes associated with eating disorders.
"It’s basically a matter of finding out what the people with the disorder share in common that people without the disease don’t have," Lutter explains. "From a theoretical perspective, it’s straightforward. But the difficulty comes in having a large enough group to find these rare genes. You have to have large families to get the statistical power."
In the new study, 20 members from three generations of one family (10 affected individuals and 10 unaffected), and eight members of a second family (six affected and two unaffected) were analyzed.
Two genes, one pathway
The gene discovered in the larger family was ESRRA, a transcription factor that turns on the expression of other genes. The mutation associated with eating disorders decreases ESSRA activity.
The gene found in the second family is a transcriptional repressor called histone deacetylase 4 (HDAC4), which turns off transcription factors, including ESRRA. This mutation is unusual in the sense that it increases the gene’s activity—most mutations decrease or destroy a gene’s activity.
Importantly, the team also found that the two affected proteins interacted with one another; HDAC4 binds to ESRRA and inhibits it.
"The fact that the HDAC4 mutation happens to increase the gene activity and happens to increase its ability to repress the ESSRA protein we found in the other family was just beyond coincidence," Lutter says.
The two genes are already known to be involved in metabolic pathways in muscle and fat tissue. They also are both regulated by exercise.
In the brain, HDAC4 is very important for regulating genes that form connections between neurons. However, there’s almost nothing known about ESRRA in the brain, although it is expressed in many brain regions that are disrupted in anorexia.
Lutter and his colleagues plan to study the role of these genes in mice and in cultured neurons to find out exactly what they are doing in the brain. They will also look for ways to modify the genes’ activity, with the long-term goal of finding small molecules that might be developed into therapies for eating disorders.
They also plan to study patients with eating disorders and see if other genes associated with the ESSRA/HDAC4 brain pathway are affected in humans.
Scientists at the Montreal Neurological Institute and Hospital-The Neuro, McGill University, have made important discoveries about a cellular process that occurs during normal brain development and may play an important role in neurodegenerative diseases. The study’s findings, published in Cell Reports, a leading scientific journal, point to new pathways and targets for novel therapies for Alzheimer’s, Parkinson’s, ALS and other neurodegenerative diseases that affect millions of people world-wide.

Research into neurodegenerative disease has traditionally concentrated on the death of nerve cell bodies. However, it is now certain that in most cases that nerve cell body death represents the final event of an extended disease process. Studies have shown that protecting cell bodies from death has no impact on disease progression whereas blocking preceding axon breakdown has a significant benefit. The new study by researchers at The Neuro shifts the focus to the loss or degeneration of axons, the nerve-cell ‘branches’ that receive and distribute neurochemical signals among neurons.
During early development, axons are pruned to ensure normal growth of the nervous system. Emerging evidence suggests that this pruning process becomes reactivated in neurodegenerative disease, leading to the aberrant loss of axons and dendrites. Axonal pruning in development is significantly influenced by proteins called caspases. “The idea that caspases are even involved in axonal degeneration during development is very recent” said Dr. Philip Barker, a principal investigator at The Neuro and senior author of the study.
Dr. Barker and his colleagues show that the activity of certain ’executioner’ caspases (caspase-3 and caspase-9) induce axonal degeneration and that their action is suppressed by a protein termed XIAP (X-linked inhibitor of apoptosis). “We found that caspase-3- and -9 play crucial roles in axonal degeneration and that their activities are regulated by XIAP. XIAP acts as a brake on caspase activity and must be removed for degeneration to proceed” added Dr. Barker.
This balancing act between caspases and XIAP ensure that caspases do not cause unnecessary or excessive destruction. However, this balance may shift during neurodegenerative disease. “If we understand the pathways that regulate XIAP levels, we may be able to develop therapies that reduce caspase-dependent degeneration during neurodegenerative disease”.
In a breakthrough for understanding brain evolution, neuroscientists have shown that differences between primate brains - from the tiny marmoset to human – can be largely explained as consequences of the same genetic program.

In research published in the Journal of Neuroscience, Professor Marcello Rosa and his team at Monash University’s School of Biomedical Sciences and colleagues at Universidade Federal do Rio de Janeiro, in Brazil, used computer modelling to demonstrate that the substantial enlargement of some areas of the human brain, vital to advanced cognition, reflected a consistent pattern that is seen across primate species of all sizes.
This finding suggests how the neural circuits responsible for traits that we consider uniquely human – such as the ability to plan, make complex decisions and speak – could have emerged simply as a natural consequence of the evolution of larger brains.
“We have known for a long time that certain areas of the human brain are much larger than one would expect based on how monkey brains are organised,” Professor Rosa said.
“What no one had realised is that this selective enlargement is part of a trend that has been present since the dawn of primates.”
Using publicly available brain maps, MRI imaging data and modelling software, the neuroscientists compared the sizes of different brain areasin humans and three monkey species: marmosets, capuchins and macaques. They found that two regions, the lateral prefrontal cortex and the temporal parietal junction, expand disproportionally to the rest of the brain.
The prefrontal cortex is related to long term planning, personality expression, decision-making, and behaviour modification. The temporal parietal junction is related to self-awareness and self-other distinction.
Lead author Tristan Chaplin, from the Department of Physiology will commence his PhD next year. He said the findings showed that those areas of the brain grew disproportionately in a predictable way.
“We found that the larger the brain is, the larger these areas get,” Tristan said.
“When you go from a small to big monkey - the marmoset to macaque - the prefrontal cortex and temporal parietal junction get larger relative to the rest of the cortex, and we see the same thing again when you compare macaques to humans.”
“This trend argues against the view that specific human mutations gave us these larger areas and advanced cognition and behaviour, but are a consequence of what happens in development when you grow a larger brain,” Tristan said.
Professor Rosa said the pattern held for primate species that evolved completely separately.
"If you compare the capuchin of South America and the macaque of Asia, their brains are almost identical, although they developed on opposite sides of the world. They both reflect the genetic plan of how a primate brain grows," Professor Rosa said.
This is the first computational comparative study conducted across several primate species. Tristan now hopes, in collaboration with zoos, to check if our closest primate relatives, the chimpanzees and gorillas, also have brain areas organised as his theory predicts.
In animal study, inflammation stops cells from accessing iron needed for brain development
Researchers exploring the link between newborn infections and later behavior and movement problems have found that inflammation in the brain keeps cells from accessing iron that they need to perform a critical role in brain development.
Specific cells in the brain need iron to produce the white matter that ensures efficient communication among cells in the central nervous system. White matter refers to white-colored bundles of myelin, a protective coating on the axons that project from the main body of a brain cell.
The scientists induced a mild E. coli infection in 3-day-old mice. This caused a transient inflammatory response in their brains that was resolved within 72 hours. This brain inflammation, though fleeting, interfered with storage and release of iron, temporarily resulting in reduced iron availability in the brain. When the iron was needed most, it was unavailable, researchers say.
“What’s important is that the timing of the inflammation during brain development switches the brain’s gears from development to trying to deal with inflammation,” said Jonathan Godbout, associate professor of neuroscience at The Ohio State University and senior author of the study. “The consequence of that is this abnormal iron storage by neurons that limits access of iron to the rest of the brain.”
The research is published in the Oct. 9, 2013, issue of The Journal of Neuroscience.
The cells that need iron during this critical period of development are called oligodendrocytes, which produce myelin and wrap it around axons. In the current study, neonatal infection caused neurons to increase their storage of iron, which deprived iron from oligodendrocytes.
In other mice, the scientists confirmed that neonatal E. coli infection was associated with motor coordination problems and hyperactivity two months later – the equivalent to young adulthood in humans. The brains of these same mice contained lower levels of myelin and fewer oligodendrocytes, suggesting that brief reductions in brain-iron availability during early development have long-lasting effects on brain myelination.
The timing of infection in newborn mice generally coincides with the late stages of the third trimester of pregnancy in humans. The myelination process begins during fetal development and continues after birth.
Though other researchers have observed links between newborn infections and effects on myelin and behavior, scientists had not figured out why those associations exist. Godbout’s group focuses on understanding how immune system activation can trigger unexpected interactions between the central nervous system and other parts of the body.
“We’re not the first to show early inflammatory events can change the brain and behavior, but we’re the first to propose a detailed mechanism connecting neonatal inflammation to physiological changes in the central nervous system,” said Daniel McKim, a lead author on the paper and a student in Ohio State’s Neuroscience Graduate Studies Program.
The neonatal infection caused several changes in brain physiology. For example, infected mice had increased inflammatory markers, altered neuronal iron storage, and reduced oligodendrocytes and myelin in their brains. Importantly, the impairments in brain myelination corresponded with behavioral and motor impairments two months after infection.
Though it’s unknown if these movement problems would last a lifetime, McKim noted that “since these impairments lasted into what would be young adulthood in humans, it seems likely to be relatively permanent.”
The reduced myelination linked to movement and behavior issues in this study has also been associated with schizophrenia and autism spectrum disorders in previous work by other scientists, said Godbout, also an investigator in Ohio State’s Institute for Behavioral Medicine Research (IBMR).
“More research in this area could confirm that human behavioral complications can arise from inflammation changing the myelin pattern. Schizophrenia and autism disorders are part of that,” he said.
This current study did not identify potential interventions to prevent these effects of early-life infection. Godbout and colleagues theorize that maternal nutrition – a diet high in antioxidants, for example – might help lower the inflammation in the brain that follows a neonatal infection.
“The prenatal and neonatal period is such an active time of development,” Godbout said. “That’s really the key – these inflammatory challenges during critical points in development seem to have profound effects. We might just want to think more about that clinically.”
Subtle body cues allow people to identify others with surprising accuracy when faces are difficult to differentiate. This skill may help researchers improve person-recognition software and expand their understanding of how humans recognize each other.
A study published in Psychological Science by researchers at The University of Texas at Dallas demonstrates that humans rely on non-facial cues, such as body shape and build, to identify people in challenging viewing conditions, such as poor lighting.
“Psychologists and computer scientists have concentrated almost exclusively on the role of the face in person recognition,” explains lead researcher Allyson Rice. “Our results show that the body can also provide important and sometimes sufficient identity information for person recognition.”
During several experiments, researchers asked college-age participants to look at images of two people side-by-side and identify whether the images showed the same person. Some pairs looked similar despite showing different people, while other image pairs showed the same person with a different appearance. The researchers used computer face recognition systems to find pairs of pictures in which facial characteristics were difficult to use for identity.
Overall, participants accurately discerned whether the images showed the same person when they were provided complete images that showed both the face and body. Participants were just as accurate in identifying people in the image pairs when the faces were blocked out and only the bodies were shown. But, similarly to the computer-based face recognition system, participants had trouble identifying images of the subjects’ faces without their bodies.

Image: Above are pairs of photographs that face-recognition software failed to identify correctly. The top two photos are of the same person, while the bottom two photos are of different people
When asked, participants thought they were using primarily facial features to identify the subjects. To unravel the paradox, the researchers used eye-tracking equipment to determine where participants were actually looking. They found participants spent more time looking at the body whenever the face did not provide enough information to identify the subjects.
“People’s recognition strategies were inaccessible to their conscious awareness,” Rice said. “This provides a cautionary tale in ascribing credibility to people’s subjective reports of how they came to an identity decision.”
Dr. Alice O’Toole, Aage and Margareta Møller Professor in the School of Behavioral and Brain Sciences, has worked on facial recognition for over 15 years and supervised the project.
“Given the widespread use of face recognition systems in security settings, it is important for these systems to make use of all potentially helpful information,” O’Toole said. “Our work shows that the body can be surprisingly useful for identification, especially when the face fails to provide the necessary identity information.”