Findings point to potential biomarkers for early detection of at-risk youth
Researchers at the University of California, San Diego School of Medicine have discovered impaired neuronal activity in the parts of the brain associated with anticipatory functioning among occasional 18- to 24-year-old users of stimulant drugs, such as cocaine, amphetamines and prescription drugs such as Adderall.
The brain differences, detected using functional magnetic resonance imaging (fMRI), are believed to represent an internal hard wiring that may make some people more prone to drug addiction later in life.
Among the study’s main implications is the possibility of being able to use brain activity patterns as a means of identifying at-risk youth long before they have any obvious outward signs of addictive behaviors.
The study is published in the March 26 issue of the Journal of Neuroscience.
“If you show me 100 college students and tell me which ones have taken stimulants a dozen times, I can tell you those students’ brains are different,” said Martin Paulus, MD, professor of psychiatry and a co-senior author with Angela Yu, PhD, professor of cognitive science at UC San Diego. “Our study is telling us, it’s not ‘this is your brain on drugs,’ it’s ‘this is the brain that does drugs.’”
In the study, 18- to 24-year-old college students were shown either an X or an O on a screen and instructed to press, as quickly as possible, a left button if an X appeared or a right button if an O appeared. If a tone was heard, they were instructed not to press a button. Each participant’s reaction times and errors were measured for 288 trials, while their brain activity was recorded via fMRI.
Occasional users were characterized as having taken stimulants an average of 12 to 15 times. The “stimulant naïve” control group included students who had never taken stimulants. Both groups were screened for factors, such as alcohol dependency and mental health disorders, that might have confounded the study’s results.
The outcomes from the trials showed that occasional users have slightly faster reaction times, suggesting a tendency toward impulsivity. The most striking difference, however, occurred during the “stop” trials. Here, the occasional users made more mistakes, and their performance worsened, relative to the control group, as the task became harder (i.e., when the tone occurred later in the trial).
The brain images of the occasional users showed consistent patterns of diminished neuronal activity in the parts of the brain associated with anticipatory functioning and updating anticipation based on past trials.
“We used to think that drug addicts just did not hold themselves back but this work suggests that the root of this is an impaired ability to anticipate a situation and to detect trends in when they need to stop,” said Katia Harlé, PhD, a postdoctoral researcher in the Paulus laboratory and the study’s lead author.
The next step will be to examine the degree to which these brain activity patterns are permanent or can be re-calibrated. The researchers said it may be possible to “exercise” weak areas of the brain, where attenuated neuronal activity is associated with higher tendency to addiction.
“Right now there are no treatments for stimulant addiction and the relapse rate is upward of 50 percent,” Paulus said. “Early intervention is our best option.”
The same gene family that may have helped the human brain become larger and more complex than in any other animal also is linked to the severity of autism, according to new research from the University of Colorado Anschutz Medical Campus.

The gene family is made up of over 270 copies of a segment of DNA called DUF1220. DUF1220 codes for a protein domain – a specific functionally important segment within a protein. The more copies of a specific DUF1220 subtype a person with autism has, the more severe the symptoms, according to a paper published in the PLoS Genetics.
This association of increasing copy number (dosage) of a gene-coding segment of DNA with increasing severity of autism is a first and suggests a focus for future research into the condition Autism Spectrum Disorder (ASD). ASD is a common behaviorally defined condition whose symptoms can vary widely – that is why the word “spectrum” is part of the name. One federal study showed that ASD affects one in 88 children.
“Previously, we linked increasing DUF1220 dosage with the evolutionary expansion of the human brain,” says James Sikela, PhD, a professor in the Department of Biochemistry and Molecular Genetics, University of Colorado School of Medicine. Sikela led the autism study which also involved other members of his laboratory.
“One of the most well-established characteristics of autism is an abnormally rapid brain growth that occurs over the first few years of life. That feature fits very well with our previous work linking more copies of DUF1220 with increasing brain size. This suggests that more copies of DUF1220 may be helpful in certain situations but harmful in others.”
The research team found that not only was DUF1220 linked to severity of autism overall, they found that as DUF1220 copy number increased, the severity of each of three main symptoms of the disorder — social deficits, communicative impairments and repetitive behaviors – became progressively worse.
In 2012, Sikela was the lead scientist of a multi-university team whose research established the link between DUF1220 and the rapid evolutionary expansion of the human brain. The work also implicated DUF1220 copy number in brain size both in normal populations as well as in microcephaly and macrocephaly (diseases involving brain size abnormalities).
Jack Davis, PhD, who contributed to the project while a postdoctoral fellow in the Sikela lab, has a son with autism and thus had a very personal motivation to seek out the genetic factors that cause autism.
The research by Sikela, Davis and colleagues at the Anschutz campus in Aurora, Colo., focused on the presence of DUF1220 in 170 people with autism.
Strikingly, Davis says, DUF1220 is as common in people who do not have ASD as in people who do. So the link with severity is only in people who have the disorder.
“Something else is at work here, a contributing factor that is needed for ASD to manifest itself,” Davis says. “We were only able to look at one of the six different subtypes of DUF1220 in this study, so we are eager to look at whether the other subtypes are playing a role in ASD.”
Because of the high number of copies of DUF1220 in the human genome, the domain has been difficult to measure. As Sikela says, “To our knowledge DUF1220 copy number has not been directly examined in previous studies of the genetics of autism and other complex human diseases. So the linking of DUF1220 with ASD is also confirmation that there are key parts of the human genome that are still unexamined but are important to human disease.”
New technique classifies retinal neurons into 15 categories, including some previously unknown types.

As we scan a scene, many types of neurons in our retinas interact to analyze different aspects of what we see and form a cohesive image. Each type is specialized to respond to a particular variety of visual input — for example, light or darkness, the edges of an object, or movement in a certain direction.
Neuroscientists believe there are 20 to 30 types of these specialized neurons, known as retinal ganglion cells, but they have yet to come up with a definitive classification system.
A new study from MIT neuroscientists has made some headway on this daunting task. Using a computer algorithm that traces the shapes of neurons and groups them based on structural similarity, the researchers sorted more than 350 mouse retinal neurons into 15 types, including six that were previously unidentified.
This technique, described in the March 24 online edition of Nature Communications, could also be deployed to help identify the huge array of neurons found in the brain’s cortex, says Uygar Sumbul, an MIT postdoc and one of the lead authors of the paper. “This delineates a program that we should be doing for the rest of the retina, and elsewhere in the brain, to robustly and precisely know the cell types,” he says.
The paper’s other lead author is former MIT postdoc Sen Song. Sebastian Seung, a former MIT professor of brain and cognitive sciences and physics who is now at Princeton University, is the paper’s senior author.
TAU researcher uses DNA therapy in lab mice to improve cochlear implant functionality
One in a thousand children in the United States is deaf, and one in three adults will experience significant hearing loss after the age of 65. Whether the result of genetic or environmental factors, hearing loss costs billions of dollars in healthcare expenses every year, making the search for a cure critical.

Now a team of researchers led by Karen B. Avraham of the Department of Human Molecular Genetics and Biochemistry at Tel Aviv University’s Sackler Faculty of Medicine and Yehoash Raphael of the Department of Otolaryngology–Head and Neck Surgery at University of Michigan’s Kresge Hearing Research Institute have discovered that using DNA as a drug — commonly called gene therapy — in laboratory mice may protect the inner ear nerve cells of humans suffering from certain types of progressive hearing loss.
In the study, doctoral student Shaked Shivatzki created a mouse population possessing the gene that produces the most prevalent form of hearing loss in humans: the mutated connexin 26 gene. Some 30 percent of American children born deaf have this form of the gene. Because of its prevalence and the inexpensive tests available to identify it, there is a great desire to find a cure or therapy to treat it.
"Regenerating" neurons
Prof. Avraham’s team set out to prove that gene therapy could be used to preserve the inner ear nerve cells of the mice. Mice with the mutated connexin 26 gene exhibit deterioration of the nerve cells that send a sound signal to the brain. The researchers found that a protein growth factor used to protect and maintain neurons, otherwise known as brain-derived neurotrophic factor (BDNF), could be used to block this degeneration. They then engineered a virus that could be tolerated by the body without causing disease, and inserted the growth factor into the virus. Finally, they surgically injected the virus into the ears of the mice. This factor was able to “rescue” the neurons in the inner ear by blocking their degeneration.
"A wide spectrum of people are affected by hearing loss, and the way each person deals with it is highly variable," said Prof. Avraham. "That said, there is an almost unanimous interest in finding the genes responsible for hearing loss. We tried to figure out why the mouse was losing cells that enable it to hear. Why did it lose its hearing? The collaborative work allowed us to provide gene therapy to reverse the loss of nerve cells in the ears of these deaf mice."
Although this approach is short of improving hearing in these mice, it has important implications for the enhancement of sound perception with a cochlear implant, used by many people whose connexin 26 mutation has led to impaired hearing.
Embryonic hearing?
Inner ear nerve cells facilitate the optimal functioning of cochlear implants. Prof. Avraham’s research suggests a possible new strategy for improving implant function, particularly in people whose hearing loss gets progressively worse with time, such as those with profound hearing loss as well as those with the connexin gene mutation. Combining gene therapy with the implant could help to protect vital nerve cells, thus preserving and improving the performance of the implant.
More research remains. “Safety is the main question. And what about timing? Although over 80 percent of human and mouse genes are similar, which makes mice the perfect lab model for human hearing, there’s still a big difference. Humans start hearing as embryos, but mice don’t start to hear until two weeks after birth. So we wondered, do we need to start the corrective process in utero, in infants, or later in life?” said Prof. Avraham.
"Practically speaking, we are a long way off from treating hearing loss during embryogenesis. But we proved what we set out to do: that we can help preserve nerve cells in the inner ears of the mouse," Prof. Avraham continued. "This already looks very promising."
Anesthesia may have lingering side effects on the brain, even years after an operation

Two and a half years ago Susan Baker spent three hours under general anesthesia as surgeons fused several vertebrae in her spine. Everything went smoothly, and for the first six hours after her operation, Baker, then an 81-year-old professor at the Johns Hopkins Bloomberg School of Public Health, was recovering well. That night, however, she hallucinated a fire raging through the hospital toward her room. Petrified, she repeatedly buzzed the nurses’ station, pleading for help. The next day she was back to her usual self. “It was the most terrifying experience I have ever had,” she says.
Baker’s waking nightmare was a symptom of postoperative delirium, a state of serious confusion and memory loss that sometimes follows anesthesia. In addition to hallucinations, delirious patients may forget why they are in the hospital, have trouble responding to questions and speak in nonsensical sentences. Such bewilderment—which is far more severe than the temporary mental fog one might expect after any major operation that requires general anesthesia—usually resolves after a day or two.
Although physicians have known about the possibility of such confusion since at least the 1980s, they had decided, based on the then available evidence, that the drugs used to anesthetize a patient in the first place were unlikely to be responsible. Instead, they concluded, the condition occurred more often because of the stress of surgery, which might in turn unmask an underlying brain defect or the early stages of dementia. Studies in the past four years have cast doubt on that assumption, however, and suggest that a high enough dose of anesthesia can in fact raise the risk of delirium after surgery. Recent studies also indicate that the condition may be more pernicious than previously realized: even if the confusion dissipates, attention and memory can languish for months and, in some cases, years.
Researchers from The University of Manchester have discovered a new mechanism that governs how body clocks react to changes in the environment.

And the discovery, which is being published in Current Biology, could provide a solution for alleviating the detrimental effects of chronic shift work and jet-lag.
The team’s findings reveal that the enzyme casein kinase 1epsilon (CK1epsilon) controls how easily the body’s clockwork can be adjusted or reset by environmental cues such as light and temperature.
Internal biological timers (circadian clocks) are found in almost every species on the planet. In mammals including humans, circadian clocks are found in most cells and tissues of the body, and orchestrate daily rhythms in our physiology, including our sleep/wake patterns and metabolism.
Dr David Bechtold, who led The University of Manchester’s research team, said: “At the heart of these clocks are a complex set of molecules whose interaction provides robust and precise 24 hour timing. Importantly, our clocks are kept in synchrony with the environment by being responsive to light and dark information.”
This work, funded by the Biotechnology and Biological Sciences Research Council, was undertaken by a team from The University of Manchester in collaboration with scientists from Pfizer led by Dr Travis Wager.
The research identifies a new mechanism through which our clocks respond to these light inputs. During the study, mice lacking CK1epsilon, a component of the clock, were able to shift to a new light-dark environment (much like the experience in shift work or long-haul air travel) much faster than normal.
The research team went on to show that drugs that inhibit CK1epsilon were able to speed up shift responses of normal mice, and critically, that faster adaption to the new environment minimised metabolic disturbances caused by the time shift.
Dr Bechtold said: “We already know that modern society poses many challenges to our health and wellbeing - things that are viewed as commonplace, such as shift-work, sleep deprivation, and jet lag disrupt our body’s clocks. It is now becoming clear that clock disruption is increasing the incidence and severity of diseases including obesity and diabetes.
“We are not genetically pre-disposed to quickly adapt to shift-work or long-haul flights, and as so our bodies’ clocks are built to resist such rapid changes. Unfortunately, we must deal with these issues today, and there is very clear evidence that disruption of our body clocks has real and negative consequences for our health.”
He continues: “As this work progresses in clinical terms, we may be able to enhance the clock’s ability to deal with shift work, and importantly understand how maladaptation of the clock contributes to diseases such as diabetes and chronic inflammation.”
University of Bonn psychologists prove genetic variation is underlying factor in higher incidence of forgetfulness
Misplaced your keys? Can’t remember someone’s name? Didn’t notice the stop sign? Those who frequently experience such cognitive lapses now have an explanation. Psychologists from the University of Bonn have found a connection between such everyday lapses and the DRD2 gene. Those who have a certain variant of this gene are more easily distracted and experience a significantly higher incidence of lapses due to a lack of attention. The scientific team will probably report their results in the May issue of “Neuroscience Letters,” which is already available online in advance.

Most of us are familiar with such everyday lapses; can’t find your keys, again! Or you walk into another room but forgot what you actually went there for. Or you are on the phone with someone and cannot remember their name. “Such short-term memory lapses are very common, but some people experience them particularly often,” said Prof. Dr. Martin Reuter from the department for Differential and Biological Psychology at the University of Bonn. Mistakes occurring due to such short-term lapses can become a hazard in cases where, e.g., a person overlooks a stop sign at an intersection. And in the workplace, a lack of attention can also become a problem–so for example when it results in forgetting to save essential data.
A gene “directing” your brain
"A familial clustering of such lapses suggests that they are subject to genetic effects," explained Dr. Sebastian Markett, the principal author and a member of Prof. Reuter’s team. In lab experiments, the group of scientists had already found indications earlier that the so-called dopamine D2 receptor gene (DRD2) plays a part in forgetfulness. DRD2 has an essential function in signal transmission within the frontal lobes. "This structure can be compared to a director coordinating the brain like an orchestra," Dr. Markett added. In this simile, the DRD2 gene would correspond to the baton, because it plays a part in dopamine transmission in the brain. If the baton skips a beat, the orchestra gets confused.
The psychologists from the University of Bonn tested a total of 500 women and men by taking a saliva sample and examining it using methods from molecular biology. All humans carry the DRD2 gene, which comes in two variants that are distinguished by only one letter within the genetic code. The one variant has C (cytosine) in one locus, which is displaced by T (thymine) in the other. According to the research team’s analyses, about a quarter of the subjects exclusively had the DRD2 gene with the cytosine nucleobase, while three quarters were the genotype with at least one thymine base.
The scientists then wanted to find out whether this difference in the genetic code also had an effect on everyday behavior. By means of a self-assessment survey they asked the subjects to state how frequently they experience these lapses–how often they forgot names, misplaced their keys. The survey also included questions regarding certain impulsivity-related factors, such as how easily a subject was distracted from actual tasks at hand, and how long they were able to maintain their concentration.
Lapses can clearly be tied to the gene variant
The scientists used statistical methods to check whether it was possible to associate the forgetfulness symptoms elicited by means of the surveys to one of the DRD2 gene variants. The results showed that functions such as attention and memory are less clearly expressed in persons who carry the thymine variant of the gene than in the cytosine type. “The connection is obvious; such lapses can partially be attributed to this gene variant,” reported Dr. Markett. According to their own statements, the subjects with the thymine DRD2 variant more frequently “fall victim” to forgetfulness or attention deficits. And vice versa, the cytosine type seems to be protected from that. “This result matches the results of other studies very well,” added Dr. Markett.
Carriers of the gene variant linked to forgetfulness may now find solace in the fact that they are not responsible for their genes, and that this is just their fate….but Dr. Markett doesn’t agree. “There are things you can do to compensate for forgetfulness; writing yourself notes or making more of an effort to put your keys down in a specific location–and not just anywhere.” Those who develop such strategies for the different areas of their lives are better able to handle their deficit.
You wouldn’t hear the mating song of the male fruit fly as you reached for the infested bananas in your kitchen. Yet, the neural activity behind the insect’s amorous call could help scientists understand how you made the quick decision to pull your hand back from the tiny swarm.

Male fruit flies base the pitch and tempo of their mating song on the movement and behavior of their desired female, Princeton University researchers have discovered. In the animal kingdom, lusty warblers such as birds typically have a mating song with a stereotyped pattern. A fruit fly’s song, however, is an unordered series of loud purrs and soft drones made by wing vibrations, the researchers reported in the journal Nature. A male adjusts his song in reaction to his specific environment, which in this case is the distance and speed of a female — the faster and farther away she’s moving, the louder he “sings.”
While the actors are small, the implications of these findings could be substantial for understanding rapid decision-making, explained corresponding author Mala Murthy, a Princeton assistant professor of molecular biology and the Princeton Neuroscience Institute. Fruit flies are a common model for studying the systems of more advanced beings such as humans, and have the basic components of more complex nervous systems, she said.
The researchers have provided a possible tool for studying the neural pathways behind how an organism engaged in a task adjusts its behavior to sudden changes, be it a leopard chasing a zigzagging gazelle, or a commuter navigating stop-and-go traffic, Murthy said. She and her co-authors created a model that could predict a fly’s choice of song in response to its changing environment, and identified the neural pathways involved in these decisions.
"Here we have natural courtship behavior and we have this discovery that males are using information about their sensory environment in real time to shape their song. That makes the fly system a unique model to study decision-making in a natural context," Murthy said.
"You can imagine that if a fly can integrate visual information quickly to modulate his song, the way in which it does that is probably a very basic equivalent of how a more complicated animal solves a similar problem," she said. "To figure out at the level of individual neurons how flies perform sensory-motor integration will give us insight into how a mammalian brain does it and, ultimately, maybe how a human brain does it."
Why do neurodegenerative diseases such as Alzheimer’s affect only the elderly? Why do some people live to be over 100 with intact cognitive function while others develop dementia decades earlier?

Image: A new study shows that a gene regulator called REST, dormant in the brains of young people (left), switches on in normal aging brains (center) to protect against various stresses, including abnormal proteins associated with neurodegenerative diseases. REST is lost in critical brain regions of people with Alzheimer’s (right). Credit: Yankner Lab
More than a century of research into the causes of dementia has focused on the clumps and tangles of abnormal proteins that appear in the brains of people with neurodegenerative diseases. However, scientists know that at least one piece of the puzzle has been missing because some people with these abnormal protein clumps show few or no signs of cognitive decline.
A new study offers an explanation for these longstanding mysteries. Researchers have discovered that a gene regulator active during fetal brain development, called REST, switches back on later in life to protect aging neurons from various stresses, including the toxic effects of abnormal proteins. The researchers also showed that REST is lost in critical brain regions of people with Alzheimer’s and mild cognitive impairment.
A new study in animals shows that using a compound to block the body’s immune response greatly reduces disability after a stroke.

The study by scientists from the University of Wisconsin School of Medicine and Public Health also showed that particular immune cells – CD4+ T-cells produce a mediator, called interleukin (IL)-21 that can cause further damage in stroke tissue.
Moreover, normal mice, ordinarily killed or disabled by an ischemic stroke, were given a shot of a compound that blocks the action of IL-21. Brain scans and brain sections showed that the treated mice suffered little or no stroke damage.
“This is very exciting because we haven’t had a new drug for stroke in decades, and this suggests a target for such a drug,” says lead author Dr. Zsuzsanna Fabry, professor of pathology and laboratory medicine
Stroke is the fourth-leading killer in the world and an important cause of permanent disability. In an ischemic stroke, a clot blocks the flow of oxygen-rich blood to the brain. But Fabry explains that much of the damage to brain cells occurs after the clot is removed or dissolved by medicine. Blood rushes back into the brain tissue, bringing with it immune cells called T-cells, which flock to the source of an injury.
The study shows that after a stroke, the injured brain cells provoke the CD4+ T-cells to produce a substance, IL-21, that kills the neurons in the blood-deprived tissue of the brain. The study gave new insight how stroke induces neural injury.
Similar Findings in Humans
Fabry’s co-author Dr. Matyas Sandor, professor of pathology and laboratory medicine, says that the final part of the study looked at brain tissue from people who had died following ischemic strokes. It found that CD4+ T-cells and their protein, IL-21 are in high concentration in areas of the brain damaged by the stroke.
Sandor says the similarity suggests that the protein that blocks IL-21 could become a treatment for stroke, and would likely be administered at the same time as the current blood-clot dissolving drugs.
“We don’t have proof that it will work in humans,” he says, “but similar accumulation of IL-21 producing cells suggests that it might.”
The paper was published this week in the Journal of Experimental Medicine.
A novel protein may explain how biological clocks regulate human sleep cycles

In a series of experiments sparked by fruit flies that couldn’t sleep, Johns Hopkins researchers say they have identified a mutant gene — dubbed “Wide Awake” — that sabotages how the biological clock sets the timing for sleep. The finding also led them to the protein made by a normal copy of the gene that promotes sleep early in the night and properly regulates sleep cycles.
Because genes and the proteins they code for are often highly conserved across species, the researchers suspect their discoveries — boosted by preliminary studies in mice — could lead to new treatments for people whose insomnia or off-hours work schedules keep them awake long after their heads hit the pillow.
“We know that the timing of sleep is regulated by the body’s internal biological clock, but just how this occurs has been a mystery,” says study leader Mark N. Wu, M.D., Ph.D., an assistant professor of neurology, medicine, genetic medicine and neuroscience at the Johns Hopkins University School of Medicine. “We have now found the first protein ever identified that translates timing information from the body’s circadian clock and uses it to regulate sleep.”
A report on the work was published online March 13 in the journal Neuron.
In their hunt for the molecular roots of sleep regulation, Wu and his colleagues studied thousands of fruit fly colonies, each with a different set of genetic mutations, and analyzed their sleep patterns. They found that one group of flies, with a mutation in the gene they would later call Wide Awake (or Wake for short), had trouble falling asleep at night, a malady that looked a lot like sleep-onset insomnia in humans. The investigators say Wake appears to be the messenger from the circadian clock to the brain, telling it that it’s time to shut down and sleep.
After isolating the gene, Wu’s team determined that when working properly, Wake helps shut down clock neurons of the brain that control arousal by making them more responsive to signals from the inhibitory neurotransmitter called GABA. Wake does this specifically in the early evening, thus promoting sleep at the right time. Levels of Wake cycle during the day, peaking near dusk in good sleepers.
Flies with a mutated Wake gene that couldn’t get to sleep were not getting enough GABA signal to quiet their arousal circuits at night, keeping the flies agitated.
The researchers found the same gene in every animal they studied: humans, mice, rabbits, chickens, even worms.
Importantly, when Wu’s team looked to see where Wake was located in the mouse brain, they found that it was expressed in the suprachiasmatic nucleus (SCN), the master clock in mammals. Wu says the fact that the Wake protein was expressed in high concentrations in the SCN of mice is significant.
“Sometimes we discover things in flies that have no direct relevance in higher order animals,” Wu says. “In this case, because we found the protein in a location where it likely plays a role in circadian rhythms and sleep, we are encouraged that this protein may do the same thing in mice and people.”
The hope is that someday, by manipulating Wake, possibly with a medication, shift workers, military personnel and sleep-onset insomniacs could sleep better.
“This novel pathway may be a place where we can intervene,” Wu says.