Johns Hopkins researchers have begun to connect the dots between a schizophrenia-linked genetic variation and its effect on the developing brain. As they report July 3 in the journal Cell Stem Cell, their experiments show that the loss of a particular gene alters the skeletons of developing brain cells, which in turn disrupts the orderly layers those cells would normally form.

(Image caption: Left, human neural stem cells form rosettes as they grow into different cell types, with ringlike patterns of PKCλ protein in the center. A neural rosette with a 15q11.2 microdeletion, a risk factor for schizophrenia, appears disorganized and lacks the ringlike PKCλ protein structure, right, suggesting that this risk factor acts early in the neurodevelopmental process. Credit: Ki-Jun Yoon/Johns Hopkins Medicine)
“This is an important step toward understanding what physically happens in the developing brain that puts people at risk of schizophrenia,” says Guo-li Ming, M.D., Ph.D., a professor of neurology and neuroscience in the Johns Hopkins University School of Medicine’s Institute for Cell Engineering.
While no single genetic mutation is known to cause schizophrenia, so-called genome wide association studies have identified variations that are more common in people with the condition than in the general population. One of these is a missing piece from an area of the genome labeled 15q11.2. “While the deletion is linked to schizophrenia, having extra copies of this part of the genome raises the risk of autism,” notes Ming.
For the new study, Ming’s research group, along with that of her husband and collaborator, neurology and neuroscience professor Hongjun Song, Ph.D., used skin cells from people with schizophrenia who were missing part of 15q11.2 on one of their chromosomes. (Because everyone carries two copies of their genome, the patients each had an intact copy of 15q11.2 as well.)
The researchers grew the human skin cells in a dish and coaxed them to become induced pluripotent stem cells, and then to form neural progenitor cells, a kind of stem cell found in the developing brain.
“Normally, neural progenitors will form orderly rings when grown in a dish, but those with the deletion didn’t,” Ming says. To find out which of the four known genes in the missing piece of the genome were responsible for the change, the researchers engineered groups of progenitors that each produced less protein than normal from one of the suspect genes. The crucial ingredient in ring formation turned out to be a gene called CYFIP1.
The team then altered the genomes of neural progenitors in mouse embryos so that they made less of the protein created by CYFIP1. The brain cells of the fetal mice turned out to have similar defects in structure to those in the dish-grown human cells. The reason, the team found, is that CYFIP1 plays a role in building the skeleton that gives shape to each cell, and its loss affects spots called adherens junctions where the skeletons of two neighboring cells connect.
Having less CYFIP1 protein also caused some neurons in the developing mice to end up in the wrong layer within the brain. “During development, new neurons get in place by ‘climbing’ the tendrils of neural progenitor cells,” Ming says. “We think that disrupted adherens junctions don’t provide a stable enough anchor for neural progenitors, so the ‘rope’ they form doesn’t quite get new neurons to the right place.”
The researchers say they also found that CYFIP1 is part of a complex of proteins called WAVE, which is key to building the cellular skeleton.
Many people with a CYFIP1 deletion do not get schizophrenia, so the team suspected the condition was more likely to arise in people with a second defect in the WAVE complex.
Analyzing data from genomewide association studies, they found a variation in the WAVE complex signaling gene ACTR2/Arp2 that, combined with the CYFIP1 deletion, increased the risk of schizophrenia more than either genetic change by itself.
In adding to science’s understanding of schizophrenia, the study also shows how other mental illnesses might be similarly investigated, the researchers say. “Using induced pluripotent stem cells from people with schizophrenia allowed us to see how their genes affected brain development,” says Song. “Next, we’d like to investigate what effects remain in the mature brain.”
Imagine feeling a slimy jellyfish, a prickly cactus or map directions on your iPad mini Retina display, because that’s where tactile technology is headed. But you’ll need more than just an index finger to feel your way around.

New research at UC Berkeley has found that people are better and faster at navigating tactile technology when using both hands and several fingers. Moreover, blind people in the study outmaneuvered their sighted counterparts – especially when using both hands and several fingers – possibly because they’ve developed superior cognitive strategies for finding their way around.
Bottom line: Two hands are better than one in the brave new world of tactile or “haptic” technology, and the visually impaired can lead the way.
”Most sighted people will explore these types of displays with a single finger. But our research shows that this is a bad decision. No matter what the task, people perform better using multiple fingers and hands,” said Valerie Morash, a doctoral student in psychology at UC Berkeley, and lead author of the study just published in the online issue of the journal, Perception.
“We can learn from blind people how to effectively use multiple fingers, and then teach these strategies to sighted individuals who have recently lost vision or are using tactile displays in high-stakes applications like controlling surgical robots,” she added.
For decades, scientists have studied how receptors on the fingertips relay information to the brain. Now, researchers at Disney and other media companies are implementing more tactile interfaces, which use vibrations, and electrostatic or magnetic feedback for users to find their way around, or experience how something feels.
In this latest study, Morash and fellow researchers at UC Berkeley and the Smith-Kettlewell Eye Research Institute in San Francisco tested 14 blind adults and 14 blindfolded sighted adults on several tasks using a tactile map. Using various hand and finger combinations, they were tasked with such challenges as finding a landmark or figuring out if a road looped around.
Overall, both blind and sighted participants performed better when using both hands and several fingers, although blind participants were, on average, 50 percent faster at completing the tasks, and even faster when they used both hands and all their fingers.
“As we move forward with integrating tactile feedback into displays, these technologies absolutely need to support multiple fingers,” Morash said. “This will promote the best tactile performance in applications such as the remote control of robotics used in space and high-risk situations, among other things.”
Neuroscientists leading the largest longitudinal adolescent brain imaging study to date have learned that predicting teenage binge-drinking is possible. In fact, say the researchers in the group’s latest publication, a number of factors – genetics, brain function and about 40 different variables – can help scientists predict with about 70 percent accuracy which teens will become binge drinkers. The study appears online July 3, 2014 as an Advance Online Publication in the journal Nature.

First author Robert Whelan, Ph.D., a former University of Vermont (UVM) postdoctoral fellow in psychiatry and current lecturer at University College Dublin, and senior author Hugh Garavan, Ph.D., UVM associate professor of psychiatry, and colleagues conducted 10 hours of comprehensive assessments – these included neuroimaging to assess brain activity and brain structure, along with other measures such as IQ, cognitive task performance, personality and blood tests – on each of 2,400 14-year-old adolescents at eight different sites across Europe.
“Our goal was to develop a model to better understand the relative roles of brain structure and function, personality, environmental influences and genetics in the development of adolescent abuse of alcohol,” says Whelan. “This multidimensional risk profile of genes, brain function and environmental influences can help in the prediction of binge drinking at age 16 years.”
A 2012 Nature Neuroscience paper by the same researchers identified brain networks that predisposed some teens to higher-risk behaviors like experimentation with drugs and alcohol. This new study develops on that earlier work by following those kids for years (the participants in the study are now 19 years old) and identifying those who developed a pattern of binge-drinking. The 2014 Nature study aimed to predict those who went on to drink heavily at age 16 using only data collected at age 14. They applied a broad range of measures, developing a unique analytic method to predict which individuals would become binge-drinkers. The reliability of the results were confirmed by showing the same accuracy when tested on a new, separate group of teenagers. The result was a list of predictors that ranged from brain and genetics to personality and personal history factors.
“Notably, it’s not the case that there’s a single one or two or three variables that are critical,” says Garavan. “The final model was very broad – it suggests that a wide mixture of reasons underlie teenage drinking.”
Some of the best predictors, shares Garavan, include variables like personality, sensation-seeking traits, lack of conscientiousness, and a family history of drug use. Having even a single drink at age 14, was also a powerful predictor. That type of risk-taking behavior – and the impulsivity that often accompanies it – was a critical predictor. In addition, those teens who had experienced several stressful life events were among those at greater risk for binge-drinking.
One interesting finding, says Garavan, was that bigger brains were also predictive. Adolescents undergo significant brain changes, so in addition to the formation of personalities and social networks, it’s actually normal for their brains to reduce to a more efficient size.
“There’s refining and sculpting of the brain, and most of the gray matter – the neurons and the connections between them, are getting smaller and the white matter is getting larger,” he explains. “Kids with more immature brains – those that are still larger – are more likely to drink.”
Garavan, Whelan and colleagues believe that by better understanding the probable causal factors for binge-drinking, targeted interventions for those most at risk could be applied.
Gunter Schumann, M.D.,professor of biological psychiatry and head of the section at the Social, Genetic and Developmental Psychiatry Centre, Institute of Psychiatry, King’s College London, is the principle investigator of the IMAGEN study, which is the source of this latest paper. “We aimed to develop a ‘gold standard’ model for predicting teenage behavior, which can be used as a benchmark for the development of simpler, widely applicable prediction models,” says Schumann. “This work will inform the development of specific early interventions in carriers of the risk profile to reduce the incidence of adolescent substance abuse. We now propose to extend analysis of the IMAGEN data in order to investigate the development of substance use patterns in the context of moderating environmental factors, such as exposure to nicotine or drugs as well as psychosocial stress.”
In the future, the researchers hope to perform more in-depth analyses of the brain factors involved and determine whether or not there are different predictors for abuse of other drugs. A similar analysis, which is using the same dataset to look at the predictors of cannabis use, is planned for the near future.
Researchers at Duke-NUS Graduate Medical School Singapore (Duke-NUS) have found evidence that the less older adults sleep, the faster their brains age. These findings, relevant in the context of Singapore’s rapidly ageing society, pave the way for future work on sleep loss and its contribution to cognitive decline, including dementia.

Past research has examined the impact of sleep duration on cognitive functions in older adults. Though faster brain ventricle enlargement is a marker for cognitive decline and the development of neurodegenerative diseases such as Alzheimer’s, the effects of sleep on this marker have never been measured.
The Duke-NUS study examined the data of 66 older Chinese adults, from the Singapore-Longitudinal Aging Brain Study(1). Participants underwent structural MRI brain scans measuring brain volume and neuropsychological assessments testing cognitive function every two years. Additionally, their sleep duration was recorded through a questionnaire. Those who slept fewer hours showed evidence of faster ventricle enlargement and decline in cognitive performance.
"Our findings relate short sleep to a marker of brain aging," said Dr June Lo, the lead author and a Duke-NUS Research Fellow. "Work done elsewhere suggests that seven hours a day(2) for adults seems to be the sweet spot for optimal performance on computer based cognitive tests. In coming years we hope to determine what’s good for cardio-metabolic and long term brain health too," added Professor Michael Chee, senior author and Director of the Centre for Cognitive Neuroscience at Duke-NUS.
Research suggests that people at increased risk for developing addiction share many of the same neurobiological signatures of people who have already developed addiction. This similarity is to be expected, as individuals with family members who have struggled with addiction are over-represented in the population of addicted people.
However, a generation of animal research supports the hypothesis that the addiction process changes the brain in ways that converge with the distinctive neurobiology of the heritable risk for addiction. In other words, the more one uses addictive substances, the more one’s brain acquires the profile of someone who has inherited a risk for addiction.
One such change is a reduction in striatal dopamine release. Dopamine is a key brain chemical messenger involved in reward-related behaviors. Disturbances in dopamine signaling appear to contribute to reward processing that biases people to seek drug-like rewards and to develop drug-taking habits.
In the current issue of Biological Psychiatry, researchers at McGill University report that individuals at high risk for addiction show the same reduced dopamine response often observed in addicted individuals, identifying a new link between addiction risk and addiction in humans.
Dr. Marco Leyton and his colleagues recruited young adults, aged 18 to 25, who were classified into three groups: 1) a high-risk group of occasional stimulant users with an extensive family history of substance abuse; 2) a comparison group of occasional stimulant users with no family history; and 3) a second comparison group of individuals with no history of stimulant use and no known risk factors for addiction. Volunteers underwent a positron emission tomography (PET) scan involving the administration of amphetamine, which enabled the researchers to measure their dopamine response.
The authors found that the high-risk group of non-dependent young adults with extensive family histories of addiction displayed markedly reduced dopamine responses in comparison with both stimulant-naïve subjects and non-dependent users with no family history.
“This interesting new parallel between addiction risk and addiction may help to focus our attention on reward-related processes that contribute to the development of addiction, perhaps informing prevention strategies,” said Dr. John Krystal, Editor of Biological Psychiatry.
Leyton, a Professor at McGill University, said, “Young adults at risk of addictions have a strikingly disturbed brain dopamine reward system response when they are administered amphetamine. Past drug use seemed to aggravate the dopamine response also but this was not a sufficient explanation. Instead, the disturbance may be a heritable biological marker that could identify those at highest risk.”
This finding suggests that there are common brain mechanisms that promote the use of addictive substances in vulnerable people and in people who have long-standing habitual substance use.
Better understanding this biology may help to advance our understanding of how people develop addiction problems, as well as providing hints related to biological mechanisms that might be targeted for prevention and treatment.
Patients with Alzheimer’s disease run a high risk of seizures. While the amyloid-beta protein involved in the development and progression of Alzheimer’s seems the most likely cause for this neuronal hyperactivity, how and why this elevated activity takes place hasn’t yet been explained — until now.

A new study by Tel Aviv University researchers, published in Cell Reports, pinpoints the precise molecular mechanism that may trigger an enhancement of neuronal activity in Alzheimer’s patients, which subsequently damages memory and learning functions. The research team, led by Dr. Inna Slutsky of TAU’s Sackler Faculty of Medicine and Sagol School of Neuroscience, discovered that the amyloid precursor protein (APP), in addition to its well-known role in producing amyloid-beta, also constitutes the receptor for amyloid-beta. According to the study, the binding of amyloid-beta to pairs of APP molecules triggers a signalling cascade, which causes elevated neuronal activity.
Elevated activity in the hippocampus — the area of the brain that controls learning and memory — has been observed in patients with mild cognitive impairment and early stages of Alzheimer’s disease. Hyperactive hippocampal neurons, which precede amyloid plaque formation, have also been observed in mouse models with early onset Alzheimer’s disease. “These are truly exciting results,” said Dr. Slutsky. “Our work suggests that APP molecules, like many other known cell surface receptors, may modulate the transfer of information between neurons.”
With the understanding of this mechanism, the potential for restoring memory and protecting the brain is greatly increased.
Building on earlier research
The research project was launched five years ago, following the researchers’ discovery of the physiological role played by amyloid-beta, previously known as an exclusively toxic molecule. The team found that amyloid-beta is essential for the normal day-to-day transfer of information through the nerve cell networks. If the level of amyloid-beta is even slightly increased, it causes neuronal hyperactivity and greatly impairs the effective transfer of information between neurons.
In the search for the underlying cause of neuronal hyperactivity, TAU doctoral student Hilla Fogel and postdoctoral fellow Samuel Frere found that while unaffected “normal” neurons became hyperactive following a rise in amyloid-beta concentration, neurons lacking APP did not respond to amyloid-beta. “This finding was the starting point of a long journey toward decoding the mechanism of APP-mediated hyperactivity,” said Dr. Slutsky.
The researchers, collaborating with Prof. Joel Hirsch of TAU’s Faculty of Life Sciences, Prof. Dominic Walsh of Harvard University, and Prof. Ehud Isacoff of University of California Berkeley, harnessed a combination of cutting-edge high-resolution optical imaging, biophysical methods and molecular biology to examine APP-dependent signalling in neural cultures, brain slices, and mouse models. Using highly sensitive biophysical techniques based on fluorescence resonance energy transfer (FRET) between fluorescent proteins in close proximity, they discovered that APP exists as a dimer at presynaptic contacts, and that the binding of amyloid-beta triggers a change in the APP-APP interactions, leading to an increase in calcium flux and higher glutamate release — in other words, brain hyperactivity.
A new approach to protecting the brain
"We have now identified the molecular players in hyperactivity," said Dr. Slutsky. "TAU postdoctoral fellow Oshik Segev is now working to identify the exact spot where the amyloid-beta binds to APP and how it modifies the structure of the APP molecule. If we can change the APP structure and engineer molecules that interfere with the binding of amyloid-beta to APP, then we can break up the process leading to hippocampal hyperactivity. This may help to restore memory and protect the brain."
Previous studies by Prof. Lennart Mucke’s laboratory strongly suggest that a reduction in the expression level of “tau” (microtubule-associated protein), another key player in Alzheimer’s pathogenesis, rescues synaptic deficits and decreases abnormal brain activity in animal models. “It will be crucial to understand the missing link between APP and ‘tau’-mediated signalling pathways leading to hyperactivity of hippocampal circuits. If we can find a way to disrupt the positive signalling loop between amyloid-beta and neuronal activity, it may rescue cognitive decline and the conversion to Alzheimer’s disease,” said Dr. Slutsky.
A drug that blocks the action of the enzyme Cdk5 could substantially reduce brain damage if administered shortly after a stroke, UT Southwestern Medical Center research suggests.
The findings, reported in the June 11 issue of the Journal of Neuroscience, determined in rodent models that aberrant Cdk5 activity causes nerve cell death during stroke.
“If you inhibit Cdk5, then the vast majority of brain tissue stays alive without oxygen for up to one hour,” said Dr. James Bibb, Associate Professor of Psychiatry and Neurology and Neurotherapeutics at UT Southwestern and senior author of the study. “This result tells us that Cdk5 is a central player in nerve cell death.”
More importantly, development of a Cdk5 inhibitor as an acute neuroprotective therapy has the potential to reduce stroke injury.
“If we could block Cdk5 in patients who have just suffered a stroke, we may be able to reduce the number of patients in our hospitals who become disabled or die from stroke. Doing so would have a major impact on health care,” Dr. Bibb said.
While several pharmaceutical companies worked to develop Cdk5 inhibitors years ago, these efforts were largely abandoned since research indicated blocking Cdk5 long-term could have detrimental effects. At the time, many scientists thought aberrant Cdk5 activity played a major role in the development of Alzheimer’s disease and that Cdk5 inhibition might be beneficial as a treatment.
Based on Dr. Bibb’s research and that of others, Cdk5 has both good and bad effects. When working normally, Cdk5 adds phosphates to other proteins that are important to healthy brain function. On the flip side, researchers have found that aberrant Cdk5 activity contributes to nerve cell death following brain injury and can lead to cancer.
“Cdk5 regulates communication between nerve cells and is essential for proper brain function. Therefore, blocking Cdk5 long-term may not be beneficial,” Dr. Bibb said. “Until now, the connection between Cdk5 and stroke injury was unknown, as was the potential benefit of acute Cdk5 inhibition as a therapy.”
In this study, researchers administered a Cdk5 inhibitor directly into dissected brain slices after adult rodents suffered a stroke, in addition to measuring the post-stroke effects in Cdk5 knockout mice.
“We are not yet at a point where this new treatment can be given for stroke. Nevertheless, this research brings us a step closer to developing the right kinds of drugs,” Dr. Bibb said. “We first need to know what mechanisms underlie the disease before targeted treatments can be developed that will be effective. As no Cdk5 blocker exists that works in a pill form, the next step will be to develop a systemic drug that could be used to confirm the study’s results and lead to a clinical trial at later stages.”
Currently, there is only one FDA-approved drug for acute treatment of stroke, the clot-busting drug tPA. Other treatment options include neurosurgical procedures to help minimize brain damage.
Without a steady supply of blood, neurons can’t work. That’s why one of the culprits behind Alzheimer’s disease is believed to be the persistent blood clots that often form in the brains of Alzheimer’s patients, contributing to the condition’s hallmark memory loss, confusion and cognitive decline.

New experiments in Sidney Strickland’s Laboratory of Neurobiology and Genetics at Rockefeller University have identified a compound that might halt the progression of Alzheimer’s by interfering with the role amyloid-β, a small protein that forms plaques in Alzheimer’s brains, plays in the formation of blood clots. This work is highlighted in the July issue of Nature Reviews Drug Discovery.
For more than a decade, potential Alzheimer’s drugs have targeted amyloid-β, but, in clinical trials, they have either failed to slow the progression of the disease or caused serious side effects. However, by targeting the protein’s ability to bind to a clotting agent in blood, the work in the Strickland lab offers a promising new strategy, according to the highlight published in print on July 1.
This latest study builds on previous work in Strickland’s lab showing amyloid-β can interact with fibrinogen, the clotting agent, to form difficult-to-break-down clots that alter blood flow, cause inflammation and choke neurons.
“Our experiments in test tubes and in mouse models of Alzheimer’s showed the compound, known as RU-505, helped restore normal clotting and cerebral blood flow. But the big pay-off came with behavioral tests in which the Alzheimer’s mice treated with RU-505 exhibited better memories than their untreated counterparts,” Strickland says. “These results suggest we have found a new strategy with which to treat Alzheimer’s disease.”
RU-505 emerged from a pack of 93,716 candidates selected from libraries of compounds, the researchers write in the June issue of the Journal of Experimental Medicine. Hyung Jin Ahn, a research associate in the lab, examined these candidates with a specific goal in mind: Find one that interferes with the interaction between fibrinogen and amyloid-β. In a series of tests that began with a massive, automated screening effort at Rockefeller’s High Throughput Resource Center, Ahn and colleagues winnowed the 93,000 contenders to five. Then, test tube experiments whittled the list down to one contender: RU-505, a small, synthetic compound. Because RU-505 binds to amyloid-β and only prevents abnormal blood clot formation, it does not interfere with normal clotting. It is also capable of passing through the blood-brain barrier.
“We tested RU-505 in mouse models of Alzheimer’s disease that over-express amyloid-β and have a relatively early onset of disease. Because Alzheimer’s disease is a long-term, progressive disease, these treatments lasted for three months,” Ahn says. “Afterward, we found evidence of improvement both at the cellular and the behavioral levels.”
The brains of the treated mice had less of the chronic and harmful inflammation associated with the disease, and blood flow in their brains was closer to normal than that of untreated Alzheimer’s mice. The RU-505-treated mice also did better when placed in a maze. Mice naturally want to escape the maze, and are trained to recognize visual cues to find the exit quickly. Even after training, Alzheimer’s mice have difficulty in exiting the maze. After these mice were treated with RU-505, they performed much better.
“While the behavior and the brains of the Alzheimer’s mice did not fully recover, the three-month treatment with RU-505 prevents much of the decline associated with the disease,” Strickland says.
The researchers have begun the next steps toward developing a human treatment. Refinements to the compound are being supported by the Robertson Therapeutic Development Fund and the Tri-Institutional Therapeutic Discovery Institute. As part of a goal to help bridge critical gaps in drug discovery, these initiatives support the early stages of drug development, as is being done with RU-505.
“At very high doses, RU-505 is toxic to mice and even at lower doses it caused some inflammation at the injection site, so we are hoping to find ways to reduce this toxicity, while also increasing RU-505’s efficacy so smaller doses can accomplish similar results,” Ahn says.
NYU Langone Medical Center is now using a novel technology that serves as a “flight simulator” for neurosurgeons, allowing them to rehearse complicated brain surgeries before making an actual incision on a patient.

The new simulator, called the Surgical Rehearsal Platform (SRP), creates an individualized walkthrough for neurosurgeons based on 3D imaging taken from the patient’s CT and MRI scans. Surgeons then plan and rehearse the surgeries using the unique software, which combines life-like tissue reaction with accurate modeling of surgical tools and clamps, to enable them to navigate multiple-angled models of a patient’s brain and vasculature.
The SRP was developed by Surgical Theater of Cleveland, Ohio. This augmented reality technology may help improve safety and efficiency during surgeries for conditions including pituitary tumors, skull base tumors, intrinsic brain tumors, aneurysms, and arteriovenous malformations (AVMs), and could potentially allow surgeons from around the world to simultaneously collaborate on a patient’s case in real-time.
”We are excited to partner with Surgical Theater to bring their Surgery Rehearsal Platform to our institution,” said John G. Golfinos, MD, chair of the Department of Neurosurgery at NYU Langone Medical Center and associate professor of neurosurgery at NYU School of Medicine. “The reaction of tissue in these 3D images is incredibly life-like and modeling of surgical tools is equally impressive. The SRP also will enhance the training of medical students, residents and fellows and help them hone their skills in new and more meaningful ways.”
When using the SRP, surgeons can rehearse a specific patient’s case on computer monitors connected to controllers that simulate surgical tools. For example, when rehearsing a surgery for an aneurysm, the SRP reacts realistically when the surgeon virtually applies a clip to the blood vessel. The surgeon then can assess the tissue’s mechanical properties and view realistic microscopic characteristics including shadowing and texture to plan approaches, so that when the real surgery is being performed, doctors have rehearsed and already have a mental picture of what is being seen in the OR.
The SRP obtained clearance from the U.S. Food and Drug Administration (FDA) in February 2013 as a pre-operative software for simulating and evaluating surgical treatment options.
In addition, a newer-generation of this technology from Surgical Theater, the Surgical Navigation Advanced Platform (SNAP), has an application pending with the FDA to allow the tool to be taken into the operating room, so surgeons can see behind arteries and other critical structures in real-time.
Researchers believe they have learned how mutations in the gene that causes Huntington’s disease kill brain cells, a finding that could open new opportunities for treating the fatal disorder. Scientists first linked the gene to the inherited disease more than 20 years ago.

Huntington’s disease affects five to seven people out of every 100,000. Symptoms, which typically begin in middle age, include involuntary jerking movements, disrupted coordination and cognitive problems such as dementia. Drugs cannot slow or stop the progressive decline caused by the disorder, which leaves patients unable to walk, talk or eat.
Lead author Hiroko Yano, PhD, of Washington University School of Medicine in St. Louis, found in mice and in mouse brain cell cultures that the disease impairs the transfer of proteins to energy-making factories inside brain cells. The factories, known as mitochondria, need these proteins to maintain their function. When disruption of the supply line disables the mitochondria, brain cells die.
“We showed the problem could be fixed by making cells overproduce the proteins that make this transfer possible,” said Yano, assistant professor of neurological surgery, neurology and genetics. “We don’t know if this will work in humans, but it’s exciting to have a solid new lead on how this condition kills brain cells.”
The findings are available online in Nature Neuroscience.
Huntington’s disease is caused by a defect in the huntingtin gene, which makes the huntingtin protein. Life expectancy after initial onset is about 20 years.
Scientists have known for some time that the mutated form of the huntingtin protein impairs mitochondria and that this disruption kills brain cells. But they have had difficulty understanding specifically how the gene harms the mitochondria.
For the new study, Yano and collaborators at the University of Pittsburgh worked with mice that were genetically modified to simulate the early stages of the disorder.
Yano and her colleagues found that the mutated huntingtin protein binds to a group of proteins called TIM23. This protein complex normally helps transfer essential proteins and other supplies to the mitochondria. The researchers discovered that the mutated huntingtin protein impairs that process.
The problem seems to be specific to brain cells early in the disease. At the same point in the disease process, the scientists found no evidence of impairment in liver cells, which also produce the mutated huntingtin protein.
The researchers speculated that brain cells might be particularly reliant on their mitochondria to power the production and recycling of the chemical signals they use to transmit information. This reliance could make the cells vulnerable to disruption of the mitochondria.
Other neurodegenerative conditions, including Alzheimer’s disease and amyotrophic lateral sclerosis, also known as Lou Gehrig’s disease, have been linked to problems with mitochondria. Scientists may be able to build upon these new findings to better understand these disorders.
A specific preparation of cocoa-extract called Lavado may reduce damage to nerve pathways seen in Alzheimer’s disease patients’ brains long before they develop symptoms, according to a study conducted at the Icahn School of Medicine at Mount Sinai and published June 20 in the Journal of Alzheimer’s Disease (JAD).

Specifically, the study results, using mice genetically engineered to mimic Alzheimer’s disease, suggest that Lavado cocoa extract prevents the protein β-amyloid- (Aβ) from gradually forming sticky clumps in the brain, which are known to damage nerve cells as Alzheimer’s disease progresses.
Lavado cocoa is primarily composed of polyphenols, antioxidants also found in fruits and vegetables, with past studies suggesting that they prevent degenerative diseases of the brain.
The Mount Sinai study results revolve around synapses, the gaps between nerve cells. Within healthy nerve pathways, each nerve cell sends an electric pulse down itself until it reaches a synapse where it triggers the release of chemicals called neurotransmitters that float across the gap and cause the downstream nerve cell to “fire” and pass on the message.
The disease-causing formation of Aβ oligomers – groups of molecules loosely attracted to each other –build up around synapses. The theory is that these sticky clumps physically interfere with synaptic structures and disrupt mechanisms that maintain memory circuits’ fitness. In addition, Aβ triggers immune inflammatory responses, like an infection, bringing an on a rush of chemicals and cells meant to destroy invaders but that damage our own cells instead.
“Our data suggest that Lavado cocoa extract prevents the abnormal formation of Aβ into clumped oligomeric structures, to prevent synaptic insult and eventually cognitive decline,” says lead investigator Giulio Maria Pasinetti, MD, PhD, Saunders Family Chair and Professor of Neurology at the Icahn School of Medicine at Mount Sinai. “Given that cognitive decline in Alzheimer’s disease is thought to start decades before symptoms appear, we believe our results have broad implications for the prevention of Alzheimer’s disease and dementia.
Evidence in the current study is the first to suggest that adequate quantities of specific cocoa polyphenols in the diet over time may prevent the glomming together of Aβ into oligomers that damage the brain, as a means to prevent Alzheimer’s disease.
The research team led by Dr. Pasinetti tested the effects of extracts from Dutched, Natural, and Lavado cocoa, which contain different levels of polyphenols. Each cocoa type was evaluated for its ability to reduce the formation of Aβ oligomers and to rescue synaptic function. Lavado extract, which has the highest polyphenol content and anti-inflammatory activity among the three, was also the most effective in both reducing formation of Aβ oligomers and reversing damage to synapses in the study mice.
“There have been some inconsistencies in medical literature regarding the potential benefit of cocoa polyphenols on cognitive function,” says Dr. Pasinetti. “Our finding of protection against synaptic deficits by Lavado cocoa extract, but not Dutched cocoa extract, strongly suggests that polyphenols are the active component that rescue synaptic transmission, since much of the polyphenol content is lost by the high alkalinity in the Dutching process.”
Because loss of synaptic function may have a greater role in memory loss than the loss of nerve cells, rescue of synaptic function may serve as a more reliable target for an effective Alzheimer’s disease drug, said Dr. Pasinetti.
The new study provides experimental evidence that Lavado cocoa extract may influence Alzheimer’s disease mechanisms by modifying the physical structure of Aβ oligomers. It also strongly supports further studies to identify the metabolites of Lavado cocoa extract that are active in the brain and identify potential drug targets.
In addition, turning cocoa-based Lavado into a dietary supplement may provide a safe, inexpensive and easily accessible means to prevent Alzheimer’s disease, even in its earliest, asymptomatic stages.
It has become increasingly common to hear reports that big brains are not necessary, or even an evolutionary fluke. However, the new article found that increases in the size of brain areas, such as the visual cortex, are an essential element of evolution.

As part of the study, the researchers found that an increase in the size of the visual part of the brain in different primate species, including humans, apes, and monkeys, is associated with enhanced visual processing.
It is controversial whether overall brain size can predict intelligence. However the size of specialised areas within the brain is associated with specific changes in behaviour such as reducing the susceptibility to visual illusions and increasing the visual acuity or fine details that can be seen.
First author, Dr Alexandra de Sousa explained: “Primates with a bigger visual cortex have better visual resolution, the precision of vision, and reduced visual illusion strength. In essence, the bigger the brain area, the better the visual processing ability.
“The size of brain areas predicts not only the number of neurons (brain cells) in that area, but also the likelihood of connections between neurons. These connections allow for increasingly complex computations to be made that allow for more accurate, and more difficult, visual perception.”
Co-author, Dr Michael Proulx, Senior Lecturer (Associate Professor) in Psychology, added: “This paper is a novel attempt to bring together the micro and macro anatomy of the brain with behaviour. We link visual abilities, the size of brain areas, and the number of neurons that make up those brain areas to provide a framework that ties brain structure and function together.
“The theory of brain size that we discuss can be tested in the future with more behavioural tests of other species, gathering more comparative neuroanatomical data, and by testing other senses and multi-sensory perception, too. We might be able to even predict how well extinct species could sense the world based on fossil data.”
For the study, Dr Alexandra de Sousa, an expert in brain evolution, provided brain size measurements from her and other’s neuroanatomical research. Dr Michael Proulx, an expert in perception, found psychological studies of visual illusions and visual acuity in the same species or general of animals.
The paper ‘What can volumes reveal about human brain evolution? A framework for bridging behavioral, histometric and volumetric perspectives’ is published today in Frontiers in Neuroanatomy – an online, open access journal.
Although deep brain stimulation can be an effective therapy for dystonia – a potentially crippling movement disorder – the treatment isn’t always effective, or benefits may not be immediate. Precise placement of DBS electrodes is one of several factors that can affect results, but few studies have attempted to identify the “sweet spot,” where electrode placement yields the best results.

Researchers led by investigators at Cedars-Sinai, using a complex set of data from records and imaging scans of patients who have undergone successful DBS implantation, have created 3-D, computerized models that map the brain region involved in dystonia. The models identify an anatomical target for further study and provide information for neurologists and neurosurgeons to consider when planning surgery and making device programming decisions.
“We know DBS works as a treatment for dystonia, but we don’t know exactly how it works or why some patients have better, quicker results than others. Patient age, disease duration and other underlying factors have a role, and we believe electrode positioning and device programming are critical, but there is no consensus on ideal device placement and optimal programming strategies,” said Michele Tagliati, MD, director of the Movement Disorders Program in the Department of Neurology at Cedars-Sinai.
“This modeling paves the way for the construction of practical therapeutic and investigational targets,” added Tagliati, senior author of an article now available on the online edition of Annals of Neurology.
Medications usually are the first line of treatment for dystonia and several other movement disorders, but if drugs fail – as frequently happens – or side effects are excessive, neurologists and neurosurgeons may supplement them with deep brain stimulation. Electrical leads are implanted deep in the brain, and a pulse generator is placed near the collarbone. The device is later programmed with a remote, hand-held controller.
To calm the disorganized muscle contractions of dystonia, doctors generally target a brain structure called the globus pallidus, but studies on precise positioning of electrode contacts and the best programming parameters – such as the intensity and frequency of electrical stimulation – are rare and conflicting. Finding the most effective settings can take months of fine-tuning.
In this retrospective study, investigators examined a database of 94 patients with the most common genetic form of dystonia, DYT1, who had been treated with DBS for at least a year. They selected 21 patients who had good responses to treatment, compiled their demographic and treatment information, and used magnetic resonance imaging scans to create 3-D anatomical models with a fine grid to show exact location of relevant brain structures.
The investigators then simulated the placement of electrodes as they were positioned in the patients’ brains and input the actual stimulation parameters into a computer program – a “volume of tissue activation” model – which calculated detailed information specific to each patient and each electrode. The model draws on principles of neurophysiology – the way nerve cells respond to DBS – the biophysics of voltage distribution from electrodes, and the anatomy of the globus pallidus and surrounding structures.
“We found that clinicians were applying relatively large amounts of energy to wide swaths of the globus pallidus, but the area in common among most individuals was much smaller. We interpret this as being the potential ‘target within the target,’ and if our results are validated in further research and clinical practice, computer modeling may offer a physiologically-based, data-driven, visualized approach to clinical decision-making,” Tagliati said.
There are new clues about malfunctions in brain cells that contribute to intellectual disability and possibly other developmental brain disorders.

(Image caption: False color image of a mouse hippocampal neuron (cell
body is at lower right) with branchlike dendrites that provide surfaces at which projections from other neurons can connect, by forming synapses. Van Aelst and colleagues have shown that when the OPHN1 protein is mutated, interfering with its ability to interact with another protein called Homer1b/c, AMPA receptors don’t recycle to the surface at synapses at the rate they normally do. This adversely impacts synaptic plasticity, the process by which neurons adjust the strength of their connections. Such pathology may play a role in X-linked mental retardation.)
Professor Linda Van Aelst of Cold Spring Harbor Laboratory (CSHL) has been scrutinizing how the normal version of a protein called OPHN1 helps enable excitatory nerve transmission in the brain, particularly at nerve-cell docking ports containing AMPA receptors (AMPARs). Her team’s new work, published June 24 in the Journal of Neuroscience, provides new mechanistic insight into how OPHN1 defects can lead to impairments in the maturation and adjustment of synaptic strength of AMPAR-expressing neurons, which are ubiquitous in the brain and respond to the excitatory neurotransmitter glutamate.
Mutations in a gene called oligophrenin-1 (OPHN1) – located on the X chromosome – have previously been linked to X-linked intellectual disability (also known as X-linked mental retardation), a condition that affects boys disproportionately and could account for as much as one-fifth of all intellectual disability among males.
Several different mutations in the OPHN1 gene have been identified to date, all of which perturb nerve cells’ manufacture of OPHN1 protein. Previously, Van Aelst and colleagues demonstrated that OPHN1 has a vital role in synaptic plasticity, the process through which adjacent nerve cells adjust the strength of their connections. Cells in the brain are constantly adjusting connection strength as they respond to streams of stimuli.
The new discovery shows how OPHN1 is involved in the trafficking of AMPARs, an essential feature of plasticity in neurons. Neurons move receptors away from synapses into their interior and then back to the surface of synapses to control connection strength. At the synaptic surface, receptors provide an opportunity for the docking of neurotransmitters, in this case glutamate molecules. After a cell has fired, surface receptors are typically brought back into the interior, where they are recycled for future use.
When OPHN1 is misshapen or missing due to genetic mutation, the CSHL team demonstrated, it can no longer properly perform its role in receptor recycling, thus also impairing neurons’ ability to maintain strong long-term connections with their neighbors, called long-term potentiation.
Van Aelst’s new experiments explain how OPHN1 in complex with another protein called Homer1b/c should normally interact with an area called the endocytic zone (EZ) to provide a pool of AMPARs to be brought to the synapse at a location called the post-synaptic density (PSD). When OPHN1 is mutated, the pool does not form and receptors needed for strengthening synapses are not available. Long-term potentiation is impaired.
“This suggests a previously unknown way in which genetic defects in OPHN1 can lead to dysfunctions in the glutamate system,” says Dr. Van Aelst. “Our earlier studies had already shown that OPHN1 is essential in stabilizing AMPA receptors at the synapse. Together, these two essential roles suggest how defective OPHN1 protein may contribute to pathology that underlies X-linked intellectual disability.”
Pregnant women who lived in close proximity to fields and farms where chemical pesticides were applied experienced a two-thirds increased risk of having a child with autism spectrum disorder or other developmental delay, a study by researchers with the UC Davis MIND Institute has found. The associations were stronger when the exposures occurred during the second and third trimesters of the women’s pregnancies.

The large, multisite California-based study examined associations between specific classes of pesticides, including organophosphates, pyrethroids and carbamates, applied during the study participants’ pregnancies and later diagnoses of autism and developmental delay in their offspring. It is published online today in Environmental Health Perspectives.
“This study validates the results of earlier research that has reported associations between having a child with autism and prenatal exposure to agricultural chemicals in California,” said lead study author Janie F. Shelton, a UC Davis graduate student who now consults with the United Nations. “While we still must investigate whether certain sub-groups are more vulnerable to exposures to these compounds than others, the message is very clear: Women who are pregnant should take special care to avoid contact with agricultural chemicals whenever possible.”
California is the top agricultural producing state in the nation, grossing $38 billion in revenue from farm crops in 2010. Statewide, approximately 200 million pounds of active pesticides are applied each year, most of it in the Central Valley, north to the Sacramento Valley and south to the Imperial Valley on the California-Mexico border. While pesticides are critical for the modern agriculture industry, certain commonly used pesticides are neurotoxic and may pose threats to brain development during gestation, potentially resulting in developmental delay or autism.
The study was conducted by examining commercial pesticide application using the California Pesticide Use Report and linking the data to the residential addresses of approximately 1,000 participants in the Northern California-based Childhood Risk of Autism from Genetics and the Environment (CHARGE) Study. The study includes families with children between 2 and 5 diagnosed with autism or developmental delay or with typical development. It is led by principal investigator Irva Hertz-Picciotto, a MIND Institute researcher and professor and vice chair of the Department of Public Health Sciences at UC Davis. The majority of study participants live in the Sacramento Valley, Central Valley and the greater San Francisco Bay Area.
Twenty-one chemical compounds were identified in the organophosphate class, including chlorpyrifos, acephate and diazinon. The second most commonly applied class of pesticides was pyrethroids, one quarter of which was esfenvalerate, followed by lambda-cyhalothrin permethrin, cypermethrin and tau-fluvalinate. Eighty percent of the carbamates were methomyl and carbaryl.
For the study, researchers used questionnaires to obtain study participants’ residential addresses during the pre-conception and pregnancy periods. The addresses then were overlaid on maps with the locations of agricultural chemical application sites based on the pesticide-use reports to determine residential proximity. The study also examined which participants were exposed to which agricultural chemicals.
“We mapped where our study participants’ lived during pregnancy and around the time of birth. In California, pesticide applicators must report what they’re applying, where they’re applying it, dates when the applications were made and how much was applied,” Hertz-Picciotto said. “What we saw were several classes of pesticides more commonly applied near residences of mothers whose children developed autism or had delayed cognitive or other skills.”
The researchers found that during the study period approximately one-third of CHARGE Study participants lived in close proximity – within 1.25 to 1.75 kilometers – of commercial pesticide application sites. Some associations were greater among mothers living closer to application sites and lower as residential proximity to the application sites decreased, the researchers found.
Organophosphates applied over the course of pregnancy were associated with an elevated risk of autism spectrum disorder, particularly for chlorpyrifos applications in the second trimester. Pyrethroids were moderately associated with autism spectrum disorder immediately prior to conception and in the third trimester. Carbamates applied during pregnancy were associated with developmental delay.
Exposures to insecticides for those living near agricultural areas may be problematic, especially during gestation, because the developing fetal brain may be more vulnerable than it is in adults. Because these pesticides are neurotoxic, in utero exposures during early development may distort the complex processes of structural development and neuronal signaling, producing alterations to the excitation and inhibition mechanisms that govern mood, learning, social interactions and behavior.
“In that early developmental gestational period, the brain is developing synapses, the spaces between neurons, where electrical impulses are turned into neurotransmitting chemicals that leap from one neuron to another to pass messages along. The formation of these junctions is really important and may well be where these pesticides are operating and affecting neurotransmission,” Hertz-Picciotto said.
Research from the CHARGE Study has emphasized the importance of maternal nutrition during pregnancy, particularly the use of prenatal vitamins to reduce the risk of having a child with autism. While it’s impossible to entirely eliminate risks due to environmental exposures, Hertz-Picciotto said that finding ways to reduce exposures to chemical pesticides, particularly for the very young, is important.
“We need to open up a dialogue about how this can be done, at both a societal and individual level,” she said. “If it were my family, I wouldn’t want to live close to where heavy pesticides are being applied.”
Genes that increase the risk of developing schizophrenia may also increase the likelihood of using cannabis, according to a new study led by King’s College London, published today in Molecular Psychiatry.
Previous studies have identified a link between cannabis use and schizophrenia, but it has remained unclear whether this association is due to cannabis directly increasing the risk of the disorder.

The new results suggest that part of this association is due to common genes, but do not rule out a causal relationship between cannabis use and schizophrenia risk.
The study is a collaboration between King’s and the Queensland Institute of Medical Research in Australia, partly funded by the UK Medical Research Council (MRC).
Mr Robert Power, lead author from the MRC Social, Genetic and Developmental Psychiatry (SGDP) Centre at the Institute of Psychiatry at King’s, says: “Studies have consistently shown a link between cannabis use and schizophrenia. We wanted to explore whether this is because of a direct cause and effect, or whether there may be shared genes which predispose individuals to both cannabis use and schizophrenia.”
Cannabis is the most widely used illicit drug in the world, and its use is higher amongst people with schizophrenia than in the general population. Schizophrenia affects approximately 1 in 100 people and people who use cannabis are about twice as likely to develop the disorder. The most common symptoms of schizophrenia are delusions (false beliefs) and auditory hallucinations (hearing voices). Whilst the exact cause is unknown, a combination of physical, genetic, psychological and environmental factors can make people more likely to develop the disorder.
Previous studies have identified a number of genetic risk variants associated with schizophrenia, each of these slightly increasing an individual’s risk of developing the disorder.
The new study included 2,082 healthy individuals of whom 1,011 had used cannabis. Each individual’s ‘genetic risk profile’ was measured – that is, the number of genes related to schizophrenia each individual carried.
The researchers found that people genetically pre-disposed to schizophrenia were more likely to use cannabis, and use it in greater quantities than those who did not possess schizophrenia risk genes.
Power says: “We know that cannabis increases the risk of schizophrenia. Our study certainly does not rule this out, but it suggests that there is likely to be an association in the other direction as well – that a pre-disposition to schizophrenia also increases your likelihood of cannabis use.”
“Our study highlights the complex interactions between genes and environments when we talk about cannabis as a risk factor for schizophrenia. Certain environmental risks, such as cannabis use, may be more likely given an individual’s innate behaviour and personality, itself influenced by their genetic make-up. This is an important finding to consider when calculating the economic and health impact of cannabis.”