Posts tagged neuroscience

Posts tagged neuroscience
Major dopamine system helps restore consciousness after general anesthesia
Researchers may be one step closer to better understanding how anesthesia works. A study in the August issue of Anesthesiology, the official medical journal of the American Society of Anesthesiologists® (ASA®), found stimulating a major dopamine-producing region in the brain, the ventral tegmental area (VTA), caused rats to wake from general anesthesia, suggesting that this region plays a key role in restoring consciousness after general anesthesia. Activating this region at the end of surgery could provide a novel approach to proactively induce consciousness from anesthesia in surgical patients, researchers say.
"While generally safe, it is well known that patients should not be under general anesthesia longer than necessary," said Ken Solt, M.D., lead author, Massachusetts General Hospital Department of Anesthesia, Critical Care and Pain Medicine and assistant professor of anesthesia, Harvard Medical School, Boston. "Currently, there are no treatments to reverse the effects of general anesthesia. We must wait for the anesthetics to wear off. Having the ability to control the process of arousal from general anesthesia would be advantageous as it might speed recovery to normal cognition after surgery and enhance operating room (O.R.) efficiencies."
Although the brain circuits that drive the process of emerging from general anesthesia are not well understood, recent studies suggest that certain arousal pathways in the brain may be activated by certain drugs to promote consciousness. The authors previously reported that methylphenidate (Ritalin), a drug used to treat attention deficit hyperactivity disorder, awakened rats from general anesthesia by activating dopamine-releasing pathways.
In the current study, rats were given the general anesthetics isoflurane or propofol. Once unconscious, researchers performed targeted electrical stimulation, through implanted steel electrodes, on the two major regions of the rats’ brains that contain dopamine-releasing cells – the VTA (the area of the brain that controls cognition, motivation and reward in humans) and the substantia nigra, which controls movement.
Researchers found that electrical stimulation of the VTA caused the rats to regain consciousness, suggesting that dopamine released from cells in this area of the brain is likely involved in arousal. Interestingly, electrical stimulation of the VTA had an effect similar to that of the drug methylphenidate in restoring consciousness after anesthesia.
"We now have evidence that dopamine released by cells in the VTA is mainly responsible for the awakening effect seen with methylphenidate," said Dr. Solt. "Because dopamine-releasing cells in the VTA are important for cognition, we may be able to use drugs that act on this region not only to induce consciousness in anesthetized patients, but to potentially treat common postoperative emergence-related problems such as delirium and restore cognitive function."
Recent published research in the Journal of Clinical Investigation demonstrates how changes in dopamine signaling and dopamine transporter function are linked to neurological and psychiatric diseases, including early-onset Parkinsonism and attention deficit hyperactivity disorder (ADHD).
"The present findings should provide a critical basis for further exploration of how dopamine dysfunction and altered dopamine transporter function contribute to brain disorders" said Michelle Sahai, a postdoctoral associate at the Weill Cornell Medical College of Cornell University, adding "it also contributes to research efforts developing new ways to help the millions of people suffering."
Sahai is also studying the effects of cocaine, a widely abused substance with psychostimulant effects that targets the dopamine transporter. She and her colleagues expect to release these specific findings within the next year.
Losing Control
Dopamine is a neurotransmitter that plays an important role in our cognitive, emotional, and behavioral functioning. When activated from outside stimuli, nerve cells in the brain release dopamine, causing a chain reaction that releases even more of this chemical messenger.
To ensure that this doesn’t result in an infinite loop of dopamine production, a protein called the dopamine transporter reabsorbs the dopamine back into the cell to terminate the process. As dopamine binds to its transporter, it is returned to the nerve cells for future use.
However, cocaine and other drugs like amphetamine, completely hijack this well-balanced system.
"When cocaine enters the bloodstream, it does not allow dopamine to bind to its transporter, which results in a rapid increase in dopamine levels," Sahai explained.
The competitive binding and subsequent excess dopamine is what causes euphoria, increased energy, and alertness. It also contributes to drug abuse and addiction.
To further understand the effects of drug abuse, Sahai and other researchers in the Harel Weinstein Lab at Cornell are delving into drug interactions on a molecular level.
Using supercomputer resources, she is able to observe the binding of dopamine and various drugs to a 3D model of the dopamine transporter on a molecular level. According to Sahai, the work requires very long simulations in terms of microseconds and seconds to understand how drugs interact with the transporters.
Through the Extreme Science and Engineering Discovery Environment (XSEDE), a virtual cyberinfrastructure that provides researchers access to computing resources, Sahai performs these simulations on Stampede, the world’s 7th fastest supercomputer, at the Texas Advanced Computing Center (TACC).
"XSEDE-allocated resources are fundamental to helping us understand of how drugs work. There’s no way we could perform these simulations on the machines we have in house. Through TACC as an XSEDE service provider, we can also expect an exponential increase in computational results, and good customer service and feedback."
Ultimately, Sahai’s research will contribute to an existing body of work that is attempting to develop a cocaine binding inhibitor without suppressing the dopamine transporter.
"If we can understand how drugs bind to the dopamine transporter, then we can better understand drug abuse and add information on what’s really important in designing therapeutic strategies to combat addiction," Sahai said.
A Common Link in the Research
While Sahai is still working to understand drug abuse, her simulations of the dopamine transporter have contributed to published research on Parkinson’s disease and other neurological disorders.
In a collaborative study with the University of Copenhagen, Copenhagen University Hospital, and other research groups in the U.S. and Europe, researchers revealed the first known link between de novo mutations in the dopamine transporter and Parkinsonism in adults.
The study found that mutations can produce typical effects including debilitating tremors, major loss of motor control, and depression. The study also provides additional support for the idea that dopamine transporter mutations are a risk factor for attention deficit hyperactivity disorder (ADHD).
After identifying the dopamine transporter as the mutated gene linked to Parkinson’s, researchers once again turned to the Harel Weinstein Lab due to its long-standing interest and investment in studying the human dopamine transporter.
Sahai’s simulations using XSEDE and TACC’s Stampede supercomputer supported clinical trials by offering greater insight into how the dopamine transporter is involved in neurological disorders.
"This research is very important to me," Sahai said. "I was able to look at the structure of the dopamine transporter on behalf of experimentalists and understand how irregularities in this protein are harming an actual person, instead of just looking at something isolated on a computer screen."
While there is currently no cure for Parkinson’s disease, a deeper understanding of the specific mechanisms behind it will help the seven to ten million people afflicted with the disease.
"Like my work on drug abuse, the end goal is thinking about how we can help people. And it all comes back to drug design," Sahai said.
Funded by a $1 million award from the Keck Foundation, biomedical researchers at UCSB will strive to find out who could be more vulnerable to addiction
We’ve all heard the term “addictive personality,” and many of us know individuals who are consistently more likely to take the extra drink or pill that puts them over the edge. But the specific balance of neurochemicals in the brain that spurs him or her to overdo it is still something of a mystery.
“There’s not really a lot we know about specific molecules that are linked to vulnerability to addiction,” said Tod Kippin, a neuroscientist at UC Santa Barbara who studies cocaine addiction. In a general sense, it is understood that animals — humans included — take substances to derive that pleasurable rush of dopamine, the neurochemical linked with the reward center of the brain. But, according to Kippin, that dopamine rush underlies virtually any type of reward animals seek, including the kinds of urges we need to have in order to survive or propagate, such as food, sex or water. Therefore, therapies that deal with that reward system have not been particularly successful in treating addiction.
However, thanks to a collaboration between UCSB researchers Kippin; Tom Soh, professor of mechanical engineering and of materials; and Kevin Plaxco, professor of chemistry and biochemistry — and funding from a $1 million grant from the W. M. Keck Foundation — the neurochemistry of addiction could become a lot less mysterious and a lot more specific. Their study, “Continuous, Real-Time Measurement of Psychoactive Molecules in the Brain,” could, in time, lead to more effective therapies for those who are particularly inclined toward addictive behaviors.
“The main purpose is to try to identify individuals that would be vulnerable to drug addiction based on their initial neurochemistry,” said Kippin. “The idea is that if we can identify phenotypes — observable characteristics — that are vulnerable to addiction and then understand how drugs change the neurochemistry related to that phenotype, we’ll be in a better position to develop therapeutics to help people with that addiction.”
To identify these addiction-prone neurochemical profiles, the researchers will rely on technology they recently developed, a biosensor that can track the concentration of specific molecules in vivo, in real time. One early incarnation of this device was called MEDIC (Microfluidic Electrochemical Detector for In vivo Concentrations). Through artificial DNA strands called aptamers, MEDIC could indicate the concentration of target molecules in the bloodstream.
“Specifically, the DNA molecules are modified so that when they bind their specific target molecule they begin to transfer electrons to an underlying electrode, producing an easily measurable current,” said Plaxco. Prior to the Keck award, the team had shown that this technology could be used to measure specific drugs continuously and in real time in blood drawn from a subject via a catheter. With Keck funding, “the team is hoping to make the leap to measurements performed directly in vivo. That is, directly in the brains of test subjects,” said Plaxco.
For this study, the technology would be modified for use in the brain tissue of awake, ambulatory animals, whose neurochemical profiles would be measured continuously and in real time. The subjects would then be allowed to self-dose with cocaine, while the levels of the drug in their brain are monitored. Also monitored are concomitant changes in the animal’s neurochemistry or drug-seeking (or other) behaviors.
“The key aspect of it is understanding the timing of the neurochemical release,” said Kippin. “What are the changes in neurochemistry that causes the animals to take the drug versus those that immediately follow consumption of the drug?”
Among techniques for achieving this goal, a single existing technology allows scientists to monitor more than one target molecule at a time (e.g., a drug, a metabolite, and a neurotransmitter). However, Kippin noted, it provides an average of one data point about every 20 minutes, which is far slower than the time course of drug-taking behaviors and much less than the sub-second timescale over which the brain responds to drugs. With the implantable biosensor the team has proposed, it would be possible not only to track how the concentration of neurochemicals shift in relation to addictive behavior in real time, but also to simultaneously monitor the concentrations of several different molecules.
“One of our hypotheses about what makes someone vulnerable to addiction is the metabolism of a drug to other active molecules so that they may end up with a more powerful, more rewarding pharmacological state than someone with a different metabolic profile,” Kippin said. “It’s not enough to understand the levels of the compound that is administered; we have to understand all the other compounds that are produced and how they’re working together.”
The implantable biosensor technology also has the potential to go beyond cocaine and shed light on addictions to other substances such as methamphetamines or alcohol. It also could explore behavioral impulses behind obesity, or investigate how memory works, which could lead to further understanding of diseases such as Alzheimers.
Study Links Enzyme to Alzheimer’s Disease
Unclogging the body’s protein disposal system may improve memory in patients with Alzheimer’s disease (AD), according to a study from scientists at Kyungpook National University in Korea published in The Journal of Experimental Medicine.
In AD, various biochemical functions of brain cells go awry, leading to progressive neuronal damage and eventual memory loss. One example is the cellular disposal system, called autophagy, which is disrupted in patients with AD, causing the accumulation of toxic protein plaques characteristic of the disease. Jae-sung Bae and colleagues had earlier noted that the brains of AD patients have elevated levels of an enzyme called acid sphingomyelinase (ASM), which breaks down cell membrane lipids prevalent in the myelin sheath that coats nerve endings. But whether increased ASM directly contributes to AD (and if so, how) was unclear.
The group now finds that these two defects are linked. In mice with AD-like disease, elevated ASM activity clogged up the autophagy machinery resulting in the accumulation of undigested cellular waste. Reducing levels of ASM restored autophagy, lessened brain pathology, and improved learning and memory in the mice. Provided these results hold true in humans, interfering with ASM activity might prove to be an effective way to slow—and possibly reverse—neurodegeneration in patients with AD.
New research links bad diet to loss of smell
Could stuffing yourself full of high-fat foods cause you to lose your sense of smell?
A new study from Florida State University neuroscientists says so, and it has researchers taking a closer look at how our diets could impact a whole range of human functions that were not traditionally considered when examining the impact of obesity.
"This opens up a lot of possibilities for obesity research," said Florida State University post-doctoral researcher Nicolas Thiebaud, who led the study examining how high-fat foods impacted smell.
Thiebaud led the study in the lab of Biological Science Professor Debra Ann Fadool. Their work is published in the Journal of Neuroscience and shows that a high-fat diet is linked to major structural and functional changes in the olfactory system, which gives us our sense of smell.
It was the first time researchers had been able to demonstrate a solid link between a bad diet and a loss of smell.
The research was conducted over a six-month period where mice were given a high-fat daily diet, while also being taught to associate between a particular odor and a reward (water).
Mice that were fed the high-fat diets were slower to learn the association than the control population. And when researchers introduced a new odor to monitor their adjustment, the mice with the high-fat diets could not rapidly adapt, demonstrating reduced smell capabilities.
"Moreover, when high-fat-reared mice were placed on a diet of control chow during which they returned to normal body weight and blood chemistry, mice still had reduced olfactory capacities," Fadool said. "Mice exposed to high-fat diets only had 50 percent of the neurons that could operate to encode odor signals."
For Thiebaud and his colleagues, the results are opening up a whole new line of research. They will begin looking at whether exercise could slow down a high-fat diet’s impact on smell and whether a high-sugar diet would also yield the same negative results on smell as a high-fat diet.
Funded by the National Institutes of Health (NIH), the study comes at an important time with obesity rates at all-time highs throughout the world. According to the NIH, more than two in three adults in the United States are considered to be overweight or obese. Additionally, about one-third of children and adolescents ages 6 to 19 are considered to be overweight or obese.
Dysfunction in dopamine signaling profoundly changes the activity level of about 2,000 genes in the brain’s prefrontal cortex and may be an underlying cause of certain complex neuropsychiatric disorders, such as schizophrenia, according to UC Irvine scientists.
This epigenetic alteration of gene activity in brain cells that receive this neurotransmitter showed for the first time that dopamine deficiencies can affect a variety of behavioral and physiological functions regulated in the prefrontal cortex.
The study, led by Emiliana Borrelli, a UCI professor of microbiology & molecular genetics, appears online in the journal Molecular Psychiatry.
“Our work presents new leads to understanding neuropsychiatric disorders,” Borrelli said. “Genes previously linked to schizophrenia seem to be dependent on the controlled release of dopamine at specific locations in the brain. Interestingly, this study shows that altered dopamine levels can modify gene activity through epigenetic mechanisms despite the absence of genetic mutations of the DNA.”
Dopamine is a neurotransmitter that acts within certain brain circuitries to help manage functions ranging from movement to emotion. Changes in the dopaminergic system are correlated with cognitive, motor, hormonal and emotional impairment. Excesses in dopamine signaling, for example, have been identified as a trigger for neuropsychiatric disorder symptoms.
Borrelli and her team wanted to understand what would happen if dopamine signaling was hindered. To do this, they used mice that lacked dopamine receptors in midbrain neurons, which radically affected regulated dopamine synthesis and release.
The researchers discovered that this receptor mutation profoundly altered gene expression in neurons receiving dopamine at distal sites in the brain, specifically in the prefrontal cortex. Borrelli said they observed a remarkable decrease in expression levels of some 2,000 genes in this area, coupled with a widespread increase in modifications of basic DNA proteins called histones – particularly those associated with reduced gene activity.
Borrelli further noted that the dopamine receptor-induced reprogramming led to psychotic-like behaviors in the mutant mice and that prolonged treatment with a dopamine activator restored regular signaling, pointing to one possible therapeutic approach.
The researchers are continuing their work to gain more insights into the genes altered by this dysfunctional dopamine signaling.
(Source: news.uci.edu)
When it comes to learning languages, adults and children have different strengths. Adults excel at absorbing the vocabulary needed to navigate a grocery store or order food in a restaurant, but children have an uncanny ability to pick up on subtle nuances of language that often elude adults. Within months of living in a foreign country, a young child may speak a second language like a native speaker.
Brain structure plays an important role in this “sensitive period” for learning language, which is believed to end around adolescence. The young brain is equipped with neural circuits that can analyze sounds and build a coherent set of rules for constructing words and sentences out of those sounds. Once these language structures are established, it’s difficult to build another one for a new language.
In a new study, a team of neuroscientists and psychologists led by Amy Finn, a postdoc at MIT’s McGovern Institute for Brain Research, has found evidence for another factor that contributes to adults’ language difficulties: When learning certain elements of language, adults’ more highly developed cognitive skills actually get in the way. The researchers discovered that the harder adults tried to learn an artificial language, the worse they were at deciphering the language’s morphology — the structure and deployment of linguistic units such as root words, suffixes, and prefixes.
“We found that effort helps you in most situations, for things like figuring out what the units of language that you need to know are, and basic ordering of elements. But when trying to learn morphology, at least in this artificial language we created, it’s actually worse when you try,” Finn says.
Finn and colleagues from the University of California at Santa Barbara, Stanford University, and the University of British Columbia describe their findings in the July 21 issue of PLoS One. Carla Hudson Kam, an associate professor of linguistics at British Columbia, is the paper’s senior author.
Too much brainpower
Linguists have known for decades that children are skilled at absorbing certain tricky elements of language, such as irregular past participles (examples of which, in English, include “gone” and “been”) or complicated verb tenses like the subjunctive.
“Children will ultimately perform better than adults in terms of their command of the grammar and the structural components of language — some of the more idiosyncratic, difficult-to-articulate aspects of language that even most native speakers don’t have conscious awareness of,” Finn says.
In 1990, linguist Elissa Newport hypothesized that adults have trouble learning those nuances because they try to analyze too much information at once. Adults have a much more highly developed prefrontal cortex than children, and they tend to throw all of that brainpower at learning a second language. This high-powered processing may actually interfere with certain elements of learning language.
“It’s an idea that’s been around for a long time, but there hasn’t been any data that experimentally show that it’s true,” Finn says.
Finn and her colleagues designed an experiment to test whether exerting more effort would help or hinder success. First, they created nine nonsense words, each with two syllables. Each word fell into one of three categories (A, B, and C), defined by the order of consonant and vowel sounds.
Study subjects listened to the artificial language for about 10 minutes. One group of subjects was told not to overanalyze what they heard, but not to tune it out either. To help them not overthink the language, they were given the option of completing a puzzle or coloring while they listened. The other group was told to try to identify the words they were hearing.
Each group heard the same recording, which was a series of three-word sequences — first a word from category A, then one from category B, then category C — with no pauses between words. Previous studies have shown that adults, babies, and even monkeys can parse this kind of information into word units, a task known as word segmentation.
Subjects from both groups were successful at word segmentation, although the group that tried harder performed a little better. Both groups also performed well in a task called word ordering, which required subjects to choose between a correct word sequence (ABC) and an incorrect sequence (such as ACB) of words they had previously heard.
The final test measured skill in identifying the language’s morphology. The researchers played a three-word sequence that included a word the subjects had not heard before, but which fit into one of the three categories. When asked to judge whether this new word was in the correct location, the subjects who had been asked to pay closer attention to the original word stream performed much worse than those who had listened more passively.
“This research is exciting because it provides evidence indicating that effortful learning leads to different results depending upon the kind of information learners are trying to master,” says Michael Ramscar, a professor of linguistics at the University of Tübingen who was not part of the research team. “The results indicate that learning to identify relatively simple parts of language, such as words, is facilitated by effortful learning, whereas learning more complex aspects of language, such as grammatical features, is impeded by effortful learning.”
Turning off effort
The findings support a theory of language acquisition that suggests that some parts of language are learned through procedural memory, while others are learned through declarative memory. Under this theory, declarative memory, which stores knowledge and facts, would be more useful for learning vocabulary and certain rules of grammar. Procedural memory, which guides tasks we perform without conscious awareness of how we learned them, would be more useful for learning subtle rules related to language morphology.
“It’s likely to be the procedural memory system that’s really important for learning these difficult morphological aspects of language. In fact, when you use the declarative memory system, it doesn’t help you, it harms you,” Finn says.
Still unresolved is the question of whether adults can overcome this language-learning obstacle. Finn says she does not have a good answer yet but she is now testing the effects of “turning off” the adult prefrontal cortex using a technique called transcranial magnetic stimulation. Other interventions she plans to study include distracting the prefrontal cortex by forcing it to perform other tasks while language is heard, and treating subjects with drugs that impair activity in that brain region.
A study of 473 sets of twins followed since birth found that compared with single-born children, 47 percent of 24-month-old identical twins had language delay compared with 31 percent of nonidentical twins. Overall, twins had twice the rate of late language emergence of single-born children. None of the children had disabilities affecting language acquisition.

The results of the study were published in the June 2014 Journal of Speech, Language, and Hearing Research.
University of Kansas Distinguished Professor Mabel Rice, lead author, said that all of the language traits analyzed in the study—vocabulary, combining words and grammar—were significantly heritable with genes accounting for about 43 percent of the overall twins’ deficit.
The “twinning effect” — a lower level of language performance for twins than single-born children — was expected to be comparable for both kinds of twins, but was greater for identical twins, said Rice, strengthening the case for the heritability of language development.
“This finding disputes hypotheses that attribute delays in early language acquisition of twins to mothers whose attention is reduced due to the demands of caring for two toddlers,” Rice said. “This should reassure busy parents who worry about giving sufficient individual attention to each child.”
However, said Rice, prematurity and birth complications, more common in identical twins, could also affect their higher rates of language delay. A study of pregnancy and birth risks for late talking in twins is currently under way by the study authors.
Further, the study will continue at least until 2017 to continue to follow the twins through preschool and school years up to adolescence to answer the question of whether late-talking twins do catch up to their peers.
“Twin studies provide unique opportunities to study inherited and environmental contributions to language acquisition,” Rice said. “The outcomes inform our understanding of how these influences contribute to language acquisition in single-born children as well.”
Late language emergence means that a child’s language is below age and gender expectations in the number of words they speak and combining two or more words into sentences. In this study, 71 percent of 2-year-old twins were not combining words compared with 17 percent of single-born children.
While previous behavioral genetics studies of toddlers have largely focused on vocabulary, the researchers introduced an innovative measure of early grammatical ability on the correct use of the past tense and the “to be” and “to do” verbs. The measure was inspired by the Rice/Wexler Test of Early Grammar Impairment, developed in 2001 by Rice and Kenneth Wexler, Massachusetts Institute of Technology professor. It was the first test to detect the subtle but common language disorder, specific language impairment.
Rice’s collaborators in the international longitudinal project that began in 2002 are Professors Cate Taylor and Stephen Zubrick from the Telethon Kids Institute in Perth, Western Australia, and Professor Shelley Smith at the University of Nebraska Medical Center.
The study population is located in the vicinity of Perth, Western Australia, because it is demographically identical to Kansas City and several other U.S. Midwestern states. But in Australian health records are available, and the Western Australia Twin Registry is a unique resource for researchers since it is a record of all multiple births, Rice said.
The research group has followed the development of 1,000 sets of Western Australian twins from their first words. In 2012, the group was granted $2.8 million by the National Institute for Deafness and Other Communication Disorders for a fourth five-year-cycle that will enable researchers to continue to monitor the twins as they develop through adolescence. In addition to formal language tests, researchers have collected genetic and environmental data as well as assessments with the twins’ siblings.
(Source: news.ku.edu)
(Image caption: These scans show atrophy of the cerebellum in a boy with Christianson Syndrome. This symptom was observed in some, but not all boys, with the condition. Credit: Eric Morrow/Brown University)
Diagnostic criteria for Christianson Syndrome
Because the severe autism-like condition Christianson Syndrome was only first reported in 1999 and some symptoms take more than a decade to appear, families and doctors urgently need fundamental information about it. A new study that doubles the number of cases now documented in the scientific literature provides the most definitive characterization of CS to date. The authors therefore propose the first diagnostic criteria for the condition.
"We’re hoping that clinicians will use these criteria and that there will be more awareness among clinicians and the community about Christianson Syndrome," said Brown University biology and psychiatry Assistant Professor Dr. Eric Morrow, senior author of the study in press in the Annals of Neurology. “We’re also hoping this study will impart an opportunity for families to predict what to expect for their child and what’s a part of the syndrome.”
In conducting their study, which includes detailed behavioral, medical and genetic observations of 14 boys with CS from 12 families, the team of scientists and physicians worked closely with families of the small but fast-growing Christianson Syndrome Association , including hosting the group’s inaugural conference at Brown’s Alpert Medical School last summer.
In their study, Morrow’s team was able to quantify the most frequent symptoms specific to CS. These include moderate to severe intellectual disability, epilepsy, difficulty or inability walking and talking, attenuated head and brain growth, and hyperactivity. Boys sometimes exhibit other specific symptoms – including autism-like behaviors, low height and weight, acid reflux, and regressions in speech and motor skills after age 10 – that the researchers include as secondary proposed diagnostic criteria. A third of the boys also had potentially neurodegenerative problems such as atrophy of the cerebellum.
What’s still not clear is whether the disease limits the eventual lifespan of patients.
Distinct genetic cause
Many CS traits, including a very happy disposition, appear similar to those of another autism-like condition, Angelman Syndrome, but the study defines important differences.
Among the most important ones is that the two syndromes have distinct genetic underpinnings. In all CS cases, said Morrow who treats autism patients at the E. P. Bradley Hospital in East Providence, boys have a mutation on the SLC9A6 gene on the X chromosome that disables production of a protein called NHE6 that is important for neurological development.
Girls, who have two X chromosomes, can also be carriers of CS mutations, but they appear to be affected differently and less severely or not at all, the study reports.
The connection to the SLC9A6 gene was first discovered in 2008. In analyzing the genomes of each patient and their parents in the new study, lead authors Matthew Pescosolido, a graduate student, and David Stein, a former undergraduate, found that each boy had only one mutation, but there were many different ones across the entire group. More often than not, they determined, the mutation was not inherited, but an unlucky “de novo” change that occurred in the affected boy. In two situations, boys in unrelated families happened to share the same mutation. These recurrent mutations suggest that there may be hotspots in the DNA for mutation at these sites, Morrow said, although further research will be necessary to sort this out.
Morrow said there is evidence that SLC9A6 mutations – and therefore CS – may be a relatively common source of X-linked intellectual disability. One study, for example found that SLC9A6 mutations in two of 200 people suspected of having X-linked ID. Another found that 1 in 19 families with a case of ID exhibited a mutation that truncated the NHE6 protein.
"If we assume that between 1-3 percent of the world’s population is diagnosed with an intellectual disability and approximately 10-20 percent of the causes are due to X-linked genes, then we can estimate that CS may affect between 1 in 16,000 to 100,000 people," Morrow and his co-authors wrote. Worldwide that frequency would add up to more than 70,000 cases.
Relevance to autism, epilepsy
In a paper published last year, Morrow’s research group found that NHE6 is underexpressed in the brains of many children with more general forms of autism. This potential connection suggests that learning about CS can help doctors and scientists learn about autism.
Similarly by studying the regression of walking and verbal skills among Christianson boys, Morrow said researchers could learn more about regressions in autism.
"Christianson syndrome, I hope will be a model," Morrow said. "If we could understand the biological mechanism that leads to that loss, and we can prevent it, by developing a treatment, then these kids will remain further ahead."
Such advances will require much more study, but Morrow said that by uncovering a variety of mutations that all lead to the disease, the study provides a wealth of new information for that work.
"We can now study these different mutations and learn how this protein works by how it gets inactivated," he said. "All the different ways it gets inactivated can actually inform us about the different components of the protein that have an important function."

How does the cerebellum work?
Nothing says “don’t mess with me” like a deeply-fissured cortex. Even the sharpest jaws and claws in the animal kingdom are worthless without some serious thought muscle under the hood. But beneath the highly convoluted membrane covering the brains of the evolutionary upper crust hides the original crumpled processor—the cerebellum. How this organ might actually work is the subject of a review published in Frontiers of Systems Neuroscience by researchers at the University of Minnesota.