Producing brightly speckled red and green snapshots of many different tissues, Johns Hopkins researchers have color-coded cells in female mice to display which of their two X chromosomes has been made inactive, or “silenced.”

(Image caption: Patterns of X chromosome silencing in cells of the cornea, skin, cartilage and inner ear of mice (clockwise). Cells are red or green depending on whether they have inactivated their maternal or paternal X chromosome, respectively. Hao Wu, courtesy of Neuron)
Scientists have long known that the silencing of one X chromosome in females — who have two X chromosomes in every cell — is a normal occurrence whose consequences can be significant, especially if one X chromosome carries a normal copy of a gene and the other X chromosome carries a mutated copy.
By genetically tagging different X chromosomes with genes that code for red or green fluorescent proteins, scientists say they can now peer into different tissue types to analyze genetic diversity within and between individual females at a new level of detail.
Published on Jan. 8 in the journal Neuron, a summary of the research shows wide-ranging variation in patterns of so-called X chromosome inactivation at every level: within tissues, on the left or right sides of a centrally located tissue (like the brain), among different tissue types, between paired organs (like the eyes) and among individuals.
"Calico cats, which are only ever female, have mottled coat colors. They have two different versions of a gene for coat color, which is located on the X chromosome: one version from their mother and the other from their father," explains Jeremy Nathans, M.D., Ph.D., professor of molecular biology and genetics at the Johns Hopkins University and a Howard Hughes Medical Institute investigator. "Their fur is orange or black depending on which X chromosome is silenced in a particular patch of skin cells. X chromosome inactivation actually occurs in all cells in female mammals, including humans, and it affects most of the genes on the X chromosome. Although this phenomenon has been known for over 50 years, it couldn’t be clearly visualized in internal organs and tissues until now."
Nathans adds that early in the development of most mammals, when a female embryo has only about 1,000 cells, each cell makes a “decision” to inactivate one of the two X chromosomes, a process that silences most of the genes on that chromosome. The choice of which X chromosome to inactivate appears to be random, but when those cells divide, their descendants maintain that initial decision.
In the new research, the Johns Hopkins team mated female mice carrying two copies of the gene for green fluorescent protein — one on each of the two X chromosomes — with male mice whose single X chromosome carried the gene for a red fluorescent protein. The female offspring from this mating had cells that glowed red or green based on which X chromosome was silenced. Additionally, the team engineered the mice so that not all of their cells were color-coded, since that would make it hard to distinguish one cell from another. Instead, they designed a system that allowed a single cell type in each mouse, such as heart muscle cells, to be color-coded. Their genetic trick resulted in red and green color maps with distinctive patterns for each cell and tissue type that they examined.
Nathans explains that the patterns are determined by the way each tissue develops. Some tissues are created from a very small number of “founder cells” in the early embryo; others are created from a large number. Statistically, the larger the group of founder cells, the greater the chances are of having a nearly equivalent number of red and green cells. Although the ratio in the founding group is roughly preserved as the tissue grows, the distribution of those cells is determined by how much movement occurs during the development of the tissue. For example, in a tissue like blood, where the cells move a lot, the red and green cells are finely intermingled. By contrast, in skin, where the cells show little movement, each patch of skin consists of the descendants of a single cell, which share the same inactive X chromosome — and therefore the same color — creating a coarse patchwork of red and green.
Normally, the pattern of X chromosome inactivation is not easily visualized. This color-coding technique is likely to be valuable for many studies, Nathans says, especially for research on variations caused by changes in the DNA sequence of the X chromosome, referred to as X-linked variation. X-linked genetic variations, such as hemophilia or color blindness, are relatively common, in part because the X chromosome carries many genes — approximately 1,000, or close to 4 percent of the total.
Males who inherit an X-linked disease usually suffer its effects because they have no second X chromosome to compensate for the mutant version of the gene. Female relatives, on the other hand, are more typically “carriers” of X-linked diseases. They have the ability to pass the disease along to their male progeny, but they do not suffer from it themselves due to the normal copy of the gene on their second X chromosome.
In the tissues of certain carrier females, however, the cells that have silenced the X chromosome with a mutated gene cannot compensate for the defect in the cells that have silenced the X chromosome with the normal gene. Nathans and his team saw such a pattern when they examined the retinas of mice that were carriers for mutations in the Norrie disease gene, which is located on the X chromosome. The Norrie disease gene codes for a protein, Norrin, which controls blood vessel formation in the retina. Women who are carriers for Norrie disease can have defects in their retinas, but some women are more affected than others, and sometimes one eye is more affected than the other eye in the same individual.
The team found that in female mice that were Norrie disease carriers, variation in blood vessel structure corresponded to localized variations in X chromosome inactivation. When the X chromosome carrying the normal copy of the Norrie disease gene was silenced in a group of cells, the blood vessels nearby failed to form properly. In contrast, when the X chromosome carrying the mutated copy of the Norrie disease gene was silenced, the nearby blood vessels developed normally.
“X chromosome inactivation is a fascinating aspect of mammalian biology,” says Nathans. “This new technique for visualizing the pattern of X chromosome inactivation should be particularly useful for looking at the role that this process plays in brain development, including the ways that it contributes to differences between the left and right sides of the female brain, and to differences in brain structure between males and females and among different females, including identical twins.”
The research, published today in the journal Cell Metabolism, provides further insights on how the insulin-producing beta cells are formed in the pancreas. The team discovered that mutations in two specific genes which are important for development of the pancreas can cause the disease. These findings increase the number of known genetic causes of neonatal diabetes to 20. The study was funded by the Wellcome Trust, Diabetes UK, European Community’s Seventh Framework Programme, with some of the authors supported by the National Institute for Health Research (NIHR).

Dr Sarah Flanagan, lead author on the paper, said: “We are very proud to be able to give answers to the families involved on why their child has diabetes. Neonatal diabetes is diagnosed when a child is less than six months old, and some of these patients have added complications such as muscle weakness and learning difficulties with or without epilepsy.
“Our genetic discovery is critical to the advancement of knowledge on how insulin-producing beta cells are formed in the pancreas, which has implications for research into manipulating stem cells, which could one day lead to a cure.”
Dr Alasdair Rankin, Diabetes UK Director of Research, said: “As well as shedding further light on the genetic causes of neonatal diabetes and providing answers for parents of children with this rare condition, this work helps us understand how the pancreas develops. Many people with diabetes can no longer make insulin and would benefit from therapies that replace the insulin producing beta cells of the pancreas. The results of this study are critical to bringing the day closer when this type of treatment is possible.”
Neonatal diabetes is caused by a change in a gene which affects insulin production. This means that levels of blood glucose (sugar) in the body rise dangerously high.
The Exeter team is the leading centre for neonatal diabetes having recruited over 1200 patients from more than 80 countries. This specific study focussed on 147 young people with neonatal diabetes, a rare condition which affects approximately 1 in 100,000 births. Following a systematic screen, 110 patients received a genetic diagnosis. For the remaining 37 patients, mutations in genes important for human pancreatic development were screened. Mutations were found in 11 patients, four of which were in one of two genes not previously known to cause neonatal diabetes (NKX2-2 and MNX1).
For many of the 121 (82%) patients who received a genetic diagnosis, knowing the cause of the diabetes will result in improved treatment, and for all the patients it will provide important information on risk of neonatal diabetes in future pregnancies. These patients also provide important scientific insights into pancreatic development.
Although drugs have been developed that inhibit the imbalance of neurotransmitters in the brain – a condition which causes many brain disorders and nervous system diseases – the exact understanding of the mechanism by which these drugs work has not yet been fully explained.

Now, researchers at the Hebrew University of Jerusalem, using baker’s yeast as a model, have deciphered the mode by which the inhibitors affect the neurological transmission process and have even been able to manipulate it.
Their work, reported in a recent article in the Journal of Biological Chemistry, raises hopes that these insights could eventually guide clinical scientists to develop new and more effective drugs for brain disorders associated with neurotransmitter imbalance.
All of the basic tasks of our existence are executed by the brain – whether it is breathing, heartbeat, memory building or physical movements – which depend on the highly regulated and efficient release of neurotransmitters – chemicals that act as messengers enabling extremely rapid connections between the neurons in the brain.
When even one part of the everyday “conversation” between neighboring neurons breaks down, the results can be devastating. Many brain disorders and nervous system diseases, including Huntington’s disease, various motor dysfunctions and even Parkinson’s disease, have been linked to problems with neurotransmitter transport.
The neurotransmitters are stored in the neuron in small, bubble-like compartments, called vesicles, containing transport proteins that are responsible for the storage of the neurotransmitters into the vesicles.
The storage of certain neurotransmitters is controlled by what is called the vesicular monoamine transporter (VMAT), which is known to transport a variety of vital neurotransmitters, such as adrenaline, dopamine and serotonin.
In addition, it can also transport the detrimental MPP+, a neurotoxin involved in models of Parkinson’s disease.
A number of studies demonstrated the significance of VMAT as a target for drug therapy in a variety of pathologic states, such as high blood pressure, hyperkinetic movement disorders and Tourette syndrome.
Many of the drugs that target VMAT act as inhibitors, including the classical VMAT2 inhibitor, tetrabenazine. Tetrabenazine has long been used for the treatment of motor dysfunctions associated with Huntington’s disease and other movement disorders. However, the mechanism by which the drug affects the storage of neurotransmitters was not fully understood.
The Hebrew University study set out, therefore, to achieve an understanding of the basic biochemical mechanism underlying the VMAT reaction, with a view towards better controlling it through new drug designs.
The research was conducted by in the laboratory of Prof. Shimon Schuldiner of the Hebrew University’s Department of Biological Chemistry; Dr.Yelena Ugolev, postdoctoral fellow in the laboratory; and research students Tali Segal, Dana Yaffe and Yael Gros.
To identify protein sequences responsible for tetrabenazine binding, the Hebrew University scientists harnessed the power of yeast genetics along with the method of directed evolution.
Expressing the human protein VMAT in baker’s yeast cells confers them with the ability to grow in the presence of toxic substrates, such as neurotoxin MPP+. Directed evolution mimics natural evolution in the laboratory and is a method used in protein engineering.
By using rounds of random mutations targeted to the gene encoding the protein of interest, the proteins can be tuned to acquire new properties or to adapt to new functions or environment.
The study led to identification of important flexible domains (or regions) in the structure of the VMAT, responsible for producing optional rearrangements in tetrabenazine binding, and also enabling regulation of the velocity of the neurotransmitter transporter.
Utilizing these new, controllable adaptations could serve as a guide for clinical scientists to develop more efficient drugs for brain disorders associated with neurotransmitter imbalance, say the Hebrew University researchers.
Although brain growth slows as individuals age, some regions of the brain continue to develop for longer than others, creating new connections and remodeling existing circuitry. How this happens is a key question in neuroscience, with implications for brain health and neurodegenerative diseases. New research published today shows that those areas of the adult brain that consume more fuel than scientists might expect also share key characteristics with the developing brain. Two Allen Brain Atlas resources – the Allen Human Brain Atlas and the BrainSpan Atlas of the Developing Human Brain – were crucial to uncovering the significance of these sugar-hungry regions. The results are published this month in the journal Cell Metabolism.

"These experiments and analysis represent the first union of its kind between functional imaging data and a biological mechanism, with the Allen Brain Atlas resources helping to bridge that gap," comments Michael Hawrylycz, Ph.D., Investigator with the Allen Institute for Brain Science and co-author of the study. Data from PET scans provides structural insight into the brain, but until now, has not been able to elucidate function. "Now we can make the comparison between the functional data and the gene expression data," says Hawrylycz, "so instead of just the ‘where,’ we now also have the ‘what’ and ‘how.’"
The brain needs to constantly metabolize fuel in order to keep running, most often in the form of glycolysis: the breaking down of stored sugar into useable energy. PET scans of the brain, which illuminate regions consuming sugar, show that some select areas of the brain seemed to exhibit fuel consumption above and beyond what was needed for basic functioning. In cancer biology, this same well-known phenomenon of consuming extra fuel—called “aerobic glycolysis”—is thought to provide support pathways for cell proliferation. In the brain, aerobic glycolysis is dramatically increased during childhood and accounts for as much as one third of total brain glucose consumption at its peak around 5 years of age, which is also the peak of synapse development.
Since aerobic glycolysis varies by region of the brain, Hawrylycz and co-author Marcus Raichle, Ph.D., at Washington University in St. Louis, wondered whether regions of the brain with higher levels of aerobic glycolysis might be associated with equivalent growth processes, like synapse formation. If so, this would point to aerobic glycolysis as a reflection of “neoteny,” or persistent brain development like the kind that takes place during early childhood.
In order to delve into the significance of aerobic glycolysis, researchers examined the genes expressed at high levels in those regions where aerobic glycolysis was taking place. The team identified 16 regions of the brain with elevated levels of aerobic glycolysis and ranked their neotenous characteristics. True to prediction, they found that gene expression data from those 16 regions suggested highly neotenous behavior.
The next phase was to identify which genes were specifically correlated with aerobic glycolysis in those regions. The Allen Brain Atlas resources proved crucial in this task, helping to pinpoint gene expression in different regions at various points in development. The Allen Human Brain Atlas was used to investigate the adult human brain, while the BrainSpan Atlas of the Developing Human Brain, developed by a consortium of partners and funded by the National Institutes of Health, provided a window into how gene expression changes as the brain ages.
Analysis of the roles of those genes pointed clearly towards their roles in growth and development; top genes included those responsible for axon guidance, potassium ion channel development, synaptic transmission and plasticity, and many more. The consistent theme was development, pointing to aerobic glycolysis as a hallmark for neotenous, continually developing regions of the brain.
"Using both the adult and developmental data, we were able to study gene expression at each point in time," describes Hawrylycz. "From there, we were able to see the roles of those genes that were highly expressed in regions with aerobic glycolysis. As it turns out, those genes are consistently involved in the remodeling and maturation process, synaptic growth and neurogenesis—all factors in neoteny." "The regions we identified as being neotenous are areas of the cortex particularly associated with development of intelligence and learning," explains Hawrylycz. "Our results suggest that aerobic glycolysis, or extra fuel consumption, is a marker for regions of the brain that continue to grow and develop in similar ways to the early human brain."
Australian researchers have shed more light on an underexplored aspect of the important brain-signaling system that controls appetite, body composition and energy use. Their findings suggest that a specific gene regulating our body clock may play a central role in determining how fat we become.
Evolution has preserved the ‘neuropeptide Y (NPY) system’, as it is known, in most species – indicating its importance – and much of our understanding comes from studying it in mice. There is one important difference, however, between the NPY system in mouse and man.
In man, the neurotransmitter NPY communicates with four well-known ‘cell surface receptors’ in the brain (Y1, Y2, Y4 and Y5), which in turn trigger the system’s effects.
The new study has shown that mice have an additional receptor, Y6, which has profound effects on their body composition. Y6 is produced in a very small region of the brain that regulates the body clock, as well as growth hormone production.
PhD student Ernie Yulyaningsih, Dr Kim Loh, Dr Shu Lin and Professor Herbert Herzog from Sydney’s Garvan Institute of Medical Research, together with Associate Professor Amanda Sainsbury-Salis, now at the University of Sydney, deleted the Y6 gene from mice to understand its effects. Their study showed that mice without the Y6 gene were smaller, and had less lean tissue, than normal mice. On the other hand, as they aged, these ‘knockout mice’ grew fatter than the normal mice, especially when fed a high-fat diet. In that case, they became obese and developed metabolic problems similar to diabetes. These findings are now published online in the prestigious international journal, Cell Metabolism.
While the gene encoding the Y6 receptor is altered in man, Professor Herzog believes it would be unwise to ignore it because the development of anti-obesity drugs relies heavily on mouse studies.
“It is now clear to us that signaling through the Y6 receptor system is critical for the ways in which energy is used at different times of the day,” said Professor Herbert Herzog.
“Our work shows that Pancreatic Polypeptide has a very high affinity for Y6 in mice. It’s a satiety signal, and probably controls the circadian aspect of food intake – because the same amount of calories eaten at different times of the day has different effects on body weight.”
“The Y6 gene is highly expressed in a part of the brain called the ‘hypothalamic suprachiasmatic nucleus’, which is known to control the body’s circadian rhythm and may also critically modulate metabolic processes in response to food. The gene stimulates higher levels of certain peptides, including vasoactive intestinal peptide (VIP) – which controls growth hormone release.”
“While it is not clear whether the Y6 receptor is fully active in humans, Pancreatic Polypeptide is highly expressed – even more so than in mice – and it’s possible that another receptor to which the peptide has high affinity, such as Y4, could have taken over this function.”
Associate Professor Amanda Sainsbury-Salis expressed surprise at the impact of the Y6 gene deletion on mice, commenting “I find it amazing that one gene, which is expressed in the small part of the brain that controls the body clock, has such a profound impact on how much fat is stored on the body, and how much lean tissue is maintained.”
“Importantly, we use mice as models of human beings in research, and so when looking for anti-obesity drugs, we need to fully understand the function of the NPY system in this animal model to understand how similar circuits in humans connect with the body clock.”
An experimental treatment for Parkinson’s disease reduced by nearly two hours on average the period each day when medication failed to control patients’ slowness and shaking, according to results from a double-blind, phase III clinical trial published in December 2013, in Lancet Neurology.

The study compared AbbVie’s levodopa-carbidopa intestinal gel against the same medication in pill form in patients with advanced disease.
The University of Alabama at Birmingham was among the sites for the study, with David G. Standaert, M.D., Ph.D., chair of the UAB Department of Neurology, an author. Led by the Mount Sinai School of Medicine, preliminary results from the study were first presented at the annual meeting of the American Academy of Neurology in April 2012.
Parkinson’s disease results from the loss of brain cells that make dopamine, which helps to control movement. As dopamine levels fall, patients experience tremors, muscle stiffness and loss of balance. A commonly prescribed treatment, the levodopa-carbidopa combination works as the body converts levodopa into dopamine and carbidopa escorts levodopa to the right part of the brain. The problem is that patients face hours of uncontrolled slowness, freezing and tremors each day — called “off-time” — as the treatment gets into place or wears off.
One reason for the break in treatment coverage is that it comes in a pill, and pills sit in the stomach for up to six hours waiting for it to empty into the small intestine. It is only there that levodopa encounters the proteins capable of transporting it into the bloodstream en route to the brain. Thus, researchers envisioned a system that steadily delivers levodopa gel directly into the small intestine through a surgically placed tube, and with the help of a pump worn on the belt.
“The results are very exciting, considering that other recently approved drugs on the market reduce off-time by, at most, just over an hour,” said Standaert. “In the study, the gel treatment helped patients who had run out of alternatives with current medications. We believe it may be an important new option for patients with severe Parkinson’s, with benefits comparable to more invasive techniques like deep brain stimulation.”
Patients using the gel system saw an average reduction in daily off-time of 1.91 hours, and an increase in “on-time” without troublesome dyskinesia of 1.86 hours compared with the pill form. Nearly all subjects experienced at least one side effect, although most were short-lived and moderate.
QBI scientists at The University of Queensland have found that honeybees use the pattern of polarised light in the sky invisible to humans to direct one another to a honey source.

The study, conducted in Professor Mandyam Srinivasan’s laboratory at the Queensland Brain Institute, a member of the Australian Research Council Centre of Excellence in Vision Science (ACEVS), demonstrated that bees navigate to and from honey sources by reading the pattern of polarised light in the sky.
“The bees tell each other where the nectar is by converting their polarised ‘light map’ into dance movements,” Professor Srinivasan said.
“The more we find out how honeybees make their way around the landscape, the more awed we feel at the elegant way they solve very complicated problems of navigation that would floor most people – and then communicate them to other bees,” he said.
The discovery shines new light on the astonishing navigational and communication skills of an insect with a brain the size of a pinhead.
The researchers allowed bees to fly down a tunnel to a sugar source, shining only polarised light from above, either aligned with the tunnel or at right angles to the tunnel.
They then filmed what the bees ‘told’ their peers, by waggling their bodies when they got back to the hive.
“It is well known that bees steer by the sun, adjusting their compass as it moves across the sky, and then convert that information into instructions for other bees by waggling their body to signal the direction of the honey,” Professor Srinivasan said.
“Other laboratories have shown from studying their eyes that bees can see a pattern of polarised light in the sky even when the sun isn’t shining: the big question was could they translate the navigational information it provides into their waggle dance.”
The researchers conclude that even when the sun is not shining, bees can tell one another where to find food by reading and dancing to their polarised sky map.
In addition to revealing how bees perform their remarkable tasks, Professor Srinivasan says it also adds to our understanding of some of the most basic machinery of the brain itself.
Professor Srinivasan’s team conjectures that flight under polarised illumination activates discrete populations of cells in the insect’s brain.
When the polarised light was aligned with the tunnel, one pair of ‘place cells’ – neurons important for spatial navigation – became activated, whereas when the light was oriented across the tunnel a different pair of place cells was activated.
The researchers suggest that depending on which set of cells is activated, the bee can work out if the food source lies in a direction toward or opposite the direction of the sun, or in a direction ninety degrees to the left or right of it.
Children are likely to have stronger muscles if their mothers had a higher level of vitamin D in their body during pregnancy, according to new research from the Medical Research Council Lifecourse Epidemiology Unit (MRC LEU) at the University of Southampton.

Low vitamin D status has been linked to reduced muscle strength in adults and children, but little is known about how variation in a mother’s status during pregnancy affects her child.
Low vitamin D concentrations are common among young women in the UK, and although women are recommended to take an additional 10μg/day of vitamin D in pregnancy, supplementation is often not taken up.
In the research, published in the January edition of the Journal of Clinical Endocrinology and Metabolism, vitamin D levels were measured in 678 mothers in the later stages of pregnancy.
When the children were four years old, grip strength and muscle mass were measured. Results showed that the higher the levels of vitamin D in the mother, the higher the grip strength of the child, with an additional, but less pronounced association between mother’s vitamin D and child’s muscle mass.
Lead researcher Dr Nicholas Harvey, Senior Lecturer at the MRC LEU at the University of Southampton, comments: “These associations between maternal vitamin D and offspring muscle strength may well have consequences for later health; muscle strength peaks in young adulthood before declining in older age and low grip strength in adulthood has been associated with poor health outcomes including diabetes, falls and fractures. It is likely that the greater muscle strength observed at four years of age in children born to mothers with higher vitamin D levels will track into adulthood, and so potentially help to reduce the burden of illness associated with loss of muscle mass in old age.”
The 678 women who took part in the study are part of the Southampton Women’s Survey, one of the largest and best characterised such studies globally.
Professor Cyrus Cooper, Professor of Rheumatology and Director of the MRC LEU at the University of Southampton, who oversaw this work, added: “This study forms part of a larger programme of research at the MRC Lifecourse Epidemiology Unit and University of Southampton in which we are seeking to understand how factors such as diet and lifestyle in the mother during pregnancy influence a child’s body composition and bone development. This work should help us to design interventions aimed at optimising body composition in childhood and later adulthood and thus improve the health of future generations.”
Researchers from the University of Illinois at Chicago College of Medicine have found that dysfunction in a single gene in mice causes fasting hyperglycemia, one of the major symptoms of type 2 diabetes. Their findings were reported online in the journal Diabetes.
If a gene called MADD is not functioning properly, insulin is not released into the bloodstream to regulate blood sugar levels, says Bellur S. Prabhakar, professor and head of microbiology and immunology at UIC and lead author of the paper.
Type 2 diabetes affects roughly 8 percent of Americans and more than 366 million people worldwide. It can cause serious complications, including cardiovascular disease, kidney failure, loss of limbs and blindness.
In a healthy person, beta cells in the pancreas secrete the hormone insulin in response to increases in blood glucose after eating. Insulin allows glucose to enter cells where it can be used as energy, keeping glucose levels in the blood within a narrow range. People with type 2 diabetes don’t produce enough insulin or are resistant to its effects. They must closely monitor their blood glucose throughout the day and, when medication fails, inject insulin.
In previous work, Prabhakar isolated several genes from human beta cells, including MADD, which is also involved in certain cancers. Small genetic variations found among thousands of human subjects revealed that a mutation in MADD was strongly associated with type 2 diabetes in Europeans and Han Chinese.
People with this mutation had high blood glucose and problems of insulin secretion – the “hallmarks of type 2 diabetes,” Prabhakar said. But it was unclear how the mutation was causing the symptoms, or whether it caused them on its own or in concert with other genes associated with type 2 diabetes.
To study the role of MADD in diabetes, Prabhakar and his colleagues developed a mouse model in which the MADD gene was deleted from the insulin-producing beta cells. All such mice had elevated blood glucose levels, which the researchers found was due to insufficient release of insulin.
“We didn’t see any insulin resistance in their cells, but it was clear that the beta cells were not functioning properly,” Prabhakar said. Examination of the beta cells revealed that they were packed with insulin. “The cells were producing plenty of insulin, they just weren’t secreting it,” he said.
The finding shows that type 2 diabetes can be directly caused by the loss of a properly functioning MADD gene alone, Prabhakar said. “Without the gene, insulin can’t leave the beta cells, and blood glucose levels are chronically high.”
Prabhakar now hopes to investigate the effect of a drug that allows for the secretion of insulin in MADD-deficient beta cells.
“If this drug works to reverse the deficits associated with a defective MADD gene in the beta cells of our model mice, it may have potential for treating people with this mutation who have an insulin-secretion defect and/or type 2 diabetes,” he said.
In the journal Neurology, researchers report a novel technique that enables a patient with “word blindness” to read again.

Word blindness is a rare neurological condition. (The medical term is “alexia without agraphia.”) Although a patient can write and understand the spoken word, the patient is unable to read.
The article is written by Jason Cuomo, Murray Flaster, MD, PhD and Jose Biller, MD, of Loyola University Medical Center.
Here’s how the technique works: When shown a word, the patient looks at the first letter. Although she clearly sees it, she cannot recognize it. So beginning with the letter A, she traces each letter of the alphabet over the unknown letter until she gets a match. For example, when shown the word Mother, she will trace the letters of the alphabet, one at a time, until she comes to M and finds a match. Three letters later, she guesses correctly that the word is Mother.
"To see this curious adaption in practice is to witness the very unique and focal nature" of the deficit, the authors write.
The authors describe how word blindness came on suddenly to a 40-year-old kindergarten teacher and reading specialist. She couldn’t make sense of her lesson plan, and her attendance sheet was as incomprehensible as hieroglyphs. She also couldn’t tell time.
The condition was due to a stroke that probably was caused by an unusual type of blood vessel inflammation within the brain called primary central nervous system angiitis.
Once a passionate reader, she was determined to learn how to read again. But none of the techniques that she had taught her students – phonics, sight words, flash cards, writing exercises, etc. – worked. So she taught herself a remarkable new technique that employed tactile skills that she still possessed.
The woman can have an emotional reaction to a word, even if she can’t read it. Shown the word “dessert,” she says “Oooh, I like that.” But when shown “asparagus,” she says, “Something’s upsetting me about this word.”
Shown two personal letters that came in the mail, she correctly determined which was sent by a friend of her mother’s and which was sent by one of her own friends. “When asked who these friends were, she could not say, but their names nevertheless provoked an emotional response that served as a powerful contextual clue,” the authors write.
What she most misses is reading books to children. She teared up as she told the authors: “One day my mom was with the kids in the family, and they were all curled up next to each other, and they were reading. And I started to cry, because that was something I couldn’t do.”
It is a common belief that consciously thinking about what we are doing interferes with our performance. The origins of this idea go far back. Consider, for instance, the centipede’s dilemma:
A centipede was happy – quite!
Until a toad in fun
Said, “Pray, which leg moves after which?”
This raised her doubts to such a pitch,
She fell exhausted in the ditch
Not knowing how to run.

The centipede performs a very complex task with ease, unless she thinks about the task. The story was thought to illustrate something fundamental about human nature. English psychologist George Humphrey wrote “[the poem] contains a profound truth which is illustrated daily in the lives of all of us.” Humphrey and others thought that not having to think about everything that we do provides a great advantage. According to the famed philosopher Alfred North Whitehead, “Civilization advances by extending the number of important operations which we can perform without thinking about them.” Whitehead believed that thinking must be reserved only for decisive moments.
Though common, this idea is misleading. It is never optimal to run on autopilot. Even the motor tasks that we have learned to do fluently without much cognitive control are better performed while engaged. The key is to realize that we can apply cognitive control at a higher level. Moreover, gaining fluency at a motor task often comes at a cost. The cost is rigidity and deliberately breaking the flow in response to changing contexts often pays off. Musicians, athletes, public speakers, architects, designers, and others whose jobs require complex sequential actions can increase their performance if they understand that they are not trapped in the centipede’s dilemma.
In a fascinating paper, Brain researchers Eitan Globerson and Israel Nelken started with the observation that piano playing involves a very complex sequential motor task. The task is often executed in speeds that do not allow cognitive control of individual muscle movements. Through practice, pianists learn to execute fast and complex motor tasks with little cognitive control. Once this is achieved, it is possible to play in a disengaged way with little cognitive involvement. However, Globerson and Nelken suggest another way. Instead of focusing on individual finger movements or not focusing on anything, pianists may focus on higher-level mental events, such as the character of a longer musical phrase. This allows constant engagement with the music making and deliberate control without disrupting the mechanics of playing. Globerson and Nelken argue that this may dramatically improve performance.
If we follow their argument, it is easy to come up with our own examples about how to use higher-level cognitive control. While playing, a pianist may actively focus on the relationships between different musical ideas. A public speaker may develop a “mental script” that includes bigger-picture ideas, the connections between those ideas, where the climax of the speech should be, and what general effects should the speech make on the audience. During the speech, the public speaker may be constantly engaged with this mental script instead of trying to select words individually or mechanically replicating a previous performance. While shooting, a basketball player may focus on the arc that the ball should follow instead of focusing on arm movements or focusing on nothing. You can create your own examples of higher-level cognitive control for dancing, driving a car, designing a house, or doing the work of a carpenter.
Experts have long been aware of the power of focusing on higher-level mental processes. In 1924, Russian pianist and piano teacher Josef Lhevinne wrote the book Basic Principles in Pianoforte Playing, which later became a classic. In his discussion of memory, he wrote, “the thing to remember is the thought, not the symbols. When you remember a poem you do not remember the alphabetical symbols, but the poet’s beautiful vision, his thought pictures. … Get the thought, the composer’s idea; that is the thing that sticks.”
Higher-level cognitive control is capable of changing the motor action in a beneficial way. When a pianist decides to play a passage in an expressive fashion, for instance, this high-level command changes the character of playing through initiating a sequence of associated motor movements. There is experimental evidence that suggests that performance in highly automatized tasks can be improved by increasing the level of engagement. Musicians in symphony orchestras are typically asked to play the same pieces many times over the course of their careers. The playing of these pieces becomes mostly automatic; and the job satisfaction of orchestra players is typically dismal. Psychologists Ellen Langer, Timothy Russell, and Noah Eisenkraft recently asked a symphony orchestra to record, under different experimental conditions, the finale from Brahms’s Symphony No. 1. A local community chorus listened to and rated the recordings. The musicians were either asked to replicate a previous fine performance or to offer “subtle new nuances” to their performance. Musicians enjoyed the latter performance more; and the majority of the listeners preferred the recording of the latter performance.
There is always an unconscious component of the link between our intentions and the motor actions those intentions create. Even if I deliberately stretch my arm to grab a coffee mug, I do not have conscious control over the way the individual muscles in my arm operate to give rise to the specific stretching movement. Deliberate cognitive control is always less complex than the actual motor action. However, we often learn to apply cognitive control in an even more summary-like way. That is, we can learn to apply cognitive control in a single step over longer and more complex sequences of motor actions. Through practice, sequences of motor actions merge into a single unit that can be initiated by a single deliberate command. This is often called chunking. When children first learn how to brush their teeth or lace their shoes, they deliberately control individual movements that make up the task. After some practice, the individual movements are chunked and the whole sequence can be initiated by a single mental command. Many other daily activities such as riding a bike or writing one’s signature involve chunking. It is possible to merge chunked sequences into even longer sequences and reduce cognitive involvement even more.
Once initiated, a chunked motor sequence is executed automatically. As a consequence, we lose control over individual movements. This type of rigidity is often undesirable because we live in a constantly changing environment. In her book The Power of Mindful Learning Harvard psychologist Ellen Langer talks about how automaticity may get in the way of adapting to new circumstances. Overlearned driving skills may put one in danger while driving in a different country or in different weather conditions. Holding a baseball bat in the same overlearned way after getting older or stronger will hinder performance.
We can disrupt automaticity and appropriately respond to the situation at hand by orienting ourselves in the present and being sensitive to different contexts. We can think at a level higher than the mechanics of the motor action. We can be engaged with the task by making use of these two approaches simultaneously. In any case, thinking should never be reserved.
In the largest ever assessment of substance use among people with severe psychiatric illness, researchers at Washington University School of Medicine in St. Louis and the University of Southern California have found that rates of smoking, drinking and drug use are significantly higher among those who have psychotic disorders than among those in the general population.
The study is published online in the journal JAMA Psychiatry.

The finding is of particular concern because individuals with severe mental illness are more likely to die younger than people without severe psychiatric disorders.
“These patients tend to pass away much younger, with estimates ranging from 12 to 25 years earlier than individuals in the general population,” said first author Sarah M. Hartz, MD, PhD, assistant professor of psychiatry at Washington University. “They don’t die from drug overdoses or commit suicide — the kinds of things you might suspect in severe psychiatric illness. They die from heart disease and cancer, problems caused by chronic alcohol and tobacco use.”
The study analyzed smoking, drinking and drug use in nearly 20,000 people. That included 9,142 psychiatric patients diagnosed with schizophrenia, bipolar disorder or schizoaffective disorder — an illness characterized by psychotic symptoms such as hallucinations and delusions, and mood disorders such as depression.
The investigators also assessed nicotine use, heavy drinking, heavy marijuana use and recreational drug use in more than 10,000 healthy people without mental illness.
The researchers found that 30 percent of those with severe psychiatric illness engaged in binge drinking, defined as drinking four servings of alcohol at one time. In comparison, the rate of binge drinking in the general population is 8 percent.
Among those with mental illness, more than 75 percent were regular smokers. This compares with 33 percent of those in the control group who smoked regularly. There were similar findings with heavy marijuana use: 50 percent of people with psychotic disorders used marijuana regularly, versus 18 percent in the general population. Half of those with mental illness also used other illicit drugs, while the rate of recreational drug use in the general population is 12 percent.
“I take care of a lot of patients with severe mental illness, many of whom are sick enough that they are on disability,” said Hartz. “And it’s always surprising when I encounter a patient who doesn’t smoke or hasn’t used drugs or had alcohol problems.”
Hartz said another striking finding from the study is that once a person develops a psychotic illness, protective factors such as race and gender don’t have their typical influence.
Previous research indicates that Hispanics and Asians tend to have lower rates of substance abuse than European Americans. The same is true for women, who tend to smoke, drink and use illicit drugs less often than men.
“We see protective effects in these subpopulations,” Hartz explained. “But once a person has a severe mental illness, that seems to trump everything.”
That’s particularly true, she said, with smoking. During the last few decades, smoking rates have declined in the general population. People over age 50 are much more likely than younger people to have been regular smokers at some point in their lives. For example, about 40 percent of those over 50 used to smoke regularly. Among those under 30, fewer than 20 percent have been regular smokers. But among the mentally ill, the smoking rate is more than 75 percent, regardless of the patient’s age.
“With public health efforts, we’ve effectively cut smoking rates in half in healthy people, but in the severely mentally ill, we haven’t made a dent at all,” she said.
Until recently, smoking was permitted in most psychiatric hospitals and mental wards. Hartz believes that many psychiatrists decided that their sickest patients had enough problems without having to worry about quitting smoking, too. There also were concerns about potential dangers from using nicotine-replacement therapy, while continuing to smoke since smoking is so prevalent among the mentally ill. Recent studies, however, have found those concerns were overblown.
The question, she said, is whether being more aggressive in trying to curb nicotine, alcohol and substance use in patients with severe psychiatric illness can lengthen their lives. Hartz believes health professionals who treat the mentally ill need to do a better job of trying to get them to stop smoking, drinking and using drugs.
“Some studies have shown that although we psychiatrists know that smoking, drinking and substance use are major problems among the mentally ill, we often don’t ask our patients about those things,” she said. “We can do better, but we also need to develop new strategies because many interventions to reduce smoking, drinking and drug use that have worked in other patient populations don’t seem to be very effective in these psychiatric patients.”
University of Queensland (UQ) researchers have made a significant discovery that could one day halt a number of neurodegenerative diseases.

Scientists at the Queensland Brain Institute (QBI) have identified a gene that protects against spontaneous, adult-onset progressive nerve degeneration.
Dr Massimo Hilliard said that the discovery of gene mec-17 causing axon (nerve fibre) degeneration could open the door to better understand the mechanisms of neuronal injury and neurodegenerative diseases characterised by axonal pathology, such as motor neuron disease, Parkinson’s, Alzheimer’s and Huntington’s diseases.
“This is an important step to fully understand how axonal degeneration occurs, and thus facilitates development of therapies to prevent or halt this damaging biological event,” Dr Hilliard said.
Dr Hilliard runs a laboratory at QBI specialising in neuronal development, and focuses on how nerves both degenerate and regenerate.
The team found that mec-17 protects the neuron by stabilising its cytoskeletal structure, allowing proper transport of essential molecules and organelles, including mitochondria, throughout the axon.
This discovery has also the potential to accelerate the identification of human neurodegenerative conditions caused by mutations in genes similar to mec-17.
“It’s our hope that this could one day lead to more effective treatments for patients suffering from conditions causing neuronal degeneration,” Dr Hilliard said.
This discovery highlights the axon as a major focal point for the health of the neuron.
Findings of the research have been published in journal Cell Reports, and lead author Dr Brent Neumann anticipates that the research into the gene will soon lead to further discoveries.
“This study demonstrates that mec-17 normally functions to protect the nervous system from damage,” Dr Neumann said.
“This knowledge can now be used to understand precisely how the gene achieves this and to discover other molecules that are used by the nervous system for similar protective functions,” he said.
“We can now start to look into means of bypassing the function of mec-17, such as activating other genes or alternative mechanisms that can protect the nervous system from damage.”
Previous research has shown that mec-17 is conserved across species, including humans, suggesting a possible shared function of protection.
“We identified mec-17 from a genetic screening method aimed at identifying molecules that cause axonal degeneration when they become inactive through genetic mutations,” Dr Neumann said.
Stroke rehabilitation researchers report improvement in spatial neglect with prism adaptation therapy. This new study supports behavioral classification of patients with spatial neglect as a valuable tool for assigning targeted, effective early rehabilitation. Results of the study, “Presence of motor-intentional aiming deficit predicts functional improvement of spatial neglect with prism adaptation” were published ahead of print in Neurorehabilitation and Neural Repair on December 27, 2013.

The article is authored by Kelly M. Goedert, PhD, of Seton Hall University, Peii Chen, PhD, of Kessler Foundation, Raymond C. Boston, PhD, of the University of Pennsylvania, Anne L. Foundas, MD, of the University of Missouri, and A.M. Barrett, MD, director of Stroke Rehabilitation Research at Kessler Foundation, and chief of Neurorehabilitation Program Innovation at Kessler Institute for Rehabilitation. Drs. Barrett and Chen have faculty appointments at Rutgers New Jersey Medical School.
Spatial neglect, an under-recognized but disabling disorder, often complicates recovery from right brain stroke,” noted Dr. Barrett. “Our study suggests we need to know what kind of neglect patients have in order to assign treatment.” The research team tested the hypothesis that classifying patients by their spatial neglect profile, i.e., by Where (perceptional-intentional) versus Aiming (motor-intentional) symptoms, would predict response to prism adaptation therapy. Moreover, they hypothesized that patients with Aiming bias would have better response to prism adaptation recovery than those with isolated Where bias.
The study involved 24 patients with right brain stroke who completed 2 weeks of prism adaptation treatment. Participants also completed the Behavioral Inattention Test and Catherine Bergego Scale (CBS) tests of neglect recovery weekly for 6 weeks. Results showed that those with only Aiming deficits improved on the CBS, whereas those with only Where deficits did not improve. Participants with both types of deficits demonstrated intermediate improvement. “These findings suggest that patients with spatial neglect and Aiming deficits may benefit the most from early intervention with prism adaptataion therapy,” said Dr. Barrett. “More broadly, classifying spatial deficits using modality-specific measures should be an important consideration of any stroke trial intending to obtain the most valid, applicable, and valuable results for recovery after right brain stroke.”
Reaching for Froot Loops and grabbing Lego pieces to build a tower are different challenges for toddlers. Depending on what they’re trying to do, tots tend to develop handedness for different tasks at different ages, according to new research.

Most people are right-handed. Babies start using their right hand to reach for cereal nuggets by age 1. However, children take until age 4 to show such a preference when building Lego models. The findings, published in this month’s issue of Developmental Psychobiology, imply tendencies to use one hand more than the other emerge depending on the tasks kids confront, rather than their age.
Preference for the right or left hand is, in part, genetic. Prior studies have shown that some of these one-sided tendencies emerge early. Fetuses suck their right thumb more often than their left; newborns on their back turn to the right more frequently. Most children grow up to be right-handed—in part because of these innate, early leanings, scientists believe.
But the timing of when one hand emerges as the dominant one for most tasks remained unclear.
"As a parent and a scientist, I was surprised to find researchers thought 3-year-olds don’t display a hand preference," said neurobiologist Claudia Gonzalez of the University of Lethbridge in Alberta, Canada.
To study how handedness emerged between ages 1 to 5, Gonzalez and her colleagues assigned about 50 tiny participants to a familiar task: grabbing a colorful object or a tasty tidbit. Children ages 1 to 2 picked up Froot Loops or Cheerios to munch at snack time. Four- and 5-year-olds grasped Lego blocks to build a small model. Three-year-old subjects tackled both tasks.
Even the youngest children had strong right-handed leanings when reaching for food, the team found. Three-year-olds were right-handed eaters, but they were just as likely to use their left hand when playing with blocks. The 4- and 5-year-olds used their left hand to hold the base of their model steady, but they manipulated blocks into the correct positions with their other hand—a clear preference for right-handedness.
"There is a developmental milestone between the ages of 3 and 4 when something clicks," Gonzalez said. "Maybe they become more skilled, or they understand the task better."
Until that developmental “click,” this study shows hand preference isn’t constant across tasks – regardless of a child’s age.
The study “uses a very clever design to get at the question of how handedness varies across tasks,” said Klaus Libertus, an infant development researcher at the University of Pittsburgh. “We did not know handedness is connected to tasks in this way. I would have expected the 3-year-olds to show the same pattern on both tasks, especially since the demands were so similar.”
Developing a hand preference might also correlate with other functions that rely strongly on just one side of the brain, such as language and certain decision-making skills, Gonzalez noted. Preliminary data from children in her lab suggests that when handedness is evident earlier, these other functions also mature more quickly.
Finding the right task to study handedness at different ages will give researchers a firmer grasp on how young brains develop right - or left -handed tendencies, she said.
"You could say hand preference develops before 1, or you could say it doesn’t emerge until age 4—just depending on what task you are looking at," said Gonzalez.
“Good to see you. I’m sorry. It sounds like you’ve had a tough, tough, week.” Spoken by a doctor to a cancer patient, that statement is an example of compassionate behavior observed by a University of Rochester Medical Center team in a new study published by the journal Health Expectations.

Rochester researchers believe they are the first to systematically pinpoint and catalogue compassionate words and actions in doctor-patient conversations. By breaking down the dialogue and studying the context, scientists hope to create a behavioral taxonomy that will guide medical training and education.
“In health care, we believe in being compassionate but the reality is that many of us have a preference for technical and biomedical issues over establishing emotional ties,” said senior investigator Ronald Epstein, M.D., professor of Family Medicine, Psychiatry, Oncology, and Nursing and director of the UR Center for Communication and Disparities Research.
Epstein is a national and international keynote speaker and investigator on mindfulness and communication in medical education.
His team recruited 23 oncologists from a variety of private and hospital-based oncology clinics in the Rochester, N.Y., area. The doctors and their stage III or stage IV cancer patients volunteered to be recorded during routine visits. Researchers then analyzed the 49 audio-recorded encounters that took place between November 2011 and June 2012, and looked for key observable markers of compassion.
In contrast to empathy – another quality that Epstein and his colleagues have studied in the medical community — compassion involves a deeper and more active imagination of the patient’s condition. An important part of this study, therefore, was to identify examples of the three main elements of compassion: recognition of suffering, emotional resonance, and movement towards addressing suffering.
Emotional resonance, or a sense of sharing and connection, was illustrated by this dialogue: Patient: “I should just get a room here.” Oncologist: “Oh, I hope you don’t really feel like you’re spending that much time here.”
Another conversation included this response from a physician to a patient, who complained about a drug patch for pain: “Who wants a patch that makes you drowsy, constipated and fuzzy? I’ll pass, thank you very much.”
Some doctors provided good examples of how they use humor to raise a patient’s spirits without deviating from the seriousness of the situation. In one case, for example, a patient was concerned that he would not be able to drink two liters of barium sulfite in preparation for a CT scan.
Doctor: “If you just get down one little cup it will tell us what’s going on in the stomach. What I tell people when we’re not being recorded is to take a cup and then pour the rest down the toilet and tell them you drank it all (laughter)… Just a creative interpretation of what you are supposed to take.”
Patient: “I love it, I love it. Well, I thank you for that. I’m prepared to do what I’ve got to do to get this right.”
Researchers evaluated tone of voice, animation that conveyed tenderness and understanding, and other ways in which doctors gave reassurances or psychology comfort.
Here’s an instance in which an oncologist encouraged a reluctant patient to follow through with a planned trip to Arizona: “You know, if you decide to do it, break down and allow somebody to meet you at the gates and use a cart or wheelchair to get you to your next gate and things like that. And having just sent my father-in-law off to Hawaii and told him he had to do that, he said no, no, I can get there. Just, it’s okay. Nobody is gonna look at you and say, ‘What’s an able-bodied man doing in a cart?’ Just, it’s okay. It’s part of setting limits.”
Researchers also observed non-verbal communication, such as pauses or sighs at appropriate times, as well as speech features and voice quality (tone, pitch, loudness) and other metaphorical language that conveyed certain attitudes and meaning.
Compassion unfolds over time, researchers concluded. During the process, physicians must challenge themselves to stay with a difficult discussion, which opens the door for the patient to admit uncertainty and grieve the loss of normalcy in life.
“It became apparent that compassion is not a quality of a single utterance but rather is made up of presence and engagement that suffuses an entire conversation,” the study said. First author, Rachel Cameron, B.A., is a student at the University of Rochester School of Medicine and Dentistry; the audio-recordings were reviewed by a diverse group of medical professionals with backgrounds in literature and linguistics, as well as palliative care specialists.