In a landmark discovery, the final piece in the puzzle of understanding how the brain circuitry vital to normal fertility in humans and other mammals operates has been put together by researchers at New Zealand’s University of Otago.
Their new findings, which appear in the leading international journal Nature Communications, will be critical to enabling the design of novel therapies for infertile couples as well as new forms of contraception.
The research team, led by Otago neuroscientist Professor Allan Herbison, have discovered the key cellular location of signalling between a small protein known as kisspeptin* and its receptor, called Gpr54. Kisspeptin had earlier been found to be crucial for fertility in humans, and in a subsequent major breakthrough Professor Herbison showed that this molecule was also vital for ovulation to occur.
In the latest research, Professor Herbison and colleagues at Otago and Heidelberg University, Germany, provide conclusive evidence that the kisspeptin-Gpr54 signalling occurs in a small population of nerve cells in the brain called gonadotropin-releasing hormone (GnRH) neurons.
Using state-of-the-art techniques, the researchers studied mice that lacked Gpr54 receptors in only their GnRH neurons and found that these did not undergo puberty and were infertile. They then showed that infertile mice could be rescued back to completely normal fertility by inserting the Gpr54 gene into just the GnRH neurons.
Professor Herbison says the findings represent a substantial step forward in enabling new treatments for infertility and new classes of contraceptives to be developed.
"Infertility is a major issue affecting millions of people worldwide. It’s currently estimated that up to 20 per cent of New Zealand couples are infertile, and it is thought that up to one-third of all cases of infertility in women involve disorders in the area of brain circuitry we are studying.
"Our new understanding of the exact mechanism by which kisspeptin acts as a master controller of reproduction is an exciting breakthrough which opens up avenues for tackling what is often a very heart-breaking health issue. Through detailing this mechanism we now have a key chemical switch to which drugs can be precisely targeted," Professor Herbison says.
As well as the findings’ benefits for advancing new therapies for infertility and approaches to controlling fertility, they suggest that targeting kisspeptin may be valuable in treating diseases such as prostate cancer that are influenced by sex steroid hormone levels in the blood, he says.
Professor Herbison noted that the research findings represent a long-standing collaborative effort with the laboratory of Professor Gunther Schutz at Heidelberg University, Germany.
Professor Herbison is Director of the University’s Centre for Neuroendocrinology, which is the world-leading research centre investigating how the brain controls fertility.
"We are delighted to have published this work in one of the top scientific journals and also to be able to maintain the leading role of New Zealand researchers in understanding fertility control," he says.
The toxoplasma parasite can be deadly, causing spontaneous abortion in pregnant women or killing immune-compromised patients, but it has even stranger effects in mice.

Infected mice lose their fear of cats, which is good for both cats and the parasite, because the cat gets an easy meal and the parasite gets into the cat’s intestinal tract, the only place it can sexually reproduce and continue its cycle of infection.
New research by graduate student Wendy Ingram at the University of California, Berkeley, reveals a scary twist to this scenario: the parasite’s effect seem to be permanent. The fearless behavior in mice persists long after the mouse recovers from the flu-like symptoms of toxoplasmosis, and for months after the parasitic infection is cleared from the body, according to research published today (Sept. 18) in the journal PLoS ONE.
“Even when the parasite is cleared and it’s no longer in the brains of the animals, some kind of permanent long-term behavior change has occurred, even though we don’t know what the actual mechanism is,” Ingram said. She speculated that the parasite could damage the smell center of the brain so that the odor of cat urine can’t be detected. The parasite could also directly alter neurons involved in memory and learning, or it could trigger a damaging host response, as in many human autoimmune diseases.
Ingram became interested in the protozoan parasite, Toxoplasma gondii, after reading about its behavior-altering effects in mice and rats and possible implications for its common host, the domesticated cat, and even humans. One-third of people around the world have been infected with toxoplasma and probably have dormant cysts in their brains. Kept in check by the body’s immune system, these cysts sometimes revive in immune-compromised people, leading to death, and some preliminary studies suggest that chronic infection may be linked to schizophrenia or suicidal behavior.
Pregnant women are already warned to steer clear of kitty litter, since the parasite is passed through cat feces and can cause blindness or death in the fetus. One main source of spread is undercooked pork, Ingram said.
With the help of Michael Eisen and Ellen Robey, UC Berkeley professors of molecular and cell biology, Ingram set out three years ago to discover how toxoplasma affects mice’s hard-wired fear of cats. She tested mice by seeing whether they avoided bobcat urine, which is normal behavior, versus rabbit urine, to which mice don’t react. While earlier studies showed that mice lose their fear of bobcat urine for a few weeks after infection, Ingram showed that the three most common strains of Toxoplasma gondii make mice less fearful of cats for at least four months.
Using a genetically altered strain of toxoplasma that is not able to form cysts and thus is unable to cause chronic infections in the brain, she demonstrated that the effect persisted for four months even after the mice completely cleared the microbe from their bodies. She is now looking at how the mouse immune system attacks the parasite to see whether the host’s response to the infection is the culprit.
“This would seem to refute – or at least make less likely – models in which the behavior effects are the result of direct physical action of parasites on specific parts of the brain,” Eisen wrote in a blog post about the research.
“The idea that this parasite knows more about our brains than we do, and has the ability to exert desired change in complicated rodent behavior, is absolutely fascinating,” Ingram said. “Toxoplasma has done a phenomenal job of figuring out mammalian brains in order to enhance its transmission through a complicated life cycle.”
Brain regions associated with memory shrink as adults age, and this size decrease is more pronounced in those who go on to develop neurodegenerative disease, reports a new study published Sept. 18 in the Journal of Neuroscience. The volume reduction is linked with an overall decline in cognitive ability and with increased genetic risk for Alzheimer’s disease, the authors say.

Image: Network of brain regions, highlighted in red and yellow, show atrophy in both healthy aging and neurodegenerative disease. The regions highlighted are susceptible to normal aging and dementia.
“Our results identify a specific pattern of structural brain changes that may provide a possible brain marker for the onset of Alzheimer’s disease,” said Nathan Spreng, assistant professor of human development and the Rebecca Q. and James C. Morgan Sesquicentennial Faculty Fellow in Cornell’s College of Human Ecology.
The study is one of the first to measure structural changes in a collection of brain regions – not just one single area – over the adult life course and from normal aging to neurodegenerative disease, said Spreng, who co-authored the study with Gary R. Turner of York University in Toronto.
Overall, they studied brain data from 848 individuals spanning the adult lifespan, using data from the Open Access Series of Imaging Studies and the Alzheimer’s Disease Neuroimaging Initiative (ADNI). About half of the ADNI sample was assessed multiple times over several years, allowing the researchers to measure brain changes over time and determine who did and did not progress to dementia.
The researchers found that brain volume in the default network (a set of brain regions associated with internally generated thoughts such as memory) declined in both healthy and pathological aging. The researchers noted the greatest decline in Alzheimer’s patients and in those who progressed from mild cognitive impairment to Alzheimer’s disease. Reduced brain volumes in these regions were associated with declines in cognitive ability, the presence of known biological markers of Alzheimer’s disease and with carrying the APOE4 variant of APOE gene, a known risk factor for Alzheimer’s.
“While elements of the default network have previously been implicated in aging and neurodegenerative disease, few studies have examined broad network changes over the full adult life course with such large participant samples and including both behavioral and genetic data,” said Spreng. “Our findings provide evidence for a network-based model of neurodegenerative disease, in which progressive brain changes spread through networks of connected brain regions.”
Scientists at the University of Alabama at Birmingham have identified a molecular pathway that seems to contribute to the ability of malignant glioma cells in a brain tumor to spread and invade previously healthy brain tissue. Researchers said the findings, published Sept. 19, 2013, in the journal PLOS ONE, provide new drug-discovery targets to rein in the ability of these cells to move.

Gliomas account for about a third of brain tumors, and survival rates are poor; only about half of the 10,000 Americans diagnosed with malignant glioma survive the first year, and only about one quarter survive for two years.
“Malignant gliomas are notorious, not only because of their resistance to conventional chemotherapy and radiation therapy, but also for their ability to invade the surrounding brain, thus causing neurological impairment and death,” said Hassan Fathallah-Shaykh, M.D., Ph.D., associate professor in the UAB Department of Neurology. “Brain invasion, a hallmark of gliomas, also helps glioma cells evade therapeutic strategies.”
Fathallah-Shaykh said there is a great deal of interest among scientists in the idea that a low-oxygen environment induces glioma cells to react with aggressive movement, migration and brain invasion. A relatively new cancer strategy to shrink tumors is to cut off the tumor’s blood supply – and thus its oxygen source – through the use of anti-angiogenesis drugs. Angiogenesis is the process of making new blood vessels.
“Stop angiogenesis and you shut off a tumor’s blood and oxygen supply, denying it the components it needs to grow,” said Fathallah-Shaykh. “Drugs that stop angiogenesis are believed to create a kind of killing field. This study identified four glioma cell lines that dramatically increased their motility when subjected to a low-oxygen environment – in effect escaping the killing field to create a new colony elsewhere in the brain.”
Fathallah-Shaykh and his team then identified two proteins that form a pathway linking low oxygen, or hypoxia, to increased motility.
“We identified a signaling protein that is activated by hypoxia called Src,” said Fathallah-Shaykh. “We also identified a downstream protein called neural Wiskott-Aldrich syndrome protein (N-WASP), which is regulated by Src in the cell lines with increased motility.”
The researchers then used protein inhibitors to shut off Src and N-WASP. When either protein was inhibited, low oxygen lost its ability to augment cell movement.
“These findings indicate that Src, N-WASP and the linkage between them – which is something we don’t fully understand yet – are key targets for drugs that would interfere with the ability of a cell to move.” said Fathallah-Shaykh. “If we can stop them from moving, then techniques such as anti-angiogenesis should be much more effective. Anti-motility drugs could be a key component in treating gliomas in the years to come.”
Researchers at UT Southwestern Medical Center have identified a cellular switch that potentially can be turned off and on to slow down, and eventually inhibit the growth of the most commonly diagnosed and aggressive malignant brain tumor.

Findings of their investigation show that the protein RIP1 acts as a mediator of brain tumor cell survival, either protecting or destroying cells. Researchers believe that the protein, found in most glioblastomas, can be targeted to develop a drug treatment for these highly malignant brain tumors. The study was published online Aug. 22 in Cell Reports.
"Our study identifies a new mechanism involving RIP1that regulates cell division and death in glioblastomas," said senior author Dr. Amyn Habib, associate professor of neurology and neurotherapeutics at UT Southwestern, and staff neurologist at VA North Texas Health Care System. "For individuals with glioblastomas, this finding identified a target for the development of a drug treatment option that currently does not exist."
In the study, researchers used animal models to examine the interactions of the cell receptor EGFRvIII and RIP1. Both are used to activate NFκB, a family of proteins that is important to the growth of cancerous tumor cells. When RIP1 is switched off in the experimental model, NFκB and the signaling that promotes tumor growth is also inhibited. Furthermore, the findings show that RIP1 can be activated to divert cancer cells into a death mode so that they self-destruct.
According to the American Cancer Society, about 30 percent of brain tumors are gliomas, a fast-growing, treatment-resistant type of tumor that includes glioblastomas, astrocytomas, oligodendrogliomas, and ependymomas. In many cases, survival is tied to novel clinical trial treatments and research that will lead to drug development.
Research on synapse stabilization could aid understanding of autism, schizophrenia, intellectual disability

When we’re born, our brains aren’t very organized. Every brain cell talks to lots of other nearby cells, sending and receiving signals across connections called synapses.
But as we grow and learn, things get a bit more stable. The brain pathways that will serve us our whole lives start to organize, and less-active, inefficient synapses shut down.
But why and how does this happen? And what happens when it doesn’t go normally? New research from the University of Michigan Medical School may help explain.
In a new paper in Nature Neuroscience, a team of U-M neuroscientists reports important findings about how brain cells called neurons keep their most active connections with other cells, while letting other synapses lapse.
Specifically, they show that SIRP alpha, a protein found on the surface of various cells throughout the body, appears to play a key role in the process of cementing the most active synaptic connections between brain cells. The research, done in mouse brains, was funded by the National Institutes of Health and several foundations.
The findings boost understanding of basic brain development – and may aid research on conditions like autism, schizophrenia, epilepsy and intellectual disability, all of which have some basis in abnormal synapse function.
“For the brain to be really functional, we need to keep the most active and most efficient connections,” says senior author Hisashi Umemori, M.D., Ph.D., a research assistant professor at U-M’s Molecular and Behavioral Neuroscience Institute and assistant professor of biological chemistry in the Medical School. “So, during development it’s crucial to establish efficient connections, and to eliminate inactive ones. We have identified a key molecular mechanism that the brain uses to stabilize and maturate the most active connections.”
Umemori says the new findings on SIRP alpha grew directly out of previous work on competition between neurons, which enables the most active ones to become part of pathways and circuits. (Read more on this research)
The team suspected that there must be some sort of signal between the two cells on either side of each synapse — something that causes the most active synapses to stabilize. So they set out to find out what it was.
SIRP-rise findings
The group had previously shown that SIRP-alpha was involved in some way in a neuron’s ability to form a presynaptic nerve terminal – an extension of the cell that reaches out toward a neighboring cell, and can send the chemical signals that brain cells use to talk to one another.
SIRP-alpha is also already known to serve an important function in the rest of the body – essentially, helping normal cells tell the immune system not to attack them. It may also help cancer cells evade detection by the immune system’s watchdogs.
In the new study, the team studied SIRP alpha function in the brain – and started to understand its role in synapse stabilization. They focused on the hippocampus, a region of the brain very important to learning and memory.
Through a range of experiments, they showed that when a brain cell receives signals from a neighboring cell across a synapse, it actually releases SIRP-alpha into the space between the cells. It does this through the action of molecules inside the cell – called CaMK and MMP – that act like molecular scissors, cutting a SIRP-alpha protein in half so that it can float freely away from the cell.
The part of the SIRP-alpha protein that floats into the synapse “gap” latches on to a receptor on the other side, called a CD47 receptor. This binding, in turn, appears to tell the cell that the signal it sent earlier was indeed received – and that the synapse is a good one. So, the cell brings more chemical signaling molecules down that way, and releases them into the synapse.
As more and more nerve messages travel between the “sending” and “receiving” cells on either side of that synapse, more SIRP-alpha gets cleaved, released into the synapse, and bound to CD47.
The researchers believe this repeated process is what helps the cells determine which synapses to keep – and which to let wither.
Umemori says the team next wants to look at what happens when SIRP-alpha doesn’t get cleaved as it should – and at what’s happening in cells when a synapse gets eliminated.
“This step of shedding SIRP-alpha must be critical to developing a functional neural network,” he says. “And if it’s not done well, disease or disorders may result. Perhaps we can use this knowledge to treat diseases caused by defects in synapse formation.”
He notes that the gene for the CD47 receptor is found in the same general area of our DNA as several genes that are suspected to be involved in schizophrenia.
If the development of our nervous system is disturbed, we risk developing serious neurological diseases, impairing our sensory systems, movement control or cognitive functions. This is true for all organisms with a well-developed nervous system, from man to worm. New research from BRIC, University of Copenhagen reveals how a tiny molecule called mir-79 regulates neural development in roundworms. The molecule is required for correct migration of specific nerve cells during development and malfunction causes defects in the nervous system of the worm. The research has just been published in the journal Science.
Hundreds of worms lie in a small plastic plate under the laboratory microscope. Over the last three years, the group of Associate Professor Roger Pocock has used the roundworm C. elegans tostudy the development of the nervous system. They have just made an important discovery.

“Our new results show that a small molecule called mir-79 is indispensable for development of the worm’s nervous system. mir-79 acts by equipping special signal molecules with a transmitter, which tells the nerve cells how they should migrate during development of the worm. If we remove mir-79 with gene technology, development of the worm nervous system goes awry”, says postdoc Mikael Egebjerg Pedersen, who is responsible for the experimental studies.
mir-79 adds just the right combination of sugar
The research shows that mir-79 acts by controlling the addition of certain groups of sugars to selected signaling molecules. In the world of cells, sugar molecules act as transmitters.

When the nerve cells come into contact with the sugar-transmitters, they are informed where to locate themselves during neural development. If the researchers remove mir-79, the migration of the nerve cells is misguided causing neuronal defects in the worms.
“It has earlier been shown that signaling molecules guide nerve migration, but our research shows that mir-79 regulates nerve cell migration by controlling the correct balance of sugar-transmitters on signaling molecules. If mir-79 does not function, the worm nervous system is malformed. In the wild, such defects would be harmful for worm survival”, explains Roger Pocock who leads the research group behind the finding.
Worm studies reveal important clues for neuronal repair
A version of mir-79 called mir-9 is found in humans. Therefore, these results are important for understanding how our nervous system develops during fetal development. In addition, the results add to the understanding of how nerve cells may be stimulated to repair damage in our brain or spinal cord.
“Our nervous system is a tissue which is not easily repaired after damage. So, how certain molecular cues can stimulate nerve cells to migrate is an important brick in the puzzle. This will enable us to understand how nerve tissue can be regenerated after, for example, a stroke or an accident. If we can use such knowledge to mimic the signals, we may be able to stimulate nerve cells to migrate into a damaged area”, says Roger Pocock.
Worms are a fantastic model to study how the nervous system develops and how nerve cells form neuronal circuits. Most of the genes that control nervous system development in the worm are also found in humans. At the same time, the reduced complexity of the worm nervous system allows researchers to investigate central biological mechanisms. With new technologies they can mark single cells or molecules, and as worms are transparent, the researchers can track the marked molecules or cells live during worm development.
The next step for the researchers is to investigate how the regulatory pathway they have revealed is regulated in cultures of human cells.
NIH-funded discovery began with asking how the brain learns to see
A class of proteins that controls visual system development in the young brain also appears to affect vulnerability to Alzheimer’s disease in the aging brain. The proteins, which are found in humans and mice, join a limited roster of molecules that scientists are studying in hopes of finding an effective drug to slow the disease process.

Image: PirB (red) is heavily concentrated on the surface of growing nerve cells. Courtesy of Dr. Carla Shatz, Stanford.
"People are just beginning to look at what these proteins do in the brain. While more research is needed, these proteins may be a brand new target for Alzheimer’s drugs," said Carla Shatz, Ph.D., the study’s lead investigator. Dr. Shatz is a professor of biology and neurobiology at Stanford University in California, and the director of Stanford’s interdisciplinary biosciences program, BioX.
She and her colleagues report that LilrB2 (pronounced “leer-bee-2”) in humans and PirB (“peer-bee”) in mice can physically partner with beta-amyloid, a protein fragment that accumulates in the brain during Alzheimer’s disease. This in turn triggers a harmful chain reaction in brain cells. In a mouse model of Alzheimer’s, depleting PirB in the brain prevented the chain reaction and reduced memory loss.
The research was funded in part by the National Eye Institute, the National Institute on Aging (NIA), and the National Institute of Neurological Disorders and Stroke (NINDS), all part of the National Institutes of Health. It is reported in the Sept. 20 issue of Science.
"These findings provide valuable insight into Alzheimer’s, a complex disorder involving the abnormal build-up of proteins, inflammation and a host of other cellular changes," said Neil Buckholtz, Ph.D., director of the neuroscience division at NIA. "Our understanding of the various proteins involved, and how these proteins interact with each other, may one day result in effective interventions that delay, treat or even prevent this dreaded disease."
Alzheimer’s disease is the most common cause of dementia in older adults, and affects as many as 5 million Americans. Large clumps—or plaques—of beta-amyloid and other proteins accumulate in the brain during Alzheimer’s, but many researchers believe the disease process starts long before the plaques appear. Even in the absence of plaques, beta-amyloid has been shown to cause damage to brain cells and the delicate connections between them.
Dr. Shatz’s discovery took a unique path. She is a renowned neuroscientist, but Alzheimer’s disease is not her focus area. For decades, she has studied plasticity—the brain’s capacity to learn and adapt—focusing mostly on the visual system.
"Dr. Shatz has always been a leader in the field of plasticity, and now she’s taken yet another innovative step—giving us new insights into the abnormal plasticity that occurs in Alzheimer’s disease," said Michael Steinmetz, Ph.D., a program director at NEI. "These findings rest squarely on basic research into the development of the visual system." NEI has funded Dr. Shatz for more than 35 years.
During development, the eyes compete to connect within a limited territory of the brain—a process known as ocular dominance plasticity. The competition takes place during a limited time in early life. If visual experience through one eye is impaired during that time—for example, by a congenital cataract (present from birth)—it can permanently lose territory to the other eye.
"Ocular dominance is a classic example of how a brain circuit can change with experience," Dr. Shatz said. "We’ve been trying to understand it at a molecular level for a long time."
Her search eventually led to PirB, a protein on the surface of nerve cells in the mouse brain. She discovered that mice without the gene for PirB have an increase in ocular dominance plasticity. In adulthood, when the visual parts of their brains should be mature, the connections there are still flexible. This established PirB as a “brake on plasticity” in the healthy brain, Dr. Shatz said.
It wasn’t long before she began to wonder if PirB might also put a brake on plasticity in Alzheimer’s disease. In the current study, she pursued that question with Taeho Kim, Ph.D., a postdoctoral fellow in her lab, and Christopher M. William, M.D., Ph.D., a neuropathology fellow at Massachusetts General Hospital in Boston. Bradley Hyman, M.D., Ph.D., a professor of neurology at Mass General, was a collaborator on the project.
First, the team repeated the genetic experiment that Dr. Shatz had done in normal mice—but this time, they deleted the PirB gene in the Alzheimer’s mice. By about nine months of age, these mice typically develop learning and memory problems. But that didn’t happen in the absence of PirB.
Next, the researchers began thinking about how PirB might fit into the Alzheimer’s disease process, and particularly how it might interact with beta-amyloid. Dr. Kim theorized that since PirB resides on the surface of nerve cells, it might act as a binding site—or receptor—for beta-amyloid. Indeed, he found that PirB binds tightly to beta-amyloid, especially to tiny clumps of it that are believed to ultimately grow into plaques.
Beta-amyloid is known to weaken synapses—the connections between nerve cells. The researchers found that PirB appears to be an accomplice in this process. Without PirB, synapses in the mouse brain were resistant to the effects of beta-amyloid. Other experiments showed that binding between PirB and beta-amyloid can trigger a cascade of harmful reactions that can lead to the breakdown of synapses.
Although PirB is a mouse protein, humans have a closely related protein called LilrB2. The researchers found that this protein also binds tightly to beta-amyloid. By examining brain tissue from people with Alzheimer’s disease, they also found evidence that LilrB2 may trigger the same harmful reactions that PirB can trigger in the mouse brain.
"These are novel results, and direct interaction between beta-amyloid and PirB-related proteins opens up welcome avenues for investigating new drug targets for Alzheimer’s disease," said Roderick Corriveau, Ph.D., a program director at NINDS.
Dr. Shatz said she hopes to interest other researchers to work on developing drugs to block PirB and LilrB2. Currently, no drugs treat the underlying causes of Alzheimer’s disease. Most of the interventions that have reached clinical testing are designed to clear away beta-amyloid. To date, only two other beta-amyloid receptors (PrP-C and EphB2) have been found and are being pursued as drug targets.
Clock’s rhythm ensures steady energy supply to cells during times of fasting
Each of our cells has an energy furnace, and it is called a mitochondrion. A Northwestern University-led research team now has identified a new mode of timekeeping that involves priming the cell’s furnace to properly use stored fuel when we are not eating.
The interdisciplinary team has identified the “match” and “flint” responsible for lighting this tiny furnace. And the match is only available when the circadian clock says so, underscoring the importance of the biological timing system to metabolism.
“Circadian clocks are with us on Earth because they have everything to do with energy,” said Joe Bass, M.D., who led the research. “If an organism burns its energy efficiently, it has a better chance of survival. Our results tell us how the circadian clock triggers the cell’s energy-burning process. Cells are most capable of using fuel when the clock is working properly.”
Bass is the Charles F. Kettering Professor and chief of the division of endocrinology, metabolism and molecular medicine at Northwestern University Feinberg School of Medicine and an endocrinologist at Northwestern Memorial Hospital.
Mitochondria regulate the supply of energy to cells when we are at rest, with no glucose available from food. In a study of mice, the researchers found that the circadian clock supplies the match to light the furnace and on the match tip is a critical compound called NAD+. It combines with an enzyme in mitochondria called Sirtuin 3, which acts as the flint, to light the furnace. When the clock in an animal isn’t working, the animal can’t metabolize stored energy and the process doesn’t ignite.
This pathway through which the body clock controls activities within the mitochondria shows how energy generation is tied tightly to the light-dark/activity-rest cycle each day.
The findings, which could be useful in the development of therapies to treat metabolic disorders related to circadian disruption, is published today (Sept. 19) by the journal Science.
The results demonstrate that the circadian clock, a genetic timekeeper that evolved to enable organisms to track the daily transition from light to darkness early in evolution, generates oscillations in mitochondrial energy capacity through rhythmic regulation of NAD+ biosynthesis.
The clock facilitates oxidative rhythms that anticipate an animal’s fasting/feeding cycle that occurs during the transition from light to darkness and wakefulness to sleep each day, and, in so doing, prevents the cell from “starving” during the night.
To understand how mitochondria are affected by circadian clock disorder, the researchers genetically removed the clocks in laboratory mice and compared them to controls. Both groups of mice were studied in a state of fasting; this “stress” test enabled the researchers to pinpoint just how the clock maintains “energy reserves” (akin to stress testing of a bank).
Bass and his research group worked together with Navdeep S. Chandel, a colleague of Bass’ at Feinberg, and John M. Denu, at the University of Wisconsin-Madison. They found the mice lacking clocks had defects in their mitochondria: the mitochondria could not metabolize stored energy and had no reserve to prevent depletion of the main currency, ATP. (Adenosine triphosphate is an energy-bearing molecule found in all living cells.)
Working with Northwestern colleague Milan Mrksich, they went on to show that removal of the clock depletes the necessary ingredient to turn on an enzyme within mitochondria, Sirtuin 3, which activates energy burning during fasting.
The researchers also showed that when the circadian clock was disrupted, resulting in a lack of NAD+, they could provide NAD+ supplements and restore function to the mitochondrion.
The findings expand the understanding of the molecular pathways linking the circadian clock with metabolism and show that the clock provides an essential buffer to stabilize the cell as organisms transition between eating and fasting each day. This knowledge has implications for disease intervention and prevention, including of diabetes, and potentially for states of increased cell demand for metabolism (including inflammation and cancer).
“We have established the chain of events that couples the clock’s control switch with the machinery of the mitochondria,” said Bass, who also is a member of the department of neurobiology at the Weinberg College of Arts and Sciences. “We now have identified an additional link in the supply chain that provides energy to the cell at different phases of our daily sleep-wake cycle. These findings establish a key role for the NAD+ biosynthetic cycle in this process.”
Major senior authors from Northwestern include Chandel, a professor in medicine-pulmonary and cell and molecular biology at Feinberg, and Mrksich, the Henry Wade Rogers Professor of Biomedical Engineering, Chemistry and Cell and Molecular Biology at Feinberg, Weinberg and the McCormick School of Engineering and Applied Science. Chandel and Mrksich are members of the Robert H. Lurie Comprehensive Cancer Center of Northwestern University.
The co-first authors are Clara Bien Peek, a postdoctoral fellow, and Alison H. Affinati, an M.D./Ph.D. candidate, both working in Bass’ lab. They have literally worked around the clock on the research, which builds on the earlier work of co-author Kathryn Moynihan Ramsey. In 2009, she and colleagues reported in Science that the compound NAD, together with the enzyme SIRT1, functions as a molecular “switch” to coordinate the internal clock with metabolic systems.
The current research team combined Northwestern expertise in basic circadian clock research, chemistry and physiology with outside collaborators who were able to verify the Northwestern findings.
Co-author Eric Goetzman, from the University of Pittsburgh School of Medicine, an expert in the rare children’s disease called metabolic myopathy, was able to confirm that the pattern the researchers observed in mice was the same as that seen in these children. Fasting can be life-threatening for children with this disorder because they can’t metabolize stored energy due to defects in their mitochondria.
Analyses by co-author Christopher B. Newgard at Duke University Medical Center identified a signature profile of the metabolic myopathy in mice with altered circadian clock genes.
Research opens up longer therapy window for children with neurodevelopmental disorders
The development of fine motor control – the ability to use your fingertips to manipulate objects – takes longer than previously believed, and isn’t entirely the result of brain development, according to a pair of complementary studies.
The research opens up the potential to use therapy to continue improving the motor control skills of children suffering from neurodevelopmental disorders such as cerebral palsy, a blanket term for central motor disorders that affects about 764,000 children and adults nationwide.
“These findings show that it’s not only possible, but critical to continue or begin physical therapy in adolescence,” said Francisco Valero-Cuevas, corresponding author of two studies on the matter – one appearing in the Journal of Neurophysiology and the other in the Journal of Neuroscience.
“We find we likely do not have a narrow window of opportunity in early childhood to improve manipulation skills, as previously believed, but rather developmental plasticity lasts much longer and provides opportunity throughout adolescence” he said. “This complements similarly exciting findings showing brain plasticity in adulthood and old age.”
Researchers had previously been able to detect improvements in fine motor control of the hand only until around ages 8-10. However, Valero-Cuevas – a professor of biomedical engineering and of biokinesiology and physical therapy – invented a tool that allows for more precise measurement of fine motor control.
The tool is simple – springs of varying stiffness and length set between plastic pads which Valero-Cuevas has patented. Motor skill is then determined by the individual’s ability to compress the increasingly awkward spring devices. Sudarshan Dayanidhi, during his PhD studies at USC with Valero-Cuevas, developed and applied clinically useful versions of this technology with great success.
With this new tool, and in collaboration with Åsa Hedberg and Hans Forssberg of the Astrid Lindgren Children’s Hospital in Stockholm, they tested 130 children with typical development between 4-16 years of age, and demonstrated that even the 16-year-olds were continuing to hone their fine motor skills. Their findings will appear in the Journal of Neurophysiology on Oct. 1.
To further this study, Dayanidhi and Valero-Cuevas joined forces with Assistant Professor of biokinesiology and physical therapy Jason Kutch (also of USC), to explore if this longer developmental timeline for dexterity was tied not just to brain maturation, but also to muscular development.
It has long been thought that improved dexterity involved only brain development and muscle growth – where muscles only got bigger and stronger – but did not add to dexterous skills since they are performed at low forces. The research by Dayanidhi, Kutch and Valero-Cuevas indicates otherwise.
“Combining our metrics of dexterity from Dayanidhi’s PhD work, with novel and noninvasive measures of muscle contraction time developed by Prof. Kutch, we were able to show a previously unknown strong association between gains in dexterity with improvement in low force muscle contraction time,” Valero-Cuevas said.
This second facet of the research showing how both dexterity and muscle function improve in children will appear in the Journal of Neuroscience on Sept. 18.
University of Adelaide researchers have identified a likely molecular pathway that causes a group of untreatable neurodegenerative diseases, including Huntington’s disease and Lou Gehrig’s disease.
The group of about 20 diseases, which show overlapping symptoms that typically include nerve cell death, share a similar genetic mutation mechanism ‒ but how this form of mutation causes these diseases has remained a mystery.
"Despite the genes for some of these diseases having been identified 20 years ago, we still haven’t understood the underlying mechanisms that lead to people developing clinical symptoms," says Professor Robert Richards, Head of Genetics in the University’s School of Molecular and Biomedical Sciences.
"By uncovering the molecular pathway for these diseases, we now expect to be able to define targets for intervention and so come up with potential therapies. Ultimately this will help sufferers to reduce the amount of nerve cell degeneration or slow its progression."
In an article published in Frontiers in Molecular Neuroscience, Professor Richards and colleagues describe their innovative theory and new evidence for the key role of RNA in the development of the diseases. RNA is a large molecule in the cell that copies genetic code from the cell’s DNA and translates it into the proteins that drive biological functions.
People with these diseases all have expanded numbers of copies of particular sequences of the ‘nucleotide bases’ which make up DNA.
"In most cases people with these diseases have increased numbers of repeat sequences in their RNA," says Professor Richards. "The disease develops when people have too many copies of the repeat sequence. Above a certain threshold, the more copies they have the earlier the disease develops and the more severe the symptoms. The current gap in knowledge is why having these expanded repeat sequences of genes in the RNA translates into actual symptoms."
Professor Richards says evidence points towards a dysfunctional RNA and a pivotal role of the body’s immune system in the development of the disease.
"Rather than recognising the ‘expanded repeat RNA’ as its own RNA, we believe the ‘expanded repeat RNA’ is being seen as foreign, like the RNA in a virus, and this activates the innate immune system, resulting in loss of function and ultimately the death of the cell," he says.
The University of Adelaide laboratory modelled and defined the expanded repeat RNA disease pathway using flies (Drosophila). Other laboratories have reported tell-tale, but previously inexplicable, signs characteristic of this pathway in studies of patients with Huntington’s disease and Myotonic Dystrophy.
"This new understanding, once proven in each of the relevant human diseases, opens the way for potential treatments, and should give cause for hope to those with these devastating diseases," Professor Richards says.
A new technique that allows scientists to measure the electrical activity in the communication junctions of the nervous systems has been developed by a researcher at Queen Mary University of London.
The junctions in the central nervous systems that enable the information to flow between neurons, known as synapses, are around 100 times smaller than the width of a human hair (one micrometer and less) and as such are difficult to target let alone measure.

By applying a high-resolution scanning probe microscopy that allows three-dimensional visualisation of the structures, the team were able to measure and record the flow of current in small synaptic terminals for the first time.
“We replaced the conventional low-resolution optical system with a high-resolution microscope based on a nanopipette,” said Dr Pavel Novak, a bioengineering specialist from Queen Mary’s School of Engineering and Materials Science.
“The nanopipette hovers above the surface of the sample and scans the structure to reveal its three-dimensional topography. The same nanopipette then attaches to the surface at selected locations on the structure to record electrical activity. By repeating the same procedure for different locations of the neuronal network we can obtain a three-dimensional map of its electrical properties and activity.”
The research, published (Wednesday 18 September) in Neuron, opens a new window into the neuronal activity at nanometre scale, and may contribute to the wider effort of understanding the function of the brain represented by the Brain Activity Map Project (BRAIN initiative), which aims to map the function of each individual neuron in the human brain.
A team from the University of Rochester Medical Center has shown scientifically what many women report anecdotally: that the breast cancer drug tamoxifen is toxic to cells of the brain and central nervous system, producing mental fogginess similar to “chemo brain.”
However, in the Journal of Neuroscience, researchers also report they’ve discovered an existing drug compound that appears to counteract or rescue brain cells from the adverse effects of the breast cancer drug.
Corresponding author Mark Noble, Ph.D., professor of Biomedical Genetics and director of the UR Stem Cell and Regenerative Medicine Institute, said it’s exciting to potentially be able to prevent a toxic reaction to one of the oldest and most widely used breast cancer medications on the market. Although tamoxifen is more easily tolerated compared to most cancer treatments, it nonetheless produces troubling side effects in a subset of the large number of people who take it.
By studying tamoxifen’s impact on central nervous system cell populations and then screening a library of 1,040 compounds already in clinical use or clinical trials, his team identified a substance known as AZD6244, and showed that it essentially eliminated tamoxifen-induced killing of brain cells in mice.
“As far as I know, no one else has discovered an agent that singles out and protects brain and central nervous system cells while also not protecting cancer cells,” said Noble, who also collaborates with researchers at the UR’s James P. Wilmot Cancer Center. “This creates a whole new paradigm; it’s where we need to go.”
The research is the result of two separate but related projects from Noble’s lab. One investigates the science underlying a condition known as “chemo brain,” and another is looking at how to exploit tamoxifen’s attributes for use in other types of cancer besides early-stage, less-aggressive breast cancer. (The drug is a type of hormonal therapy, which works by stopping the growth of estrogen-sensitive tumors.)
In the Journal of Neuroscience paper, Noble’s team first identified central nervous system (CNS) cells that are most vulnerable to tamoxifen toxicity. Chief among these were oligodendrocyte-type 2 astrocyte progenitor cells (O-2A/OPCs), cells that are essential for making the insulating sheaths (called myelin) required for nerve cells to work properly. Exposure to clinically relevant levels of tamoxifen for 48 hours killed more than 75 percent of these cells.
In earlier work, while studying the biology of the cognitive difficulties that linger in some people being treated for cancer, Noble and colleagues discovered that 5-fluorouracil, (cisplatin, cytarabine, carmustine), and multiple other types of chemotherapy, damages populations of stem cells in the CNS. Published in the Journal of Biology (1, 2) in 2006 and 2008, these studies pioneered analysis of the biological foundations of chemo brain.
“It’s critical to find safe treatments that can rescue the brain from impairment,” Noble said, “because despite increasing awareness and research in this area, some people continue to endure short-term memory loss, mental cloudiness, and trouble concentrating. For some patients the effects wear off over time, but others experience symptoms that can lead to job loss, depression, and other debilitating events.”
Noble’s lab, led by post-doctoral fellow Hsing-Yu Chen, Ph.D., identified 27 drugs that protected O-2A/OPCs from the effects of tamoxifen. Further testing resulted in singling out AZD6244, by other laboratories as a potential cancer therapy.
In mice co-treated with tamoxifen plus AZD6244, cell death in the corpus callosum, the largest white matter (myelinated) structure in the brain, was prevented, the paper reported. Meanwhile, several national clinical trials are testing the safety and effectiveness of AZD6244 in treating multiple cancers, from breast and colon to melanoma and lung.
Researchers were also optimistic about finding that while AZD6244 protected brain cells, it did not also protect cancer cells. New drug compounds have greater value if they do not compromise the effects of existing treatments, and in this case, Noble said, the experiments in his laboratory agreed with studies by other research groups, who found that the combined use of AZD6244 and chemotherapy enhances targeting of cancer cells.
In future work, Noble’s group plans to identify the dosage of AZD6244 that provides maximum protection and minimum disruption to differentiating brain cells. Their research was supported by the U.S. Department of Defense, National Institutes of Health, Susan Komen Race for the Cure, and the Carlson Stem Cell Fund.
This is the second tamoxifen-related study to come from Noble’s lab in 2013. In April they showed in pre-clinical research they could leverage the drug’s various cellular activities so that it might work on more aggressive triple-negative breast cancer. In the journal EMBO Molecular Medicine, Noble and Chen also reported finding an experimental compound that enhances tamoxifen’s ability to work in this new way.
Gene expression analysis shows bird brain an even better model for research.
Explorers need good maps, which they often end up drawing themselves.

Pursuing their interests in using the brains of birds as a model for the human brain, an international team of researchers led by Duke neuroscientist Erich Jarvis and his collaborators Chun-Chun Chen and Kazuhiro Wada have just completed a mapping of the bird brain based on a 10-year exploration of the tiny cerebrums of eight species of birds.
In a special issue appearing online in the Journal of Comparative Neurology, two papers (1, 2) from the Jarvis group propose a dramatic redrawing of some boundaries and functional areas based on a computational analysis of the activity of 52 genes across 23 areas of the bird brain.
Jarvis, who is a professor of neurobiology at Duke, member of the Duke Institute for Brain Sciences, and a Howard Hughes Medical Institute investigator, said the most important takeaway from the new map is that the brains of all vertebrates, a group that includes birds as well as humans, have some important similarities that can be useful to research.
Most significantly, the new map argues for and supports the existence of columnar organization in the bird brain. “Columnar organization is a rule, rather than an exception found only in mammals,” Jarvis said. “One way I visualize this view is that the avian brain is one big, giant gyrus folding around a ventricle space, functioning like what you’d find in the mammalian brain,” he said.
To create different patterns of gene expression for the analysis, the birds were exposed to various environmental factors such as darkness or light, silence or bird song, hopping on a treadmill, and in the case of migratory warblers, a magnetic field that stimulated their navigational circuits.
The new map follows up on a 2004 model, proposed by an Avian Brain Nomenclature Consortium, also lead by Jarvis and colleagues, which officially changed a century-old view on the prevailing model that the avian brain contained mostly primitive regions. They argued instead that the avian brain has a cortical-like area and other forebrain regions similar to mammals, but organized differently.
"The change in terminology is small this time, but the change in concept is big," Jarvis said. For this special issue, the of Journal of Comparative Neurology commissioned a commentary by Juan Montiel and Zoltan Molnar, experts in brain evolution, to summarize the large amount of data presented in the studies by the Jarvis group.
One of the major findings is that two populations of cells on either side of a void called the ventricle are actually the same cell types with similar patterns of gene expression. Earlier investigators had thought of the ventricle as a physical barrier separating cell types, but in development studies led by Jarvis’ post doctoral fellow Chun-chun Chen, the Duke researchers showed how dividing cells spread in a sheet and flow around the ventricle as they multiply.
The new map simplifies the bird cortex, called pallium, from seven populations of cells down to four major populations. Humans have five populations of cells in six layers.
Part of this refinement is simply that the tools are getting better, says Harvey Karten, a professor of neurosciences at the University of California-San Diego who proposed a dramatic re-thinking of bird cortical organization in the late 1960s. The best tools in that era were microscopes, specific cell stains and electrophysiology. Karten and colleagues are authors of a fourth paper in the special issue which announces a database of gene expression profiles of the avian brain containing some of the data that the Jarvis group used.
Jarvis said having a more specific map is necessary for properly sampling cell populations for gene expression analysis to do even more functional analysis of how the brain operates. As a next step, his team is considering doing an even more detailed bird map with “several hundred” genes rather than the 52 used to make this map.
Jarvis and colleagues are working now on a similar mapping of the crocodile brain with the ultimate goal of being able to say something about how dinosaur brains were organized, since both birds and crocs are descended from them. At a Society for Neuroscience conference in November, they’ll be presenting some early findings from that project.
Though the specifics of this newest map may only be of interest within the bird research community, Jarvis said, it builds the awareness that birds can be a useful model for many questions about the human brain.
"Where does the mammalian brain come from?" Karten asks. "And what’s the origin of these structures at the cellular and molecular level?" Some neuroscientists have argued that the mammalian cortex — the one we have — is something apart from the brains of other vertebrates. Jarvis and Karten now think vertebrate brains have more commonalities than differences.
That awareness is making birds an ever more useful model for questions about the human brain. “There are very few animal models where you can learn — at the molecular level — what’s going on in vocal learning,” Karten said. Birds are also being used as models for research on Parkinson’s, Huntington’s, deafness and other degenerative conditions in humans.
UC Davis MIND Institute research finds rigorous evaluations are needed to accurately diagnose autism in children with 22q11.2 deletion syndrome
Children with a genetic disorder called 22q11.2 deletion syndrome, who frequently are believed to also have autism, often may be misidentified because the social impairments associated with their developmental delay may mimic the features of autism, a study by researchers with the UC Davis MIND Institute suggests.

The study is the first to examine autism in children with chromosome 22q11.2 deletion syndrome, in whom the prevalence of autism has been reported at between 20 and 50 percent, using rigorous gold-standard diagnostic criteria. The research found that none of the children with 22q11.2 deletion syndrome “met strict diagnostic criteria” for autism.
The researchers said the finding is important because treatments designed for children with autism, such as widely used discrete-trial training methods, may exacerbate the anxiety that is commonplace among the population.
Rather, evaluations should be performed to assess autism and guide the selection of appropriate therapies based on the children’s symptoms, such as language and communication delay, the researchers said. The study, “Social impairments in Chromosome 22q11.2 Deletion Syndrome (22q11.2DS): Autism Spectrum Disorder or a different Endophenotype?” is published online today in Springer’s Journal of Autism and Developmental Disorders.
A high prevalence of autism spectrum disorder has been reported in children with 22q11.2 deletion syndrome – as high as 50 percent based on parent-report measures. Children diagnosed with 22q11.2 deletion syndrome – or 22q – may experience mild to severe cardiac anomalies, weakened immune systems and malformations of the head and neck and the roof of the mouth, or palate. They also experience developmental delay, with IQs in the borderline-to-low-average range. They characteristically experience significant anxiety and appear socially awkward.
“The results of our study show that of the children involved in our study no child actually met strict diagnostic criteria for an autism spectrum disorder,” said Kathleen Angkustsiri, study lead author and assistant professor of developmental-behavioral pediatrics at the MIND Institute.
“This is very important because the literature cites rates of anywhere from 20 to 50 percent of children with the disorder also have an autism spectrum disorder. Our findings lead us to question whether this is the correct label for these children who clearly have social impairments. We need to find out what interventions are most appropriate for their difficulties.”
The disorder’s name also describes its location on the 22nd chromosome as well as the nature of the genetic mutation, which is associated with a variety of anatomical and intellectual deficits. It has previously been known as Velocardiofacial Syndrome and Di George Syndrome, for the pediatric endocrinologist who described it in the 1960s.
The risk of 22q is about 1 in 2000 in the general population. The condition is seen in individuals of all backgrounds. Notably, people with 22q are at significantly heightened risk of developing mental-health disorders in adolescence and young adulthood. A person with 22q has a 30 times greater risk of developing schizophrenia than individuals in the general population.
“Because of the high rates of psychiatric disorders in childhood and adulthood, 22q is a very special population for prospective study looking at what’s happening throughout childhood that might either increase risk or provide protection against some of the later developing serious psychiatric illnesses, such as schizophrenia, that are associated with the disorder,” said Tony J. Simon, professor of psychiatry and behavioral sciences and director of the chromosome 22q11.2 deletion program at the MIND Institute.
The study was conducted among individuals recruited through the website of the Cognitive Analysis and Brain Imaging Laboratory (CABIL), which Simon directs. Simon and Angkustsiri said that the parents of children with 22q deletion syndrome often had commented that their children “seemed different” from other children with autism diagnoses, but that they hadn’t discovered a better diagnosis.
The clinical impression of the MIND Institute’s 22q deletion syndrome team, which includes psychologists Ingrid Leckliter and Janice Enriquez, was that the children were experiencing significant social impairments, but their presentation diverged from that of children with autism. To determine whether the children met the criteria for classic autism, they decided to test a subset of the children recruited from participants in a larger study of neurocognitive functioning, based on stringent methods and using multiple testing instruments.
The researchers selected 29 children –16 boys and 13 girls – for additional scrutiny, administering two tests. The Autism Diagnostic Observation Schedule (ADOS), a gold-standard assessment for autism, was administered to the children. The Social Communication Questionnaire (SCQ), a 40-question parent screening tool for communication and social functioning based on the gold-standard Autism Diagnostic Interview-Revised, was administered to their parents.
Typically, a diagnosis of autism spectrum disorder requires elevated scores on both a parent report measure, such as the SCQ, and a directly administered assessment such as the ADOS. Prior studies of autism in chromosome 22q11.2 deletion syndrome have only used parent report measures.
Only five of the 29 children had scores in the elevated range on the ADOS diagnostic tool. Four of the five had significant anxiety. Only two – 7 percent – had SCQ scores above the cut off. No child had both SCQ and ADOS scores in the relevant ranges that would lead to an ASD diagnosis.
“Over the years, a number of children came to us as part of the research or the clinical assessments that we perform, and their parents told us that they had an autism spectrum diagnosis. It’s quite clear that children with the disorder do have social impairments,” Simon said. “But it did seem to us that they did not have a classic case of autism spectrum disorder. They often have very high levels of social motivation. They get a lot of pleasure from social interaction, and they’re quite socially skilled.”
Simon said that the team also noted that the children’s social deficits might be more a function of their developmental delay and intellectual disability than autism.
“If you put them with their younger siblings’ friends they function very well in a social setting,” Simon continued, “and they interact well with an adult who accommodates their expectations for social interaction.”
Angkustsiri said that further study is needed to assess more appropriate treatments for children with 22q, such as improving their communication skills, treating their anxiety, helping them to remain focused and on task.
“There are a variety of different avenues that might be pursued rather than treatments that are designed to treat children with autism,” Angkustsiri said. “There are readily available, evidence-based treatments that may be more appropriate to help maximize these children’s potential.”
Study suggests musical training could possibly sharpen language processing

People who are better able to move to a beat show more consistent brain responses to speech than those with less rhythm, according to a study published in the September 18 issue of The Journal of Neuroscience. The findings suggest that musical training could possibly sharpen the brain’s response to language.
Scientists have long known that moving to a steady beat requires synchronization between the parts of the brain responsible for hearing and movement. In the current study, Professor Nina Kraus, PhD, and colleagues at Northwestern University examined the relationship between the ability to keep a beat and the brain’s response to sound.
More than 100 teenagers from the Chicago area participated in the Kraus Lab study, where they were instructed to listen and tap their finger along to a metronome. The teens’ tapping accuracy was computed based on how closely their taps aligned in time with the “tic-toc” of the metronome. In a second test, the researchers used a technique called electroencephalography (EEG) to record brainwaves from a major brain hub for sound processing as the teens listened to the synthesized speech sound “da” repeated periodically over a 30-minute period. The researchers then calculated how similarly the nerve cells in this region responded each time the “da” sound was repeated.
“Across this population of adolescents, the more accurate they were at tapping along to the beat, the more consistent their brains’ response to the ‘da’ syllable was,” Kraus said. Because previous studies show a link between reading ability and beat-keeping ability as well as reading ability and the consistency of the brain’s response to sound, Kraus explained that these new findings show that hearing is a common basis for these associations.
“Rhythm is inherently a part of music and language,” Kraus said. “It may be that musical training, with an emphasis on rhythmic skills, exercises the auditory-system, leading to strong sound-to-meaning associations that are so essential in learning to read.”
John Iversen, PhD, who studies how the brain processes music at the University of California, San Diego, and was not involved with this study, noted that the findings raise the possibility that musical training may have important impacts on the brain.“This study adds another piece to the puzzle in the emerging story suggesting that musical rhythmic abilities are correlated with improved performance in non-music areas, particularly language,” he said.
Kraus’ group is now working on a multi-year study to evaluate the effects of musical training on beat synchronization, response consistency, and reading skills in a group of children engaging in musical training.
Cognitive enhancers—drugs taken to enhance concentration, memory, alertness and moods—do not improve cognition or function in people with mild cognitive impairment in the long term, according to a new study by researchers at St. Michael’s Hospital.
In fact, patients on these medications experienced significantly more nausea, diarrhea, vomiting and headaches, according to the study published today in the Canadian Medical Association Journal.
“Our findings do not support the use of cognitive enhancers for mild cognitive impairment,” wrote Dr. Andrea Tricco and Dr. Sharon Straus, who are both scientists in the hospital’s Li Ka Shing Knowledge Institute. Dr. Straus is also a geriatrician at the hospital.
Mild cognitive impairment is a condition characterized by memory complaints without significant limitations in everyday activity. Between 3 and 42 per cent of people are diagnosed with the condition each year, about 4.6 million people worldwide. Each year about 3 to 17 per cent of people with mild cognitive impairment will develop dementia, such as Alzheimer’s disease. Given the aging population, it’s estimated the number of Canadians with dementia will double to more than 1 million in the next 25 years.
It has been hypothesized that cognitive enhancers may delay the onset of dementia. Families and patients are increasingly requesting these drugs even though their efficacy for patients with mild cognitive impairment has not been established. In Canada, cognitive enhancers can be obtained only with special authorization.
Drs. Tricco and Straus conducted a review of existing evidence to understand the efficacy and safety of cognitive enhancers. They looked at eight randomized trials that compared one of four cognitive enhancers (donepezil, rivastigmine, galantamine or memantine) to a placebo among patients diagnosed with mild cognitive impairment.
While they found short-term benefits to using these drugs on one cognition scale, there were no long-term effects after about a year and a half. No other benefits were observed on the second cognition scale or on function, behaviour, and mortality. As well, patients on these medications experienced significantly more nausea, diarrhea, vomiting and headaches. One study also found a higher risk of a heart condition known as bradycardia (slow heartbeat) among patients who received galantamine.
“Our results do not support the use of cognitive enhancers for patients with mild cognitive impairment,” the authors wrote. “These agents were not associated with any benefit and led to an increase in harms. Patients and their families should consider this information when requesting these medications. Similarly, health care decision-makers may not wish to approve the use of these medications for mild cognitive impairment, because these drugs might not be effective and are likely associated with harm.”
This study was funded by the Drug Safety and Effectiveness Network/Canadian Institutes of Health Research.
Another St. Michael’s study published in the CMAJ in April found no evidence that drugs, herbal products or vitamin supplements help prevent cognitive decline in healthy older adults. That review, led by Dr. Raza Naqvi, a University of Toronto resident, found some evidence that mental exercises, such as computerized memory training programs, might help.
A girl who does not feel physical pain has helped researchers identify a gene mutation that disrupts pain perception. The discovery may spur the development of new painkillers that will block pain signals in the same way.

People with congenital analgesia cannot feel physical pain and often injure themselves as a result – they might badly scald their skin, for example, through being unaware that they are touching something hot.
By comparing the gene sequence of a girl with the disorder against those of her parents, who do not, Ingo Kurth at Jena University Hospital in Germany and his colleagues identified a mutation in a gene called SCN11A.
This gene controls the development of channels on pain-sensing neurons. Sodium ions travel through these channels, creating electrical nerve impulses that are sent to the brain, which registers pain.
Blocked signals
Overactivity in the mutated version of SCN11A prevents the build-up of the charge that the neurons need to transmit an electrical impulse, numbing the body to pain. “The outcome is blocked transmission of pain signals,” says Kurth.
To confirm their findings, the team inserted a mutated version of SCN11A into mice and tested their ability to perceive pain. They found that 11 per cent of the mice with the modified gene developed injuries similar to those seen in people with congenital analgesia, such as bone fractures and skin wounds. They also tested a control group of mice with the normal SCN11A gene, none of which developed such injuries.
The altered mice also took 2.5 times longer on average than the control group to react to the “tail flick” pain test, which measures how long it takes for mice to flick their tails when exposed to a hot light beam. “What became clear from our experiments is that although there are similarities between mice and men with the mutation, the degree of pain insensitivity is more prominent in humans,” says Kurth.
The team has now begun the search for drugs that block the SCN11A channel. “It would require drugs that selectively block this but not other sodium channels, which is far from simple,” says Kurth.
Completely unexpected
"This is a cracking paper, and great science," says Geoffrey Woods of the University of Cambridge, whose team discovered in 2006 that mutations in another, closely related ion channel gene can cause insensitivity to pain. "It’s completely unexpected and not what people had been looking for," he says.
Woods says that there are three ion channels, called SCN9A, 10A and 11A, on pain-sensing neurons. People experience no pain when either of the first two don’t work, and agonising pain when they’re overactive. “With this new gene, it’s the opposite: when it’s overactive, they feel no pain. So maybe it’s some kind of gatekeeper that stops neurons from firing too often, but cancels pain signals completely when it’s overactive,” he says. “If you could get a drug that made SCN11A overactive, it should be a fantastic analgesic.”
"It’s fascinating that SCN11A appears to work the other way, and that could really advance our knowledge of the role of sodium channels in pain perception, which is a very hot topic,” says Jeffrey Mogil at McGill University in Canada, who was not involved in the new study.
Lefties and righties can thank same DNA that puts hearts on left side for hand dominance
Left- or right-handedness may be determined by the genes that position people’s internal organs.

About 10 percent of people prefer using their left hand. That ratio is found in every population in the world and scientists have long suspected that genetics controls hand preference. But finding the genes has been no simple task, says Chris McManus, a neuropsychologist at University College London who studies handedness but was not involved in the new research.
“There’s no single gene for the direction of handedness. That’s clear,” McManus says. Dozens of genes are probably involved, he says, which means that one person’s left-handedness might be caused by a variant in one gene, while another lefty might carry variants in an entirely different gene.
To find handedness genes, William Brandler, a geneticist at the University of Oxford, and colleagues conducted a statistical sweep of DNA from 3,394 people. Statistical searches such as this are known as genome-wide association studies; scientists often do such studies to uncover genes that contribute to complex diseases or traits such as diabetes and height. The people in this study had taken tests involving moving pegs on a board. The difference in the amount of time they took with one hand versus the other reflected how strongly left- or right-handed they were.
A variant in a gene called PCSK6 was most tightly linked with strong hand preference, the researchers report in the Sept. 12 PLOS Genetics. The gene has been implicated in handedness before, including in a 2011 study by the same research group. PCSK6 is involved in the asymmetrical positioning of internal organs in organisms from snails to vertebrates.
Brandler, who happens to be a lefty, knew the gene wasn’t the only cause of hand preference, so he and his colleagues looked at other genetic variants that didn’t quite cross the threshold of statistical significance. Many of the genes the team uncovered had previously been shown in studies of mice to be necessary for correctly placing organs such as the heart and liver. Four of the genes when disrupted in mice can cause cilia-related diseases. Cilia are hairlike appendages on cells that act a bit like GPS units and direct many aspects of development of a wide range of species, including humans.
One of the cilia genes, GLI3, also helps build the corpus callosum, a bundle of nerves that connects the two hemispheres of the brain. Some studies have suggested that the structure is bigger in left-handers.
It’s still a mystery how these genes direct handedness, says Larissa Arning, a human geneticist at Ruhr University Bochum in Germany. In addition to genes that direct body plans, she says, the study suggests that many more yet-to-be-discovered genes probably play a role in handedness.
Brandler hopes the study will also help remove some of the stigma of being left-handed. Left-handedness isn’t a character flaw or a sign of being sinister, he says: “It’s an outcome of genetic variation.”
An Oxford University study has shown that a representative sample of UK schoolchildren aged seven to nine years had low levels of key Omega-3 fatty acids in their blood. Furthermore, the study found that children’s blood levels of the long-chain Omega-3 DHA (the form found in most abundance in the brain) ‘significantly predicted’ how well they were able to concentrate and learn. Oxford University researchers explained the findings, recently published in the journal PLOS ONE, at a conference in London on 4 September.

The study was presented at the conference by co-authors Dr Alex Richardson and Professor Paul Montgomery from Oxford University’s Centre for Evidence-Based Intervention in the Department of Social Policy and Intervention. It is one of the first to evaluate blood Omega-3 levels in UK schoolchildren. The long-chain Omega-3 fats (EPA and DHA) found in fish, seafood and some algae, are essential for the brain’s structure and function as well as for maintaining a healthy heart and immune system. Parents also reported on their child’s diet, revealing to the researchers that almost nine out of ten children in the sample ate fish less than twice a week, and nearly one in ten never ate fish at all. The government’s guidelines for a healthy diet recommend at least two portions of fish a week. This is because like vitamins, omega-3 fats have to come from our diets – and although humans can in theory make some EPA and DHA from shorter-chain omega-3 (found in some vegetable oils), research has shown this conversion is not reliable, particularly for DHA, say the researchers.
Blood samples were taken from 493 schoolchildren, aged between seven and nine years, from 74 mainstream schools in Oxfordshire. All of the children were thought to have below-average reading skills, based on national assessments at the age of seven or their teachers’ current judgements. Analyses of their blood samples showed that, on average, just under two per cent of the children’s total blood fatty acids were Omega-3 DHA (Docosahexaenoic acid) and 0.5 per cent were Omega-3 EPA (Eicosapentaenoic acid), with a total of 2.45 per cent for these long-chain Omega-3 combined. This is below the minimum of 4 per cent recommended by leading scientists to maintain cardiovascular health in adults, with 8-12 per cent regarded as optimal for a healthy heart, the researchers reported.
Co-author Professor Paul Montgomery said: ‘From a sample of nearly 500 schoolchildren, we found that levels of Omega-3 fatty acids in the blood significantly predicted a child’s behaviour and ability to learn. Higher levels of Omega-3 in the blood, and DHA in particular, were associated with better reading and memory, as well as with fewer behaviour problems as rated by parents and teachers. These results are particularly noteworthy given that we had a restricted range of scores, especially with respect to blood DHA but also for reading ability, as around two-thirds of these children were still reading below their age-level when we assessed them. Although further research is needed, we think it is likely that these findings could be applied generally to schoolchildren throughout the UK.’
Co-author Dr Alex Richardson added: ‘The longer term health implications of such low blood Omega-3 levels in children obviously can’t be known. But this study suggests that many, if not most UK children, probably aren’t getting enough of the long-chain Omega-3 we all need for a healthy brain, heart and immune system. That gives serious cause for concern because we found that lower blood DHA was linked with poorer behaviour and learning in these children.
‘Most of the children we studied had blood levels of long-chain Omega-3 that in adults would indicate a high risk of heart disease. This was consistent with their parents’ reports that most of them failed to meet current dietary guidelines for fish and seafood intake. Similarly, few took supplements or foods fortified with these Omega-3.’
The current findings build on earlier work by the same researchers, showing that dietary supplementation with Omega-3 DHA improved both reading progress and behaviour in children from the general school population who were behind on their reading. Their previous research has already shown benefits of supplementation with long-chain omega-3 (EPA+DHA) for children with ADHD, Dyspraxia, Dyslexia, and related conditions. The DHA Oxford Learning and Behaviour (DOLAB) Studies have now extended these findings to children from the general school population.
‘Technical advances in recent years have enabled the measurement of individual Omega-3 and other fatty acids from fingerstick blood samples. ‘These new techniques have been revolutionary – because in the past, blood samples from a vein were needed for assessing fatty acids, and that has seriously restricted research into the blood Omega-3 status of healthy UK children until now,’ said Dr Richardson.
Understanding alternate pathways for how mental meds work could lead to faster-acting drug targets
The reasons behind why it often takes people several weeks to feel the effect of newly prescribed antidepressants remains somewhat of a mystery – and likely, a frustration to both patients and physicians.

(Image: Mouse hippocampus expressing the Cre- virus. Credit: Julie Blendy, PhD; Brigitta Gunderson, PhD; Perelman School of Medicine, University of Pennsylvania)
Julie Blendy, PhD, professor of Pharmacology, at the Perelman School of Medicine, University of Pennsylvania; Brigitta Gunderson, PhD, a former postdoctoral fellow in the Blendy lab, and colleagues, have been working to find out why and if there is anything that can be done to shorten the time in which antidepressants kick in.
“Our goal is to find ways for antidepressants to work faster,” says Blendy.
The proteins CREB and CREM are both transcription factors, which bind to specific DNA sequences to control the “reading” of genetic information from DNA to messenger RNA (mRNA). Both CREB and CREM bind to the same 8-base-pair DNA sequence in the cell nucleus. But, the comparative influence of CREM versus CREB on the action of antidepressants is a “big unknown,” says Blendy.
CREB, and CREM to some degree, has been implicated in the pathophysiology of depression, as well as in the efficacy of antidepressants. However, whenever CREB is deleted, CREM is upregulated, further complicating the story.
Therefore, how an antidepressant works on the biochemistry and behavior in a mouse in which the CREB protein is deleted only in the hippocampus versus a wild type mouse in which CREM is overexpressed let the researchers tease out the relative influence of CREB and CREM on the pharmacology of an antidepressant. They saw the same results in each type of mouse line – increased nerve-cell generation in the hippocampus and a quicker response to the antidepressant. Their findings appear in the Journal of Neuroscience.
“This is the first demonstration of CREM within the brain playing a role in behavior, and specifically in behavioral outcomes, following antidepressant treatment,” says Blendy.
A Flood of Neurotransmitters
Antidepressants like SSRIs, NRIs, and older tricyclic drugs work by causing an immediate flood of neurotransmitters like serotonin, norepinephrine, and in some cases dopamine, into the synaptic space. However, it can take three to four weeks for patients to feel changes in mental state. Long-term behavioral effects of the drugs may take longer to manifest themselves, because of the need to activate CREB downstream targets such as BDNF and trkB, or as of yet unidentified targets, which could also be developed as new antidepressant drug targets.
The Penn team compared the behavior of the control, wild-type mice to the CREB mutant mice using a test in which the mouse is trained to eat a treat – Reese’s Pieces, to be exact – in the comfort of their home cage. The treat-loving mice are then placed in a new cage to make them anxious. They are given the treat again, and the time it takes for the mouse to approach the treat is recorded.
Animals that receive no drug treatment take a long time to venture out into the anxious environment to retrieve the treat, however, if given an antidepressant drug for at least three weeks, the time it takes a mouse to get the treat decreases significantly, from about 400 seconds to 100 seconds. In mice in which CREB is deleted or in mice in which CREM is upregulated, this reduction happens in one to two days versus the three weeks seen in wild-type mice.
The accelerated time to approach the treat in mice on the medication was accompanied by an increase in new nerve growth in the hippocampus.
“Our results suggest that activation of CREM may provide a means to accelerate the therapeutic efficacy of current antidepressant treatment,” says Blendy. Upregulation of CREM observed after CREB deletion, appears to functionally compensate for CREB loss at a behavioral level and leads to maintained or increased expression of some CREB target genes. The researchers’ next step is to identify any unique CREM target genes in brain areas such as the hippocampus, which may lead to the development of faster-acting antidepressants.
If the violins were taken away from the musicians performing Beethoven’s 9th symphony, the resulting composition would sound very different. If the violins were left on stage but the violinists were removed, the same mutant version of the symphony would be heard.
But what if it ended up sounding like “Hey Jude” instead?
This sort of surprise is what scientists from the Virginia Tech Carilion Research Institute had during what they assumed to be a routine experiment in neurodevelopment. Previous studies had shown that the glycoprotein Reelin is crucial to developing healthy neural networks. Logically, taking away the two receptors that Reelin is known to act on early in the brain’s development should create the same malformations as taking away Reelin itself.
It didn’t.
“We conducted the experiment thinking we’d see the same defects for both cases – Reelin deficiency and its receptors’ deletion – but we didn’t,” said Michael Fox, an associate professor at the research institute and the lead author of the study. “If you take away the receptors instead of the targeting molecule, you get an entirely separate set of abnormalities. The results raise the question of the identity of other molecules with which Reelin and the two receptors are interacting.”
The study, first published online in June in Neural Development, could prove useful for the development of therapies and diagnostics to combat brain disease.
In the early stages of neural development, neurons grow from the retina to a small portion of the brain called the thalamus. All sensory information coming into the brain gets routed through this region, before being transmitted to the cerebral cortex for further processing. Because these retinal neurons carry specific types of information, they must connect to specific places in the thalamus, which Reelin helps them find.
In the experiment, the scientists bred mice lacking both Reelin receptors known to be critical for neurons to navigate their targets during development. The scientists expected the neurons in the mutants to become lost and unable to find their targets, which is what happens in Reelin-deficient mice. Instead, the neurons were able to locate their targets, but those targets had wandered off.
While these results were surprising, they weren’t the most interesting of the experiment. Although most neurons look the same to people without advanced training in neuroscience, many different types are intermixed in distinct regions with strict borders. How these borders are formed, however, is still an open question.
“Many of us have questioned how you can have such a crisp boundary between two regions of the brain,” said Jianmin Su, a research assistant professor at the research institute and first author of the study. “I always thought it was a large number of cells creating some kind of cue or environment, but that isn’t what this experiment indicates.”
In the mice without the Reelin receptors, neurons from one part of the thalamus migrated to an area where they weren’t supposed to be. Even though only a handful of neurons were misplaced, they did not mingle with their new neighbors. They stayed separate.
“The result is a baffling curiosity that nobody in the lab expected – just how distinct these little regions can be,” Fox said. “How do just a few cells create such a barrier? How many cells does it take? Maybe these little islands can teach us something about how you create boundaries between larger regions of functionally similar cells.”
This experiment isn’t the only example Fox has had recently of neurons invading regions in which they weren’t supposed to be. In a second experiment, researchers examined how neurons from the cortex connect to the thalamus during the initial stages of development.
And neurons seem to be polite.
The results showed that neurons from the cortex grow to the edge of the part of the thalamus dedicated to visual signals, called the dorsal lateral geniculate nucleus, but then stop. In fact, they stay on standby for nearly two weeks before making their way into the region. It seems as though they’re waiting for the retinal neurons to make their connections before beginning to make their own. If researchers surgically removed the eyes or genetically removed the retinal cells connecting the eyes to the thalamus, neurons from the cortex invaded more than a week earlier than they were supposed to.
“It turns out that the cortical neurons are waiting for the retinal axons to mature and find the most appropriate spots to connect before they’re allowed to come in,” said Fox. “There’s some form of instructional role that retinal axons play in the timing of the cortical axons entering.”
Alzheimer’s disease is thought to be caused by the buildup of abnormal, thread-like protein deposits in the brain, but little is known about the molecular structures of these so-called beta-amyloid fibrils. A study published by Cell Press September 12th in the journal Cell has revealed that distinct molecular structures of beta-amyloid fibrils may predominate in the brains of Alzheimer’s patients with different clinical histories and degrees of brain damage. The findings pave the way for new patient-specific strategies to improve diagnosis and treatment of this common and debilitating disease.

"This work represents the first detailed characterization of the molecular structures of beta-amyloid fibrils that develop in the brains of patients with Alzheimer’s disease," says senior study author Robert Tycko of the National Institutes of Health. "This detailed structural model may be used to guide the development of chemical compounds that bind to these fibrils with high specificity for purposes of diagnostic imaging, as well as compounds that inhibit fibril formation for purposes of prevention or therapy."
Tycko and his team had previously noticed that beta-amyloid fibrils grown in a dish have different molecular structures, depending on the specific growth conditions. Based on this observation, they suspected that fibrils found in the brains of patients with Alzheimer’s disease are also variable and that these structural variations might relate to each patient’s clinical history. But it has not been possible to directly study the structures of fibrils found in patients because of their low abundance in the brain.
To overcome this hurdle, Tycko and his collaborators developed a new experimental protocol. They extracted beta-amyloid fibril fragments from the brain tissue of two patients with different clinical histories and degrees of brain damage and then used these fragments to grow a large quantity of fibrils in a dish. They found that a single fibril structure prevailed in the brain tissue of each patient, but the molecular structures were different between the two patients.
"This may mean that fibrils in a given patient appear first at a single site in the brain, then spread to other locations while retaining the identical molecular structure," Tycko says. "Our study also shows that certain fibril structures may be more likely than others to cause Alzheimer’s disease, highlighting the importance of developing imaging agents that target specific fibril structures to improve the reliability and specificity of diagnosis."
Scientists from the Florida campus of The Scripps Research Institute (TSRI) have found a group of proteins essential to the formation of long-term memories.
The study, published online ahead of print on September 12, 2013 by the journal Cell Reports, focuses on a family of proteins called Wnts. These proteins send signals from the outside to the inside of a cell, inducing a cellular response crucial for many aspects of embryonic development, including stem cell differentiation, as well as for normal functioning of the adult brain.
“By removing the function of three proteins in the Wnt signaling pathway, we produced a deficit in long-term but not short-term memory,” said Ron Davis, chair of the TSRI Department of Neuroscience. “The pathway is clearly part of the conversion of short-term memory to the long-term stable form, which occurs through changes in gene expression.”
The findings stem from experiments probing the role of Wnt signaling components in olfactory memory formation in Drosophila, the common fruit fly—a widely used doppelgänger for human memory studies. In the new study, the scientists inactivated the expression of several Wnt signaling proteins in the mushroom bodies of adult flies—part of the fly brain that plays a role in learning and memory.
The resulting memory disruption, Davis said, suggests that Wnt signaling participates actively in the formation of long-term memory, rather than having some general, non-specific effect on behavior.
“What is interesting is that the molecular mechanisms of adult memory use the same processes that guide the early development of the organism, except that they are repurposed for memory formation,” he said. “One difference, however, is that during early development the signals are intrinsic, while in adults they require an outside stimulus to create a memory.”
Isabelle Arnulf and colleagues from the Sleep Disorders Unit at the Université Pierre et Marie Curie (UPMC) have outlined case studies of patients with Auto-Activation Deficit who reported dreams when awakened from REM sleep – even when they demonstrated a mental blank during the daytime. This paper proves that even patients with Auto-Activation Disorder have the ability to dream and that it is the “bottom-up” process that causes the dream state.

In a new paper for the neurology journal Brain, Arnulf et al compare the dream states of patients with Auto-Activation Deficit (AAD) with those of healthy, control patients. AAD is caused by bilateral damage to the basal ganglia and it is a neuro-physical syndrome characterized by a striking apathy, a lack of spontaneous activation of thought, and a loss of self-driven behaviour. AAD patients must be stimulated by their care-givers in order to take part in everyday tasks like standing up, eating, or drinking. If you were to ask an AAD patient: “what are you thinking?” they would report that they have no thoughts.
During sleep, the brain is operating on an exclusively internal basis. In REM sleep, the higher cortex areas are internally stimulated by the brainstem. When awakened, most normal subjects will remember some dreams that were associated with their previous sleep state, especially in REM sleep. Would the self-stimulation of the cortex by the brainstem be sufficient to stimulate spontaneous dreams during sleep in AAD patients?
Discovering the answer to this question would go some way to proving either the top-down or bottom-up theories of dreaming. The top-down theory stipulates that dreaming begins in higher cortex memory structures and then proceeds backwards as imagination develops during wakefulness. The bottom-up theory posits that the brainstem structures which elicit rapid eye movements and cortex activation during REM sleep result in the emotional, visual, sensory, and auditory elements of dreaming.
Thirteen patients with AAD agreed to participate in the study and record their dreams in dream diaries during the week leading up to the evaluation. These patients were compared with thirteen non-medicated, healthy control subjects. Video and sleep monitoring were performed on all twenty six participants for two consecutive nights. The first night evaluated the patient’s sleep duration, structure, and architecture of their dreams. During the second night of sleep evaluation, the researchers woke the subjects up as they entered the second non-REM sleep cycle, and again after 10 min of established REM sleep during the following sleep cycle, and asked them what they were dreaming before being woken up. The dream reports were then independently analysed and scored according to; complexity of dream, bizarreness, and elaboration.
Four of the thirteen patients with AAD reported dreaming when awakened from REM sleep, even though they demonstrated a mental blank during the daytime. This is compared to 12 out of 13 of the control patients. However, the four AAD patients’ dreams were devoid of any complex, bizarre, or emotional elements. The presence of simple yet spontaneous dreams in REM sleep, despite the absence of thoughts during wakefulness in AAD patients, supports the notion that simple dream imagery is generated by brainstem stimulation and sent to the sensory cortex. The lack of complexity in the dreams of the four AAD patients, as opposed to the complexity of the control patients’ dreams, demonstrates that the full dreaming process require these sensations to be interpreted by a higher-order cortical area.
Therefore, this study shows for the first time that it is the bottom-up process that causes the dream state.
Yet, despite the simplicity of the dreams, Isabelle Arnulf commented that the banal tasks that the AAD patients dreamt about were fascinating. For instance, Patient 10 dreamt of shaving – an activity he never initiated during the daytime without motivation from his caregivers, and an activity he could not do by himself due to severe hand dystonia. Similarly, Patient 5 dreamt about writing even though he would never write in the daytime without being invited by his caregivers to do so.
Interestingly, there were no real differences in the sleep measures between the AAD patients and the control patients apart from 46% of the AAD patients had a complete absence of sleep spindles (a burst of oscillatory brain activity visible on an EEG that occurs during stage 2 sleep). The striking absence of sleep spindles in localized lesions in the basal ganglia of these 6 AAD patients highlights the role of the pallidum and striatum in spindling activity during non-REM sleep. This is a key distinction between the AAD patients and the control patients; all thirteen control subjects displayed signs of sleep spindles.