Learning the smell of fear: Mothers teach babies their own fears via odor
Babies can learn what to fear in the first days of life just by smelling the odor of their distressed mothers, new research suggests. And not just “natural” fears: If a mother experienced something before pregnancy that made her fear something specific, her baby will quickly learn to fear it too — through the odor she gives off when she feels fear.
In the first direct observation of this kind of fear transmission, a team of University of Michigan Medical School and New York University studied mother rats who had learned to fear the smell of peppermint – and showed how they “taught” this fear to their babies in their first days of life through their alarm odor released during distress.
Their findings in animals may help explain a phenomenon that has puzzled mental health experts for generations: how a mother’s traumatic experience can affect her children in profound ways, even when it happened long before they were born.
The researchers also hope their work will lead to better understanding of why not all children of traumatized mothers, or of mothers with major phobias, other anxiety disorders or major depression, experience the same effects.
“During the early days of an infant rat’s life, they are immune to learning information about environmental dangers. But if their mother is the source of threat information, we have shown they can learn from her and produce lasting memories,” says Jacek Debiec, M.D., Ph.D., the U-M psychiatrist and neuroscientist who led the research.
“Our research demonstrates that infants can learn from maternal expression of fear, very early in life,” he adds. “Before they can even make their own experiences, they basically acquire their mothers’ experiences. Most importantly, these maternally-transmitted memories are long-lived, whereas other types of infant learning, if not repeated, rapidly perish.”
Peering inside the fearful brain
Debiec, who treats children and mothers with anxiety and other conditions in the U-M Department of Psychiatry, notes that the research on rats allows scientists to see what’s going on inside the brain during fear transmission, in ways they could never do in humans.
He began the research during his fellowship at NYU with Regina Marie Sullivan, Ph.D., senior author of the new paper, and continues it in his new lab at U-M’s Molecular and Behavioral Neuroscience Institute.
The researchers taught female rats to fear the smell of peppermint by exposing them to mild, unpleasant electric shocks while they smelled the scent, before they were pregnant. Then after they gave birth, the team exposed the mothers to just the minty smell, without the shocks, to provoke the fear response. They also used a comparison group of female rats that didn’t fear peppermint.
They exposed the pups of both groups of mothers to the peppermint smell, under many different conditions with and without their mothers present.
Using special brain imaging, and studies of genetic activity in individual brain cells and cortisol in the blood, they zeroed in on a brain structure called the lateral amygdala as the key location for learning fears. During later life, this area is key to detecting and planning response to threats – so it makes sense that it would also be the hub for learning new fears.
But the fact that these fears could be learned in a way that lasted, during a time when the baby rat’s ability to learn any fears directly was naturally suppressed, is what makes the new findings so interesting, says Debiec.
The team even showed that the newborns could learn their mothers’ fears even when the mothers weren’t present. Just the piped-in scent of their mother reacting to the peppermint odor she feared was enough to make them fear the same thing.
Even when just the odor of the frightened mother was piped in to a chamber where baby rats were exposed to peppermint smell, the babies developed a fear of the same smell, and their blood cortisol levels rose when they smelled it.
And when the researchers gave the baby rats a substance that blocked activity in the amygdala, they failed to learn the fear of peppermint smell from their mothers. This suggests, Debiec says, that there may be ways to intervene to prevent children from learning irrational or harmful fear responses from their mothers, or reduce their impact.
From animals to humans: next steps
The new research builds on what scientists have learned over time about the fear circuitry in the brain, and what can go wrong with it. That work has helped psychiatrists develop new treatments for human patients with phobias and other anxiety disorders – for instance, exposure therapy that helps them overcome fears by gradually confronting the thing or experience that causes their fear.
In much the same way, Debiec hopes that exploring the roots of fear in infancy, and how maternal trauma can affect subsequent generations, could help human patients. While it’s too soon to know if the same odor-based effect happens between human mothers and babies, the role of a mother’s scent in calming human babies has been shown.
Debiec, who hails from Poland, recalls working with the grown children of Holocaust survivors, who experienced nightmares, avoidance instincts and even flashbacks related to traumatic experiences they never had themselves. While they would have learned about the Holocaust from their parents, this deeply ingrained fear suggests something more at work, he says.
The bit of your brain that signals how bad things could be
An evolutionarily ancient and tiny part of the brain tracks expectations about nasty events, finds new UCL research.
The study, published in Proceedings of the National Academy of Sciences, demonstrates for the first time that the human habenula, half the size of a pea, tracks predictions about negative events, like painful electric shocks, suggesting a role in learning from bad experiences.
Brain scans from 23 healthy volunteers showed that the habenula activates in response to pictures associated with painful electric shocks, with the opposite occurring for pictures that predicted winning money.
Previous studies in animals have found that habenula activity leads to avoidance as it suppresses dopamine, a brain chemical that drives motivation. In animals, habenula cells have been found to fire when bad things happen or are anticipated.
"The habenula tracks our experiences, responding more the worse something is expected to be," says senior author Dr Jonathan Roiser of the UCL Institute of Cognitive Neuroscience. "For example, the habenula responds much more strongly when an electric shock is almost certain than when it is unlikely. In this study we showed that the habenula doesn’t just express whether something leads to negative events or not; it signals quite how much bad outcomes are expected."
During the experiment, healthy volunteers were placed inside a functional magnetic resonance imaging (fMRI) scanner, and brain images were collected at high resolution because the habenula is so small. Volunteers were shown a random sequence of pictures each followed by a set chance of a good or bad outcome, occasionally pressing a button simply to show they were paying attention. Habenula activation tracked the changing expectation of bad and good events.
"Fascinatingly, people were slower to press the button when the picture was associated with getting shocked, even though their response had no bearing on the outcome." says lead author Dr Rebecca Lawson, also at the UCL Institute of Cognitive Neuroscience. "Furthermore, the slower people responded, the more reliably their habenula tracked associations with shocks. This demonstrates a crucial link between the habenula and motivated behaviour, which may be the result of dopamine suppression."
The habenula has previously been linked to depression, and this study shows how it could be involved in causing symptoms such low motivation, pessimism and a focus on negative experiences. A hyperactive habenula could cause people to make disproportionately negative predictions.
"Other work shows that ketamine, which has profound and immediate benefits in patients who failed to respond to standard antidepressant medication, specifically dampens down habenula activity," says Dr Roiser. "Therefore, understanding the habenula could help us to develop better treatments for treatment-resistant depression."
Slow Walking Speed and Memory Complaints Can Predict Dementia
A study involving nearly 27,000 older adults on five continents found that nearly 1 in 10 met criteria for pre-dementia based on a simple test that measures how fast people walk and whether they have cognitive complaints. People who tested positive for pre-dementia were twice as likely as others to develop dementia within 12 years. The study, led by scientists at Albert Einstein College of Medicine of Yeshiva University and Montefiore Medical Center, was published online on July 16, 2014 in Neurology®, the medical journal of the American Academy of Neurology.
The new test diagnoses motoric cognitive risk syndrome (MCR). Testing for the newly described syndrome relies on measuring gait speed (our manner of walking) and asking a few simple questions about a patient’s cognitive abilities, both of which take just seconds. The test is not reliant on the latest medical technology and can be done in a clinical setting, diagnosing people in the early stages of the dementia process. Early diagnosis is critical because it allows time to identify and possibly treat the underlying causes of the disease, which may delay or even prevent the onset of dementia in some cases.
“In many clinical and community settings, people don’t have access to the sophisticated tests—biomarker assays, cognitive tests or neuroimaging studies—used to diagnose people at risk for developing dementia,” said Joe Verghese, M.B.B.S., professor in the Saul R. Korey Department of Neurology and of medicine at Einstein, chief of geriatrics at Einstein and Montefiore, and senior author of the Neurology paper. “Our assessment method could enable many more people to learn if they’re at risk for dementia, since it avoids the need for complex testing and doesn’t require that the test be administered by a neurologist. The potential payoff could be tremendous—not only for individuals and their families, but also in terms of healthcare savings for society. All that’s needed to assess MCR is a stopwatch and a few questions, so primary care physicians could easily incorporate it into examinations of their older patients.”
The U.S. Centers for Disease Control and Prevention estimates that up to 5.3 million Americans—about 1 in 9 people age 65 and over—have Alzheimer’s disease, the most common type of dementia. That number is expected to more than double by 2050 due to population aging.
“As a young researcher, I examined hundreds of patients and noticed that if an older person was walking slowly, there was a good chance that his cognitive tests were also abnormal,” said Dr. Verghese, who is also the Murray D. Gross Memorial Faculty Scholar in Gerontology at Einstein. “This gave me the idea that perhaps we could use this simple clinical sign—how fast someone walks—to predict who would develop dementia. In a 2002 New England Journal of Medicine study, we reported that abnormal gait patterns accurately predict whether people will go on to develop dementia. MCR improves on the slow gait concept by evaluating not only patients’ gait speed but also whether they have cognitive complaints.”
The Neurology paper reported on the prevalence of MCR among 26,802 adults without dementia or disability aged 60 years and older enrolled in 22 studies in 17 countries. A significant number of adults—9.7 percent—met the criteria for MCR (i.e., abnormally slow gait and cognitive complaints). While the syndrome was equally common in men and women, highly educated people were less likely to test positive for MCR compared with less-educated individuals. A slow gait, said Dr. Verghese, is a walking speed slower than about one meter per second, which is about 2.2 miles per hour (m.p.h.). Less than 0.6 meters per second (or 1.3 m.p.h.) is “clearly abnormal.”
To test whether MCR predicts future dementia, the researchers focused on four of the 22 studies that tested a total of 4,812 people for MCR and then evaluated them annually over an average follow-up period of 12 years to see which ones developed dementia. Those who met the criteria for MCR were nearly twice as likely to develop dementia over the following 12 years compared with people who did not.
Dr. Verghese emphasized that a slow gait alone is not sufficient for a diagnosis of MCR. “Walking slowly could be due to conditions such as arthritis or an inner ear problem that affects balance, which would not increase risk for dementia. To meet the criteria for MCR requires having a slow gait and cognitive problems. An example would be answering ‘yes’ to the question, ‘Do you think you have more memory problems than other people?’”
For patients meeting MCR criteria, said Dr. Verghese, the next step is to look for the causes of their slow gait and cognitive complaints. The search may reveal underlying—and controllable—problems. “Evidence increasingly suggests that brain health is closely tied to cardiovascular health—meaning that treatable conditions such as hypertension, smoking, high cholesterol, obesity and diabetes can interfere with blood flow to the brain and thereby increase a person’s risk for developing Alzheimer’s and other dementias,” said Dr. Verghese.
What about people who meet MCR criteria but no treatable underlying problems can be found?
“Even in the absence of a specific cause, we know that most healthy lifestyle factors, such as exercising and eating healthier, have been shown to reduce the rate of cognitive decline,” said Dr. Verghese. “In addition, our group has shown that cognitively stimulating activities—playing board games, card games, reading, writing and also dancing—can delay dementia’s onset. Knowing they’re at high risk for dementia can also help people and their families make arrangements for the future, which is an aspect of MCR testing that I’ve found is very important in my own clinical practice.”
Researchers discover that Klotho is neuroprotective against Alzheimer's disease
Boston University School of Medicine researchers may have found a way to delay or even prevent Alzheimer’s disease (AD). They discovered that pre-treatment of neurons with the anti-aging protein Klotho can prevent neuron death in the presence of the toxic amyloid protein and glutamate. These findings currently appear in the Journal of Biological Chemistry.
Alzheimer’s disease is the most frequent age-related dementia affecting 5.4 million Americans including 13 percent of people age 65 and older and more than 40 percent of people over the age of 85. In AD the cognitive decline and dementia result from the death of nerve cells that are involved in learning and memory. The amyloid protein and the excess of the neurotransmitter, glutamate are partially responsible for the neuronal demise.
Nerve cells were grown in petri dishes and treated with or without Klotho for four hours. Amyloid or glutamate then were added to the dish for 24 hours. In the dishes where Klotho was added, a much higher percentage of neurons survived than in the dishes without Klotho.
"Finding a neuroprotective agent that will protect nerve cells from amyloid that accumulates as a function of age in the brain is novel and of major importance," explained corresponding author Carmela R. Abraham, PhD, professor of biochemistry and pharmacology at BUSM. "We now have evidence that if more Klotho is present in the brain, it will protect the neurons from the oxidative stress induced by amyloid and glutamate.
According to the researchers, Klotho is a large protein that cannot penetrate the blood brain barrier so it can’t be administered by mouth or injection. However in a separate study the researchers have identified small molecules that can enter the brain and increase the levels of Klotho. “We believe that increasing Klotho levels with such compounds would improve the outcome for Alzheimer’s patients, and if started early enough would prevent further deterioration. This potential treatment has implications for other neurodegenerative diseases such as Parkinson’s, Huntington’s, ALS and brain trauma, as well,” added Abraham.
The findings demonstrate that the drug, called XPro1595, can reach the brain at sufficient levels and have beneficial effects when administered by subcutaneous injection, like an insulin shot. Previous studies of XPro1595 in animals tested more invasive modes of delivery, such as direct injection into the brain.
“This is an important step forward for anti-inflammatory therapies for Parkinson’s disease,” says Malu Tansey, PhD, associate professor of physiology at Emory University School of Medicine. “Our results provide a compelling rationale for moving toward a clinical trial in early Parkinson’s disease patients.”
The new research on subcutaneous administration of XPro1595 was funded by the Michael J. Fox Foundation for Parkinson’s Research (MJFF). XPro1595 is licensed by FPRT Bio, and is seeking funding for a clinical trial to test its efficacy in the early stages of Parkinson’s disease.
“We are proud to have supported this work and glad to see positive pre-clinical results,” said Marco Baptista, PhD, MJFF associate director of research programs. “A therapy that could slow Parkinson’s progression would be a game changer for the millions living with this disease, and this study is a step in that direction.”
In addition, Tansey and Yoland Smith, PhD, from Yerkes National Primate Research Center, were awarded a grant this week from the Parkinson’s Disease Foundation to test XPro1595 in a non-human primate model of Parkinson’s.
Evidence has been piling up that inflammation is an important mechanism driving the progression of Parkinson’s disease. XPro1595 targets tumor necrosis factor (TNF), a critical inflammatory signaling molecule, and is specific to the soluble form of TNF. This specificity would avoid compromising immunity to infections, a known side effect of existing anti-TNF drugs used to treat disorders such as rheumatoid arthritis.
“Inflammation is probably not the initiating event in Parkinson’s disease, but it is important for the neurodegeneration that follows,” Tansey says. “That’s why we believe that an anti-inflammatory agent, such as one that counteracts soluble TNF, could substantially slow the progression of the disease.”
Postdoctoral fellow Christopher Barnum, PhD and colleagues used a model of Parkinson’s disease in rats in which the neurotoxin 6-hydroxydopamine (6-OHDA) is injected into only one side of the brain. This reproduces some aspects of Parkinson’s disease: neurons that produce dopamine in the injected side of the brain die, leading to impaired movement on the opposite side of the body.
When XPro1595 is given to the animals 3 days after 6-OHDA injection, just 15 percent of the dopamine-producing neurons were lost five weeks later. That compares to controls in which 55 percent of the same neurons were lost. By reducing dopamine neuron loss with XPro1595, the researchers were also able to reduce motor impairment. In fact, the degree of dopamine cell loss was highly correlated both with the degree of motor impairment and immune cell activation.
When XPro1595 is given two weeks after injection, 44 percent of the vulnerable neurons are still lost, suggesting that there is a limited window of opportunity to intervene.
“Recent clinical studies indicates there is a four or five year window between diagnosis of Parkinson’s disease and the time when the maximum number of vulnerable neurons are lost,” Dr. Tansey says. “If this is true, and if inflammation is playing a key role during this window, then we might be able to slow or halt the progression of Parkinson’s with a treatment like XPro1595.”
Experiences at every stage of life contribute to cognitive abilities in old age
Early life experiences, such as childhood socioeconomic status and literacy, may have greater influence on the risk of cognitive impairment late in life than such demographic characteristics as race and ethnicity, a large study by researchers with the UC Davis Alzheimer’s Disease Center and the University of Victoria, Canada, has found.
“Declining cognitive function in older adults is a major personal and public health concern,” said Bruce Reed professor of neurology and associate director of the UC Davis Alzheimer’s Disease Center.
“But not all people lose cognitive function, and understanding the remarkable variability in cognitive trajectories as people age is of critical importance for prevention, treatment and planning to promote successful cognitive aging and minimize problems associated with cognitive decline.”
The study, “Life Experiences and Demographic Influences on Cognitive Function in Older Adults,” is published online in Neuropsychology, a journal of the American Psychological Association. It is one of the first comprehensive examinations of the multiple influences of varied demographic factors early in life and their relationship to cognitive aging.
The research was conducted in a group of over 300 diverse men and women who spoke either English or Spanish. They were recruited from senior citizen social, recreational and residential centers, as well as churches and health-care settings. At the time of recruitment, all study participants were 60 or older, and had no major psychiatric illnesses or life threatening medical illnesses. Participants were Caucasian, African-American or Hispanic.
The extensive testing included multidisciplinary diagnostic evaluations through the UC Davis Alzheimer’s Disease Center in either English or Spanish, which permitted comparisons across a diverse cohort of participants.
Consistent with previous research, the study found that non-Latino Caucasians scored 20 to 25 percent higher on tests of semantic memory (general knowledge) and 13 to 15 percent higher on tests of executive functioning compared to the other ethnic groups. However, ethnic differences in executive functioning disappeared and differences in semantic memory were reduced by 20 to 30 percent when group differences in childhood socioeconomic status, adult literacy and extent of physical activity during adulthood were considered.
“This study is unusual in that it examines how many different life experiences affect cognitive decline in late life,” said Dan Mungas, professor of neurology and associate director of the UC Davis Alzheimer’s Disease Research Center.
“It shows that variables like ethnicity and years of education that influence cognitive test scores in a single evaluation are not associated with rate of cognitive decline, but that specific life experiences like level of reading attainment and intellectually stimulating activities are predictive of the rate of late-life cognitive decline. This suggests that intellectual stimulation throughout the life span can reduce cognitive decline in old age.”
Regardless of ethnicity, advanced age and apolipoprotein-E (APOE genotype) were associated with increased cognitive decline over an average of four years that participants were followed. APOE is the largest known genetic risk factor for late-onset Alzheimer’s. Less decline was experienced by persons who reported more engagement in recreational activities in late life and who maintained their levels of activity engagement from middle age to old age. Single-word reading — the ability to decode a word on sight, which often is considered an indication of quality of educational experience — was also associated with less cognitive decline, a finding that was true for both English and Spanish readers, irrespective of their race or ethnicity. These findings suggest that early life experiences affect late-life cognition indirectly, through literacy and late-life recreational pursuits, the authors said.
“These findings are important,” explained Paul Brewster, lead author of the study, a doctoral student at the University of Victoria, Canada, and a pre-doctoral psychology intern at the UC San Diego Department of Psychiatry, “because it challenges earlier research that suggests associations between race and ethnicity, particularly among Latinos, and an increased risk of late-life cognitive impairment and dementia.
”Our findings suggest that the influences of demographic factors on late-life cognition may be reflective of broader socioeconomic factors, such as educational opportunity and related differences in physical and mental activity across the life span.”
Researchers Uncover an Unexpected Role for Endostatin in the Nervous System
Researchers at UC San Francisco have discovered that endostatin, a protein that once aroused intense interest as a possible cancer treatment, plays a key role in the stable functioning of the nervous system.
A substance that occurs naturally in the body, endostatin potently blocks the formation of new blood vessels. In studies in mice in the late 1990s, endostatin treatment virtually eliminated cancer by shutting down the blood supply to tumors, but subsequent human clinical trials proved disappointing.
“It was a very big surprise” to find that endostatin, through some other mechanism, helps to maintain the proper workings of synapses, the sites where communication between nerve cells takes place, said Graeme W. Davis, PhD, Hertzstein Distinguished Professor of Medicine in the Department of Biochemistry and Biophysics at UCSF and senior author of the new study. “Endostatin was not on our radar.”
The findings were reported online July 24 in the journal Neuron.
Synapses are continually shaped and reshaped by experience, a phenomenon known as plasticity. But for those changes to be meaningful, said Davis, they must take place against a stable background, which paradoxically requires another form of change that he and colleagues call “homeostatic plasticity.” Just as we change our pace, slowing down or speeding up, to keep abreast of a running partner, neurons adjust aspects of their function at synapses to compensate for changes in their synaptic partners brought on by aging, illness, or other factors.
In an example of homeostatic plasticity, in the neuromuscular disease myasthenia gravis, as muscle cells become less responsive to the neurotransmitter acetylcholine, nerve cells ramp up their secretion of the neurotransmitter to keep the system in balance for as long as possible. Some researchers believe that in other disorders, including autism and schizophrenia, a failure in such homeostatic mechanisms keeps synapses from functioning properly.
In previous research Davis noticed that applying a toxin to a muscle cell in the fruit fly Drosophila melanogaster triggers homeostatic plasticity in the neuron that forms a synapse on that muscle cell: the neuron—which is called presynaptic, because it is “before” the synapse with the muscle cell—reliably releases more neurotransmitter, just as happens when muscle cells begin to malfunction in myasthenia gravis.
Davis has since built on this model of homeostatic plasticity by painstakingly knocking out Drosophila genes one by one and recording from presynaptic neurons to see which genes are necessary for the homeostatic response, because it is these genes that may be compromised in diseases affecting the process.
“So far we’ve tested about 1,000 genes this way, which has entailed close to 10,000 recordings,” Davis said.
Using this technique Davis and colleagues observed at one point that knocking out a gene called multiplexin significantly hampered homeostatic plasticity in presynaptic neurons. But because that gene helps to form a structural protein known as collagen—which in humans is a component of ligaments, tendons, and cartilage—the finding wasn’t immediately considered relevant to synaptic function.
The team learned that the multiplexin protein can be snipped by an enzyme to produce endostatin, so in experiments led by postdoctoral fellow Tingting Wang, PhD, they tested whether endostatin might play a role in homeostatic plasticity.
“Nobody picked up multiplexin to work on for a couple of years, because we didn’t think a collagen could be that interesting,” Davis said. “Then, when a new postdoc, Tingting Wang, came to the lab, we started thinking about it harder.”
When the group genetically deleted the portion of Drosophila multiplexin that forms endostatin, presynaptic neurons behaved normally, but homeostatic plasticity was severely compromised when toxin was applied to postsynaptic muscle cells. On the opposite side of the coin, when the team overexpressed endostatin at Drosophila synapses lacking multiplexin, homeostasis was restored, whether endostatin was expressed in muscle cells or presynaptic neurons.
The research team is unsure precisely how and where endostatin exerts its effects on homeostatic plasticity, but they believe that multiplexin is cleaved at the postsynaptic site to form endostatin, and that the endostatin signal is conveyed to the presynaptic neuron to alter its function. “Because so many people in the cancer world have studied endostatin, there is a great set of tools available” to study the protein, Davis said, so he expects his group to make rapid progress in addressing these questions.
“Despite its checkered history in cancer, we know endostatin is a signaling molecule and we know that the brain has a great deal of collagen—we just haven’t known what it does, and we certainly don’t know what endostatin’s receptors in the brain might be.” Davis said. “But it’s pretty exciting to think about a new signaling molecule with a profound role in the stabilization of the function of neural circuits.”
Researchers find new mechanism for neurodegeneration
A research team led by Jackson Laboratory Professor and Howard Hughes Investigator Susan Ackerman, Ph.D., has pinpointed a surprising mechanism behind neurodegeneration in mice, one that involves a defect in a key component of the cellular machinery that makes proteins, known as transfer RNA or tRNA.
The researchers report in the journal Science that a mutation in a gene that produces tRNAs operating only in the central nervous system results in a “stalling” or pausing of the protein production process in the neuronal ribosomes. When another protein the researchers identified, GTPBP2, is also missing, neurodegeneration results.
“Our study demonstrates that individual tRNA genes can be tissue-specifically expressed in vertebrates,” Ackerman says, “and mutations in such genes may cause disease or modify other phenotypes. This is a new area to look for disease mechanisms.”
Neurodegeneration—the process through which mature neurons decay and ultimately die—is poorly understood, yet it underlies major human diseases, such as Alzheimer’s disease, Parkinson’s disease, Huntington’s disease and ALS (amyotrophic lateral sclerosis, also known as Lou Gehrig’s disease).
While the causes of neurodegeneration are still coming to light, there is mounting evidence that neurons are exquisitely sensitive—much more so than other types of cells—to disruptions in how proteins are made and how they fold.
tRNAs are critical in translating the genetic code into proteins, the workhorses of the cell. tRNAs possess a characteristic cloverleaf shape with two distinct “business” ends—one that reads out the genetic code in three-letter increments (or triplets), and another that transports the protein building block specified by each triplet (known as an amino acid).
In higher organisms, tRNAs are strikingly diverse. For example, while there are 61 distinct triplets that are recognized by tRNAs in humans, the human genome contains roughly 500 tRNA genes. To date little is known about why they are so numerous, whether they carry out overlapping or redundant functions, or whether they possibly have roles beyond the making of proteins.
“Multiple genes encode almost all tRNA types,” Ackerman says. “In fact, AGA codons are decoded by five tRNAs in mice. Until now, this apparent redundancy has caused us to completely overlook the disease-causing potential of mutations in tRNAs, as well as other repetitive genes.”
Ackerman and her colleagues at The Jackson Laboratory in Bar Harbor, Maine, and Farmington, Conn., The Scripps Research Institute in LaJolla, Calif., and Kumamoto University in Japan pinpointed a mutation in the tRNA gene n-Tr20 as a genetic culprit behind the neurodegeneration observed in mice lacking GTPBP2.
Remarkably, the tRNA’s activity is confined to the brain and other parts of the central nervous system, in both mice and humans. The tRNA encoded by n-Tr20 recognizes the triplet code, AGA (which specifies the amino acid arginine).
The n-Tr20 defect disrupts how proteins are made. Specifically, it causes the “factories” responsible for synthesizing proteins, called ribosomes, to stall when they encounter an AGA triplet.
Such stalling can be largely overcome, thanks to the work of a partner protein called GTPBP2. But when this partner is missing—as it is in the mutant mice that Ackerman and her colleagues studied—the stalling intensifies. This is thought to be a driving force behind the neurodegeneration seen in these mice.
Unlocking the secrets to better treating the pernicious disorders of obesity and dementia reside in the brain, according to a paper from American University’s Center for Behavioral Neuroscience. In the paper, researchers make the case for treating obesity with therapies aimed at areas of the brain responsible for memory and learning. Furthermore, treatments that focus on the hippocampus could play a role in reducing certain dementias.
"In the struggle to treat these diseases, therapies and preventive measures often fall short. This is a new way for providers who treat people with weight problems and for researchers who study dementias to think about obesity and cognitive decline," said Prof. Terry Davidson, center director and lead study author.
In the paper, published in the journal Physiology & Behavior, Davidson and colleague Ashley A. Martin review research findings linking obesity with cognitive decline, including the center’s findings about the “vicious cycle” model, which explains how weight-challenged individuals who suffer from particular kinds of cognitive impairment are more susceptible to overeating.
Obesity, Memory Deficits and Lasting Effects
It is widely accepted that overconsumption of dietary fats, sugar and sweeteners can cause obesity. These types of dietary factors are also linked to cognitive dysfunction. Foods that are risk factors for cognitive impairment (i.e., foods high in saturated fats and simple carbohydrates that make up the modern Western diet) are so widespread and readily available in today’s food environment, their consumption is all but encouraged, Davidson said.
Across age groups, evidence reveals links between excess food intake, body weight and cognitive dysfunction. Childhood obesity and consumption of the Western diet can have lasting effects, as seen through the normal aging process, cognitive deficits and brain pathologies. Several analyses of cases of mild cognitive impairment progressing to full-blown cases of Alzheimer’s disease show that the first signs of brain disease can occur at least 50 years prior to the emergence of serious cognitive dysfunction. These signs originate in the hippocampus, the area of the brain where memory, learning, decision making, behavior control and other cognitive functions come into play.
Still, most research on the role of the brain in obesity focuses on areas thought to be involved with hunger motivation (e.g., hypothalamus), taste (e.g., brain stem), reinforcement (e.g., striatum) and reward (e.g., nucleus accumbens) or with hormonal or metabolic disorders. This research has not yet been successful in generating therapies that are effective in treating or preventing obesity, Davidson says.
Experiments in rats by Davidson and colleagues show that overconsumption of the Western diet can damage or change the blood-brain barrier, the tight network of blood vessels protecting the brain and substrates for cognition. Certain kinds of dementias are known to arise from the breakdown in these brain substrates.
"Breakdown in the blood-brain barrier is more rationale for treating obesity as a learning and memory disorder," Davidson said. "Treating obesity successfully may also reduce the incidence of dementias, because the deterioration in the brain is often produced by the same diets that promote obesity."
The “vicious cycle” model AU researchers put forth says eating a Western diet high in saturated fats, sugar and simple carbohydrates produces pathologies in brain structures and circuits, ultimately changing brain pathways and disrupting cognitive abilities.
It works like this: People become less able to resist temptation when they encounter environmental cues (e.g., food itself or the sight of McDonald’s Golden Arches) that remind them of the pleasures of consumption. They then eat more of the same type of foods that produce the pathological changes in the brain, leading to progressive deterioration in those areas and impairments in cognitive processes important for providing control over one’s thoughts and behaviors. These cognitive impairments can weaken a person’s ability to resist thinking about food, making them more easily distracted by food cues in the environment and more susceptible to overeating and weight gain.
"People have known at least since the time of Hippocrates that one key to a healthy life is to eat in moderation. Yet many of us are unable to follow that good advice," Davidson said. "Our work suggests that new therapeutic interventions that target brain regions involved with learning and memory may lead to success in controlling both the urge to eat, as well as the undesirable consequences produced by overeating."
Rhymes can inspire reasoning during the third trimester in the womb
Mozart, Beethoven or even Shakespeare — pregnant mothers have been known to expose their babies to many forms of auditory stimulation. But according to researchers at the University of Florida, all a baby really needs is the music of mom’s voice.
Research published in the most recent issue of the journal Infant Behavior and Development shows that babies in utero begin to respond to the rhythm of a nursery rhyme — showing evidence of learning — by 34 weeks of pregnancy and are capable of remembering a set rhyme until just prior to birth. Nursing researcher Charlene Krueger and her team studied pregnant women who recited a rhyme to their babies three times a day for six weeks, beginning at 28 weeks’ gestational age, which is the start of the third trimester of pregnancy.
“The mother’s voice is the predominant source of sensory stimulation in the developing fetus,” said Krueger, an associate professor in the UF College of Nursing. “This research highlights just how sophisticated the third trimester fetus really is and suggests that a mother’s voice is involved in the development of early learning and memory capabilities. This could potentially affect how we approach the care and stimulation of the preterm infant.”
Krueger’s team recruited 32 pregnant women during their 28th week of pregnancy, as determined by fetal ultrasound. The participants were between 18 and 39 years of age, spoke English as a primary language and were pregnant with their first baby. Once recruited, the women were randomly assigned to either an experimental or a control group. The mean age of the women in the group was 25. In addition, 68 percent of the women were white, 28 percent were black and 4 percent were of another race or ethnicity.
From 28 to 34 weeks of pregnancy, all mothers in the study recited a passage or nursery rhyme out loud twice a day and then came in for testing at 28, 32, 33 and 34 weeks’ gestation. To determine whether the fetus could remember the pattern of speech at 34 weeks of age, all mothers were asked to stop speaking the passage. Then the fetuses were tested again at 36 and 38 weeks’ gestational age.
During testing, researchers used a fetal heart monitor, similar to what is used during traditional labor and delivery, to record heart rate and determine any changes. Researchers interpret a small heart rate deceleration in the fetus as an indicator of learning or familiarity with a stimulus.
At testing, the fetuses in the experimental group were played a recording of the same rhyme their mother had been reciting at home but spoken by a female stranger. Those in the control group heard a different rhyme also spoken by a stranger. This was to help determine if the fetus was responding simply to its mother’s voice or to a familiar pattern of speech, which is a more difficult task, Krueger said.
The researchers found that the fetus’ heart rate began to respond to the familiar rhyme recited by a stranger’s voice by 34 weeks of gestational age — once the mother had spoken the rhyme out loud at home for six weeks. They continued to respond with a small cardiac deceleration for as long as four weeks after the mother had stopped saying the rhyme until about 38 weeks. At 38 weeks, there was a statistically significant difference between the two groups in responding to the strangers’ recited rhymes — the experimental group who heard the original rhyme responded with a deeper and more sustained cardiac deceleration, whereas the control group who heard a new rhyme responded with a cardiac acceleration.
Further research is needed to more fully understand how ongoing development affects learning and memory, Krueger said. Her aim is to recognize how this type of research can influence care in preterm infants and their long-term outcomes.
“This study helped us understand more about how early a fetus could learn a passage of speech and whether the passage could be remembered weeks later even without daily exposure to it,” Krueger said. “This could have implications to those preterm infants who are born before 37 weeks of age and the impact an intervention such as their mother’s voice may have on influencing better outcomes in this high-risk population.”
Contrary to previous assumptions, researchers find that preschoolers are able to gauge the strength of their memories and make decisions based on their self-assessments. The study findings are published in Psychological Science, a journal of the Association for Psychological Science.
“Previously, developmental researchers assumed that preschoolers did not introspect much on their mental states, and were not able to reflect on their own uncertainty when problem solving,” says psychological scientist Emily Hembacher of the University of California, Davis, lead author of the study. “This is partly because young children are not usually able to tell us much about their own mental processes due to verbal limitations.”
In several previous studies in their lab, Hembacher and co-author Simona Ghetti observed that preschoolers reported feeling uncertain after giving wrong answers during tasks, suggesting the preschoolers were capable of metacognition — the ability to evaluate one’s own thoughts and mental states.
The researchers decided to examine preschoolers’ metacognition about their memories, given its importance for learning. They investigated whether kids could assess their confidence in their memories and use those assessments in deciding whether to exclude answers they had generated but were unsure of when given the option.
Eighty-one children ages 3, 4, and 5 participated in the study. The preschoolers viewed a series of drawings of various items, such as a piano or a balloon. Half of the images were presented once, and the other half were shown twice. Next, the children were presented with a pair of images: one they had seen, and a new one they had not seen. The children were instructed to pick which image they’d seen before in the previous task.
After making their choice, the preschoolers rated how confident they were that their choice was correct. They then sorted their answers into two boxes. One box was for the responses that children were confident about and wanted researchers to evaluate for a prize. The other one was for responses the children thought might be mistaken and that they didn’t want researchers to see.
The data revealed that only 4- and 5-year-olds reported being less confident in their incorrect than their correct memory responses. They were also more confident about images they’d seen twice, suggesting that they could distinguish between stronger and weaker memories. Older preschoolers were also more likely to decide whether they wanted researchers to see their answers based on their confidence level.
Although 3-year-olds didn’t display the same kind of metacognitive capability on individual responses, the data showed that 3-year-olds who had scored well reported higher confidence overall than kids who hadn’t scored as well.
When the researchers analyzed just the correct answers, they found that preschoolers of all ages sorted responses they weren’t as confident about to the box they didn’t want researchers to evaluate. So, while they may not be as advanced as their older peers, even children as young as 3 seem to display some ability to reflect on their own knowledge.
The findings contribute to research on the reliability of children’s eyewitness testimony in a court of law, and they carry important implications for educational practices.
“Previous emphasis on the development of metacognition during middle childhood has influenced education practices aimed at strengthening children’s monitoring and control of their own learning,” says Hembacher. “Now we know that some of these ideas may be adapted to meet preschoolers’ learning needs.”
Researchers discover neuroprotective role of immune cell
A type of immune cell widely believed to exacerbate chronic adult brain diseases, such as Alzheimer’s disease and multiple sclerosis (MS), can actually protect the brain from traumatic brain injury (TBI) and may slow the progression of neurodegenerative diseases, according to Cleveland Clinic research published today in the online journal Nature Communications.
The research team, led by Bruce Trapp, PhD, Chair of the Department of Neurosciences at Cleveland Clinic’s Lerner Research Institute, found that microglia can help synchronize brain firing, which protects the brain from TBI and may help alleviate chronic neurological diseases. They provided the most detailed study and visual evidence of the mechanisms involved in that protection.
"Our findings suggest the innate immune system helps protect the brain after injury or during chronic disease, and this role should be further studied," Dr. Trapp said. "We could potentially harness the protective role of microglia to improve prognosis for patients with TBI and delay the progression of Alzheimer’s disease, MS, and stroke. The methods we developed will help us further understand mechanisms of neuroprotection."
Microglias are primary responders to the brain after injury or during illness. While researchers have long believed that activated microglia cause harmful inflammation that destroys healthy brain cells, some speculate a more protective role. Dr. Trapp’s team used an advanced technique called 3D electron microscopy to visualize the activation of microglia and subsequent events in animal models.
They found that when chemically activated, microglia migrate to inhibitory synapses, connections between brain cells that slow the firing of impulses. They dislodge the synapse (called “synaptic stripping”), thereby increasing neuronal firing and leading to a cascade of events that enhance survival of brain cells.
Trapp is internationally known for his work on mechanisms of neurodegeneration and repair in multiple sclerosis. His past research has included investigation of the cause of neurological disability in MS patients, cellular mechanisms of brain repair in neurodegenerative diseases, and the molecular biology of myelination in the central and peripheral nervous systems.
Brain imaging study examines second-language learning skills
With enough practice, some learners of a second language can process their new language as well as native speakers, research at the University of Kansas shows.
Using brain imaging, a trio of KU researchers was able to examine to the millisecond how the brain processes a second language. They then compared their findings with their previous results for native speakers and saw both followed similar patterns.
The research by Robert Fiorentino and Alison Gabriele, both associate professors in the linguistics department, and José Alemán Bañón, a former KU graduate student who is now a postdoctoral researcher at the University of Reading in the United Kingdom, was published this month in the journal Second Language Research.
For years, linguists have debated whether second-language learners would ever resemble native speakers in their ability to process language properties that differ between the first and second language, such as gender agreement, which is a property of Spanish but not English. In Spanish, all nouns are categorized as masculine or feminine, and various elements in the sentence, such as adjectives, need to carry the gender feature of the noun as well.
Some researchers argued that even those who spoke a second language with a high level of accuracy were using a qualitatively different mechanism than native speakers.
“We realized that these different theories proposing that either second-language learners use the same mechanism, or a different mechanism could actually be teased apart by using brain-imaging techniques,” Gabriele said.
The team studied 26 high-level Spanish speakers who hadn’t learned to speak Spanish until after age 11 and grew up with English as the majority language. The speakers used Spanish on a daily basis and had spent an average of a year and a half in a Spanish-speaking country.
They were compared with 24 native speakers, who were raised in Spanish-speaking countries and stayed in their home country until age 17.
To measure language processing as it happens, the team used a method known as electroencephalography (EEG), which uses an array of electrodes placed on the scalp to detect patterns of brain activity with high accuracy in timing.
Once hooked up to the EEG, the test subjects were asked to read sentences, some of which had grammatical errors in either number agreement or gender agreement.
The researchers then compared the results of the second-language learners to native speakers. They found that the highly proficient second-language speakers showed the same patterns of brain activity as native speakers when processing grammatical violations in sentences.
“We show that the learners’ brain activity looks qualitatively similar to that of native speakers, suggesting that they are using the same mechanisms,” Fiorentino said.
The study highlights the brain’s plasticity and its ability to acquire a new complex system even in adulthood.
“A lot of researchers have argued that there is some sort of language learning mechanism that might atrophy over the life span, particularly before puberty. And, we certainly have a lot of evidence that it is difficult to process your second language at nativelike levels and you have to go through quite a bit of effort to find people who can,” Gabriele said. “But I think what this paper shows is that it is possible.”
Gabriele and Fiorentino are working on a second phase of the research, studying how the brain processes a second language at the initial stages of exposure. Their preliminary results suggest that properties that are shared between the first and second language show patterns of brain activity that are very similar in learners and native speakers. This suggests that learners build on the representation for language that is already in place when learning a second language.
Scientists from the Sloan-Kettering Institute for Cancer Research in New York with the help of Plymouth University Peninsula Schools of Medicine and Dentistry have completed research which for the first time brings us nearer to understanding how some cells in the brain and nervous system become cancerous.
The results of their study are published in the prestigious journal Cancer Cell.
The research team led by Sloan-Kettering researchers studied a tumour suppressor called Merlin.
The results of the study have identified a new mechanism whereby Merlin suppresses tumours, and that the mechanism operates within the nucleus. The research team has discovered that unsuppressed tumour cells increase via a core signalling system, the hippo pathway, and they have identified the route and method by which this signalling occurs.
By identifying the signalling system and understanding how, when present, Merlin suppresses it, the way is open for research into drug therapies which may suppress the signalling in a similar way to Merlin.
Tumour suppressors exist in cells to prevent abnormal cell division in our bodies. The loss of Merlin leads to tumours in many cell types within our nervous systems. There are two copies of a tumour suppressor, one on each chromosome that we inherit from our parents. The loss of Merlin can be caused by random loss of both copies in a single cell, causing sporadic tumours, or by inheriting one abnormal copy and losing the second copy throughout our lifetime as is seen in the inherited condition of neurofibromatosis type 2 (NF2).
No effective therapy for these tumours exists, other than repeated invasive surgery aiming at a single tumour at a time and which is unlikely to eradicate the full extent of the tumours, or radiotherapy.
Professor Oliver Hanemann, Director of the Institute of Translational and Stratified Medicine at Plymouth University Peninsula Schools of Medicine and Dentistry, and who led the Plymouth aspect of the study, commented:
“We have known for some time that the loss of the tumour suppressor Merlin resulted in the development of nervous system tumours, and we have come tantalisingly close to understanding how this occurs. Our joint study with colleagues at the Sloan-Kettering Institute for Cancer Research shows for the first time how this mechanism works. By understanding the mechanism, we can use this knowledge to develop effective drug therapies – in some cases adapting existing drugs – to treat patients for whom current therapies are limited and potentially devastating.”
Study finds potential genetic link between epilepsy and neurodegenerative disorders
A recent scientific discovery showed that mutations in prickle genes cause epilepsy, which in humans is a brain disorder characterized by repeated seizures over time. However, the mechanism responsible for generating prickle-associated seizures was unknown.
A new University of Iowa study, published online July 14 in the Proceedings of the National Academy of Sciences, reveals a novel pathway in the pathophysiology of epilepsy. UI researchers have identified the basic cellular mechanism that goes awry in prickle mutant flies, leading to the epilepsy-like seizures.
“This is to our knowledge the first direct genetic evidence demonstrating that mutations in the fly version of a known human epilepsy gene produce seizures through altered vesicle transport,” says John Manak, senior author and associate professor of biology in the College of Liberal Arts and Sciences and pediatrics in the Carver College of Medicine.
Seizure suppression in flies
A neuron has an axon (nerve fiber) that projects from the cell body to different neurons, muscles, and glands. Information is transmitted along the axon to help a neuron function properly.
Manak and his fellow researchers show that seizure-prone prickle mutant flies have behavioral defects (such as uncoordinated gait) and electrophysiological defects (problems in the electrical properties of biological cells) similar to other fly mutants used to study seizures. The researchers also show that altering the balance of two forms of the prickle gene disrupts neural information flow and causes epilepsy.
Further, they demonstrate that reducing either of two motor proteins responsible for directional movement of vesicles (small organelles within a cell that contain biologically important molecules) along tracks of structural proteins in axons can suppress the seizures.
“The reduction of either of two motor proteins, called Kinesins, fully suppressed the seizures in the prickle mutant flies,” says Manak, faculty member in the Interdisciplinary Graduate Programs in Genetics, Molecular and Cellular Biology, and Health Informatics. “We were able to use two independent assays to show that we could suppress the seizures, effectively ‘curing’ the flies of their epileptic behaviors.”
Genetic link between epilepsy and Alzheimer’s
This new epilepsy pathway was previously shown to be involved in neurodegenerative diseases, including Alzheimer’s and Parkinson’s.
Manak and his colleagues note that two Alzheimer’s-associated proteins, amyloid precursor protein and presenilin, are components of the same vesicle, and mutations in the genes encoding these proteins in flies affect vesicle transport in ways that are strikingly similar to how transport is impacted in prickle mutants.
“We are particularly excited because we may have stumbled upon one of the key genetic links between epilepsy and Alzheimer’s, since both disorders are converging on the same pathway,” Manak says. “This is not such a crazy idea. In fact, Dr. Jeff Noebels, a leading epilepsy researcher, has presented compelling evidence suggesting a link between these disorders. Indeed, patients with inherited forms of Alzheimer’s disease also present with epilepsy, and this has been documented in a number of published studies.”
Manak adds, “If this connection is real, then drugs that have been developed to treat neurodegenerative disorders could potentially be screened for anti-seizure properties, and vice versa.”
Manak’s future research will involve treating seizure-prone flies with such drugs to see if he can suppress their seizures.
Researchers find epigenetic tie to neuropsychiatric disorders
Dysfunction in dopamine signaling profoundly changes the activity level of about 2,000 genes in the brain’s prefrontal cortex and may be an underlying cause of certain complex neuropsychiatric disorders, such as schizophrenia, according to UC Irvine scientists.
This epigenetic alteration of gene activity in brain cells that receive this neurotransmitter showed for the first time that dopamine deficiencies can affect a variety of behavioral and physiological functions regulated in the prefrontal cortex.
The study, led by Emiliana Borrelli, a UCI professor of microbiology & molecular genetics, appears online in the journal Molecular Psychiatry.
“Our work presents new leads to understanding neuropsychiatric disorders,” Borrelli said. “Genes previously linked to schizophrenia seem to be dependent on the controlled release of dopamine at specific locations in the brain. Interestingly, this study shows that altered dopamine levels can modify gene activity through epigenetic mechanisms despite the absence of genetic mutations of the DNA.”
Dopamine is a neurotransmitter that acts within certain brain circuitries to help manage functions ranging from movement to emotion. Changes in the dopaminergic system are correlated with cognitive, motor, hormonal and emotional impairment. Excesses in dopamine signaling, for example, have been identified as a trigger for neuropsychiatric disorder symptoms.
Borrelli and her team wanted to understand what would happen if dopamine signaling was hindered. To do this, they used mice that lacked dopamine receptors in midbrain neurons, which radically affected regulated dopamine synthesis and release.
The researchers discovered that this receptor mutation profoundly altered gene expression in neurons receiving dopamine at distal sites in the brain, specifically in the prefrontal cortex. Borrelli said they observed a remarkable decrease in expression levels of some 2,000 genes in this area, coupled with a widespread increase in modifications of basic DNA proteins called histones – particularly those associated with reduced gene activity.
Borrelli further noted that the dopamine receptor-induced reprogramming led to psychotic-like behaviors in the mutant mice and that prolonged treatment with a dopamine activator restored regular signaling, pointing to one possible therapeutic approach.
The researchers are continuing their work to gain more insights into the genes altered by this dysfunctional dopamine signaling.
Twin study suggests language delay due more to nature than nurture
A study of 473 sets of twins followed since birth found that compared with single-born children, 47 percent of 24-month-old identical twins had language delay compared with 31 percent of nonidentical twins. Overall, twins had twice the rate of late language emergence of single-born children. None of the children had disabilities affecting language acquisition.
University of Kansas Distinguished Professor Mabel Rice, lead author, said that all of the language traits analyzed in the study—vocabulary, combining words and grammar—were significantly heritable with genes accounting for about 43 percent of the overall twins’ deficit.
The “twinning effect” — a lower level of language performance for twins than single-born children — was expected to be comparable for both kinds of twins, but was greater for identical twins, said Rice, strengthening the case for the heritability of language development.
“This finding disputes hypotheses that attribute delays in early language acquisition of twins to mothers whose attention is reduced due to the demands of caring for two toddlers,” Rice said. “This should reassure busy parents who worry about giving sufficient individual attention to each child.”
However, said Rice, prematurity and birth complications, more common in identical twins, could also affect their higher rates of language delay. A study of pregnancy and birth risks for late talking in twins is currently under way by the study authors.
Further, the study will continue at least until 2017 to continue to follow the twins through preschool and school years up to adolescence to answer the question of whether late-talking twins do catch up to their peers.
“Twin studies provide unique opportunities to study inherited and environmental contributions to language acquisition,” Rice said. “The outcomes inform our understanding of how these influences contribute to language acquisition in single-born children as well.”
Late language emergence means that a child’s language is below age and gender expectations in the number of words they speak and combining two or more words into sentences. In this study, 71 percent of 2-year-old twins were not combining words compared with 17 percent of single-born children.
While previous behavioral genetics studies of toddlers have largely focused on vocabulary, the researchers introduced an innovative measure of early grammatical ability on the correct use of the past tense and the “to be” and “to do” verbs. The measure was inspired by the Rice/Wexler Test of Early Grammar Impairment, developed in 2001 by Rice and Kenneth Wexler, Massachusetts Institute of Technology professor. It was the first test to detect the subtle but common language disorder, specific language impairment.
Rice’s collaborators in the international longitudinal project that began in 2002 are Professors Cate Taylor and Stephen Zubrick from the Telethon Kids Institute in Perth, Western Australia, and Professor Shelley Smith at the University of Nebraska Medical Center.
The study population is located in the vicinity of Perth, Western Australia, because it is demographically identical to Kansas City and several other U.S. Midwestern states. But in Australian health records are available, and the Western Australia Twin Registry is a unique resource for researchers since it is a record of all multiple births, Rice said.
The research group has followed the development of 1,000 sets of Western Australian twins from their first words. In 2012, the group was granted $2.8 million by the National Institute for Deafness and Other Communication Disorders for a fourth five-year-cycle that will enable researchers to continue to monitor the twins as they develop through adolescence. In addition to formal language tests, researchers have collected genetic and environmental data as well as assessments with the twins’ siblings.
Investigators at The Feinstein Institute for Medical Research have utilized a new image-based strategy to identify and measure placebo effects in randomized clinical trials for brain disorders. The findings are published in the August issue of The Journal of Clinical Investigation.
Parkinson’s disease is the second most common neurodegenerative disease in the US. Those who suffer from Parkinson’s disease most often experience tremors, slowness of movement (bradykinesia), rigidity, and impaired balance and coordination. Patients may have difficulty walking, talking or completing simple daily tasks. They may also experience depression and difficulty sleeping due to the disease. The current standard for diagnosis of Parkinson’s disease relies on a skilled healthcare professional, usually an experienced neurologist, to determine through clinical examination that someone has it. There currently is no cure for Parkinson’s disease, but medications can improve symptoms.
A team of researchers at the Feinstein Institute’s Center for Neurosciences, led by David Eidelberg, MD, has developed a method to identify brain patterns that are abnormal or indicate disease using imaging techniques. To date, this approach has been used successfully to identify specific networks in the brain that indicate a patient has or is at risk for Parkinson’s disease and other neurodegenerative disorders.
"One of the major challenges in developing new treatments for neurodegenerative disorders such as Parkinson’s disease is that it is common for patients participating in clinical trials to experience a placebo or sham effect," noted Dr. Eidelberg. "When patients involved in a clinical trial commonly experience benefits from placebo, it’s difficult for researchers to identify if the treatment being studied is effective. In a new study conducted by my colleagues and myself, we have used a new image-based strategy to identify and measure placebo effects in brain disorder clinical trials."
In the current study, the researchers used their network mapping technique to identify specific brain circuits underlying the response to sham surgery in Parkinson’s disease patients participating in a gene therapy trial. The expression of this network measured under blinded conditions correlated with the sham subjects’ clinical outcome; the network changes were reversed when the subjects learned of their sham treatment status. Finally, an individual subject’s network expression value measured before the treatment predicted his/her subsequent blinded response to sham treatment. This suggests that this novel image-based measure of the sham-related network can help to reduce the number of subjects assigned to sham treatment in randomized clinical trials for brain disorders by excluding those subjects who are more likely to display placebo effects under blinded conditions.