Posts tagged science

Posts tagged science
Striking Patterns: Skill for Forming Tools and Words Evolved Together
When did humans start talking? There are nearly as many answers to this perplexing question as there are researchers studying it. A new brain imaging study claims to support the hypothesis that language emerged long before Homo sapiens and coevolved with the invention of the first finely made stone tools nearly 2 million years ago. However, some experts think it’s premature to draw sweeping conclusions.
Unlike ancient bones and stone tools, language does not fossilize. Researchers have to guess about its origins based on proxy indicators. Does painting cave walls indicate the capacity for language? How about the ability to make a fancy tool? Yet, in recent years, scientists have made some progress. A series of brain imaging studies by Dietrich Stout, an archaeologist at Emory University in Atlanta, and Thierry Chaminade, a cognitive neuroscientist at Aix-Marseille University in France, have shown that toolmaking and language use similar parts of the brain, including regions involved in manual manipulations and speech production. Moreover, the overlap is greater the more sophisticated the toolmaking techniques are. Thus, there was little overlap when modern-day flint knappers were making stone tools using the oldest known techniques, dated to 2.5 million years ago and called the Oldowan technology. But when knappers used a more sophisticated approach, called Acheulean technology and dating to as much as 1.75 million years ago, the parallels between toolmaking and language were more evident. Stout and Chaminade have used functional magnetic resonance imaging (fMRI) and positron emission tomography (PET) scans, although not on the same subjects at the same time.
In the new work, published online today in PLOS ONE, archaeologist Natalie Uomini and experimental psychologist Georg Meyer, both at the University of Liverpool in the United Kingdom, attempted to advance these earlier studies in several ways. They applied a technique called functional transcranial Doppler ultrasonography (fTCD), which measures blood flow to the brain’s cerebral cortex and which—unlike fMRI and PET—is highly portable and can be used on subjects in the field through a device attached to their heads (see video). The fTCD approach makes it much easier to monitor subjects’ brains during vigorous activity, such as the somewhat violent motions that are required to make stone tools. Uomini and Meyer are also the first to study both toolmaking and language tasks in the same subjects.
The researchers recruited 10 expert flint knappers and gave them two different tasks. In the first, the knappers crafted an Acheulean hand ax, a symmetrical tool that requires considerable planning and skill. The procedure involves shaping a flint core with another stone called a hammerstone. While wearing the fTCD monitor, the knappers worked on the tool for periods of about 30 seconds each, interspersed with control periods of about 20 seconds in which they simply struck the core with the hammerstone without trying to make a tool.
In the second task, the knappers were asked to silently think up words beginning with a given letter. The control periods consisted of simply resting quietly and not thinking of words.
The team found that the pattern of blood flow changes in the brain during the critical first 10 seconds of each experimental period—when the knappers were strategizing about how to shape the core or thinking up their first words—was very similar, again involving areas of the brain implicated in manual manipulations and language. Moreover, although there were some variations in the patterns between the 10 knappers, the toolmaking and language patterns within each individual were very closely aligned—suggesting, the team concludes, that the same brain areas recruited in both tasks.
The results, Uomini and Meyer argue, support earlier hypotheses that language and toolmaking coevolved, perhaps beginning as early as 1.75 million years ago. This doesn’t necessarily mean that early humans were talking in the same rapid-fire way that we do today, Uomini points out, but that “the circuits for both activities were there early on.”
Stout calls the new study “exciting work” that provides “one more piece of evidence supporting a link between stone-tool making and language evolution.” Yet a number of questions remain, he says, such as whether the correlation is between the motor skills involved in making tools and in making the sounds of speech, or whether toolmaking and language share higher cognitive functions such as those used in symbolic behavior.
That question is critical, some researchers say, because the knappers in this study and the ones that Stout conducted probably used a technique known as the Late Acheulean, dating from about 500,000 years ago, which put a much greater emphasis on symmetry and aesthetic considerations than did the earliest Acheulean, dating from 1.75 million years ago. “There is an enormous difference” between these varieties of Acheulean toolmaking, says Michael Petraglia, an archaeologist at the University of Oxford in the United Kingdom, who adds that “future experimental studies should thus examine the range of techniques and methods used.”
Thus the new work is “consistent with the hypothesis” of coevolution between language and toolmaking, “but not proof of it,” says Michael Corballis, a psychologist at the University of Auckland in New Zealand. “It is possible that language itself emerged much later, but was built on circuits established during the Acheulean” period.
Thomas Wynn, an archaeologist at the University of Colorado, Colorado Springs, is even more cautious about the results. He thinks that the fTCD technique, which measures blood flow to large areas of the cerebral cortex but does not have as high a resolution as fMRI or PET, “is a crude measure, even for brain imaging techniques.” As a result, Wynn says, he is “far from convinced” that the study has anything new to say about language evolution.
Learning how the brain takes out its trash may help decode neurological diseases
Imagine that garbage haulers don’t exist. Slowly, the trash accumulates in our offices, our homes, it clogs the streets and damages our cars, causes illness and renders normal life impossible.
Garbage in the brain, in the form of dead cells, must also be removed before it accumulates, because it can cause both rare and common neurological diseases, such as Parkinson’s. Now, University of Michigan researchers are a leap closer to decoding the critical process of how the brain clears dead cells, said Haoxing Xu, associate professor in the U-M Department of Molecular, Cellular and Developmental Biology.
A new U-M study identified two critical components of this cell clearing process: an essential calcium channel protein, TRPML1, that helps the so-called garbage collecting cells, called microphages or microglia, to clear out the dead cells; and alipid molecule, which helps activate TRPML1 and the process that allows the microphages to remove these dead cells.
Moreover, the Xu lab identified a synthetic chemical compound that can activate TRPML1. Because this chemical compound ultimately helps activate this cell-clearing process, it provides a drug target that could help combat these neurological diseases.
"This is clearly a drug target," Xu said. "What this paper picks out is exactly what is going wrong in this process."
Scientists began by looking at a very rare neurodegenerative disease called Type IV Mucolipidosis, a childhood neurodegenerative disease characterized by multiple disabilities.
Xu’s group found that lack of TRPML1 function, which is the channel through which calcium is released from the lysosome—the cell’s recycling center—into the microphage cells, contributes to these neurodegenerative conditions. If this calcium channel doesn’t work, calcium cannot be released, and dead cells aren’t removed, Xu said. The synthetic chemical compound stimulates the TRPML1 calcium channel to release the calcium into the cell.
Further, dead cells “are bad for live cells,” Xu said. An excess of dead cells leads the macrophage cells to also kill healthy neurons necessary for neurological function, which in turn can lead to these neurodegenerative diseases.
There are many neurodegenerative diseases, some very rare and some more common, such as Parkinson’s and ALS. The common thread among them is the dearth of live and functioning neurons, which prevents the neurological system from carrying out normal functions, Xu said.
Thus, identifying a lipid molecule and also chemical compounds that stimulates proper function of the TRMPL1 function could revolutionize the treatment of these neurodegenerative diseases.
The next step in Xu’s research is to test how these general observations are helpful to the neurological diseases and whether the compound is effective in animal models of neurological diseases.
The paper, “A TRP channel in the lysosome regulates large particle phagocytosis via focal exocytosis,” appeared Aug. 29 online in Developmental Cell.
Left brain, right brain: Different patterns of cortical interaction
The human brain is divided into two hemispheres – left and right – in which neural functions are said to be lateralized. (For example, language and motor abilities are associated with the left hemisphere, and visuospatial attention with the right.) Although hemispheric lateralization is generally thought to benefit brain function, relationships between lateralization degree and functioning levels have not been quantified. Recently, however, scientists at the National Institutes of Health in Bethesda, MD demonstrated that the two hemispheres have qualitatively different biases: the left prefers to interact with itself – especially for regions associated with language and fine motor coordination – while the right visuospatial and attentional processing regions interact with both hemispheres. Moreover, the researchers provided direct evidence that an individual’s degree of lateralization is associated with enhanced cognitive ability.
Dr. Stephen J. Gotts spoke with Medical Xpress about the research that he, Dr. Hang Joon Jo, and Dr. Alex Martin and colleagues conducted – and the challenges they faced in so doing. “One of the tricky things about studying lateralization of function is that it’s hard to know exactly which points in the two hemispheres are correspondent.” Gotts tells Medical Xpress. This is the case, he explains, because while the hemispheres are roughly symmetrical, there are idiosyncratic differences in cortical folding between left and right for any given individual. In addition, he notes, the exact location of particular folds (known as gyri) varies across individuals.
"Neuroimaging studies have historically adopted a couple of different approaches to deal with this situation," Gotts explains. Some studies, he illustrates, transform the geometry of the brain for each individual into a so-called standard three-dimensional coordinate reference brain – for example, the Talairach-Tournoux atlas. This allows them to estimate symmetrical corresponding points by flipping the left/right x-coordinate about zero. However, he acknowledges that this technique is prone to error by as much as 1-2 centimeters in some brain locations.
"Another approach," Gotts continues, "has been to compare the magnitude of the neural response in each hemisphere during the performance of a task – for example, a language comprehension task – and calculate a quantitative laterality index to enumerate the extent of lateralization. While this approach makes a lot of sense, and doesn’t necessarily require one to solve the correspondence problem, it will be strictly limited to the brain areas that can be activated by the task.” In other words, if an area isn’t engaged by the task, it’s hard to know whether or not it’s lateralized. Moreover, it requires many different tasks to be selected in order to address the spatial scope of the entire brain – and Gotts points out that this hasn’t been carried out to date.
"Our solution addressed the correspondence problem more directly," Gotts says. The scientists first flattened out a model of each individual’s folded cortex onto a smooth surface, spatially warping and stretching each individual brain so that each cortical landmark – that is, gyrus or sulcus – was aligned across individuals. They then found corresponding points in the two hemispheres by their position on this standardized, flattened surface relative to the full set of cortical landmarks. (Sulci are depressions or fissures in the surface of the brain surrounding the gyri.) "Applying the same spatial warping to the functional data then allowed us to compare ongoing, resting brain dynamics between the hemispheres at every position on the cortical surface," Gotts explains.
Utilizing a more traditional, task-based approach to measuring laterality has another downside: researchers typically assess the average magnitude of neural response to a task condition across many individual stimulus events, meaning that dynamical interactions of brain areas aren’t as easily assessed. “It’s not impossible,” notes Gotts, “but to eliminate the effects of stimulus artifacts on connectivity estimates, it requires particular choices of neuroimaging task timing – and it’s been done a lot less often than magnitude estimation. The qualitative distinction that we observed in our study between how the hemispheres interact with one another really requires the examination of time-varying neural responses and their co-variation. I don’t think that you’d be able to anticipate this finding solely from examining average activity levels.”
With respect to the correlations with behavioral ability, Gotts points out that there are probably many different tasks that one could have chosen. “Our choice was to use tasks that have been well-studied and well-normed across individuals as part of the Wechsler intelligence scales – specifically, Vocabulary, which is correlated with many aspects of verbal abilities, and Block Design and Matrix Reasoning, which index aspects of visuospatial processing. These obviously aren’t the only possible choices, and it would be nice to follow up this work with a more thorough battery of tasks that would allow us to examine more detailed aspects of language, fine motor control, and visuospatial abilities.”
It is important to point out, Gotts adds, that there have been several previous task-based studies that have examined the relationships between lateralization magnitude and cognitive ability, with some reporting a direct relationship as their current study shows. “The main contribution of our study is to demonstrate, at a whole-brain scope, the qualitative differences between the hemispheres in their within- and between-hemisphere interactions. The correlations with behavioral ability really hammer this distinction home, since one needed to use the appropriate metric – that is, segregation versus integration – to see these correlations.”
One of the interesting things about the distinction between the hemispheres that the scientists observed, Gotts notes, is that there are implicit hints about it in the literatures on individual cognitive domains. “When people discuss language lateralization, the notion is more like classic modularity: language is operating in the left hemisphere in a manner somewhat isolated, or segregated, from the right hemisphere. This notion may come in large part from the neuropsychological literature, which shows that brain damage to the left hemisphere is much more likely to cause aphasia than damage to the right hemisphere in right-handed individuals.
In contrast, Gotts continues, visuospatial processing and attention involves coordinated processing across the entire visual field, with the left and right halves of visual space represented separately in the right and left occipital cortex, respectively. “Visual processing over the entire visual field requires inter-hemispheric integration, and integration and/or control relates to visuospatial attentional control that is more right-hemisphere lateralized. “Our findings highlight this implicit distinction, making it more explicit and showing that the respective cognitive abilities benefit from it. As a field, I think that we’ve always assumed that hemispheric lateralization was somehow beneficial for function, but very few brain imaging studies have even examined the issue directly, much less at a whole-brain scope across the range of cognitive domains known to be lateralized.”
Moving forward, says Gotts, one of the key outstanding questions is: What is the developmental time course of these hemispheric differences? That is, does the left hemisphere bias for self-interaction exist prior to skilled motor control and language function – or does it emerge later as a consequence of these functions? “If it were to exist prior to handedness and language acquisition in the first few months of age, or even in utero, then the bias could plausibly serve as the cause of the preferential left-lateralization of these functions. One could even try to predict the degree of lateralization present later in life during various tasks, or when at rest, from estimates measured early in life.”
A similar set of questions exists for the domain of visuospatial function and the right-hemisphere bias for bilateral interaction, Gotts adds. “Because our method for assessing lateralization only requires measuring resting brain activity and not the performance of complex cognitive tasks, these experiments are actually possible to perform with young infants in a reasonably parallel manner.”
According to Gotts, another crucial question for the field of human neuroscience is: What changed from monkeys to apes to humans with respect to lateralization? “Several decades ago, there was the suggestion that monkeys exhibit hand preferences like the ones humans exhibit. After much research, it became clear that monkeys are more symmetrical in their brain control of both motor and visuospatial function. However, apes – such as chimpanzees – appear to be a different story. They appear to exhibit some hand preference lateralization with accompanying brain lateralization, although perhaps not to the extremes to which humans do.” (Roughly 80-90% of human males and females are right-handed.) “As with infants, resting brain scans can be performed on monkeys and chimpanzees in a manner similar to those conducted on adult humans.”
Regarding other areas of research that might benefit from this study, Gotts thinks it would be possible to apply their methods for assessing lateralization to a range of psychiatric disorders, such as autism and schizophrenia. “There’s some suggestion in the literature that lateralization of function is altered in these disorders. Is lateralization qualitatively different from the hemispheric biases we demonstrate for typical individuals – or do they differ in magnitude? We’d also like to understand more about the relationship between handedness and cognitive ability.’
Being left-handed, he illustrates, is associated with a more bilateral representation of language – but this doesn’t appear to mandate poorer cognitive abilities in left-handed individuals. “It may be that in left-handed individuals a different optimal weighting or balance of power between the hemispheres is achieved which differs from what we’ve observed in right-handed males,” Gotts concludes. “Our methods could certainly be applied to examine this set of issues.”
Findings Point to New Potential Drug Target—GABA Neurons—to Treat Patients with Depression and Other Mood Disorders
A new drug target to treat depression and other mood disorders may lie in a group of GABA neurons (gamma-aminobutyric acid –the neurotransmitters which inhibit other cells) shown to contribute to symptoms like social withdrawal and increased anxiety, Penn Medicine researchers report in a new study in the Journal of Neuroscience.
Experts know that people suffering from depression and other mood disorders often react to rejection or bullying by withdrawing themselves socially more than the average person who takes it in strides, yet the biological processes behind these responses have remained unclear.
Now, a preclinical study, from the labs of Olivier Berton, PhD, an assistant professor in the department of Psychiatry, with Collin Challis of the Neuroscience Graduate Group, and Sheryl Beck, PhD, a professor in the department of Anesthesiology at Children’s Hospital of Philadelphia, found that bullying and other social stresses triggered symptoms of depression in mice by activating GABA neurons, in a never-before-seen direct relationship between social stimuli and this neural circuitry. Activation of those neurons, they found, directly inhibited levels of serotonin, long known to play a vital role in behavioral responses—without it, a depressed person is more likely to socially withdrawal.
Conversely, when the researchers successfully put the brake on the GABA neurons, mice became more resilient to bullying and didn’t avoid once -perceived threats.
“This is the first time that GABA neuron activity—found deep in the brainstem—has been shown to play a key role in the cognitive processes associated with social approach or avoidance behavior in mammals,” said Dr. Berton. “The results help us to understand why current antidepressants may not work for everyone and how to make them work better—by targeting GABA neurons that put the brake on serotonin cells.”
Less serotonin elicits socially defensive responses such as avoidance or submission, where enhancement—the main goal of antidepressants—induces a positive shift in the perception of socio-affective stimuli, promoting affiliation and dominance. However, current antidepressants targeting serotonin, like SSRIs, are only effective in about 50 percent of patients.
These new findings point to GABA neurons as a new, neural drug target that could help treat the other patients who don’t respond to today’s treatment.
For the study, “avoidant” mice were exposed to brief bouts of aggression from trained “bully” mice. By comparing gene expression in the brains of resilient and avoidant mice, Berton and colleagues discovered that bullying in avoidant mice puts GABA neurons in a state where they become more excitable and the mice exhibit signs of social defeat. Resilient mice, however, had no change in neuron levels and behavior.
To better understand the link between GABA and the development of stress resilience, Berton, Beck, and colleagues also devised an approach to directly manipulate levels: Lifting GABA inhibition of serotonin neurons reduced social and anxiety symptoms in mice exposed to bullies and also fully prevented neurobiological changes due to stress.
“Our paper provides a novel cellular understanding of how social defensiveness and social withdrawal develop in mice and gives us a stepping stone to better understand the basis of similar social symptoms in humans,” said Berton. “This has important implications for the understanding and treatment of mood disorders.”
(Source: uphs.upenn.edu)

Scientists use latest stem cell and gene-editing techniques to generate neurons in a dish, and reveal new clues behind deadly diseases of the brain
There is no easy way to study diseases of the brain. Extracting neurons from a living patient is both difficult and risky, while examining a patient’s brain post-mortem usually only reveals the disease’s final stages. And animal models, while incredibly informative, have frequently fallen short during the crucial drug-development stage of research. The result: we are woefully unprepared to fight—and win—the war against this class of diseases.
But scientists at the Gladstone Institutes and the University of California, San Francisco (UCSF) are taking a potentially more powerful approach: an advanced stem-cell technique that creates a human model of degenerative disease in a dish.
Using this model, the team uncovered a molecular process that causes neurons to degenerate, a hallmark sign of conditions such as Alzheimer’s disease and frontotemporal dementia (FTD). The results, published in the latest issue of Stem Cell Reports, offer fresh ammunition in the continued battle against these and other deadly neurodegenerative disorders.
The research team, led by Gladstone Investigator Yadong Huang, MD, PhD, identified an important mechanism behind tauopathies. A group of disorders that includes both Alzheimer’s and FTD, tauopathies are characterized by the abnormal accumulation of the protein Tau in neurons. This buildup is thought to contribute to the degeneration of these neurons over time, leading to debilitating symptoms such as dementia and memory loss. But while this notion has been around for a long time, the underlying processes have largely remained unclear.
“So much about the mechanisms that cause tauopathies is a mystery, in part because traditional approaches—such as post-mortem brain analysis and animal models—give an incomplete picture,” explained Dr. Huang. “But by using the latest stem-cell technology, we generated human neurons in a dish that exhibited the same pattern of cell degeneration and death that occurs inside a patient’s brain. Studying these models allowed us to see for the first time how a specific genetic mutation may kick start the tauopathy process.”
Other scientists recently discovered that the Tau mutation in question could increase a person’s risk of developing different tauopathies, including Alzheimer’s or FTD. So the research team, in collaboration with Bruce Miller, MD, who directs the UCSF Memory and Aging Center and who provided skin cells from a patient with this mutation, transformed these cells into induced pluripotent stem cells, or iPS cells. This technique, pioneered by Gladstone Investigator and 2012 Nobel Laureate Shinya Yamanaka, MD, PhD, allows scientists to reprogram adult skin cells into cells that are virtually identical to stem cells. These stem cells can then develop into almost any cell in the body.
The team combined this method with a cutting-edge gene-editing technique that essentially eliminated the Tau mutation in some of the iPS cells. The result was a system that allowed the team to compare neurons that had the mutation to those that did not.
“Our approach allowed us to grow human neurons in a dish that contained the exact same mutation as the neurons in the brain of the patient,” explained first author Helen Fong, PhD, who is also a California Institute for Regenerative Medicine postdoctoral scholar. “By comparing these diseased neurons with the ‘genetically corrected’ healthy neurons, we could see—cell by cell—how the Tau mutation leads to the abnormal build up of Tau and, over time, neuronal degeneration and death.”
“Tau’s main functions include keeping the skeletal structure of individual neurons intact and regulating neuronal activity,” said Dr. Huang. “But our research showed that the Tau produced by neurons from people with the Tau mutation is different; so it is red-flagged by the cell and targeted for destruction. However, instead of being flushed out, Tau gets chopped into pieces. These potentially toxic fragments accumulate over time and may in fact cause the neuron to degenerate and die.”
But by correcting the Tau mutation, the team effectively removed Tau’s red flag. The protein remained in one piece, the abnormal buildup ceased and the neurons remained healthy. Ongoing studies aim to determine whether the abnormal fragmentation and buildup of mutant tau is really the main cause of the neuronal death and, if so, how to block it.
Finding a way to block this toxic buildup of tau fragments has been a key focus of drug development—but has thus far been unsuccessful. But Dr. Huang and his colleagues are optimistic that their approach could be exactly what researchers need to fight back against deadly tauopathies.
“These findings not only offer a glimpse into how these powerful new models can shed light on mechanisms of disease” said Dr. Miller, “They may also prove invaluable for screening potential drugs that could be developed into better treatments for Alzheimer’s disease, FTD and related conditions.”
Investigators at The Feinstein Institute for Medical Research have discovered a new way to measure the progression of Huntington’s disease, using positron emission tomography (PET) to scan the brains of carriers of the gene. The findings are published in the September issue of The Journal of Clinical Investigation.
Huntington’s disease causes the progressive breakdown of nerve cells in the brain, which leads to impairments in movement, thinking and emotions. Most people with Huntington’s disease develop signs and symptoms in their 40s or 50s, but the onset of disease may be earlier or later in life. Medications are available to help manage the symptoms of Huntington’s disease, but treatments do not prevent the physical, mental and behavioral decline associated with the condition.
Huntington’s disease is an inherited disease, passed from parent to child through a mutation in the normal gene. Each child of a parent with Huntington’s disease has a 50/50 chance of inheriting the Huntington’s disease gene, and a child who inherits the gene will eventually develop the disease. Genetic testing for Huntington’s disease can be performed to determine whether a person carries the gene and is developing the disease even before symptoms appear. Having this ability provides an opportunity for scientists to study how the disease first develops and how it progresses in its early, presymptomatic stages. Even though a carrier of the Huntington’s disease gene may not have experienced symptoms, changes in the brain have already taken place, which ultimately lead to severe disability. Brain imaging is one tool that could be used to track how quickly Huntington’s disease progresses in gene carriers. Having a better way to track the disease at its earliest stages will make it easier to test drugs designed to delay or even prevent the onset of symptoms.
Researchers at the Feinstein Institute used PET scanning to map changes in brain metabolism in 12 people with the Huntington’s disease gene who had not developed clinical signs of the illness. The researchers scanned the subjects repeatedly over a seven-year period and found a characteristic set (network) of abnormalities in their brains. The network was used to measure the rate of disease progression in the study participants. The Feinstein Institute investigators then confirmed the progression rate through independent measurements in scans from a separate group of Huntington’s disease gene carriers who were studied in the Netherlands. The investigators believe that progression networks similar to the one identified in Huntington’s disease carriers will have an important role in evaluating new drugs for degenerative brain disorders.
“Huntington’s disease is an extremely debilitating disease. The findings make it possible to evaluate the effects of new drugs on disease progression before symptoms actually appear. This is a major advance in the field,” said David Eidelberg, MD, Susan and Leonard Feinstein Professor and head of the Center for Neurosciences at the Feinstein Institute.
(Source: northshorelij.com)
The brain doesn’t require simultaneous visual and audio stimulation to locate the source of a sound

As ventriloquists have long known, your eyes can sometimes tell your brain where a sound is coming from more convincingly than your ears can.
A series of experiments in humans and monkeys by Duke University researchers has found that the brain does not require simultaneous visual and audio stimulation to locate the source of a sound. Rather, visual feedback obtained from trying to find a sound with the eyes had a stronger effect than visual stimuli presented at the same time as the audio, according to the Duke study.
The findings could help those with mild hearing loss learn to localize voices better, improving their ability to communicate in noisy environments, said Jennifer Groh, a professor of psychology and neuroscience at Duke.
Locating where a sound is coming from is partially learned with the aid of vision. Researchers sought to learn more about how the brain locates the source of a sound when the source is unclear and there are a number of possible visual matches.
"Our study is related to ventriloquism, in which the visual image of a puppet’s mouth ‘captures’ the sound of the puppeteer’s voice," Groh said. "It is thought that one reason this illusion occurs is because vision normally teaches the brain how to tell where sounds are coming from. We investigated how the brain knows which visual stimulus should capture the location of a sound, such as why it is the puppet’s mouth and not some other visual stimulus."
The study, which appears Thursday (Aug. 29) in the journal PLOS ONE, tested two competing hypotheses. In one, the brain determines the location of a sound based on the simultaneous occurrence of audio and its visual source. In the other, the brain uses a “guess and check” method. In this scenario, visual feedback sent to the brain after the eye focuses on a sound affects how the eye searches for that sound in the future, possibly through the brain’s reward-related circuitry.
In both paradigms, the visual stimulus — an LED — was displaced from the sound. Groh’s team then looked for evidence that the LED caused a persistent mislocation of the sound.
"Surprisingly, we found that visual feedback exerts the more powerful effect on altering localization of sounds," Groh said. "This suggests that the active behavior of looking at the puppet during a ventriloquism performance plays a role in causing the shift in where you hear the voice."
Participants in the study — 11 humans and two rhesus monkeys — shifted their sight to a sound under different visual and audio scenarios.
In one scenario, called the “synchrony-only” task, a visual stimulus appeared at the same time as a sound but too briefly to provide feedback after an eye movement to that sound.
In another, the “feedback-only” task, the visual stimulus appeared during the execution of an eye movement to a sound, but was never on at the same time as the sound.
The study found that the “feedback-only task” exerted a much more powerful effect on the estimation of sound location, as measured with eye tracking, than did the other scenario. This suggests that those who have difficulty localizing sounds may benefit from practice involving eye movements.
On average, participants altered their eye movements in the direction of the lightsâ location to a greater degree, about a quarter of the way, when the visual stimulus was presented as feedback than when it was presented at the same time as the sound, the study found.
"This is about the brain’s self-improvement skills," said co-author Daniel Pages, a graduate student in Psychology & Neuroscience at Duke. "What we’re getting at is how the brain uses different types of information to improve how it does its job. In this case, it uses vision coupled with eye movements to improve hearing."
"We were surprised at how important the eye movements were," Groh said. "But finding sounds is really hard. Feedback about your performance is important for anything that is difficult, whether it is the B- you get on your homework or the error your eyes detect in localizing a sound."
(Source: today.duke.edu)
The white arrow highlights the primary neuronal cilium, a hair-like structure on nerve cells. The neuron on the right has no cilium because of the loss of a protein linked to intellectual disability in humans. Credit: YOSHIHO IKEUCHI
Intellectual disability linked to nerve cells that lose their ‘antennae’
An odd and little-known feature of nerve cells may be linked to several forms of inherited intellectual disability, researchers at Washington University School of Medicine in St. Louis have learned.
The scientists report that a genetic mutation that causes intellectual disability also blocks formation of the neuronal primary cilium, a hair-like structure that protrudes from the bodies of nerve cells.
"The primary cilium acts as a kind of antenna for nerve cells,” said first author Yoshiho Ikeuchi, PhD, a staff scientist. “It’s covered in receptors that monitor environmental conditions outside the cell and may influence the cell’s functions.”
Learning more about how the mutation sabotages production of the nerve cell cilium eventually will help scientists develop drugs to treat intellectual disability, according to senior author Azad Bonni, MD, PhD, the Edison Professor and chairman of the Department of Anatomy and Neurobiology.
"Intellectual disability—sometimes known as mental retardation—affects 1 to 2 percent of the general population, and researchers have identified more than 100 genes on the X chromosome that can cause these conditions,” Bonni said. “But we don’t know what most of these genes do, and that information is essential for new treatments.”
The research appears online Aug. 29 in Cell Reports.
Nearly every cell in the mammalian body has a primary cilium—a structure that acts as an environmental sensor. Some cells have many cilia that move together in waves. Problems with cilia are associated with disorders throughout the body, including illnesses of the kidneys, eyes and reproductive organs.
"Some of the X-linked intellectual disorders are syndromes that not only hamper brain development but also cause problems elsewhere in the body,” Bonni said. “That makes sense in the context of this new connection we’ve identified between intellectual disability and the primary cilium.”
Scientists only recently have recognized the potential of a primary cilium malfunction to impair nerve cell development and function. Studies have suggested that the primary cilium may be where nerve cells receive the growth signals that allow them to extend branches to each other and form circuits. Other research has shown that blocking of signal receptors on the primary cilium leads to memory problems in mice.
Bonni’s path to the primary cilium led through the nucleus, the command center that contains a cell’s DNA. Proteins found inside a cell’s nucleus often regulate the turning on or off of other genes, making them influential in orchestrating the responses and functions of cells.
Bonni and his colleagues scanned the literature on X chromosome genes linked to intellectual disability to learn which genes produce proteins found in the nucleus. When they disabled 15 such genes in individual nerve cells, they found that the loss of the gene for polyglutamine-binding protein 1 (PQBP1) produced the most dramatic effect, leaving nerve cells with shortened primary cilia or no cilia at all.
In other cell types outside the brain, PQBP1 is typically found only in the nucleus. But the new results show that in neurons the protein is present both in the nucleus and, surprisingly, at the base of the primary cilium.
The scientists learned PQBP1 binds to another protein outside the nucleus that suppresses growth of the primary cilium. By binding to the suppressor, PQBP1 gets that suppressor out of the way, allowing cilium formation to proceed normally.
Scientists may one day try to imitate this effect with drugs, potentially allowing the brain to develop more normally when PQBP1 is mutated. For now, the researchers want to learn more about the suppressor protein and also are investigating the possibility that PQBP1 may continue to influence the functions of the primary cilium after it is formed.
Schizophrenia is one of the most devastating neurological conditions, with only 30 percent of sufferers ever experiencing full recovery. While current medications can control most psychotic symptoms, their side effects can leave individuals so severely impaired that the disease ranks among the top ten causes of disability in developed countries.
Now, in this week’s issue of the Proceedings of the National Academy of Sciences, Thomas Albright and Ricardo Gil-da-Costa of the Salk Institute for Biological Studies describe a model system that completes the bridge between cellular and human studies of schizophrenia, an advance that should help speed the development of therapeutics for schizophrenia and other neurological disorders.
"Part of the terror of schizophrenia is that the brain can’t properly integrate sensory information, so the world is a disorientating series of unrelated bits of input," says Albright, the Conrad T. Prebys Chair in Vision Research. "We’ve created a model that tests the ability to do sensory integration, which should be extremely useful for pharmaceutical research."
Currently, over 1.1 percent of the world’s population has schizophrenia, with an estimated three million individuals in the United States alone. The economic cost is high: In 2002, Americans spent nearly $63 billion on treatment and managing disability. The emotional cost is higher still: Ten percent of those with schizophrenia are driven to commit suicide by the burden of coping with the disease.
Initially, it was thought that excessive amounts of the neurotransmitter dopamine caused psychotic symptoms, and indeed, current anti-psychotic drugs work by blocking dopamine from entering brain cells. But nearly all of these drugs have severe cognitive side effects, which led researchers to speculate that some other mechanism must also be involved.
A major clue to understanding schizophrenia came with the development of phencyclidine (PCP) in 1956. It was intended to keep patients safely asleep during surgeries, but many woke up with symptoms similar to those experienced by people with schizophrenia, including hallucinations and the disorientation of feeling “dissociated” from their limbs, resulting in PCP being abandoned for clinical purposes. A decade later, it was replaced by a derivative called ketamine. At doses high enough to put patients to sleep, ketamine is an effective anesthetic. At lower doses, it temporarily produces the same schizophrenia-like effects as PCP.
The two drugs are part of a class called N-methyl-D-aspartate receptor antagonists. Essentially, they work by gumming up the mechanism by which glutamate, the main excitatory neurotransmitter, would enter brain cells. Thus, it is clear that dopamine dysfunction accounts for some of the symptoms of psychosis, although that is probably not the full story.
"While dopamine has limited reach in the brain, any dysfunction in glutamate would be expected to have the sort of widespread effects we see in the perceptual disorders associated with schizophrenia," says Albright. "Nevertheless, which neurotransmitter was primary to these disorders—glutamate or dopamine—has been argued about for years."
Standing in the way of a definitive answer was a researcher’s Catch-22: Many experiments designed to understand cognitive disorders such as schizophrenia or Alzheimer’s require a participant’s conscious attention-yet these disorders interfere with attention.
To get around this, scientists turned to electroencephalograms (EEGs), which can be used to detect changes in cases where a subject is not consciously paying attention to a stimulus, by recording the brain’s electrical signals through electrodes placed in a scalp cap. In one test, a series of tones is played, but an “oddball” tone breaks the pattern in the sequence. A healthy brain can still easily spot the differences, even if a participant is concentrating on another task, such as reading a magazine.
"The test works because the brain is a prediction machine-it’s built to anticipate what should come next," says Albright. "If you have healthy working memory, you should be able to perceive a pattern and notice when something violates it, but patients suffering from some mental health disorders lack that basic ability."
In their latest research, Albright’s team detected the difference through two signals, event-related brain potentials called mismatch negativity (MMN) and P3. The MMN reflects differential brain activity to the detected oddball tone, below the level of conscious awareness. P3 picks up the next phase: a subject’s attention orientation to the oddball tone.
Still, a gap in understanding remained. While scientists could do cellular work in animal models on the role of dopamine versus glutamate, and they could do EEGs in human beings, a bridge between the two remained elusive. Such a bridge can help scientists understanding of how healthy and disordered brains work from the cellular level all the way to the multiple interactions between brain areas. Moreover, it can enable pre-clinical and clinical trials linking cellular and systems levels for successful therapeutic avenues.
Gil-da-Costa has at last crossed the bridge by crafting the first non-invasive scalp EEG setup that records accurately from the brains of non-human primates, with the same proportional density of electrodes as a human cap and no distortions in signal caused by an incorrect fit. This setup allows him to get accurate measurements of MMN and P3, with the same protocols that are followed in humans. As a result, the lab has come closer than ever before to untangling the roles of dopamine and glutamate.
"While rodents are essential for understanding mechanisms at a cellular or molecular level, at a higher cognitive level, the best you could do was a sort of rough analogy. Now, finally, we can have a one-to-one correspondence," says Gil-da-Costa. "For sensory integration, our findings with this model support the glutamate hypothesis."
Pharmaceutical companies are interested in the model, because of the potential for more precise testing and the universality of the MMN/P3 assays. “These brain makers are the same across dozens of neurological diseases, as well as brain trauma, so you can test potential therapies not just for schizophrenia, but for conditions such as Parkinson’s, Alzheimer’s, bi-polar disorder, and traumatic brain injuries,” says Gil-da-Costa. “We hope this will help begin a new era in neurological therapeutics.”
(Source: salk.edu)
Study is the first to find functional MRI differences in working memory in people with primary insomnia

A new brain imaging study may help explain why people with insomnia often complain that they struggle to concentrate during the day even when objective evidence of a cognitive problem is lacking.
"We found that insomnia subjects did not properly turn on brain regions critical to a working memory task and did not turn off ‘mind-wandering’ brain regions irrelevant to the task," said lead author Sean P.A. Drummond, PhD, associate professor in the department of psychiatry at the University of California, San Diego, and the VA San Diego Healthcare System, and Secretary/Treasurer of the Sleep Research Society. "Based on these results, it is not surprising that someone with insomnia would feel like they are working harder to do the same job as a healthy sleeper."
The research team led by Drummond and co-principal investigator Matthew Walker, PhD, studied 25 people with primary insomnia and 25 good sleepers. Participants had an average age of 32 years. The study subjects underwent a functional magnetic resonance imaging scan while performing a working memory task.
Results published in the September issue of the journal Sleep show that participants with insomnia did not differ from good sleepers in objective cognitive performance on the working memory task. However, the MRI scans revealed that people with insomnia could not modulate activity in brain regions typically used to perform the task.
As the task got harder, good sleepers used more resources within the working memory network of the brain, especially the dorsolateral prefrontal cortex. Insomnia subjects, however, were unable to recruit more resources in these brain regions. Furthermore, as the task got harder, participants with insomnia did not dial down the “default mode” regions of the brain that are normally only active when our minds are wandering.
"The data help us understand that people with insomnia not only have trouble sleeping at night, but their brains are not functioning as efficiently during the day," said Drummond. "Some aspects of insomnia are as much of a daytime problem as a nighttime problem. These daytime problems are associated with organic, measurable abnormalities of brain activity, giving us a biological marker for treatment success."
According to the authors, the study is the largest to examine cerebral activation with functional MRI during cognitive performance in people with primary insomnia, relative to well-matched good sleepers. It also is the first to characterize functional MRI differences in working memory in people with primary insomnia.
The American Academy of Sleep Medicine reports that about 10 to 15 percent of adults have an insomnia disorder with distress or daytime impairment. Most often insomnia is a comorbid disorder occurring with another problem such as depression or chronic pain, or caused by a medication or substance. Fewer people suffering from insomnia are considered to have primary insomnia, which is defined as a difficulty falling asleep or maintaining sleep in the absence of a coexisting condition.
(Source: eurekalert.org)