Posts tagged brain

Posts tagged brain

Avoid impulsive acts by imagining future benefits
Why is it so hard for some people to resist the least little temptation, while others seem to possess incredible patience, passing up immediate gratification for a greater long-term good?
The answer, suggests a new brain imaging study from Washington University in St. Louis, lies in how effective people are at feeling good right now about all the future benefits that may come from passing up a smaller immediate reward. Researchers found that activity in two regions of the brain distinguished impulsive and patient people.
“Activity in one part of the brain, the anterior prefrontal cortex, seems to show whether you’re getting pleasure from thinking about the future reward you are about to receive,” explains study co-author Todd Braver, PhD, professor of psychology in Arts & Sciences. “People can relate to this idea that when you know something good is coming, just that waiting can feel pleasurable.”
The study, which was published in the first issue of the Journal of Neuroscience this year, was designed to examine what happens in the brain as people wait for a reward, especially whether people characterized as “impulsive” would show different brain responses than those considered “patient.”
The lead author of the study was Koji Jimura, then a postdoctoral researcher in Braver’s Cognitive Control and Psychopathology Laboratory, and now a research associate professor at the Tokyo Institute of Technology, in Japan.
Unlike previous research on delayed gratification that had people choose between hypothetical rewards of money over long delays (e.g, $500 now or $1,000 a year from now), this Washington University study presented their participants with real rewards of squirts of juice that they chose to receive either immediately or after a delay of up to a minute.
“It’s kind of funny because we treated the people in our study like researchers that work with animals do, and we actually squirted juice into their mouths,” Braver says.
Results show that a brain region called the ventral striatum (VS) ramped up its activity in impulsive people as they got closer and closer to receiving their delayed reward. The VS activity of patient people, on the other hand, stayed more constant.
The researchers interpreted these different brain responses to mean that impulsive people initially did not find the prospect of waiting for a reward very appealing. However, as they approached the time they’d receive that reward, they became more excited and their VS reflected that excitement.
“This gradual increase may reflect impatience or excessive anticipation of the upcoming reward in impulsive individuals,” says Jimura. This was unlike patient people, who were likely content with waiting for the reward from the start, as no changes in VS activity were observed for them.
The most novel finding of the study concerned the anterior prefrontal cortex (aPFC). This is the part of the brain that helps you think about the future. Here, we found that the patient people heightened activity in the aPFC when they first started waiting for they reward, which then decreased as the time to receive the reward approached. Impulsive people didn’t show this brain activity pattern.
“The aPFC appears to allow you to create a mental simulation of the future. It helps you consider what it’ll be like getting the future reward. In this way, you can get access to the utility and satisfaction in the present,” says Braver.
By thinking about the future reward, patient people were able to gain what economists call “anticipatory utility.” While their reward was far away in time, they were giddy with anticipation in the present. Conversely, impulsive people weren’t thinking beyond the present and so did not feel pleasure when they were told they had to wait. Their excitement built only as they got closer to receiving their reward.
Overall this study suggests that people may be impulsive because they do not or cannot imagine the future, so they prefer rewards right away. This research could be useful for assessing the effects of clinical treatments for impulsivity problems, which can lead to issues such as problem gambling and substance abuse disorders. A similar brain imaging approach as was used in the Washington University study could allow clinicians to track the effects of an intervention on changes not only in impulsive behavior but also changes in patients’ brain responses.
“One possible treatment approach could be to enhance mental functions in aPFC, a brain region well-known to be associated with cognitive control,” says Jimura. By increasing cognitive control, impulsive patients could learn to reject their immediate impulses.
Impulsivity occurs not only in a clinical setting but also every day in our own lives. Applying his research to his personal life, Braver says, “When I’m successful at achieving long-term goals it’s from explicitly trying to activate that goal and imagining each decision as helping me achieve it, to keep me on track.” Perhaps adopting this strategy of focusing on the long-term could help us move past present distractions and move toward our future goals.
Surprisingly, yes.
The modern lobotomy originated in the 1930s, when doctors realized that by severing fiber tracts connected to the frontal lobe, they could help patients overcome certain psychiatric problems, such as intractable depression and anxiety. Over the next two decades, the procedure would become simple and popular, completed by poking a sharpened tool above the eyeball. According to one study, about two thirds of patients showed improvement after surgery.
Unfortunately, not all lobotomy practition-ers were responsible, and the technique left some patients with severe side effects, including seizures, lethargy, changes in personality, and incontinence. In response, doctors refined their techniques. They replaced the lobotomy with more specialized approaches: the cingulotomy, the anterior capsulotomy, and the subcaudate tractotomy. Studies of these procedures found evidence of benefit for at least one fourth of patients suffering from problems such as OCD and depression.
Even with the risk of side effects, those in the field still say the procedures were by and large successful. “I feel that the principle behind ablative surgery was somewhat exonerated by the research findings, which showed that it worked for very specific indications,” says Konstantin Slavin, president of the American Society for Stereotactic and Functional Neurosurgery, and professor at the University of Illinois at Chicago.
By the 1980s, lobotomies had fallen out of fashion. “In general, the entire functional neurosurgery field moved away from destruction—from ablative surgery,” Slavin says. A then-new technique called deep-brain stimulation made ablative surgery obsolete. In the procedure, a surgeon drills holes in the head and inserts electrodes into the neural tissue. When current passes through the leads, they activate or inactivate patches of the brain. “The attractive part is that we don’t destroy the tissue,” Slavin says. Doctors can also adjust treatment if a patient suffers side effects. They can turn the current down or suspend it altogether—so as to “give the brain a holiday,” as Slavin calls it.
Most deep-brain stimulation is now used to treat movement disorders such as Parkinson’s Disease. The surgical treatment of patients with OCD is FDA-approved but reserved only for extreme cases. Slavin and his colleagues have been examining broader uses in an ongoing study. “Within the next five years, we hope we’ll have a definitive answer of whether or not it works.”

Awake imaging device moves diagnostics field forward
A technology being developed at the Department of Energy’s Oak Ridge National Laboratory promises to provide clear images of the brains of children, the elderly and people with Parkinson’s and other diseases without the use of uncomfortable or intrusive restraints.
Awake imaging provides motion compensation reconstruction, which removes blur caused by motion, allowing physicians to get a transparent picture of the functioning brain without anesthetics that can mask conditions and alter test results. The use of anesthetics, patient restraints or both is not ideal because they can trigger brain activities that may alter the normal brain functions being studied.
With this new capability, researchers hope to better understand brain development in babies, pre-teens and teen-agers. In addition, they believe the technology will provide unprecedented insight into conditions such as autism, drug addictions, alcoholism, traumatic brain injuries and Alzheimer’s disease.
"With this work, we’re hoping to establish a new paradigm in noninvasive diagnostic imaging," said Justin Baba, a biomedical engineer who heads the ORNL development team.
The study, which was performed in collaboration with Thomas Jefferson National Accelerator Laboratory and Johns Hopkins University, utilized an awake imaging scanner and awake, unanesthetized, unrestrained mice that had been injected with a radiotracer known as DaTSCAN, provided by GE-Medical.
With awake imaging using DaTSCAN and other molecular probes, Baba and colleagues envision development of new, more effective therapies for a wide assortment of conditions and diseases while also contributing to pharmaceutical drug discovery, development and testing. The technology could also help with real-time stabilization and registration of targets during surgical intervention.
Baba noted that this technical accomplishment, detailed in a paper published in The Journal of Nuclear Medicine, has its origins in past DOE-supported research on biomedical imaging. The paper is titled “Conscious, Unrestrained Molecular Imaging of Mice with AwakeSPECT.” Jim Goddard of ORNL’s Measurement Science and Systems Engineering Division is a co-author.
While a working prototype scanner is located at Johns Hopkins School of Medicine, ORNL is pursuing commercialization of the technology.
A “light switch” in the brain illuminates neural networks
Researchers from NTNU’s Kavli Institute of Systems Neuroscience are able to see which cells communicate with each other in the brain by flipping a neural light switch. The results of their efforts are presented in an article in the 5 April issue of Science magazine.
There are cells in your brain that recognize very specific places, and have that and nothing else as their job. These cells, called place cells, are found in an area behind your temple called the hippocampus. While these cells must be sent information from nearby cells to do their job, so far no one has been able to determine exactly what kind of cells work with place cells to craft the code they create for each location. Neurons come in many different types with specialized functions. Some respond to edges and borders, others to specific locations, others act like a compass and react to which way you turn your head.
Now, researchers at the Kavli Institute for Systems Neuroscience have developed a range of advanced techniques that enable them to identify which neurons communicate with each other at different times in the rat brain, and in doing so, create the animal’s sense of direction.
"A rat’s brain is the size of a grape. Inside there are about fifty million neurons that are connected together at a staggering 450 billion places (roughly)," explains Professor Edvard Moser, director of the Kavli Institute. "Inside this grape-sized brain are areas on each side that are smaller than a grape seed, where we know that memory and the sense of location reside. This is also where we find the neurons that respond to specific places, the place cells. But from which cells do these place cells get information?"
From spaghetti to light switches
The problem is, of course, that researchers cannot simply cut open the rat brain to see which cells have had contact. That would be the equivalent of taking a giant pile of cooked spaghetti, chopping it into little pieces, and then trying to figure out how the various spaghetti strands were tangled together before the pile was cut up.
A job like this requires the use of a completely different set of neural tools, which is where the “light switches” come into play.
Neurons share many similarities with electric cables when they send signals to each other. They send an electric current in one direction – from the “body” of the neuron and down a long arm, called the axon, which goes to another nerve cell next in line. Place cells thus get their small electric signals from a whole series of such arms.
So how do light switches play into all of this?
Viruses do the work
“What we did first was to give these nerve arms a harmless viral infection,” Moser says. “We designed a unique virus that does not cause disease, but that acts as a pathway for delivering genes to specific cells. The virus creeps into the neurons, crawls up against the electric current, and uses the nerve cell’s own factory to make the genetic recipe that we gave to the virus to carry.”
The genetic recipe enabled the cell to make the equivalent of a light switch. Our eyes actually contain the same kind of biological light switch, which allows us to see. The virus infection converts neurons that have previously existed only in darkness, deep inside the brain, to now be sensitive to light.
Then the researchers inserted optical fibres in the rat’s brain to transmit light to the place cells that had light switches in them. They also implanted thin microelectrodes down between the cells so they could detect the signals sent through the axons every time the light from the optical fibre was turned on.
"Now we had everything set up, with light switches installed in cells around the place cells, a lamp, and a way to record the activity," Moser said.
10,000 times
The researchers then turned the lights on and off more than ten thousand times in their rat lab partners, while they monitored and recorded the activity of hundreds of individual cells in the rats’ grape-sized brains. The researchers did this research while the rats ran around in a metre-square box, gathering treats. As the rats explored their box and found the treats, the researchers were able to use the light-sensitive cells to reveal how the rat’s brain created the map of where the rat had been.
When the researchers put together all the information afterwards they concluded that there is a whole range of different specialized cells that together provide place cells their information. The brain’s GPS – its sense of place – is created by signals from head direction cells, border cells, cells that have no known function in creating location points and grid cells. Place cells receive both information about the rat’s surroundings and landmarks, but also continuously update their own movement, which is actually independent on sensory input.
"The biggest mystery is the role that the cells that are not part of the sense of direction play. They send signals to place cells, but what do they actually do?" wonders Moser.
"We also wonder how the cells in the hippocampus are able to sort out the various signals they receive. Do they ‘listen’ to all of the cells equally effectively all the time, or are there some cells that get more time than others to ‘talk’ to place cells?"

Brain-imaging tool and stroke risk test help identify cognitive decline early
The connection between stroke risk and cognitive decline has been well established by previous research. Individuals with higher stroke risk, as measured by factors like high blood pressure, have traditionally performed worse on tests of memory, attention and abstract reasoning.
The current small study demonstrated that not only stroke risk, but also the burden of plaques and tangles, as measured by a UCLA brain scan, may influence cognitive decline.
The imaging tool used in the study was developed at UCLA and reveals early evidence of amyloid beta “plaques” and neurofibrillary tau “tangles” in the brain — the hallmarks of Alzheimer’s disease.
The study, published in the April issue of the Journal of Alzheimer’s Disease, demonstrates that taking both stroke risk and the burden of plaques and tangles into accout may offer a more powerful assessment of factors determining how people are doing now and will do in the future.
"The findings reinforce the importance of managing stroke risk factors to prevent cognitive decline even before clinical symptoms of dementia appear," said first author Dr. David Merrill, an assistant clinical professor of psychiatry and biobehavioral sciences at the Semel Institute for Neuroscience and Human Behavior at UCLA.
This is one of the first studies to examine both stroke risk and plaque and tangle levels in the brain in relation to cognitive decline before dementia has even set in, Merrill said.
According to the researchers, the UCLA brain-imaging tool could prove useful in tracking cognitive decline over time and offer additional insight when used with other assessment tools.
For the study, the team assessed 75 people who were healthy or had mild cognitive impairment, a risk factor for the future development of Alzheimer’s. The average age of the participants was 63.
The individuals underwent neuropsychological testing and physical assessments to calculate their stroke risk using the Framingham Stroke Risk Profile, which examines age, gender, smoking status, systolic blood pressure, diabetes, atrial fibrillation (irregular heart rhythm), use of blood pressure medications, and other factors.
In addition, each participant was injected with a chemical marker called FDDNP, which binds to deposits of amyloid beta plaques and neurofibrillary tau tangles in the brain. The researchers then used positron emission tomography (PET) to image the brains of the subjects — a method that enabled them to pinpoint where these abnormal proteins accumulate.
The study found that greater stroke risk was significantly related to lower performance in several cognitive areas, including language, attention, information-processing speed, memory, visual-spatial functioning (e.g., ability to read a map), problem-solving and verbal reasoning.
The researchers also observed that FDDNP binding levels in the brain correlated with participants’ cognitive performance. For example, volunteers who had greater difficulties with problem-solving and language displayed higher levels of the FDDNP marker in areas of their brain that control those cognitive activities.
"Our findings demonstrate that the effects of elevated vascular risk, along with evidence of plaques and tangles, is apparent early on, even before vascular damage has occurred or a diagnosis of dementia has been confirmed," said the study’s senior author, Dr. Gary Small, director of the UCLA Longevity Center and a professor of psychiatry and biobehavioral sciences who holds the Parlow–Solomon Chair on Aging at UCLA’s Semel Institute.
Researchers found that several individual factors in the stroke assessment stood out as predictors of decline in cognitive function, including age, systolic blood pressure and use of blood pressure–related medications.
Small noted that the next step in the research would be studies with a larger sample size to confirm and expand the findings.

New Genetic Evidence Suggests a Continuum Among Neurodevelopmental and Psychiatric Disorders
A paper published this month in the prestigious medical journal The Lancet Neurology suggests that a broad spectrum of developmental and psychiatric disorders, ranging from autism and intellectual disability to schizophrenia, should be conceptualized as different manifestations of a common underlying denominator, “developmental brain dysfunction,” rather than completely independent conditions with distinct causes.
In “Developmental Brain Dysfunction: Revival and Expansion of Old Concepts Based on New Genetic Evidence,” the authors make two key points:
According to Andres Moreno De Luca, M.D., research scientist at the Autism and Developmental Medicine Institute at Geisinger Health System and article co-author, “Recent genetic studies conducted in thousands of individuals have shown that identical genetic mutations are shared among neurodevelopmental disorders that are thought to be clinically distinct. What we have seen over the past few years is that genetic mutations that were initially found in individuals with one disorder, such as intellectual disability or autism, are then identified in people with an apparently different condition like schizophrenia, epilepsy, or bipolar disorder.”
“It turns out that the genes don’t respect our diagnostic classification boundaries, but that really isn’t surprising given the overlapping symptoms and frequent co-existence of neurodevelopmental disorders,” said Scott M. Myers, M.D., autism specialist at Geisinger Health System and article co-author.
“We believe this study supports use of the term ‘developmental brain dysfunction’ or DBD, which would encompass the broad spectrum of neurodevelopmental and neuropsychiatric disorders,” said David H. Ledbetter, Ph.D., executive vice president and chief scientific officer at Geisinger Health System, and article co-author. “Additionally, it is clear that diagnostic tools such as whole genome analysis for both children and their families are essential when diagnosing and treating these disorders in order to ensure the most personalized treatment.”
An example used in the study was analysis of intelligence quotient (IQ) scores. The average IQ score in the general population is 100. Historically, the medical community has defined intellectual disability as an IQ of less than 70 (with concurrent deficits in adaptive functioning). But according to Dr. Ledbetter, there is little difference in the function of a child with an IQ of 69 versus 71, yet one may be diagnosed with a disability and the other may not.
“We know a variety of factors contribute to IQ score, including genetics, as a child’s IQ is highly correlated with that of his or her parents and siblings. Therefore, an important factor to take into consideration when interpreting IQ is family background,” said Dr. Ledbetter. “Imagine if we have a child with a genetic abnormality, but the child’s IQ is 85. Technically, we would not diagnose this child with a disability. However, if the family of this child has IQs around 130, we could consider that this child’s genetic anomaly has ‘cost’ him or her 45 IQ points – a very substantial difference.”
According to Dr. Myers, “One implication of this concept is that studies designed to investigate the causes and mechanisms of developmental brain dysfunction should focus on measurement of quantifiable neuropsychological and neurobehavioral traits across groups of individuals with different clinical diagnoses. Another is that whenever possible, individuals with a particular genetic variant or other risk factor should be compared to their unaffected family members, not just to population norms.”
The feeling of hunger itself may protect against Alzheimer’s disease, according to study published today in the journal PLOS ONE. Interestingly, the results of this study in mice suggest that mild hunger pangs, and related hormonal pathways, may be as important to the much-discussed value of “caloric restriction” as actually eating less.

Caloric restriction is a regimen where an individual consumes fewer calories than average, but not so few that they become malnourished. Studies in many species have suggested that it could protect against neurodegenerative disorders and extend lifespans, but the effect has not been confirmed in human randomized clinical trials.
Efforts to understand how cutting calories may protect the brain have grown increasingly important with news that American Alzheimer’s deaths are increasing, and because the best available treatments only delay onset in a subset of patients.
Study authors argue that hormonal signals are the middlemen between an empty gut and the perception of hunger in the brain, and that manipulating them may effectively counter age-related cognitive decline in the same way as caloric restriction.
“This is the first paper, as far as we are aware, to show that the sensation of hunger can reduce Alzheimer’s disease pathology in a mouse model of the disease,” said Inga Kadish, Ph.D., assistant professor in the Department of Cell, Developmental and Integrative Biology (CDIB) within the School of Medicine at the University of Alabama at Birmingham. “If the mechanisms are confirmed, hormonal hunger signaling may represent a new way to combat Alzheimer’s disease, either by itself or combined with caloric restriction.”
The team theorizes that feeling hungry creates mild stress. That, in turn, fires up metabolic signaling pathways that counter plaque deposits known to destroy nerve cells in Alzheimer’s patients. The idea is an example of hormesis theory, where damaging stressors like starvation are thought to be good for you when experienced to a lesser degree.
To study the sensation of hunger, the research team analyzed the effects of the hormone ghrelin, which is known to make us feel hungry. They used a synthetic form of ghrelin in pill form, which let them control dosage such that the ghrelin-treated mice felt steadily, mildly hungry.
If it could be developed, a treatment that affected biochemical pathways downstream of hunger signals might help delay cognitive decline without consigning people to a life of feeling hungry. Straight caloric restriction would not be tolerable for many persons over the long-run, but manipulating post-hunger signaling might.
This line of thinking becomes important because any protective benefit brought about by drugs or diets that mildly adjust post-hunger signals might be most useful if started in those at risk as early in life as possible. Attempts to treat the disease years later – when nerve networks are damaged enough for neurological symptoms to appear – may be too late. In the current study, it was long-term treatment with a ghrelin agonist that improved cognitive performance in mice tested when they had reached an advanced age.
Study details
The study looked at whether or not the feeling of hunger, in the absence of caloric restriction, could counter Alzheimer’s pathology in mice genetically engineered to have three genetic mutations known to cause the disease in humans.
Study mice were divided into three groups: one that received the ‘synthetic ghrelin’ (ghrelin agonist), a second that underwent caloric restriction (20 percent less food) and a third group that was fed normally. Study measures looked at each group’s ability to remember, their degree of Alzheimer’s pathology and their level of related, potentially harmful immune cell activation.
Results of such studies are most appropriately presented in terms of general trends in the data and statistical assessments of their likelihood if only chance factors were in play, a trait captured in each result’s P value (the smaller the better). Thus, the first formal result of the study are that, in mice with the human Alzheimer’s mutations, both the group treated with the ghrelin agonist LY444711 and the group that underwent caloric restriction performed significantly better in the a water maze than did than mice fed normally (p=0.023).
The water maze is the standard test used to measure mouse memory. Researchers put mice in a pool with an invisible platform on which they could rest, and measured how quickly the mice found the platform in a series of tests. Mice with normal memory will remember where the platform is, and find it more quickly each time they are placed in the pool. Ghrelin agonist-treated mice found the hidden platform 26 percent faster than control mice, with caloric restricted mice doing so 23 percent faster than control mice.
The second result was a measure of the buildup of a cholesterol-related protein called amyloid beta in the forebrain, an early step in the destruction of nerve cells that accompanies Alzheimer’s disease. The formal amyloid beta results show that mice either treated with the ghrelin agonist or calorically restricted had significantly less buildup of amyloid beta in the dentate gyrus, the part of the brain that controls memory function, than mice fed normally (i.e., control, 3.95±0.83; LY, 2.05±0.26 and CR, 1.28±0.17%, respectively; Wilcoxon p=0.04).
The above results translate roughly into a 67 percent reduction of this pathology in caloric-restricted mice as compared to control mice, and a 48 percent reduction of amyloid beta deposits when comparing the ghrelin-treated mice with the control group. These percentages are neither final nor translatable to humans, but are simply meant to convey the idea of “better.”
Finally, the team examined the difference in immune responses related to Alzheimer’s pathology in each of the three groups. Microglia are the immune cells of the brain, engulfing and removing invading pathogens and dead tissue. They have also been implicated in several diseases when their misplaced activation damages tissues. The team found that mice receiving the ghrelin agonist treatment had both reduced levels of microglial activation compared to the control group, similar to the effect of caloric restriction.
The ghrelin agonist used in the study does not lend itself to clinical use and will not play a role in the future prevention of Alzheimer’s disease, said Kadish. It was meant instead to prove a principle that hormonal hunger signaling itself can counter Alzheimer’s pathology in a mammal. The next step is to understand exactly how it achieved this as a prerequisite to future treatment design.
Ghrelin is known to create hunger signals by interacting with the arcuate nucleus in the part of the brain called the hypothalamus, which then sends out signaling neuropeptides that help the body sense and respond to energy needs. Studies already underway in Kadish’s lab seek to determine the potential role of these pathways and related genes in countering disease.
“Our group in the School of Public Health was studying whether or not a ghrelin agonist could make mice hungry as we sought to unravel mechanisms contributing to the life-prolonging effects of caloric restriction,” said David Allison, Ph.D., associate dean for Science in the UAB School of Public Health and the project’s initiator.
“Because of the interdisciplinary nature of UAB, our work with Dr. Allison led to an amazing conversation with Dr. Kadish about how we might combine our research with her longtime expertise in neurology because caloric restriction had been shown in early studies to counter Alzheimer’s disease,” said Emily Dhurandhar, Ph.D., a trainee in the UAB Nutrition Obesity Research Center and first study author. “The current study is the result.”
(Source: uab.edu)

Vitamin P as a potential approach for the treatment of damaged motor neurons
Biologists from the Ruhr-Universität Bochum have explored how to protect neurons that control movements from dying off. In the journal “Molecular and Cellular Neuroscience” they report that the molecule 7,8-Dihydroxyflavone, also known as vitamin P, ensures the survival of motor neurons in culture. It sends the survival signal on another path than the molecule Brain Derived Neurotrophic Factor (BDNF), which was previously considered a candidate for the treatment of motoneuron diseases or after spinal cord damage. “The Brain Derived Neurotrophic Factor only had a limited effect when tested on humans, and even had partially negative consequences”, says Prof. Dr. Stefan Wiese from the RUB Work Group for Molecular Cell Biology. “Therefore we are looking for alternative ways to find new approaches for the treatment of neurodegenerative diseases such as Amyotrophic Lateral Sclerosis.”
Same effect, different mode of action
In previous studies, researchers hypothesised that vitamin P is an analogue of BDNF and thus works in the same way. This theory has been disproved by the team led by Dr. Teresa Tsai and Prof. Stefan Wiese from the Group for Molecular Cell Biology and the Department of Cell Morphology and Molecular Neurobiology headed by Prof. Andreas Faissner. Both substances ensure that isolated motor neurons of the mouse survive in cell culture and grow new processes, but what exactly the molecules trigger at the protein level varies. BDNF activates two signalling pathways, the so-called MAP kinase and PI3K/AKT signal paths. Vitamin P on the other hand makes use only of the latter.
The dose is crucial
However, vitamin P only unfolded its positive effects on the motor neurons in a very small concentration range. “These results show how important an accurate determination of dose and effect is”, says Prof. Wiese. An overdose of vitamin P reduced the survival effect, and over a certain amount, no more positive effects occurred at all. The researchers hope that vitamin P could have less negative side effects than BDNF. “It is easier to use, because vitamin P, in contrast to BDNF, can pass the blood-brain barrier and therefore does not have to be introduced into the cerebrospinal fluid using pumps like BDNF,” says Wiese.
Speaking a tonal language (such as Cantonese) primes the brain for musical training
Non-musicians who speak tonal languages may have a better ear for learning musical notes, according to Canadian researchers.
Tonal languages, found mainly in Asia, Africa and South America, have an abundance of high and low pitch patterns as part of speech. In these languages, differences in pitch can alter the meaning of a word. Vietnamese, for example, has eleven different vowel sounds and six different tones. Cantonese also has an intricate six-tone system, while English has no tones.
Researchers at Baycrest Health Sciences’ Rotman Research Institute (RRI) in Toronto have found the strongest evidence yet that speaking a tonal language may improve how the brain hears music. While the findings may boost the egos of tonal language speakers who excel in musicianship, they are exciting neuroscientists for another reason: they represent the first strong evidence that music and language – which share overlapping brain structures – have bi-directional benefits!
The findings are published today in PLOS ONE, an international, peer-reviewed open-access science journal.
The benefits of music training for speech and language are already well documented (showing positive influences on speech perception and recognition, auditory working memory, aspects of verbal intelligence, and awareness of the sound structure of spoken words). The reverse – the benefits of language experience for learning music – has largely been unexplored until now.
"For those who speak tonal languages, we believe their brain’s auditory system is already enhanced to allow them to hear musical notes better and detect minute changes in pitch," said lead investigator Gavin Bidelman, who conducted the research as a post-doctoral fellow at Baycrest’s RRI, supported by a GRAMMY Foundation® grant.
"If you pick up an instrument, you may be able to acquire the skills faster to play that instrument because your brain has already built up these auditory perceptual advantages through speaking your native tonal language."
But Bidelman, now assistant professor with the Institute for Intelligent Systems and School of Communication Science & Disorders at the University of Memphis, was quick to dispel the notion that people who speak tonal languages make better musicians. Musicianship requires much more than the sense of hearing and plenty of English-speaking musical icons will put that quick assumption to rest.
That music and language – two key domains of human cognition – can influence each other offers exciting possibilities for devising new approaches to rehabilitation for people with speech and language deficits, said Bidelman.
"If music and language are so intimately coupled, we may be able to design rehabilitation treatments that use musical training to help individuals improve speech-related functions that have been impaired due to age, aphasia or stroke," he suggested. Bidelman added that similar benefits might also work in the opposite direction. Musical listening skills could be improved by designing well-crafted speech and language training programs.
The study
Fifty-four healthy adults in their mid-20s were recruited for the study from the University of Toronto and Greater Toronto Area. They were divided into three groups: English-speaking trained musicians (instrumentalists) and Cantonese-speaking and English-speaking non-musicians. Wearing headphones in a sound-proof lab, participants were tested on their ability to discriminate complex musical notes. They were assessed on measures of auditory pitch acuity and music perception as well as general cognitive ability such as working memory and fluid intelligence (abstract reasoning, thinking quickly).
While the musicians demonstrated superior performance on all auditory measures, the Cantonese non-musicians showed similar performance to musicians on music and cognitive behavioural tasks, testing 15 to 20 percent higher than that of the English-speaking non-musicians.
Bidelman added that not all tonal languages may offer the music listening benefits seen with the Cantonese speakers in his study. Mandarin, for example, has more “curved” tones and the pitch patterns vary with time – which is different from how pitch occurs in music. Musical pitch resembles “stair step, level pitch patterns” which happen to share similarities with the Cantonese language, he explained.

BRAIN Initiative Launched to Unlock Mysteries of Human Mind
Today at the White House, President Barak Obama unveiled the “BRAIN” Initiative — a bold new research effort to revolutionize our understanding of the human mind and uncover new ways to treat, prevent, and cure brain disorders like Alzheimer’s, schizophrenia, autism, epilepsy, and traumatic brain injury.
The NIH Brain Research through Advancing Innovative Neurotechnologies (BRAIN) Initiative is part of a new Presidential focus aimed at revolutionizing our understanding of the human brain. By accelerating the development and application of innovative technologies, researchers will be able to produce a revolutionary new dynamic picture of the brain that, for the first time, shows how individual cells and complex neural circuits interact in both time and space. Long desired by researchers seeking new ways to treat, cure, and even prevent brain disorders, this picture will fill major gaps in our current knowledge and provide unprecedented opportunities for exploring exactly how the brain enables the human body to record, process, utilize, store, and retrieve vast quantities of information, all at the speed of thought.
Why is the NIH BRAIN Initiative needed?
With nearly 100 billion neurons and 100 trillion connections, the human brain remains one of the greatest mysteries in science and one of the greatest challenges in medicine. Neurological and psychiatric disorders, such as Alzheimer’s disease, Parkinson’s disease, autism, epilepsy, schizophrenia, depression, and traumatic brain injury, exact a tremendous toll on individuals, families, and society. Despite the many advances in neuroscience in recent years, the underlying causes of most of neurological and psychiatric conditions remain largely unknown, due to the vast complexity of the human brain. If we are ever to develop effective ways of helping people suffering from these devastating conditions, researchers will first need a more complete arsenal of tools and information for understanding how the brain functions both in health and disease.
Why is now the right time for the NIH BRAIN Initiative?
In the last decade alone, scientists have made a number of landmark discoveries that now create the opportunity to unlock the mysteries of the brain. We have witnessed the sequencing of the human genome, the development of new tools for mapping neuronal connections, the increasing resolution of imaging technologies, and the explosion of nanoscience. These discoveries have yielded unprecedented opportunities for integration across scientific fields. For instance, by combining advanced genetic and optical techniques, scientists can now use pulses of light in animal models to determine how specific cell activities within the brain affect behavior. What’s more, through the integration of neuroscience and physics, researchers can now use high-resolution imaging technologies to observe how the brain is structurally and functionally connected in living humans.
While these technological innovations have contributed substantially to our expanding knowledge of the brain, significant breakthroughs in how we treat neurological and psychiatric disease will require a new generation of tools to enable researchers to record signals from brain cells in much greater numbers and at even faster speeds. This cannot currently be achieved, but great promise for developing such technologies lies at the intersections of nanoscience, imaging, engineering, informatics, and other rapidly emerging fields of science.
How will the NIH BRAIN Initiative work?
Given the ambitious scope of this pioneering endeavor, it is vital that planning for the NIH BRAIN Initiative be informed by a wide range of expertise and experience. Therefore, NIH is establishing a high level working group of the Advisory Committee to the NIH Director (ACD) to help shape this new initiative. This working group, co-chaired by Dr. Cornelia “Cori” Bargmann (The Rockefeller University) and Dr. William Newsome (Stanford University), is being asked to articulate the scientific goals of the BRAIN initiative and develop a multi-year scientific plan for achieving these goals, including timetables, milestones, and cost estimates.
As part of this planning process, input will be sought broadly from the scientific community, patient advocates, and the general public. The working group will be asked to produce an interim report by fall 2013 that will contain specific recommendations on high priority investments for Fiscal Year (FY) 2014. The final report will be delivered to the NIH Director in June 2014.
How will the NIH BRAIN Initiative be supported?
In total, NIH intends to allocate $40 million in FY14. Given the cross-cutting nature of this project, the NIH Blueprint for Neuroscience Research — an initiative spanning 14 NIH Institutes and Centers — will be the leading NIH contributor to its implementation in FY14. Of course, a goal this audacious will require ideas from the best scientists and engineers across many diverse disciplines and sectors. Therefore, NIH is working in close collaboration with other government agencies, including the Defense Advanced Research Projects Agency (DARPA) and the National Science Foundation (NSF). Strong interest has also been expressed by several private foundations, including the Howard Hughes Medical Institute, the Allen Institute for Brain Science, and The Kavli Foundation, and the Salk Institute for Biological Studies. Private industries have also expressed a high level of interest in participation in this groundbreaking initiative.