Neuroscience

Articles and news from the latest research reports.

270 notes

Do Patients in a Vegetative State Recognize Loved Ones?

TAU researchers find unresponsive patients’ brains may recognize photographs of their family and friends

image

Patients in a vegetative state are awake, breathe on their own, and seem to go in and out of sleep. But they do not respond to what is happening around them and exhibit no signs of conscious awareness. With communication impossible, friends and family are left wondering if the patients even know they are there.

Now, using functional magnetic resonance imaging (fMRI), Dr. Haggai Sharon and Dr. Yotam Pasternak of Tel Aviv University’s Functional Brain Center and Sackler Faculty of Medicine and the Tel Aviv Sourasky Medical Center have shown that the brains of patients in a vegetative state emotionally react to photographs of people they know personally as though they recognize them.

"We showed that patients in a vegetative state can react differently to different stimuli in the environment depending on their emotional value," said Dr. Sharon. "It’s not a generic thing; it’s personal and autobiographical. We engaged the person, the individual, inside the patient."

The findings, published in PLOS ONE, deepen our understanding of the vegetative state and may offer hope for better care and the development of novel treatments. Researchers from TAU’s School of Psychological Sciences, Department of Neurology, and Sagol School of Neuroscience and the Loewenstein Hospital in Ranaana contributed to the research.

Talking to the brain

For many years, patients in a vegetative state were believed to have no awareness of self or environment. But in recent years, doctors have made use of fMRI to examine brain activity in such patients. They have found that some patients in a vegetative state can perform complex cognitive tasks on command, like imagining a physical activity such as playing tennis, or, in one case, even answering yes-or-no questions. But these cases are rare and don’t provide any indication as to whether patients are having personal emotional experiences in such a state.

To gain insight into “what it feels like to be in a vegetative state,” the researchers worked with four patients in a persistent (defined as “month-long”) or permanent (persisting for more than three months) vegetative state. They showed them photographs of people they did and did not personally know, then gauged the patients’ reactions using fMRI, which measures blood flow in the brain to detect areas of neurological activity in real time. In response to all the photographs, a region specific to facial recognition was activated in the patients’ brains, indicating that their brains had correctly identified that they were looking at faces.

But in response to the photographs of close family members and friends, brain regions involved in emotional significance and autobiographical information were also activated in the patients’ brains. In other words, the patients reacted with activations of brain centers involved in processing emotion, as though they knew the people in the photographs. The results suggest patients in a vegetative state can register and categorize complex visual information and connect it to memories – a groundbreaking finding.

The ghost in the machine

However, the researchers could not be sure if the patients were conscious of their emotions or just reacting spontaneously. So they then verbally asked the patients to imagine their parents’ faces. Surprisingly, one patient, a 60-year-old kindergarten teacher who was hit by a car while crossing the street, exhibited complex brain activity in the face- and emotion-specific brain regions, identical to brain activity seen in healthy people. The researchers say her response is the strongest evidence yet that vegetative-state patients can be “emotionally aware.” A second patient, a 23-year-old woman, exhibited activity just in the emotion-specific brain regions. (Significantly, both patients woke up within two months of the tests. They did not remember being in a vegetative state.)

"This experiment, a first of its kind, demonstrates that some vegetative patients may not only possess emotional awareness of the environment but also experience emotional awareness driven by internal processes, such as images," said Dr. Sharon.

Research focused on the “emotional awareness” of patients in a vegetative state is only a few years old. The researchers hope their work will eventually contribute to improved care and treatment. They have also begun working with patients in a minimally conscious state to better understand how regions of the brain interact in response to familiar cues. Emotions, they say, could help unlock the secrets of consciousness.

(Source: aftau.org)

Filed under vegetative state emotion neuroimaging brain activity facial recognition consciousness neuroscience science

180 notes

The brain’s got rhythm: Extracting temporal patterns from visual input
To understand how the brain recognizes speech, appreciates music and performs other higher-level functions, it is necessary to understand how neural systems process temporal information. Recently, scientists at Beijing Normal University studied a simple but powerful network model by which a neural system can extract long-period (several seconds in duration) external rhythms from visual input. Moreover, the study’s findings suggest that a large neural network with a scale-free topology – that is, a network in which the probability distribution of the number of connections between its nodes follows a power law – is analogous to a repertoire where neural loops and chains form the mechanism by which exogenous rhythms are learned. Importantly, their model suggests that the brain does not necessarily require an internal clock to acquire and memorize these rhythms.
Prof. Si Wu and Prof. Gang Hu discussed the paper that they and their co-authors recently published in Proceedings of the National Academy of Sciences. “The challenge for generating slow oscillation – that is, on the order of seconds – in a neural system is that the dynamics of single neurons and neuronal synapses are too short,” Wu tells Medical Xpress. “In other words, for an unstructured network, a strong input will typically generate a strong transient response, and hence the system is unable to retain slow oscillation.” To solve this problem, the scientists came up with the idea of using the propagation of activity along a long loop of neurons to hold the rhythm information. “Neurons in the loop need to have low-connectivity degrees to avoid inducing synchronous firing of the network,” Hu adds.
Hu also comments on constructing a network model with scale-free structure. “We knew that a scale-free network had the structure we wanted – namely, it consists of a large number of low-degree neurons which can form different sizes of loops and chains, as well as a few hub neurons which can trigger synchronous firing of the network. Furthermore,” he continues, “we didn’t want hub neurons to be easily elicited; otherwise, the network will always get into epileptic firings.” To solve this problem, the researchers required that the neuronal interactions have the proper form to easily activate low-degree neuron while also making it hard to activate hub neurons. Wu point out that biologically plausible electrical synapses and scaled chemical synapses naturally hold this property.
Wu says that the researchers did not develop innovative techniques in this study. “Our main contribution was to propose a simple and yet effective mechanism for a neural system encoding temporal information,” he explains, noting that this mechanism consists of five key points:
1. Hub neurons, through their massive connections to others, induce synchronous firing of the network
2. Loops of low-degree neurons hold rhythm information, with the loop size deciding the rhythm
3. Proper electrical or scaled chemical neuronal synapses ensure that activating a hub neuron is difficult in comparison with a low-degree neuron – and also avoids epileptic network firing, in which periods of rapid spiking are followed by quiescent, silent, periods
4. A large-size scale-free network is like a reservoir, which contains a large number and various sizes of loops and chains formed by low-degree neurons, and hence can encode a broad range of rhythmic information
5. When an external rhythmic input is presented, the network selects a loop from its reservoir, with the loop size matching the input rhythm – and this matching operation can be achieved by a synaptic plasticity rule
The team’s findings imply that in terms of neural information processing, a neural system can use loops and chains of connected neurons to hold the memory trace of input information and, that the latter might serve as the substrate to process temporal events. “These implications for temporal information processing in neural systems have two aspects,” Wu points out. “Firstly, there’s been a long-standing debate on whether the brain has a global clock that counts time and coordinates temporal events. Our study suggests that this is not necessary: By using intrinsic network dynamics, the neural system can process temporal information in a distributed manner.”
Secondly, Wu continues, the brain may not use very complicated strategies to process temporal information, but by fully utilizing its enormous number of neurons, rather simple ones. “Our study suggests that a large size scale-free network has various lengths of loops and chains to hold different rhythms of inputs, making information encoding very simple. This is not economically efficient, but it simplifies computation, which could be crucial for animals responding quickly in a naturally competitive environment.”
In the presence of an external rhythmic input, Wu says that the neural system responds and holds the residual activity as the memory trace of the input for a sufficiently long time. If this input is repetitively presented, neuron pairs which fire together become connected through the biological synaptic plasticity rule, and thereby a loop matching the input rhythm is established.
Hu tells Medical Xpress that the network topology is not required to be perfectly scale-free, but rather that the network consists of a few neurons having many connections and a large number of neurons with few connections. “For the convenience of analysis, we considered a scale-free network in which the distribution of neuronal connections satisfying a power law. However, in practice, we don’t need such a strong condition. Rather, what we really need is a large number of low-degree neurons forming loops and chains, and a few hub neurons triggering synchronous firing. In other words, scale-free topology is the sufficient, but not the necessary, condition for our model to work.” Although the researchers focused on the visual system and have not applied their model to the auditory system, Hi suspects that it can be applied to the latter, where temporal processing is more critical.
Moving forward, the scientists’ next step is to build large networks having a similar structure but with more realistic neurons and synapses. “Based on this model,” Wu concludes, “we can explore how temporal information encoded in the way proposed in our model is involved in higher brain functions.” Moreover, other dynamical systems which generate slow oscillation and need to hold temporal information by network dynamics might benefit from our study.”

The brain’s got rhythm: Extracting temporal patterns from visual input

To understand how the brain recognizes speech, appreciates music and performs other higher-level functions, it is necessary to understand how neural systems process temporal information. Recently, scientists at Beijing Normal University studied a simple but powerful network model by which a neural system can extract long-period (several seconds in duration) external rhythms from visual input. Moreover, the study’s findings suggest that a large neural network with a scale-free topology – that is, a network in which the probability distribution of the number of connections between its nodes follows a power law – is analogous to a repertoire where neural loops and chains form the mechanism by which exogenous rhythms are learned. Importantly, their model suggests that the brain does not necessarily require an internal clock to acquire and memorize these rhythms.

Prof. Si Wu and Prof. Gang Hu discussed the paper that they and their co-authors recently published in Proceedings of the National Academy of Sciences. “The challenge for generating slow oscillation – that is, on the order of seconds – in a neural system is that the dynamics of single neurons and neuronal synapses are too short,” Wu tells Medical Xpress. “In other words, for an unstructured network, a strong input will typically generate a strong transient response, and hence the system is unable to retain slow oscillation.” To solve this problem, the scientists came up with the idea of using the propagation of activity along a long loop of neurons to hold the rhythm information. “Neurons in the loop need to have low-connectivity degrees to avoid inducing synchronous firing of the network,” Hu adds.

Hu also comments on constructing a network model with scale-free structure. “We knew that a scale-free network had the structure we wanted – namely, it consists of a large number of low-degree neurons which can form different sizes of loops and chains, as well as a few hub neurons which can trigger synchronous firing of the network. Furthermore,” he continues, “we didn’t want hub neurons to be easily elicited; otherwise, the network will always get into epileptic firings.” To solve this problem, the researchers required that the neuronal interactions have the proper form to easily activate low-degree neuron while also making it hard to activate hub neurons. Wu point out that biologically plausible electrical synapses and scaled chemical synapses naturally hold this property.

Wu says that the researchers did not develop innovative techniques in this study. “Our main contribution was to propose a simple and yet effective mechanism for a neural system encoding temporal information,” he explains, noting that this mechanism consists of five key points:

1. Hub neurons, through their massive connections to others, induce synchronous firing of the network

2. Loops of low-degree neurons hold rhythm information, with the loop size deciding the rhythm

3. Proper electrical or scaled chemical neuronal synapses ensure that activating a hub neuron is difficult in comparison with a low-degree neuron – and also avoids epileptic network firing, in which periods of rapid spiking are followed by quiescent, silent, periods

4. A large-size scale-free network is like a reservoir, which contains a large number and various sizes of loops and chains formed by low-degree neurons, and hence can encode a broad range of rhythmic information

5. When an external rhythmic input is presented, the network selects a loop from its reservoir, with the loop size matching the input rhythm – and this matching operation can be achieved by a synaptic plasticity rule

The team’s findings imply that in terms of neural information processing, a neural system can use loops and chains of connected neurons to hold the memory trace of input information and, that the latter might serve as the substrate to process temporal events. “These implications for temporal information processing in neural systems have two aspects,” Wu points out. “Firstly, there’s been a long-standing debate on whether the brain has a global clock that counts time and coordinates temporal events. Our study suggests that this is not necessary: By using intrinsic network dynamics, the neural system can process temporal information in a distributed manner.”

Secondly, Wu continues, the brain may not use very complicated strategies to process temporal information, but by fully utilizing its enormous number of neurons, rather simple ones. “Our study suggests that a large size scale-free network has various lengths of loops and chains to hold different rhythms of inputs, making information encoding very simple. This is not economically efficient, but it simplifies computation, which could be crucial for animals responding quickly in a naturally competitive environment.”

In the presence of an external rhythmic input, Wu says that the neural system responds and holds the residual activity as the memory trace of the input for a sufficiently long time. If this input is repetitively presented, neuron pairs which fire together become connected through the biological synaptic plasticity rule, and thereby a loop matching the input rhythm is established.

Hu tells Medical Xpress that the network topology is not required to be perfectly scale-free, but rather that the network consists of a few neurons having many connections and a large number of neurons with few connections. “For the convenience of analysis, we considered a scale-free network in which the distribution of neuronal connections satisfying a power law. However, in practice, we don’t need such a strong condition. Rather, what we really need is a large number of low-degree neurons forming loops and chains, and a few hub neurons triggering synchronous firing. In other words, scale-free topology is the sufficient, but not the necessary, condition for our model to work.” Although the researchers focused on the visual system and have not applied their model to the auditory system, Hi suspects that it can be applied to the latter, where temporal processing is more critical.

Moving forward, the scientists’ next step is to build large networks having a similar structure but with more realistic neurons and synapses. “Based on this model,” Wu concludes, “we can explore how temporal information encoded in the way proposed in our model is involved in higher brain functions.” Moreover, other dynamical systems which generate slow oscillation and need to hold temporal information by network dynamics might benefit from our study.”

Filed under neurons auditory system neural system synapses neural networks neuroscience science

106 notes

Turning off major memory switch dulls memories

A faultily formed memory sounds like hitting random notes on a keyboard while a proper one sounds more like a song, scientists say.

image

When they turned off a major switch for learning and memory, brain cells communicated, but the relationship was superficial, said Dr. Joe Tsien, neuroscientist at the Medical College of Georgia at Georgia Regents University and Co-Director of the GRU Brain & Behavior Discovery Institute.

“We have begun to crack the neural code, which allows us to look in real time at how thoughts happen and how memories are made,” Tsien said. “That has enabled us to understand  for the first time how and whether the right keys are struck at the right time and in the right place and manner to make the beautiful sound of coherent memories and to compare what happens when a key element is missing.”

With the NMDA receptor intact, chatter reverberates, associations are made and helpful memories – like how touching a hot stove results in a burn – are easily retrieved.

“You see a face and think of a name, you see your office, and you think you need to work; everything is associative,” said Tsien, corresponding author of the study in the journal PLOS ONE. “But in mice lacking an NMDA receptor, you can tell the memory patterns are dull and dissociated.”

Using the century-old Pavlovian conditioning model that first showed how repetition creates association, they found that mice lacking a functioning NMDA receptor in the hippocampus, the brain’s center of learning and memory, could not recollect even something fearful.

When they played a tone, followed 20 seconds later by a mild foot shock, normal mice quickly made the association, down to the timing. The connection essentially never registered with mice lacking the NMDA receptor. Healthy brain recalling memories and Amnesic brain recalling contextual memories 

“They form the initial patterns, but don’t rehearse them,” said Tsien. “Their tones are flat, the association is poor, while everything we register in the healthy brain is associative.” To illustrate just how flat, Postdoctoral Fellow Hui Kuang assigned musical notes to the memory activity of each, which resulted in random noise by the NMDA knockout mice compared to a dynamic rhythm from normal mice.

“By knowing what these patterns look like and what they mean, you can use this signature to measure, for example, during aging, why we begin to lose memory and to identify and test drugs that are truly effective at aiding memory,” Tsien said.

“You can tell whether there is an issue with reverberation, whether your brain is repeating what you need to remember, or repeats it but somehow stores it badly, so it’s not associated with the right things. This study has revealed a lot of fascinating details about what neuroscientists call the brain’s neural code” Tsien said.”

He wants to look at how aging affects these processes as a next step. The research team also is looking at Doogie, a mouse genetically bred by Tsien and his team in 1999 to be exceptionally smart, to see if they can also learn more about how super memories are made and what they look like.

This ability to decode how and what the brain is remembering, should one day help physicians better assess and treat conditions such as Alzheimer’s and schizophrenia, Tsien said. They may find that some answers are already out there, such as drugs that boost reverberation, or a stimulant like caffeine to help retrieve a memory, Tsien said.

His team first reported decoding brain cell conversations as memories were formed and recalled in PLOS ONE in 2009. As with the new study, they used a computational algorithm to translate the neuronal conversations into some of the first pictures of what memories look like.

Filed under memory NMDA receptor brain cells neuroscience science

248 notes

Repairing mitochondria in neurodegenerative disease
The relationship between fine-scale structure and function in the brain is perhaps best explored today by the study of neurodegenerative disease. Disorders like Rett syndrome may be considered developmental in origin—and defined by exotic mechanisms including X-linked inactivation, DNA methylation, and genomic imprinting—but even here, its larger physical pathology evolves through the course of life and continues to be revealed in almost any place that researchers look. When diseases directly involve inputs to the brain like vitamin or diet, and can also be controlled by them, things get even more interesting. More often than not, these disorders have a clear genetic component, are frequently linked to the mitochondria, and lead to progressive and often perplexing deficits of movement. One such enigma is known as pantothenate kinase-associated neurodegeneration, or PKAN syndrome, in its the most frequent form. A recent open paper in the journal Brain explains.
This particular syndrome can be caused by any number of a hundred or so mutations in the PANK2 gene, which codes for the mitochondrial enzyme pantothenate kinase 2. Of the four nuclear-coded PANK genes, only PANK2 is targeted to the mitochondria. Its protein product is involved in co-enzyme A biosynthesis and catalyzes the phosphorylation of pantothenate (vitamin B5). The hallmark pathology, as defined by T2-weighted MRI, can be seen in the globus pallidus and even has its own unique name— the Eye-of-the-Tiger sign.

The researchers used a mouse model of the disease with a Pank2 double gene knockout. On a standard diet, the mice showed growth issues, azoospermia (lack of sperm) and minor mitochondrial dysfunction, but not some of the other typical issues like iron accumulation in the brain or retinal degeneration. Since co-enzyme A is crucial to several metabolic pathways, the researchers also tested the mice on a high fat ketogenic diet. Under these conditions, ketone bodies produced through fatty acid oxidation bypass the normal glycolytic pathways and proceed directly to the citric acid acid.

On the ketogenic diet, the mitochondria, which were already ailing with abnormal, swollen cristae, fared much worse, losing some cristae entirely. Extensive lipofuschin deposits were also found in these mice, and movement issues were amplified. It had previously been established in other organisms like flies, that panthethine (a dimeric form of vitamin B5 linked by cysteamine bridging groups) could counteract these issues. When the mice were given panthethine, the general pathology was resolved. In particular, the mitochondria were completely rescued, presumably restored to health, or otherwise replaced in the natural course of events.

The researchers also evaluated mitochondrial membrane potential using dye staining methods. In the knockout mice, membrane potential was compromised, however it was completely restored by the panthethine. At present there is no definitive way to predict functional variables, like membrane potential, from the morphology as it is seen on processed EM tissue. In a recent review of new brain mapping techniques, we discussed this issue, and also pointed to new technologies which may permit closer examinations.
On EM images, one of the most striking features in the interior of mitochondria is the crista junction. This protein structure functionally divides the inner and intermembrane spaces, and controls exchanges between them. While mitochondria come in a variety of forms, the junctions generally converge on a preferred shape and size. Efforts to thermodynamically characterize them in terms of shape entropy have been initiated, as have conceptions of how they evolve as conditions in the mitochondria change mechanically. The so-called “baffle model” of mitochondrial has been entirely replaced by the new cristae junction model which aims to relate structure to function for these organelles, just as we seek it on larger scales for the brain.

Several issues in PNAK style neurodegeneration still stand out like a sore thumb. The iron accumulation is still unexplained, but may be related to another unexplained issue: namely, not only does panthethine fail to cross the BBB, it does not even appear to be working through a vitamin B5 function. When panthethine is metabolized into two pantothenic acid molecules, it also forms two cysteamines. While cysteamine is associated with various side effects, and it can bind and inactivate certain liver enzymes, it also can cross the BBB, perhaps as seen here, to great effect.
The doses necessary for vitamin B5 function are far below those needed here for restorative function. More work is needed to constrain the range of possible mechanisms at play here, but in addition to finding cures for the disease, it will also help cure our ignorance as far as structure-function relations.

Repairing mitochondria in neurodegenerative disease

The relationship between fine-scale structure and function in the brain is perhaps best explored today by the study of neurodegenerative disease. Disorders like Rett syndrome may be considered developmental in origin—and defined by exotic mechanisms including X-linked inactivation, DNA methylation, and genomic imprinting—but even here, its larger physical pathology evolves through the course of life and continues to be revealed in almost any place that researchers look. When diseases directly involve inputs to the brain like vitamin or diet, and can also be controlled by them, things get even more interesting. More often than not, these disorders have a clear genetic component, are frequently linked to the mitochondria, and lead to progressive and often perplexing deficits of movement. One such enigma is known as pantothenate kinase-associated neurodegeneration, or PKAN syndrome, in its the most frequent form. A recent open paper in the journal Brain explains.

This particular syndrome can be caused by any number of a hundred or so mutations in the PANK2 gene, which codes for the mitochondrial enzyme pantothenate kinase 2. Of the four nuclear-coded PANK genes, only PANK2 is targeted to the mitochondria. Its protein product is involved in co-enzyme A biosynthesis and catalyzes the phosphorylation of pantothenate (vitamin B5). The hallmark pathology, as defined by T2-weighted MRI, can be seen in the globus pallidus and even has its own unique name— the Eye-of-the-Tiger sign.

The researchers used a mouse model of the disease with a Pank2 double gene knockout. On a standard diet, the mice showed growth issues, azoospermia (lack of sperm) and minor mitochondrial dysfunction, but not some of the other typical issues like iron accumulation in the brain or retinal degeneration. Since co-enzyme A is crucial to several metabolic pathways, the researchers also tested the mice on a high fat ketogenic diet. Under these conditions, ketone bodies produced through fatty acid oxidation bypass the normal glycolytic pathways and proceed directly to the citric acid acid.

On the ketogenic diet, the mitochondria, which were already ailing with abnormal, swollen cristae, fared much worse, losing some cristae entirely. Extensive lipofuschin deposits were also found in these mice, and movement issues were amplified. It had previously been established in other organisms like flies, that panthethine (a dimeric form of vitamin B5 linked by cysteamine bridging groups) could counteract these issues. When the mice were given panthethine, the general pathology was resolved. In particular, the mitochondria were completely rescued, presumably restored to health, or otherwise replaced in the natural course of events.

The researchers also evaluated mitochondrial membrane potential using dye staining methods. In the knockout mice, membrane potential was compromised, however it was completely restored by the panthethine. At present there is no definitive way to predict functional variables, like membrane potential, from the morphology as it is seen on processed EM tissue. In a recent review of new brain mapping techniques, we discussed this issue, and also pointed to new technologies which may permit closer examinations.

On EM images, one of the most striking features in the interior of mitochondria is the crista junction. This protein structure functionally divides the inner and intermembrane spaces, and controls exchanges between them. While mitochondria come in a variety of forms, the junctions generally converge on a preferred shape and size. Efforts to thermodynamically characterize them in terms of shape entropy have been initiated, as have conceptions of how they evolve as conditions in the mitochondria change mechanically. The so-called “baffle model” of mitochondrial has been entirely replaced by the new cristae junction model which aims to relate structure to function for these organelles, just as we seek it on larger scales for the brain.

Several issues in PNAK style neurodegeneration still stand out like a sore thumb. The iron accumulation is still unexplained, but may be related to another unexplained issue: namely, not only does panthethine fail to cross the BBB, it does not even appear to be working through a vitamin B5 function. When panthethine is metabolized into two pantothenic acid molecules, it also forms two cysteamines. While cysteamine is associated with various side effects, and it can bind and inactivate certain liver enzymes, it also can cross the BBB, perhaps as seen here, to great effect.

The doses necessary for vitamin B5 function are far below those needed here for restorative function. More work is needed to constrain the range of possible mechanisms at play here, but in addition to finding cures for the disease, it will also help cure our ignorance as far as structure-function relations.

Filed under neurodegenerative diseases neurodegeneration mitochondria animal model neuroscience science

263 notes

Muting the Mozart effect
Children get plenty of benefits from music lessons. Learning to play instruments can fuel their creativity, and practicing can teach much-needed focus and discipline. And the payoff, whether in learning a new song or just mastering a chord, often boosts self-esteem.
But Harvard researchers now say that one oft-cited benefit — that studying music improves intelligence — is a myth.
Though it has been embraced by everyone from advocates for arts education to parents hoping to encourage their kids to stick with piano lessons, a pair of studies conducted by Samuel Mehr, a Harvard Graduate School of Education (HGSE) doctoral student working in the lab of Elizabeth Spelke, the Marshall L. Berkman Professor of Psychology, found that music training had no effect on the cognitive abilities of young children. The studies are described in a Dec. 11 paper published in the open-access journal PLoS One.
“More than 80 percent of American adults think that music improves children’s grades or intelligence,” Mehr said. “Even in the scientific community, there’s a general belief that music is important for these extrinsic reasons. But there is very little evidence supporting the idea that music classes enhance children’s cognitive development.”
The notion that music training can make someone smarter, Mehr said, can largely be traced to a single study published in Nature. In it, researchers identified what they called the “Mozart effect.” After listening to music, test subjects performed better on spatial tasks.
Though the study was later debunked, the notion that simply listening to music could make someone smarter became firmly embedded in the public imagination, and spurred a host of follow-up studies, including several that focused on the cognitive benefits of music lessons.
Though dozens of studies have explored whether and how music and cognitive skills might be connected, when Mehr and colleagues reviewed the literature they found only five studies that used randomized trials, the gold standard for determining causal effects of educational interventions on child development. Of the five, only one showed an unambiguously positive effect, and it was so small — just a 2.7 point increase in IQ after a year of music lessons — that it was barely enough to be statistically significant.
“The experimental work on this question is very much in its infancy, but the few published studies on the topic show little evidence for ‘music makes you smarter,’” Mehr said.
To explore the connection between music and cognition, Mehr and his colleagues recruited 29 parents and 4-year-old children from the Cambridge area. After initial vocabulary tests for the children and music aptitude tests for the parents, each was randomly assigned to one of two classes, one that had music training, or another that focused on visual arts.
“We wanted to test the effects of the type of music education that actually happens in the real world, and we wanted to study the effect in young children, so we implemented a parent-child music enrichment program with preschoolers,” Mehr said. “The goal is to encourage musical play between parents and children in a classroom environment, which gives parents a strong repertoire of musical activities they can continue to use at home with their kids.”
Among the key changes Mehr and his colleagues made from earlier studies were controlling for the effect of different teachers — Mehr taught both the music and visual arts classes — and using assessment tools designed to test areas of cognition, vocabulary, mathematics, and two spatial tasks.
“Instead of using something general, like an IQ test, we tested four specific domains of cognition,” Mehr said. “If there really is an effect of music training on children’s cognition, we should be able to better detect it here than in previous studies, because these tests are more sensitive than tests of general intelligence.”
The study’s results, however, showed no evidence for cognitive benefits of music training.
While the groups performed comparably on vocabulary and number-estimation tasks, the assessments showed that children who received music training performed slightly better at one spatial task, while those who received visual arts training performed better at the other.
“Study One was very small. We only had 15 children in the music group, and 14 in the visual arts,” Mehr said. “The effects were tiny, and their statistical significance was marginal at best. So we attempted to replicate the study, something that hasn’t been done in any of the previous work.”
To replicate the effect, Mehr and colleagues designed a second study that recruited 45 parents and children, half of whom received music training, and half of whom received no training.
Just as in the first study, Mehr said, there was no evidence that music training offered any cognitive benefit. Even when the results of both studies were pooled to allow researchers to compare the effect of music training, visual arts training, and no training, there was no sign that any group outperformed the others.
“There were slight differences in performance between the groups, but none were large enough to be statistically significant,” Mehr said. “Even when we used the finest-grained statistical analyses available to us, the effects just weren’t there.”
While the results suggest studying music may not be a shortcut to educational success, Mehr said there is still substantial value in music education.
“There’s a compelling case to be made for teaching music that has nothing to do with extrinsic benefits,” he said. “We don’t teach kids Shakespeare because we think it will help them do better on the SATs. We do it because we believe Shakespeare is important.
“Music is an ancient, uniquely human activity. The oldest flutes that have been dug up are 40,000 years old, and human song long preceded that,” he said. “Every single culture in the world has music, including music for children. Music says something about what it means to be human, and it would be crazy not to teach this to our children.”

Muting the Mozart effect

Children get plenty of benefits from music lessons. Learning to play instruments can fuel their creativity, and practicing can teach much-needed focus and discipline. And the payoff, whether in learning a new song or just mastering a chord, often boosts self-esteem.

But Harvard researchers now say that one oft-cited benefit — that studying music improves intelligence — is a myth.

Though it has been embraced by everyone from advocates for arts education to parents hoping to encourage their kids to stick with piano lessons, a pair of studies conducted by Samuel Mehr, a Harvard Graduate School of Education (HGSE) doctoral student working in the lab of Elizabeth Spelke, the Marshall L. Berkman Professor of Psychology, found that music training had no effect on the cognitive abilities of young children. The studies are described in a Dec. 11 paper published in the open-access journal PLoS One.

“More than 80 percent of American adults think that music improves children’s grades or intelligence,” Mehr said. “Even in the scientific community, there’s a general belief that music is important for these extrinsic reasons. But there is very little evidence supporting the idea that music classes enhance children’s cognitive development.”

The notion that music training can make someone smarter, Mehr said, can largely be traced to a single study published in Nature. In it, researchers identified what they called the “Mozart effect.” After listening to music, test subjects performed better on spatial tasks.

Though the study was later debunked, the notion that simply listening to music could make someone smarter became firmly embedded in the public imagination, and spurred a host of follow-up studies, including several that focused on the cognitive benefits of music lessons.

Though dozens of studies have explored whether and how music and cognitive skills might be connected, when Mehr and colleagues reviewed the literature they found only five studies that used randomized trials, the gold standard for determining causal effects of educational interventions on child development. Of the five, only one showed an unambiguously positive effect, and it was so small — just a 2.7 point increase in IQ after a year of music lessons — that it was barely enough to be statistically significant.

“The experimental work on this question is very much in its infancy, but the few published studies on the topic show little evidence for ‘music makes you smarter,’” Mehr said.

To explore the connection between music and cognition, Mehr and his colleagues recruited 29 parents and 4-year-old children from the Cambridge area. After initial vocabulary tests for the children and music aptitude tests for the parents, each was randomly assigned to one of two classes, one that had music training, or another that focused on visual arts.

“We wanted to test the effects of the type of music education that actually happens in the real world, and we wanted to study the effect in young children, so we implemented a parent-child music enrichment program with preschoolers,” Mehr said. “The goal is to encourage musical play between parents and children in a classroom environment, which gives parents a strong repertoire of musical activities they can continue to use at home with their kids.”

Among the key changes Mehr and his colleagues made from earlier studies were controlling for the effect of different teachers — Mehr taught both the music and visual arts classes — and using assessment tools designed to test areas of cognition, vocabulary, mathematics, and two spatial tasks.

“Instead of using something general, like an IQ test, we tested four specific domains of cognition,” Mehr said. “If there really is an effect of music training on children’s cognition, we should be able to better detect it here than in previous studies, because these tests are more sensitive than tests of general intelligence.”

The study’s results, however, showed no evidence for cognitive benefits of music training.

While the groups performed comparably on vocabulary and number-estimation tasks, the assessments showed that children who received music training performed slightly better at one spatial task, while those who received visual arts training performed better at the other.

“Study One was very small. We only had 15 children in the music group, and 14 in the visual arts,” Mehr said. “The effects were tiny, and their statistical significance was marginal at best. So we attempted to replicate the study, something that hasn’t been done in any of the previous work.”

To replicate the effect, Mehr and colleagues designed a second study that recruited 45 parents and children, half of whom received music training, and half of whom received no training.

Just as in the first study, Mehr said, there was no evidence that music training offered any cognitive benefit. Even when the results of both studies were pooled to allow researchers to compare the effect of music training, visual arts training, and no training, there was no sign that any group outperformed the others.

“There were slight differences in performance between the groups, but none were large enough to be statistically significant,” Mehr said. “Even when we used the finest-grained statistical analyses available to us, the effects just weren’t there.”

While the results suggest studying music may not be a shortcut to educational success, Mehr said there is still substantial value in music education.

“There’s a compelling case to be made for teaching music that has nothing to do with extrinsic benefits,” he said. “We don’t teach kids Shakespeare because we think it will help them do better on the SATs. We do it because we believe Shakespeare is important.

“Music is an ancient, uniquely human activity. The oldest flutes that have been dug up are 40,000 years old, and human song long preceded that,” he said. “Every single culture in the world has music, including music for children. Music says something about what it means to be human, and it would be crazy not to teach this to our children.”

Filed under music intelligence mozart effect cognition psychology neuroscience science

227 notes

Establishing the basis of humour

The act of laughing at a joke is the result of a two-stage process in the brain, first detecting an incongruity before then resolving it with an expression of mirth. The brain actions involved in understanding humour differ between young boys and girls. These are the conclusions reached by a US-based scientist supported by the Swiss National Science Foundation. 

image

Since science has demonstrated that animals are also capable of planning into the future, the once deep cleft between the brain capacities of humans and animals is rapidly disappearing. Fortunately, we can still claim humour as our unique selling point. This makes it even more astonishing that researchers have considered this attribute but fleetingly (and have spent much more time on negative emotions such as fear), write the Swiss neuroscientist Pascal Vrticka and his US colleagues at Stanford University, in the journal “Nature Reviews Neuroscience”.

Strangely cheerful feelings

In their recently published article (*), the researchers demonstrate that, while laughter at a joke requires activity in many different areas of the brain, just two separate elements can be identified among the complex patterns of activity. In the first part, the brain detects a logical incongruity, which, in the second part, it proceeds to resolve. The ensuing feeling of cheerfulness arises from a brain activity that can be clearly differentiated from that of other positive emotions.

Moreover, in the study of 22 children aged between six and thirteen, the research team led by Vrticka showed that sex-specific differences in the processing of humour are formed early on in life. The researchers recorded the children’s brain activity while they were enjoying film clips that were either funny – slapstick home video – or entertaining – such as clips of children break-dancing. On average, the girls’ brains responded more to the funny scenes, while the boys showed greater reaction to the entertaining clips.

Benefits of improved understanding

Vrticka speculates that these sex-based differences could play a role in helping women to select a suitable (and humorous) mate. Aside from this, humour also plays a key role in psychological health. This is demonstrated, among other things, in the fact that adults with psychological disorders such as autism or depression often have a modified humour processing activity and respond less markedly to humour than people who do not have these disorders. Vrticka believes that an improved understanding of the processes that take place in our brain when we enjoy the effects of an amusing joke could be of great benefit in the development of treatments.

(Source: alphagalileo.org)

Filed under humour amygdala brain activity sex differences laughter neuroscience psychology science

143 notes

Picturing pain could help unlock its mysteries and lead to better treatments
Understanding the science behind pain, from a simple “ouch” to the chronic and excruciating, has been an elusive goal for centuries. But now, researchers are reporting a promising step toward studying pain in action. In a study published in the Journal of the American Chemical Society, scientists describe the development of a new technique, which they tested in rats, that could result in better ways to relieve pain and monitor healing.
Sandip Biswal, Frederick T. Chin, Justin Du Bois and colleagues note that current ways to diagnose pain basically involve asking the patient if something hurts. These subjective approaches are fraught with bias and can lead doctors in the wrong direction if a patient doesn’t want to talk about the pain or can’t communicate well. It can also be difficult to tell how well a treatment is really working. No existing method can measure pain intensity objectively or help physicians pinpoint the exact location of the pain. Past research has shown an association between pain and a certain kind of protein, called a sodium channel, that helps nerve cells transmit pain and other sensations to the brain. Certain forms of this channel are overproduced at the site of an injury, so the team set out to develop an imaging method to visualize high concentrations of this protein.
They turned to a small molecule called saxitoxin, produced naturally by certain types of microscopic marine creatures, and attached a signal to it so they could trace it by PET imaging. PET scanners are used in hospitals to diagnose diseases and injuries. When the researchers injected the molecule into rats, often a stand-in for humans in lab tests, they saw that the molecule accumulated where the rats had nerve damage. The rats didn’t show signs of toxic side effects. The work is one of the first attempts to mark these sodium channels in a living animal, they say.

Picturing pain could help unlock its mysteries and lead to better treatments

Understanding the science behind pain, from a simple “ouch” to the chronic and excruciating, has been an elusive goal for centuries. But now, researchers are reporting a promising step toward studying pain in action. In a study published in the Journal of the American Chemical Society, scientists describe the development of a new technique, which they tested in rats, that could result in better ways to relieve pain and monitor healing.

Sandip Biswal, Frederick T. Chin, Justin Du Bois and colleagues note that current ways to diagnose pain basically involve asking the patient if something hurts. These subjective approaches are fraught with bias and can lead doctors in the wrong direction if a patient doesn’t want to talk about the pain or can’t communicate well. It can also be difficult to tell how well a treatment is really working. No existing method can measure pain intensity objectively or help physicians pinpoint the exact location of the pain. Past research has shown an association between pain and a certain kind of protein, called a sodium channel, that helps nerve cells transmit pain and other sensations to the brain. Certain forms of this channel are overproduced at the site of an injury, so the team set out to develop an imaging method to visualize high concentrations of this protein.

They turned to a small molecule called saxitoxin, produced naturally by certain types of microscopic marine creatures, and attached a signal to it so they could trace it by PET imaging. PET scanners are used in hospitals to diagnose diseases and injuries. When the researchers injected the molecule into rats, often a stand-in for humans in lab tests, they saw that the molecule accumulated where the rats had nerve damage. The rats didn’t show signs of toxic side effects. The work is one of the first attempts to mark these sodium channels in a living animal, they say.

Filed under pain sodium channel ion channel saxitoxin nerve cells neuroscience science

130 notes

Study breaks blood-brain barriers to understanding Alzheimer’s

A study in mice shows a breakdown of the brain’s blood vessels may amplify or cause problems associated with Alzheimer’s disease. The results published in Nature Communications suggest that blood vessel cells called pericytes may provide novel targets for treatments and diagnoses.

image

“This study helps show how the brain’s vascular system may contribute to the development of Alzheimer’s disease,” said study leader Berislav V. Zlokovic, M.D. Ph.D., director of the Zilkha Neurogenetic Institute at the Keck School of Medicine of the University of Southern California, Los Angeles. The study was co-funded by the National Institute of Neurological Diseases and Stroke (NINDS) and the National Institute on Aging (NIA), parts of the National Institutes of Health

Alzheimer’s disease is the leading cause of dementia.  It is an age-related disease that gradually erodes a person’s memory, thinking, and ability to perform everyday tasks.  Brains from Alzheimer’s patients typically have abnormally high levels of plaques made up of accumulations of beta-amyloid protein next to brain cells, tau protein that clumps together to form neurofibrillary tangles inside neurons, and extensive neuron loss. 

Vascular dementias, the second leading cause of dementia, are a diverse group of brain disorders caused by a range of blood vessel problems.  Brains from Alzheimer’s patients often show evidence of vascular disease, including ischemic stroke, small hemorrhages, and diffuse white matter disease, plus a buildup of beta-amyloid protein in vessel walls.  Furthermore, previous studies suggest that APOE4, a genetic risk factor for Alzheimer’s disease, is linked to brain blood vessel health and integrity.

“This study may provide a better understanding of the overlap between Alzheimer’s disease and vascular dementia,” said Roderick Corriveau, Ph.D., a program director at NINDS.

One hypothesis about Alzheimer’s disease states that increases in beta-amyloid lead to nerve cell damage.  This is supported by genetic studies that link familial forms of the disease to mutations in amyloid precursor protein (APP), the larger protein from which plaque-forming beta-amyloid molecules are derived.  Nonetheless, previous studies on mice showed that increased beta-amyloid levels reproduce some of the problems associated with Alzheimer’s.  The animals have memory problems, beta-amyloid plaques in the brain and vascular damage but none of the neurofibrillary tangles and neuron loss that are hallmarks of the disease.

In this study, the researchers show that pericytes may be a key to whether increased beta-amyloid leads to tangles and neuron loss.

Pericytes are cells that surround the outside of blood vessels.  Many are found in a brain plumbing system, called the blood-brain barrier.  It is a network that exquisitely controls the movement of cells and molecules between the blood and the interstitial fluid that surrounds the brain’s nerve cells.  Pericytes work with other blood-brain barrier cells to transport nutrients and waste molecules between the blood and the interstitial brain fluid.

To study how pericytes influence Alzheimer’s disease, Dr. Zlokovic and his colleagues crossbred mice genetically engineered to have a form of APP linked to familial Alzheimer’s with ones that have reduced levels of platelet-derived growth factor beta receptor (PDGFR-beta), a protein known to control pericyte growth and survival.  Previous studies showed that PDGFR-beta mutant mice have fewer pericytes than normal, decreased brain blood flow, and damage to the blood-brain barrier.

“Pericytes act like the gatekeepers of the blood-brain barrier,” said Dr. Zlokovic.

Both the APP and PDGFR-beta mutant mice had problems with learning and memory.  Crossbreeding the mice slightly enhanced these problems.  The mice also had increased beta-amyloid plaque deposition near brain cells and along brain blood vessels.  Surprisingly, the brains of the crossbred mice had enhanced neuronal cell death and extensive neurofibrillary tangles in the hippocampus and cerebral cortex, regions that are typically affected during Alzheimer’s.

“Our results suggest that damage to the vascular system may be a critical step in the development of full-blown Alzheimer’s disease pathology,” said Dr. Zlokovic.

Further experiments suggested that pericytes may transport beta-amyloid across the blood-brain barrier into the blood and showed that crossbreeding the mice slowed the rate at which beta-amyloid was cleared away from nerve cells in the brain.

Next, the researchers addressed how beta-amyloid may affect the vascular system.  The crossbred mutants had more pericyte death and more damage to the blood-brain barrier than the PDGFR-beta mutant mice, suggesting beta-amyloid may enhance vascular damage.  The investigators also confirmed previous findings showing that beta-amyloid accumulation leads to pericyte death.

Dr. Zlokovic and his colleagues concluded that their results support a two-hit vascular hypothesis of Alzheimer’s.  The hypothesis states that the toxic effects of increased beta-amyloid deposition on pericytes in aged blood vessels leads to a breakdown of the blood-brain barrier and a reduced ability to clear amyloid from the brain.  In turn, the progressive accumulation of beta-amyloid in the brain and death of pericytes may become a damaging feedback loop that causes dementia.  If true, then pericytes and other blood-brain barrier cells may be new therapeutic targets for treating Alzheimer’s disease.

(Source: ninds.nih.gov)

Filed under alzheimer's disease blood-brain barrier dementia hippocampus neurons genetics neuroscience science

178 notes

Scientists improve human self-control through electrical brain stimulation

If you have ever said or done the wrong thing at the wrong time, you should read this. Neuroscientists at The University of Texas Health Science Center at Houston (UTHealth) and the University of California, San Diego, have successfully demonstrated a technique to enhance a form of self-control through a novel form of brain stimulation.

image

Study participants were asked to perform a simple behavioral task that required the braking/slowing of action – inhibition – in the brain. In each participant, the researchers first identified the specific location for this brake in the prefrontal region of the brain. Next, they increased activity in this brain region using stimulation with brief and imperceptible electrical charges. This led to increased braking – a form of enhanced self-control.

This proof-of-principle study appears in the Dec. 11 issue of The Journal of Neuroscience and its methods may one day be useful for treating attention deficit hyperactivity disorder (ADHD), Tourette’s syndrome and other severe disorders of self-control.

“There is a circuit in the brain for inhibiting or braking responses,” said Nitin Tandon, M.D., the study’s senior author and associate professor in The Vivian L. Smith Department of Neurosurgery at the UTHealth Medical School. “We believe we are the first to show that we can enhance this braking system with brain stimulation.”

A computer stimulated the prefrontal cortex exactly when braking was needed. This was done using electrodes implanted directly on the brain surface.

When the test was repeated with stimulation of a brain region outside the prefrontal cortex, there was no effect on behavior, showing the effect to be specific to the prefrontal braking system.

This was a double-blind study, meaning that participants and scientists did not know when or where the charges were being administered.

The method of electrical stimulation was novel in that it apparently enhanced prefrontal function, whereas other human brain stimulation studies mostly disrupt normal brain activity. This is the first published human study to enhance prefrontal lobe function using direct electrical stimulation, the researchers report.

The study involved four volunteers with epilepsy who agreed to participate while being monitored for seizures at the Mischer Neuroscience Institute at Memorial Hermann-Texas Medical Center (TMC). Stimulation enhanced braking in all four participants.

Tandon has been working on self-control research with researchers at the University of California, San Diego, for five years. “Our daily life is full of occasions when one must inhibit responses. For example, one must stop speaking when it’s inappropriate to the social context and stop oneself from reaching for extra candy,” said Tandon, who is a neurosurgeon with the Mischer Neuroscience Institute at Memorial Hermann-TMC. 

The researchers are quick to point out that while their results are promising, they do not yet point to the ability to improve self-control in general. In particular, this study does not show that direct electrical stimulation is a realistic option for treating human self-control disorders such as obsessive-compulsive disorder, Tourette’s syndrome and borderline personality disorder. Notably, direct electrical stimulation requires an invasive surgical procedure, which is now used only for the localization and treatment of severe epilepsy.

(Source: uth.edu)

Filed under brain stimulation electrical stimulation DBS prefrontal cortex neuroscience science

78 notes

New gene discovery sheds more light on Alzheimer’s risk

A research team from The University of Nottingham has helped uncover a second rare genetic mutation which strongly increases the risk of Alzheimer’s disease in later life.

image

In an international collaboration, the University’s Translational Cell Sciences Human Genetics research group has pinpointed a rare coding variation in the Phospholipase D3 (PLD3) gene which is more common in people with late-onset Alzheimer’s than non-sufferers.

The discovery is an important milestone on the road to early diagnosis of the disease and eventual improved treatment. Having surveyed the human genome for common variants associated with Alzheimer’s, geneticists are now turning the spotlight on rare mutations which may be even stronger risk factors.

More than 820,000 people in the UK have dementia and the number is rising as the population ages. The condition, of which Alzheimer’s disease is the predominant cause, costs the UK economy £23 billion per year, much more than other diseases like cancer and heart disease.

Nottingham’s genetic experts have been working with long-term partners from Washington University, St Louis, USA and University College, London, to carry out next-generation whole exome sequencing on families where Alzheimer’s affects several members.

Earlier this year the collaboration uncovered the first ever rare genetic mutation implicated in disease risk, linking the TREM2 gene to a higher risk of Alzheimer’s (published in the New England Journal of Medicine). Now, in a new study published today in the international journal, Nature, the team reveal that after analysis of the genes of around 2,000 people with Alzheimer’s, a second genetic variation has been found, in the PLD3 gene.

PLD3 influences processing of amyloid precursor protein which results in the generation of the characteristic amyloid plaques seen in AD brain tissue, suggesting that it may be a potential therapeutic target.

The international research team used Nottingham’s Alzheimer’s Research UK DNA bank, one of the largest collections of DNA from Alzheimer’s patients, to completely sequence the entire coding region (exome) of the PLD3 gene. The results showed several mutations in the gene occurred more frequently in people who had the disease than in non-sufferers. Carriers of PLD3 coding variants showed a two-fold increased risk for the disease.

Leading the team at Nottingham, Professor of Human Genomics and Molecular Genetics, Kevin Morgan, said:

“This second crucial discovery has confirmed that this latest scientific approach does deliver, it is able to find these clues. However, it is also inferring that there are lots more AD-significant variations out there and before we can use it for diagnosis we need to find all of the other genetic variations involved in Alzheimer’s too.

“Our research is forming the basis of potential diagnostics later on and more importantly it shows pathways that can be diagnostic targets which could lead to therapeutic interventions in the future.

“The next step will be to examine how this particular rare gene variant functions in the cell and see if it can be targeted, to see if there are any benefits to finding out how this gene operates in both normal and diseased cells. If we can do this, we may be able eventually to correct the defect with drug therapy. Here in Nottingham we will keep looking for more rare gene variations.

“Even if we could eventually slow or halt the progress of the disease with new drugs rather than curing it completely, the benefits would be huge in terms of the real impact on patients’ lives and also in vast savings to the health economy. The group The University of Nottingham has played a significant role in all of the recent AD genetics discoveries that have highlighted 20 new regions of interest in the genome in the last five years and we will continue to do so into the future.”

Rebecca Wood, Chief Executive of Alzheimer’s Research UK, the UK’s leading dementia research charity, said: “Advances in genetic technology are allowing researchers to understand more than ever about the genetic risk factors for the most common form of Alzheimer’s. This announcement, made just off the back of the G8 dementia research summit, is a timely reminder of the progress that can be made by worldwide collaboration. We know that late-onset Alzheimer’s is caused by a complex mix of risk factors, including both genetic and lifestyle. Understanding all of these risk factors and how they work together to affect someone’s likelihood of developing Alzheimer’s is incredibly important for developing interventions to slow the onset of the disease. Alzheimer’s Research UK is proud to have contributed to this discovery, both by funding researchers and through the establishment of a DNA collection that has been used in many of the recent genetic discoveries in Alzheimer’s.”

(Source: nottingham.ac.uk)

Filed under alzheimer's disease neurodegeneration dementia genetics neuroscience science

free counters