Neuroscience

Articles and news from the latest research reports.

Posts tagged science

20 notes

Conscious perception is a matter of global neural networks

June 13, 2012

(Medical Xpress) — Consciousness is a selective process that allows only a part of the sensory input to reach awareness. But up to today it has yet to be clarified which areas of the brain are responsible for the content of conscious perception. Theofanis Panagiotaropoulos and his colleagues - researchers at the Max Planck Institute for Biological Cybernetics in Tübingen and University Pompeu Fabra in Barcelona - have now discovered that the content of consciousness is not localized in a unique cortical area, but is most likely an emergent property of global networks of neuronal populations.

Neurons in the lateral prefrontal cortex represent the content of consciousness. The red trace depicts neural activity (neuronal discharges) in the lateral prefrontal cortex when a stimulus is consciously perceived for 1 second while the green trace depicts neural activity when the same stimulus is suppressed from awareness. Credit: MPI for Biological Cybernetics

The question which parts of the brain are responsible for the things that reach our awareness is one of the main puzzles in neurobiology today. Previous research on the brains of primates has shown that neurons in primary and secondary cortices provide poor representation of visual consciousness. In contrast, the neurons in the temporal lobe seem to reliably reflect the actual conscious perception of a visual stimulus. These findings indicated that not all parts of the brain are responsible for the content of conscious awareness. Nevertheless, the question whether only one of the brain’s areas is responsible for the content of perception or whether more regions are involved in the process has so far remained unanswered.

The Max Planck scientists in Tübingen led by Nikos Logothetis have now addressed this issue using electrophysiological methods to monitor the neural activity in the lateral prefrontal cortex of macaque monkeys during ambiguous visual stimulation. The visual stimuli used allow for multiple perceptual interpretations, even though the actual input remained the same. In doing so, Panagiotaropoulos and his team were able to show that the electrical activity monitored in the lateral prefrontal cortex correlates with what the macaque monkeys actually perceive.

They thus concluded that visual awareness is not only reliably reflected in the temporal lobe, but also in the lateral prefrontal cortex of primates. The results depict that the neuronal correlates of consciousness are embedded in this area, which has a direct connection to premotor and motor areas of the brain, and is therefore able to directly affect motor output. These findings support the “frontal lobe hypothesis” of conscious visual perception established in 1995 by the researchers Crick (the co-discoverer of the structure of the DNA molecule) and Koch that awareness is related to neural activity with direct access to the planning stages of the brain.

The results support this theory in so far as they show that the lateral prefrontal cortex is involved in the process of visual awareness. However, the fact that neural activity in two different cortical areas reflects conscious perception shows that the decision which sensory input reaches our awareness is most likely not made in a unique cortical area but, rather, that a global network of neurons from different areas of the brain is responsible for it. “Our results therefore broaden the hypothesis and create new questions regarding the cortical mechanisms of visual awareness”, Panagiotaropoulos explains. In the near future the group is going to record the electrical activity in both regions simultaneously.

By this they will try to find out which of the two areas is activated first and draw conclusions on how the two areas interact with each other during conscious perception. This may lead to a better understanding of why only certain things reach our awareness and others remain suppressed.

Provided by Max Planck Society

Source: medicalxpress.com

Filed under science neuroscience brain psychology consciousness

4 notes

In vitro fertilization linked to multiple sclerosis relapse

June 13, 2012

(HealthDay) — Women with multiple sclerosis (MS) who undergo in vitro fertilization (IVF) are at greater risk of relapse after treatment, particularly if they receive gonadotrophin releasing hormone (GnRH) agonists or if IVF fails, according to a study published online June 11 in the Journal of Neurology, Neurosurgery & Psychiatry.

Women with multiple sclerosis who undergo in vitro fertilization (IVF) are at greater risk of relapse after treatment, particularly if they receive gonadotrophin releasing hormone agonists or if IVF fails, according to a study published online June 11 in the Journal of Neurology, Neurosurgery & Psychiatry.

Noting that pregnancy and treatment with sex steroids can affect the relapse rate in patients with MS, Laure Michel, M.D., from Hôpital Laennec in Nantes, France, and colleagues analyzed data from 32 women with MS who had undergone 70 IVF treatments during an 11-year study period: 48 with GnRH agonists and 19 with GnRH antagonists.

The researchers found that there were significantly more relapses in the three months after IVF (annualized relapse rate [ARR], 1.60), compared with one year before (ARR, 0.68) or three months before (ARR, 0.80). The increase in relapses was significantly associated with GnRH agonist use (P = 0.025) and failed IVF (P = 0.019).

"MS patients should be aware of a possible increased risk of MS relapse after IVF, particularly if the procedure does not result in a pregnancy," Michel and colleagues conclude. "Furthermore, because there is a reasonable doubt that GnRH agonists may make patients more prone to such an increase in relapse rate, GnRH antagonists might be preferred for IVF protocols.”

Source: medicalxpress.com

Filed under science neuroscience MS psychology

23 notes

Don’t Feel Like Exercise? Scientists Find Compound That May Help You Work out Harder

ScienceDaily (June 12, 2012) — As science rushes to develop safe weight loss drugs, a new research report approaches this problem from an entirely new angle: What if there were a pill that would make you want to exercise harder? It may sound strange, but a new research report appearing online in The FASEB Journal suggests that it might be possible. That’s because a team of Swiss researchers found that when a hormone in the brain, erythropoietin (Epo), was elevated in mice, they were more motivated to exercise.

In addition, the form of erythropoietin used in these experiments did not elevate red blood cell counts. Such a treatment has obvious benefits for a wide range of health problems ranging from Alzheimer’s to obesity, including mental health disorders for which increased physical activity is known to improve symptoms.

"Here we show that Epo increases the motivation to exercise," said Max Gassmann, D.V.M., a researcher involved in the work from the Institute of Veterinary Physiology, Vetsuisse-Faculty and Zurich Center for Integrative Human Physiology at the University of Zurich in Switzerland. "Most probably, Epo has a general effect on a person’s mood and might be used in patients suffering from depression and related diseases."

To make this discovery, Gassmann and colleagues used three types of mice: those that received no treatment, those that were injected with human Epo, and those that were genetically modified to produce human Epo in the brain. Compared to the mice that did not have any increase in Epo, both mouse groups harboring human Epo in the brain showed significantly higher running performance without increases in red blood cells.

"If you can’t put exercise in a pill, then maybe you can put the motivation to exercise in a pill instead," said Gerald Weissmann, M.D., Editor-in-Chief of The FASEB Journal. “As more and more people become overweight and obese, we must attack the problem from all angles. Maybe the day will come when gyms are as easily found as fast food restaurants.”

Source: Science Daily

Filed under science neuroscience brain psychology

11 notes

Naturally Occurring Protein Has Role in Chronic Pain

ScienceDaily (June 12, 2012) — Researchers in France and Sweden have discovered how one of the body’s own proteins is involved in generating chronic pain in rats. The results, which also suggest therapeutic interventions to alleviate long-lasting pain, are reported in The EMBO Journal.

Chronic pain is persistent and often difficult to treat. It is due, at least in part, to changes in molecular signalling events that take place in neurons, alterations that can ultimately disrupt the transmission of nerve signals from the spinal cord to the brain.

"We are fortunate to have a wide range of technologies that allow us to look more precisely at the molecular events that lead to the onset of chronic pain in animals," said Marc Landry, lead author of the study and Professor at the University of Bordeaux.

"Our results show that the levels of the naturally occurring protein 14-3-3 zeta are higher in the spinal cord of rats that have chronic pain. Moreover, we have been able to demonstrate how 14-3-3 zeta triggers changes in the signalling pathway that leads to the symptoms of chronic pain."

The 14-3-3 zeta protein disrupts the interaction between the two subunits of the GABAB receptor, a protein complex found on the surface of nerve cells. GABAB receptors are G-protein coupled receptors, a family of receptors that regulate many physiological processes and which are frequently targeted for drug development.

The researchers used antibody labelling and microscopy techniques to investigate the molecular interactions of the signalling proteins. In cells and living animals, they were able to show that the 14-3-3 zeta protein interacts directly with the B1 subunit of the GABAB receptor. This interaction impairs the effective signalling of the receptor and limits the pain-relieving effects of the GABAB receptor under conditions of chronic pain.

The researchers also showed that the treatment of rats with a specific small interfering RNA (siRNA) or a competing peptide, molecules that interfere with the action of the 14-3-3 zeta protein, inhibited chronic pain.

"The impairment of the GABAB receptor by 14-3-3 zeta is a novel mechanism for the modulation of chronic pain,” said Landry. “We see potential in combining the use of inhibitors that interfere with the action of 14-3-3 zeta together with existing drug treatments like Baclofen for chronic pain. Targeting the GABAB dissociation process may be of therapeutic interest since it may allow classical pain killers to be more effective.”

Source: Science Daily

Filed under science neuroscience psychology pain

9 notes

Alzheimer’s Risk Gene Disrupts Brain Function in Healthy Older Women, but Not Men

ScienceDaily (June 12, 2012) — A team led by investigators at the Stanford University School of Medicine has found that the most common genetic risk factor for Alzheimer’s disease disrupts brain function in healthy, older women but has little impact on brain function in healthy, older men. Women harboring the gene variant, known to be a potent risk factor for Alzheimer’s disease, show brain changes characteristic of the neurodegenerative disorder that can be observed before any outward symptoms manifest.

Both men and women who inherit two copies (one from each parent) of this gene variant, known as ApoE4, are at extremely high risk for Alzheimer’s. But the double-barreled ApoE4 combination is uncommon, affecting only about 2 percent of the population, whereas about 15 percent of people carry a single copy of this version of the gene.

The Stanford researchers demonstrated for the first time the existence of a gender distinction among outwardly healthy, older people who carry the ApoE4 variant. In this group, women but not men exhibit two telltale characteristics that have been linked to Alzheimer’s disease: a signature change in their brain activity, and elevated levels of a protein called tau in their cerebrospinal fluid.

One implication of the study, published June 13 in the Journal of Neuroscience, is that men revealed by genetic tests to carry a single copy of ApoE4 shouldn’t be assumed to be at elevated risk for Alzheimer’s, a syndrome afflicting about 5 million people in the United States and nearly 30 million worldwide. The new findings also may help explain why more women than men develop this disease, said Michael Greicius, MD, assistant professor of neurology and neurological sciences and medical director of the Stanford Center for Memory Disorders. Most critically, identifying the prominent interaction between ApoE4 and gender opens a host of new experimental avenues that will allow Greicius’ team and the field generally to better understandhow ApoE4 increases risk for Alzheimer’s disease.

For every three women with Alzheimer’s disease, only about two men have the neurodegenerative disorder, said Greicius, the study’s senior author. (The first author is Jessica Damoiseaux, PhD, a postdoctoral scholar in Greicius’ laboratory. They collaborated with colleagues at the University of California-San Francisco and UCLA.) True, women live longer than men do, on average, and old age is by far the greatest risk factor for Alzheimer’s, Greicius said. “But the disparity in Alzheimer’s risk persists even if you correct for the difference in longevity,” he said. “This disparate impact of ApoE4 status on women versus men might account for a big part of the skewed gender ratio.”

Besides age, another well-studied major risk factor is genetic: possession of a particular version of the gene known as ApoE. This gene is a recipe for a protein involved in transporting cholesterol into cells — an important job, as cholesterol is a crucial constituent of all cell membranes including those of nerve cells. And nerve cells are constantly responding to experience by developing or enhancing small, bulblike electrochemical contacts to other nerve cells, or diminishing or abolishing them. For all these processes, efficient cholesterol transport is critical.

The ApoE protein comes in three versions, each the product of a slightly differing version of the ApoE gene: E2, E3 or E4. Most people have two copies of the E3 version of ApoE. A small percentage carries one copy of E3 and one of E2, and even fewer two copies of E2. The protein specified by the E4 gene version seems to be somewhat defective in comparison to the one encoded by either E2 or the much more common E3. Thus, while only about 10-15 percent of the population carries one copy of E4 (or, much less commonly, two), more than 50 percent of people who develop Alzheimer’s are E4 carriers.

But, as it turns out, the heightened risk E4 imposes may be largely restricted to women.

To demonstrate this, the scientists first obtained functional MRI scans of 131 healthy people, with a median age of 70, to examine connections in the brain’s memory network. They used sophisticated brain-imaging analysis to show that in older women carrying the E4 variant, this network of interconnected brain regions, which normally share a synchronized pattern of activity, exhibit a loss of that synchrony — a pattern typically seen in Alzheimer’s patients. In healthy, older women (but not men) with at least one E4 allele, activity in a brain area called the precuneus appeared be out of synch with other regions whose firing patterns generally are closely coordinated.

The brain-imaging technique Greicius and his colleagues used is known as functional-connectivity magnetic resonance imaging, or fcMRI. Performed on “resting” subjects, who remain in the scanner awake but not focusing on any particular task, fcMRI can discern on the order of 20 different brain networks, each consisting of a set of dispersed brain regions that are physically connected by nerve tracts and whose pulses of activity are synchronized, or in phase. Greicius, Damoiseaux and their associates have previously shown that the synchronous firing pattern of one network in particular, critical to memory function and known as the “default mode network,” is specifically targeted by Alzheimer’s and deteriorates as the disease progresses.

To independently confirm their imaging-based observations, the scientists assessed records from a large public database compiled from the Alzheimer’s Disease Neuroimaging Initiative, a multi-site study of healthy aging and Alzheimer’s disease. The Stanford study focused on the healthy 55- to 90-year-old volunteers who had agreed to undergo a spinal tap and have their cerebrospinal fluid analyzed.

From this database the Greicius team extracted the records of 91 subjects, with an average age of 75, and divided them into four groups representing women with or without a copy of the E4 variant, and men with or without a copy. For each group, they checked recorded concentrations of a protein named tau in these subjects’ cerebrospinal fluid. Elevated tau levels in cerebrospinal fluid are a key biomarker of Alzheimer’s disease. The results — the CSF of women, but not men, who carried at least one E4 allele was substantially enriched in tau — confirmed the brain-imaging findings.

The tau findings constitute another first. “It was only possible to see these differences in tau levels when we separated the patients by gender,” Greicius said.

Notably, all the men and women participating in the Journal of Neuroscience study were screened for cognitive status. Only those whose ability to think and remember appeared normal for their age were admitted. Thus, the observed changes in brain activity and CSF composition were occurring well before the onset of classic Alzheimer’s symptoms such as memory loss, disorientation and dementia. It may someday be practical to substitute fcMRI, which is noninvasive, for a spinal tap as a diagnostic tool, Greicius said.

Source: Science Daily

Filed under science neuroscience brain psychology alzheimer

13 notes

When being scared twice is enough to remember

June 12, 2012

One of the brain’s jobs is to help us figure out what’s important enough to be remembered. Scientists at Yerkes National Primate Research Center, Emory University have achieved some insight into how fleeting experiences become memories in the brain.

Their experimental system could be a way to test or refine treatments aimed at enhancing learning and memory, or interfering with troubling memories. The results were published recently in the Journal of Neuroscience.

The researchers set up a system where rats were exposed to a light followed by a mild shock. A single light-shock event isn’t enough to make the rat afraid of the light, but a repeat of the pairing of the light and shock is, even a few days later.

"I describe this effect as ‘priming’," says the first author of the paper, postdoctoral fellow Ryan Parsons. "The animal experiences all sorts of things, and has to sort out what’s important. If something happens just once, it doesn’t register. But twice, and the animal remembers."

Parsons was working with Michael Davis, PhD, Robert W. Woodruff professor of psychiatry and behavioral sciences at Emory University School of Medicine, who has been studying the molecular basis for fear memory for several years.

Even though a robust fear memory was not formed after the first priming event, at that point Parsons could already detect chemical changes in the amygdala, part of the brain critical for fear responses. Long term memory formation could be blocked by infusing a drug into the amygdala. The drug inhibits protein kinase A, which is involved in the chemical changes Parsons observed.

It is possible to train rats to become afraid of something like a sound or a smell after one event, Parsons says. However, rats are less sensitive to light compared with sounds or smells, and a relatively mild shock was used.

Fear memories only formed when shocks were paired with light, instead of noise or nothing at all, for both the priming and the confirmation event. Parsons measured how afraid the rats were by gauging their “acoustic startle response” (how jittery they were in response to a loud noise) in the presence of the light, compared to before training began.

Scientists have been able to study the chemical changes connected with the priming process extensively in neurons in culture dishes, but not as much in live animals. The process is referred to as “metaplasticity,” or how the history of the brain’s experiences affects its readiness to change and learn.

"This could be a good model for dissecting the mechanisms involved in learning and memory,” Parsons says. “We’re going to be able to look at what’s going on in that first priming event, as well as when the long-term memory is triggered.”

"We believe our findings might help explain how events are selected out for long-term storage from what is essentially a torrent of information encountered during conscious experience," Parsons and Davis write in their paper.

Provided by Emory University

Source: medicalxpress.com

Filed under science neuroscience brain psychology

32 notes

Early Gut Bacteria Regulate Happiness

ScienceDaily (June 12, 2012) — UCC scientists have shown that brain levels of serotonin, the ‘happy hormone’ are regulated by the amount of bacteria in the gut during early life. Their research is being published June 12 in the international psychiatry journal, Molecular Psychiatry.

Happy children. UCC scientists have shown that brain levels of serotonin, the ‘happy hormone’ are regulated by the amount of bacteria in the gut during early life. (Credit: © Marzanna Syncerz / Fotolia)

This research shows that normal adult brain function depends on the presence of gut microbes during development. Serotonin, the major chemical involved in the regulation of mood and emotion, is altered in times of stress, anxiety and depression and most clinically effective antidepressant drugs work by targeting this neurochemical.

Scientists at the Alimentary Pharmabiotic Centre in UCC used a germ-free mouse model to show that the absence of bacteria during early life significantly affected serotonin concentrations in the brain in adulthood. The research also highlighted that the influence is sex dependent, with more marked effects in male compared with female animals. Finally, when the scientists colonized the animals with bacteria prior to adulthood, they found that many of the central nervous system changes, especially those related to serotonin, could not be reversed indicating a permanent imprinting of the effects of absence of gut flora on brain function.

This builds on earlier work, from the Cork group and others, showing that a microbiome-gut-brain axis exists that is essential for maintaining normal health which can affect brain and behavior. The research was carried out by Dr Gerard Clarke, Professor Fergus Shanahan, Professor Ted Dinan and Professor John F Cryan and colleagues at the Alimentary Pharmabiotic Centre in UCC.

"As a neuroscientist these findings are fascinating as they highlight the important role that gut bacteria play in the bidirectional communication between the gut and the brain, and opens up the intriguing opportunity of developing unique microbial-based strategies for treatment for brain disorders," said Professor John F Cryan, senior author on the publication and Head of the Department of Anatomy & Neuroscience at UCC.

This research has multiple health implications as it shows that manipulations of the microbiota (e.g. by antibiotics, diet, or infection) can have profound knock-on effects on brain function. “We’re really excited by these findings” said lead author Dr Gerard Clarke. “Although we always believed that the microbiota was essential for our general health, our results also highlight how important our tiny friends are for our mental wellbeing.”

Source: Science Daily

Filed under science neuroscience psychology serotonin brain

28 notes

Nature or Nurture? It May Depend On Where You Live

ScienceDaily (June 12, 2012) — In a study published June 12 in the journal Molecular Psychiatry, researchers from the Twins Early Development Study at King’s College London’s Institute of Psychiatry studied data from more than 6700 families relating to 45 childhood characteristics, from IQ and hyperactivity to height and weight. They found that genetic and environmental contributions to these characteristics vary geographically in the UK and have published their results online as a series of nature-nurture maps.

Newborn twins. (Credit: © pojoslaw / Fotolia)

Our development, health and behaviour are determined by complex interactions between our genetic make-up and the environment in which we live. For example, we may carry genes that increase our risk of developing type 2 diabetes, but if we eat a healthy diet and get sufficient exercise, we may not develop the disease. Similarly, someone may carry genes that reduce his or her risk of developing lung cancer, but heavy smoking may still lead to the disease.

The UK-based Twins Early Development Study follows more than 13,000 pairs of twins, both identical and non-identical, born between 1994 and 1996. When the twins were age 12, the researchers carried out a broad survey to assess a wide range of cognitive abilities, behavioural (and other) traits, environments and academic achievement in 6759 twin pairs. The researchers then designed an analysis that reveals the UK’s genetic and environmental hotspots, something which had never been done before.

"These days we’re used to the idea that it’s not a question of nature or nurture; everything, including our behaviour, is a little of both," explains Dr Oliver Davis, a Sir Henry Wellcome Postdoctoral Fellow at King’s College London’s Institute of Psychiatry. "But when we saw the maps, the first thing that struck us was how much the balance of genes and environments can vary from region to region."

"Take a trait like classroom behaviour problems. From our maps we can tell that in most of the UK around 60 per cent of the difference between people is explained by genes. However, in the South East genes aren’t as important: they explain less than half of the variation. For classroom behaviour, London is an ‘environmental hotspot’."

The maps give the researchers a global overview of how the environment interacts with our genomes, without homing in on particular genes or environments. However, the patterns have given them important clues about which environments to explore in more detail.

"The nature-nurture maps help us to spot patterns in the complex data and to try to work out what’s causing these patterns," says Dr Davis. "For our classroom behaviour example, we realised that one thing that varies more in London is household income. When we compare maps of income inequality to our nature-nurture map for classroom behaviour, we find income inequality may account for some of the pattern.

"Of course, this is just one example. There are any number of environments that vary geographically in the UK, from social environments like healthcare or education provision to physical environments like altitude, the weather or pollution. Our approach is all about tracking down those environments that you wouldn’t necessarily think of at first."

It may be relatively easy to explain environmental hotspots, but what about the genetic hotspots that appear on the maps: do people’s genomes vary more in those regions? The researchers believe this is not the case; rather, genetic hotspots are areas where the environment exposes the effects of genetic variation.

For example, researchers searching for gene variants that increase the risk of hay fever may study populations from two regions. In the first region people live among fields of wind-pollinated crops, whereas the second region is miles away from those fields. In this second region, where no one is exposed to pollen, no one develops hay fever; hence any genetic differences between people living in this region would be invisible.

By contrast, in the first region, where people live among the fields of crops, they will all be exposed to pollen and differences between the people with a genetic susceptibility to hay fever and the people without will stand out. That would make the region a genetic hotspot for hay fever.

"The message that these maps really drive home is that your genes aren’t your destiny. There are plenty of things that can affect how your particular human genome expresses itself, and one of those things is where you grow up," says Dr Davis.

Source: Science Daily

Filed under science neuroscience psychology genetics

12 notes

Losing money, emotions and evolution

June 12, 2012

Financial loss can lead to irrational behavior. Now, research by Weizmann Institute scientists reveals that the effects of loss go even deeper: Loss can compromise our early perception and interfere with our grasp of the true situation. The findings, which recently appeared in the Journal of Neuroscience, may also have implications for our understanding of the neurological mechanisms underlying post-traumatic stress disorder.

The experiment was conducted by Dr. Rony Paz and research student Offir Laufer of the Neurobiology Department. Subjects underwent a learning process based on classic conditioning and involving money. They were asked to listen to a series of tones composed of three different notes. After hearing one note, they were told they had earned a certain sum; after a second note, they were informed that they had lost some of their money; and a third note was followed by the message that their bankroll would remain the same. According to the findings, when a note was tied to gain, or at least to no loss, the subjects improved over time in a learned task – distinguishing that note from other, similar notes. But when they heard the “lose money” note, they actually got worse at telling one from the other.

Functional MRI (fMRI) scans of the brain areas involved in the learning process revealed an emotional aspect: The amygdala, which is tied to emotions and reward, was strongly involved. The researchers also noted activity in another area in the front of the brain, which functions to moderate the emotional response. Subjects who exhibited stronger activity in this area showed less of a drop in their abilities to distinguish between tones.

Paz: “The evolutionary origins of that blurring of our ability to discriminate are positive: If the best response to the growl of a lion is to run quickly, it would be counterproductive to distinguish between different pitches of growl. Any similar sound should make us flee without thinking. Unfortunately, that same blurring mechanism can be activated today in stress-inducing situations that are not life-threatening – like losing money – and this can harm us.”

That harm may even be quite serious: For instance, it may be involved in post-traumatic stress disorder. If sufferers are unable to distinguish between a stimulus that should cause a panic response and similar, but non-threatening, stimuli, they may experience strong emotional reactions in inappropriate situations.

This perceptional blurring may even expand over time to encompass a larger range of stimuli. Paz intends to investigate this possibility in future research.

Provided by Weizmann Institute of Science

Source: medicalxpress.com

Filed under science neuroscience brain psychology perception stress

28 notes

Psychologists reveals how brain performs ‘motor chunking’ tasks

June 12, 2012

You pick up your cell phone and dial the new number of a friend. Ten numbers. One. Number. At. A. Time. Because you haven’t actually typed the number before, your brain handles each button press separately, as a sequence of distinct movements.

This image shows identified brain regions linked to the parsing (left) and concatenation (right) processes involved in motor chunking. Trials with greater parsing showed increased activation of the left prefrontal and parietal cortex and trials with greater concatenation showed increased activation of the putamen. Credit: Photo by Nicholas Wymbs

After dialing the number a few more times, you find yourself typing it out as a series of three successive bursts of movement: the area code, the first three numbers, the last four numbers. Those three separate chunks allow you to type the number faster, and with greater precision. Eventually, dialed often enough, the number is stored in your brain as one chunk. Who needs speed dial?

"You can think about a chunk as a rhythm," said Nicholas Wymbs, a postdoctoral researcher in UC Santa Barbara’s Department of Psychological and Brain Sciences, and the lead author of a new study on motor chunking in the journal Neuron, published by Cell Press. “We highlight the two-part process that seems to occur when we are chunking. This is demonstrated by the rhythm we use when typing the phone number: rapid bursts of finger movements that are interspersed by pauses.”

The rhythm is the human brain taking information and processing it in an efficient way, according to Wymbs. “On one level, the brain is going to try to divide up, or parse, long sequences of movement,” he said. “This parsing process functions to group or cluster movements in the most efficient way possible.”

But it is also in our brain’s best interest to assemble single or short strings of movements into longer, integrated sequences so that a complex behavior can be made with as little effort as possible. “The motor system in the brain wants to output movement in the most computational, low-cost way as possible,” Wymbs said. “With this integrative process, it’s going to try to bind as many individual motor movements into a fluid, uniform movement as it possibly can.”

This diagram illustrates how the subjects in the experiment used their left hands to respond to the “notes” on a button box. Credit: Illustration by Nicholas Wymbs

The two processes are at odds with each other, and it’s how the brain reconciles this struggle during motor learning that intrigues Wymbs and the study’s other authors, including Scott Grafton, professor of psychology and director of the UCSB Brain Imaging Center. “What we are interested in is functional plasticity of the brain –– how the brain changes when we learn actions, or motor sequences as we refer to them in this paper,” Wymbs said.

The study was conducted using human subjects in the Magnetic Resonance Imaging (MRI) scanner in the Brain Imaging Center. The experiment involved three days of training with people performing and practicing three separate motor sequences for up to 200 trials each during the collection of functional MRI data. The subjects were all right-handed but they were asked to learn the sequences using the four fingers of their left hands. Participants practiced the sequences during the operation of the MRI scanner by tapping out responses with a button box that looked like a set of piano keys, with long, rectangular buttons.

"People would see a static image shown on a video screen that detailed the sequence to be typed out," Wymbs said. "They’re lying down inside the scanner and they see this image above their eyes. Interestingly, some people reported that the images looked like something out of (the video game) Guitar Hero, and, indeed, it does look a bit like guitar tablature. They would have to type out the ‘notes’ from left to right, as you normally would when reading music.

"After practicing a sequence for 200 trials, they would get pretty good at it," Wymbs added. "After awhile, the note patterns become familiar. At the start of the training, it would take someone about four and a half seconds to complete each sequence of 12 button presses. By the end of the experiment, the average participant could produce the same sequence in under three seconds."

The researchers’ goal was to look at which areas of the brain support the two-part process of chunking. “We feel that the motor process, or the concatenation process as we refer to it in the paper, tends to take over as you continue to practice and continue to learn the sequences,” Wymbs said. “That’s the one that’s tied to the motor output system –– the thing that’s actually accomplishing what we set out to do.”

With the experience of repeating a motor sequence, such as typing out a phone number, speaking, typing on a computer, or even texting, it becomes more automatic. “With automaticity comes the recruitment of core motor output regions,” Wymbs said.

The scientists discovered that the putamen –– a brain region that is critically important to movement –– supports the concatenation process of motor chunking, with robust connectivity to parts of the brain that are intimately tied to the output of skilled motor behavior. On the other hand, they found that cortical regions in the left hemisphere respond more during the parsing process of motor chunking. “These regions have been linked to the manipulation of motor information, which is something that we probably do more of when we just begin to learn the sequences as chunks,” Wymbs said.

"Initially, when you’re doing one of these 12-element sequences, you want to pause,” Wymbs added. “That would evoke more of the parsing mechanism. But then, over time, as you learn a sequence so that it becomes more automatic, and the concatenation process takes over and it wants to put all of these individual elements into a single fluid behavior.”

According to Wymbs, the findings could have implications for the study and diagnosis of Parkinson’s and other diseases of the motor system that involve action. “We show here that there are two potentially competing systems that lead to the isolation of different systems that both work to allow us to process things efficiently when we’re learning,” Wymbs said.

Provided by University of California - Santa Barbara

Source: medicalxpress.com

Filed under science neuroscience psychology brain

free counters