Neuroscience

Articles and news from the latest research reports.

157 notes

Seeing Our Errors Keeps Us On Our Toes

If people are unable to perceive their own errors as they complete a routine, simple task, their skill will decline over time, Johns Hopkins researchers have found — but not for the reasons scientists assumed. The researchers report that the human brain does not passively forget our good techniques, but chooses to put aside what it has learned.

The term “motor memories” may conjure images of childhood road trips, but in fact it refers to the reason why we’re able to smoothly perform everyday physical tasks. The amount of force needed to lift an empty glass versus a full one, to shut a car door or pick up a box, even to move a limb accurately from one place to another — all of these are motor memories.

In a report published May 1 in the The Journal of Neuroscience, the Johns Hopkins researchers describe their latest efforts to study how motor memories are formed and lost by focusing on one well-known experimental phenomenon: When people learn to do a task well, but are asked to keep doing it while receiving deliberately misleading feedback indicating that their performance is perfect every time, their actual performance will gradually get worse.

It had been assumed that the decline was due to the decay of memories in the absence of reinforcement, says Reza Shadmehr, Ph.D., a professor in the Department of Biomedical Engineering at the Johns Hopkins University School of Medicine.

But when Shadmehr and graduate student Pavan Vaswani asked volunteers to learn a simple task with a few twists designed to deliberately manipulate the brain’s motor control system, they learned otherwise.

The volunteers were told to push a joystick quickly toward a red dot on a computer screen. But the volunteers’ hands were placed under the screen, where they couldn’t see them, and their starting point was shown on the screen as a blue dot. In addition, as the volunteers moved the joystick toward the red dot, a force within the contraption would suddenly push the joystick to the left. So the volunteers practiced until they could move the blue dot straight to and past the red dot by compensating for the leftward push with pressure toward the right.

Once the volunteers had mastered the task, Shadmehr and Vaswani changed it up without their knowing. For one group of 24 volunteers, they added a stiff spring to the joystick device that would guide the user straight to the target, but would also measure the amount of rightward force the volunteers were applying. To the volunteers, it looked as though they were now doing the task perfectly every time, and, as in previous experiments, they gradually stopped pushing to the right, apparently “forgetting” what they had learned.

For a different group of 19 volunteers, though, the researchers not only added the spring, but also changed the feedback on the screen not to reflect what was actually happening during each task, but to show feedback similar to reruns of earlier efforts. The volunteers weren’t seeing the errors they were actually making, but feedback that looked convincingly like errors they might have made. This group continued to do the task as they’d learned, applying the right amount of force to the joystick hundreds of times.

This shows that decline in technique “isn’t just a process of forgetting,” says Vaswani. “Your brain notices that you are doing this task perfectly, and you see what you can do differently.”

Adds Shadmehr, “Our results correct a component of knowledge we thought we understood. Neuroscientists thought decay was intrinsic to motor memories, but in fact it’s not decay — it’s selection.”

Filed under motor memories forgetting neuroscience science

69 notes

Older adult clumsiness linked to brain changes

For many older adults, the aging process seems to go hand-in-hand with an annoying increase in clumsiness — difficulties dialing a phone, fumbling with keys in a lock or knocking over the occasional wine glass while reaching for a salt shaker.

image

While it’s easy to see these failings as a normal consequence of age-related breakdowns in agility, vision and other physical abilities, new research from Washington University in St. Louis suggests that some of these day-to-day reaching-and-grasping difficulties may be be caused by changes in the mental frame of reference that older adults use to visualize nearby objects.

“Reference frames help determine what in our environment we will pay attention to and they can affect how we interact with objects, such as controls for a car or dishes on a table,” said study co-author Richard Abrams, PhD, professor of psychology in Arts & Sciences.

“Our study shows that in addition to physical and perceptual changes, difficulties in interaction may also be caused by changes in how older adults mentally represent the objects near them.”

The study, published in the journal Psychological Science, is co-authored by two recent graduates of the psychology graduate program at Washington University. The lead author, Emily K. Bloesch, PhD, is now a postdoctoral teaching associate at Central Michigan University. The third co-author, Christopher C. Davoli, PhD, is a postdoctoral psychology researcher at the University of Notre Dame.

When tested on a series of simple tasks involving hand movements, young people in this study adopted an attentional reference frame centered on the hand, while older study participants adopted a reference frame centered on the body.

Young adults, the researchers explain, have been shown to use an “action-centered” reference frame that is sensitive to the movements they are making. So, when young people move their hands to pick up an object, they remain aware of and sensitive to potential obstacles along the movement path. Older adults, on the other hand, tend to devote more attention to objects that are closer to their bodies — whether they are on the action path or not.

“We showed in our paper that older adults do not use an “action centered” reference frame. Instead they use a “body centered” one,” Bloesch said. “As a result, they might be less able to effectively adjust their reaching movements to avoid obstacles — and that’s why they might knock over the wine glass after reaching for the salt shaker.”

These findings mesh well with other research that has documented age-related physical declines in several areas of the brain that are responsible for hand-eye coordination. Older adults exhibit volumetric declines in the parietal cortex and intraparietal sulcus, as well as white-matter loss in the parietal lobe and precuneus. These declines may make the use of an action-centered reference frame difficult or impossible.

“These three areas are highly involved in visually guided hand actions like reaching and grasping and in creating attentional reference frames that are used to guide such actions. These neurological changes in older adults suggest that their representations of the space around them may be compromised relative to those of young adults and that, consequently, young and older adults might encode and attend to near-body space in fundamentally different ways,” the study finds.

As the U.S. population ages, research on these issues is becoming increasingly important. An estimated 60-to-70 percent of the elderly population reports difficulty with activities of daily living, such as eating and bathing and many show deficiencies in performing goal-directed hand movements. Knowing more about these aging-related changes in spatial representation, the researchers suggest, may eventually inspire options for skills training and other therapies to help seniors compensate for the cognitive declines that influence hand-eye coordination

(Source: news.wustl.edu)

Filed under aging clumsiness intraparietal sulcus parietal cortex white matter psychology neuroscience science

192 notes

Never forget a face? Researchers find women have better memory recall than men
New research from McMaster University suggests women can remember faces better than men, in part because they spend more time studying features without even knowing it, and a technique researchers say can help improve anyone’s memories.
The findings help to answer long-standing questions about why some people can remember faces easily while others quickly forget someone they’ve just met.
“The way we move our eyes across a new individual’s face affects our ability to recognize that individual later,” explains Jennifer Heisz, a research fellow at the Rotman Research Institute at Baycrest Health Sciences and newly appointed assistant professor in the Department of Kinesiology at McMaster University.
She co-authored the paper with David Shore, psychology professor at McMaster and psychology graduate student Molly Pottruff.
“Our findings provide new insights into the potential mechanisms of episodic memory and the differences between the sexes. We discovered that women look more at new faces than men do, which allows them to create a richer and more superior memory,” Heisz says.
Eye tracking technology was used to monitor where study participants looked—be it eyes, nose or mouth—while they were shown a series of randomly selected faces on a computer screen. Each face was assigned a name that participants were asked to remember.
One group was tested over the course of one day, another group tested over the course of four days.
“We found that women fixated on the features far more than men, but this strategy operates completely outside of our awareness. Individuals don’t usually notice where their eyes fixate, so it’s all subconscious.”
The implications are exciting, she says, because it means anyone can be taught to scan more and potentially have better memory.
“The results open the possibility that changing our eye movement pattern may lead to better memory,” says Shore. “Increased scanning may prove to be a simple strategy to improve face memory in the general population, especially for individuals with memory impairment like older adults.”

Never forget a face? Researchers find women have better memory recall than men

New research from McMaster University suggests women can remember faces better than men, in part because they spend more time studying features without even knowing it, and a technique researchers say can help improve anyone’s memories.

The findings help to answer long-standing questions about why some people can remember faces easily while others quickly forget someone they’ve just met.

“The way we move our eyes across a new individual’s face affects our ability to recognize that individual later,” explains Jennifer Heisz, a research fellow at the Rotman Research Institute at Baycrest Health Sciences and newly appointed assistant professor in the Department of Kinesiology at McMaster University.

She co-authored the paper with David Shore, psychology professor at McMaster and psychology graduate student Molly Pottruff.

“Our findings provide new insights into the potential mechanisms of episodic memory and the differences between the sexes. We discovered that women look more at new faces than men do, which allows them to create a richer and more superior memory,” Heisz says.

Eye tracking technology was used to monitor where study participants looked—be it eyes, nose or mouth—while they were shown a series of randomly selected faces on a computer screen. Each face was assigned a name that participants were asked to remember.

One group was tested over the course of one day, another group tested over the course of four days.

“We found that women fixated on the features far more than men, but this strategy operates completely outside of our awareness. Individuals don’t usually notice where their eyes fixate, so it’s all subconscious.”

The implications are exciting, she says, because it means anyone can be taught to scan more and potentially have better memory.

“The results open the possibility that changing our eye movement pattern may lead to better memory,” says Shore. “Increased scanning may prove to be a simple strategy to improve face memory in the general population, especially for individuals with memory impairment like older adults.”

Filed under episodic memory eye movements face memory eye tracking technology psychology neuroscience science

71 notes

Common gene known to cause inherited autism now linked to specific behaviors
The genetic malady known as Fragile X syndrome is the most common cause of inherited autism and intellectual disability. Brain scientists know the gene defect that causes the syndrome and understand the damage it does in misshaping the brain’s synapses — the connections between neurons. But how this abnormal shaping of synapses translates into abnormal behavior is unclear.
Now, researchers at UCLA believe they know. Using a mouse model of Fragile X syndrome (FXS), they recorded the activity of networks of neurons in a living mouse brain while the animal was awake and asleep. They found that during both sleep and quiet wakefulness, these neuronal networks showed too much activity, firing too often and in sync, much more than a normal brain.
This neuronal excitability, the researchers said, may be the basis for symptoms in children with FXS, which can include disrupted sleep, seizures or learning disabilities. The findings may lead to treatments that could quiet the excessive activity and allow for more normal behavior.
The study results are published in the June 2 online edition of the journal Nature Neuroscience.
According to the National Fragile X Foundation, approximately one in every 3,600 to 4,000 males has the disorder, as does one in 4,000 to 6,000 females. FXS is caused by a mutation in the gene FMR1, which encodes the fragile X mental retardation protein, or FMRP. That protein is believed to be important for the formation and regulation of synapses. Mice that lack the FMR1 gene — and therefore lack the FMRP protein — show some of the same symptoms of human FXS, including seizures, impaired sleep, abnormal social relationships and learning defects.
"We wanted to find the link between the abnormal structure of synapses in the FXS mouse and the behavioral abnormalities at the level of brain circuits. That had not been previously established," said senior author Dr. Carlos Portera-Cailliau, an associate professor in the departments of neurology and neurobiology at UCLA. " So we tested the signaling between different neurons in Fragile X mice and indeed found there was abnormally high firing of action potentials — the signals between neurons — and also abnormally high synchrony — that is, too many neurons fired together. That’s a feature that is common in early brain development, but not in the adult."
"In essence, this points to a relative immaturity of brain circuits in FXS," added Tiago Gonçalves, a former postdoctoral researcher in Portera-Cailliau’s laboratory and the first author of the study.
The researchers used two-photon calcium imaging and patch-clamp electrophysiology — two sophisticated technologies that allowed them to record the signals from individual brain cells. Abnormally high firing and network synchrony, said Portera-Cailliau, is evidence of the fact that neuronal circuits are overexcitable in FXS.
"That likely leads to aberrant brain function or impairments in the normal computations of the brain," he said. "For example, high synchrony could lead to seizures; more neurons firing together could cause entire portions of the brain to fire synchronously, which is the basis of seizures."
And epilepsy, Portera-Cailliau said, is seen in up to 20 percent of children with FXS. High firing rates could also impair the ability of the brain to decode sensory stimuli by causing an overwhelming response to even simple sensory stimuli; this could lead to autism and the withdrawal from social interactions, he noted.
"Interestingly, we found that the high firing and synchrony were especially apparent at times when the animals were asleep," said Portera-Cailliau. "This is curious because a prominent symptom of FXS is disrupted sleep and frequent awakenings."
And, he noted, since sleep is important for encoding memories and consolidating learning, this hyperexcitability of brain networks in FXS may interfere with the process of laying down new memories, and perhaps explain the learning disability in children with FXS.
"Because brain scientists know a lot about the factors that regulate neuronal excitability, including inhibitory neurons, they can now try to use a variety of strategies to dampen neuronal excitation," he said. "Hopefully, this may be helpful to treat symptoms of FXS."
The next step, said Portera-Cailliau, is to explore whether Fragile X mice indeed exhibit exaggerated responses to sensory stimuli.
"An overwhelming reaction to a slight sound or caress, or hyperarousal to sensory stimuli, could be common to different types of autism and not just FXS," he said. "If hyperexcitability is the brain-network basis for these symptoms, then reducing neuronal excitability with certain drugs that modulate inhibition could be of therapeutic value in these devastating neurodevelopmental disorders."

Common gene known to cause inherited autism now linked to specific behaviors

The genetic malady known as Fragile X syndrome is the most common cause of inherited autism and intellectual disability. Brain scientists know the gene defect that causes the syndrome and understand the damage it does in misshaping the brain’s synapses — the connections between neurons. But how this abnormal shaping of synapses translates into abnormal behavior is unclear.

Now, researchers at UCLA believe they know. Using a mouse model of Fragile X syndrome (FXS), they recorded the activity of networks of neurons in a living mouse brain while the animal was awake and asleep. They found that during both sleep and quiet wakefulness, these neuronal networks showed too much activity, firing too often and in sync, much more than a normal brain.

This neuronal excitability, the researchers said, may be the basis for symptoms in children with FXS, which can include disrupted sleep, seizures or learning disabilities. The findings may lead to treatments that could quiet the excessive activity and allow for more normal behavior.

The study results are published in the June 2 online edition of the journal Nature Neuroscience.

According to the National Fragile X Foundation, approximately one in every 3,600 to 4,000 males has the disorder, as does one in 4,000 to 6,000 females. FXS is caused by a mutation in the gene FMR1, which encodes the fragile X mental retardation protein, or FMRP. That protein is believed to be important for the formation and regulation of synapses. Mice that lack the FMR1 gene — and therefore lack the FMRP protein — show some of the same symptoms of human FXS, including seizures, impaired sleep, abnormal social relationships and learning defects.

"We wanted to find the link between the abnormal structure of synapses in the FXS mouse and the behavioral abnormalities at the level of brain circuits. That had not been previously established," said senior author Dr. Carlos Portera-Cailliau, an associate professor in the departments of neurology and neurobiology at UCLA. " So we tested the signaling between different neurons in Fragile X mice and indeed found there was abnormally high firing of action potentials — the signals between neurons — and also abnormally high synchrony — that is, too many neurons fired together. That’s a feature that is common in early brain development, but not in the adult."

"In essence, this points to a relative immaturity of brain circuits in FXS," added Tiago Gonçalves, a former postdoctoral researcher in Portera-Cailliau’s laboratory and the first author of the study.

The researchers used two-photon calcium imaging and patch-clamp electrophysiology — two sophisticated technologies that allowed them to record the signals from individual brain cells. Abnormally high firing and network synchrony, said Portera-Cailliau, is evidence of the fact that neuronal circuits are overexcitable in FXS.

"That likely leads to aberrant brain function or impairments in the normal computations of the brain," he said. "For example, high synchrony could lead to seizures; more neurons firing together could cause entire portions of the brain to fire synchronously, which is the basis of seizures."

And epilepsy, Portera-Cailliau said, is seen in up to 20 percent of children with FXS. High firing rates could also impair the ability of the brain to decode sensory stimuli by causing an overwhelming response to even simple sensory stimuli; this could lead to autism and the withdrawal from social interactions, he noted.

"Interestingly, we found that the high firing and synchrony were especially apparent at times when the animals were asleep," said Portera-Cailliau. "This is curious because a prominent symptom of FXS is disrupted sleep and frequent awakenings."

And, he noted, since sleep is important for encoding memories and consolidating learning, this hyperexcitability of brain networks in FXS may interfere with the process of laying down new memories, and perhaps explain the learning disability in children with FXS.

"Because brain scientists know a lot about the factors that regulate neuronal excitability, including inhibitory neurons, they can now try to use a variety of strategies to dampen neuronal excitation," he said. "Hopefully, this may be helpful to treat symptoms of FXS."

The next step, said Portera-Cailliau, is to explore whether Fragile X mice indeed exhibit exaggerated responses to sensory stimuli.

"An overwhelming reaction to a slight sound or caress, or hyperarousal to sensory stimuli, could be common to different types of autism and not just FXS," he said. "If hyperexcitability is the brain-network basis for these symptoms, then reducing neuronal excitability with certain drugs that modulate inhibition could be of therapeutic value in these devastating neurodevelopmental disorders."

Filed under fragile x syndrome brain circuits neuronal networks synapses fmr1 gene neuroscience science

79 notes

Neuroscience Research Project Examines Neural Synchronization Patterns During Addiction
A cross-disciplinary collaboration of researchers in the School of Science at Indiana University-Purdue University Indianapolis (IUPUI) explores the neural synchrony between circuits in the brain and their behavior under simulated drug addiction. The two-year study could have broad implications for treating addiction and understanding brain function in conditions such as Parkinson’s disease.
Advanced mathematical models coupled with extensive laboratory testing revealed recurrent stimulant injections in rodents resulted in neural circuits that could easily synchronize but were more likely to become unstable. In other words, the introduction and restriction of drugs over time caused neurons to lose their ability to engage supervisory control over brain function and behavior. Researchers noticed these short periods of desynchronization were much more prevalent and caused changes in neurobiology and behavior.
“A better understanding of the dynamics of neural synchrony could have very important implications for understanding the addicted brain and may provide a physiological target to understand persistent neural changes that contribute to the probability of relapse,” said Christopher Lapish, Ph.D., assistant professor of psychology at IUPUI.
Lapish, with expertise in neurophysiology and addiction, and Leonid Rubchinsky, Ph.D., associate professor of mathematical sciences, collaborated on the project with support from the IUPUI Institute for Mathematical Modeling and Computational Science. Rubchinsky is an applied mathematician and neuroscientist who has extensively studied the neurophysiology of Parkinson’s disease.
Sungwoo Ahn, Ph.D., a post-doctoral fellow in mathematical sciences, also co-authored the study, recently published in the Cerebral Cortex scientific journal.
The research was patterned after the various stages of drug addiction: the first introduction of amphetamines, periods of abstinence that model withdrawal and then relapse.
The neural synchrony patterns of models injected with a stimulant were compared to those injected with a saline solution. Short periods of desychronization were prevalent in both groups, but the drug-affected group displayed a marked connection between synchrony and brain function. Synchrony has long been considered to play an important role in how the brain processes data, so any disruption of this pattern could hold significant research value, according to the published study.
“Through these long and progressive experimental examinations, we were able to explore the different areas of the brain and how they are connected to each other,” Rubchinsky said. “In addition to understanding, monitoring, diagnosing and treating addiction, this type of study is helpful in better understanding how the normal brain works.”
This collaboration moves scientists closer to understanding brain function and disruptions, Rubchinsky said, by incorporating mathematical models that recreate events and reactions in the brain over time. Lapish agreed, saying computational science ultimately will drive the growth and success of future neuroscience research.
“Neuroscience is an inherently data-rich science and, by combining experimentalists with theorists, there is a tremendous potential for discovery,” Lapish. “The interactive effects of this collaboration are certainly greater than the sum of its parts. We’re able to create a fully dynamic picture of this process that would not be possible without combining these two areas of expertise.”
Moving forward, the team will continue to seek funding to advance their research methods and better understand the role of synchrony in brain function. By doing so, scientists could map the progress and deterioration of neural circuits in various scenarios.

Neuroscience Research Project Examines Neural Synchronization Patterns During Addiction

A cross-disciplinary collaboration of researchers in the School of Science at Indiana University-Purdue University Indianapolis (IUPUI) explores the neural synchrony between circuits in the brain and their behavior under simulated drug addiction. The two-year study could have broad implications for treating addiction and understanding brain function in conditions such as Parkinson’s disease.

Advanced mathematical models coupled with extensive laboratory testing revealed recurrent stimulant injections in rodents resulted in neural circuits that could easily synchronize but were more likely to become unstable. In other words, the introduction and restriction of drugs over time caused neurons to lose their ability to engage supervisory control over brain function and behavior. Researchers noticed these short periods of desynchronization were much more prevalent and caused changes in neurobiology and behavior.

“A better understanding of the dynamics of neural synchrony could have very important implications for understanding the addicted brain and may provide a physiological target to understand persistent neural changes that contribute to the probability of relapse,” said Christopher Lapish, Ph.D., assistant professor of psychology at IUPUI.

Lapish, with expertise in neurophysiology and addiction, and Leonid Rubchinsky, Ph.D., associate professor of mathematical sciences, collaborated on the project with support from the IUPUI Institute for Mathematical Modeling and Computational Science. Rubchinsky is an applied mathematician and neuroscientist who has extensively studied the neurophysiology of Parkinson’s disease.

Sungwoo Ahn, Ph.D., a post-doctoral fellow in mathematical sciences, also co-authored the study, recently published in the Cerebral Cortex scientific journal.

The research was patterned after the various stages of drug addiction: the first introduction of amphetamines, periods of abstinence that model withdrawal and then relapse.

The neural synchrony patterns of models injected with a stimulant were compared to those injected with a saline solution. Short periods of desychronization were prevalent in both groups, but the drug-affected group displayed a marked connection between synchrony and brain function. Synchrony has long been considered to play an important role in how the brain processes data, so any disruption of this pattern could hold significant research value, according to the published study.

“Through these long and progressive experimental examinations, we were able to explore the different areas of the brain and how they are connected to each other,” Rubchinsky said. “In addition to understanding, monitoring, diagnosing and treating addiction, this type of study is helpful in better understanding how the normal brain works.”

This collaboration moves scientists closer to understanding brain function and disruptions, Rubchinsky said, by incorporating mathematical models that recreate events and reactions in the brain over time. Lapish agreed, saying computational science ultimately will drive the growth and success of future neuroscience research.

“Neuroscience is an inherently data-rich science and, by combining experimentalists with theorists, there is a tremendous potential for discovery,” Lapish. “The interactive effects of this collaboration are certainly greater than the sum of its parts. We’re able to create a fully dynamic picture of this process that would not be possible without combining these two areas of expertise.”

Moving forward, the team will continue to seek funding to advance their research methods and better understand the role of synchrony in brain function. By doing so, scientists could map the progress and deterioration of neural circuits in various scenarios.

Filed under addiction neural synchrony prefrontal cortex amphetamine brain function neuroscience science

405 notes

Anxious? Activate Your Anterior Cingulate Cortex With a Little Meditation
Scientists, like Buddhist monks and Zen masters, have known for years that meditation can reduce anxiety, but not how. Scientists at Wake Forest Baptist Medical Center, however, have succeeded in identifying the brain functions involved.
“Although we’ve known that meditation can reduce anxiety, we hadn’t identified the specific brain mechanisms involved in relieving anxiety in healthy individuals,” said Fadel Zeidan, Ph.D., postdoctoral research fellow in neurobiology and anatomy at Wake Forest Baptist and lead author of the study. “In this study, we were able to see which areas of the brain were activated and which were deactivated during meditation-related anxiety relief.”
The study is published in the current edition of the journal Social Cognitive and Affective Neuroscience.
For the study, 15 healthy volunteers with normal levels of everyday anxiety were recruited for the study. These individuals had no previous meditation experience or anxiety disorders. All subjects participated in four 20-minute classes to learn a technique known as mindfulness meditation. In this form of meditation, people are taught to focus on breath and body sensations and to non-judgmentally evaluate distracting thoughts and emotions.
Both before and after meditation training, the study participants’ brain activity was examined using a special type of imaging – arterial spin labeling magnetic resonance imaging – that is very effective at imaging brain processes, such as meditation. In addition, anxiety reports were measured before and after brain scanning.
The majority of study participants reported decreases in anxiety. Researchers found that meditation reduced anxiety ratings by as much as 39 percent.
“This showed that just a few minutes of mindfulness meditation can help reduce normal everyday anxiety,” Zeidan said.
The study revealed that meditation-related anxiety relief is associated with activation of the anterior cingulate cortex and ventromedial prefrontal cortex, areas of the brain involved with executive-level function. During meditation, there was more activity in the ventromedial prefrontal cortex, the area of the brain that controls worrying. In addition, when activity increased in the anterior cingulate cortex – the area that governs thinking and emotion – anxiety decreased.
“Mindfulness is premised on sustaining attention in the present moment and controlling the way we react to daily thoughts and feelings,” Zeidan said. “Interestingly, the present findings reveal that the brain regions associated with meditation-related anxiety relief are remarkably consistent with the principles of being mindful.”
Research at other institutions has shown that meditation can significantly reduce anxiety in patients with generalized anxiety and depression disorders. The results of this neuroimaging experiment complement that body of knowledge by showing the brain mechanisms associated with meditation-related anxiety relief in healthy people, he said.

Anxious? Activate Your Anterior Cingulate Cortex With a Little Meditation

Scientists, like Buddhist monks and Zen masters, have known for years that meditation can reduce anxiety, but not how. Scientists at Wake Forest Baptist Medical Center, however, have succeeded in identifying the brain functions involved.

“Although we’ve known that meditation can reduce anxiety, we hadn’t identified the specific brain mechanisms involved in relieving anxiety in healthy individuals,” said Fadel Zeidan, Ph.D., postdoctoral research fellow in neurobiology and anatomy at Wake Forest Baptist and lead author of the study. “In this study, we were able to see which areas of the brain were activated and which were deactivated during meditation-related anxiety relief.”

The study is published in the current edition of the journal Social Cognitive and Affective Neuroscience.

For the study, 15 healthy volunteers with normal levels of everyday anxiety were recruited for the study. These individuals had no previous meditation experience or anxiety disorders. All subjects participated in four 20-minute classes to learn a technique known as mindfulness meditation. In this form of meditation, people are taught to focus on breath and body sensations and to non-judgmentally evaluate distracting thoughts and emotions.

Both before and after meditation training, the study participants’ brain activity was examined using a special type of imaging – arterial spin labeling magnetic resonance imaging – that is very effective at imaging brain processes, such as meditation. In addition, anxiety reports were measured before and after brain scanning.

The majority of study participants reported decreases in anxiety. Researchers found that meditation reduced anxiety ratings by as much as 39 percent.

“This showed that just a few minutes of mindfulness meditation can help reduce normal everyday anxiety,” Zeidan said.

The study revealed that meditation-related anxiety relief is associated with activation of the anterior cingulate cortex and ventromedial prefrontal cortex, areas of the brain involved with executive-level function. During meditation, there was more activity in the ventromedial prefrontal cortex, the area of the brain that controls worrying. In addition, when activity increased in the anterior cingulate cortex – the area that governs thinking and emotion – anxiety decreased.

“Mindfulness is premised on sustaining attention in the present moment and controlling the way we react to daily thoughts and feelings,” Zeidan said. “Interestingly, the present findings reveal that the brain regions associated with meditation-related anxiety relief are remarkably consistent with the principles of being mindful.”

Research at other institutions has shown that meditation can significantly reduce anxiety in patients with generalized anxiety and depression disorders. The results of this neuroimaging experiment complement that body of knowledge by showing the brain mechanisms associated with meditation-related anxiety relief in healthy people, he said.

Filed under anxiety mindfulness meditation brain activity anterior cingulate cortex neuroscience science

99 notes

Fear: A Justified Response or Faulty Wiring?
Fear is one of the most primal feelings known to man and beast. As we develop in society and learn, fear is hard coded into our neural circuitry through the amygdala, a small, almond-shaped nuclei of neurons within the medial temporal lobe of the brain. For psychologists and neurologists, the amygdala is a particularly interesting region of the brain because it plays a role in emotional learning and can have profound effects on human and animal behavior.
On June 3, 2013, a new article studying amygdala activity in human beings will be published as part of JoVE Behavior, a new section of the video journal that focuses on the behavioral sciences. The technique, developed by Dr. Fred Helmstetter and his research group at the University of Wisconsin-Milwaukee, studies how the brain responds to anticipated painful stimuli, in this case an electric shock, in volunteer test subjects.
“We’re interested in how the brain reacts to stimuli in the environment and how it changes when we form a memory of what we experience.” Dr. Helmstetter explains. “The amygdala is a part of the brain that’s important for the way we determine what is dangerous and what is safe around us and how we react to threat.  This experiment is novel in that we are able to look at activity in the amygdala on a very detailed time scale while it responds to human faces.“
The technique takes advantage of two neuroimaging techniques: magnetic resonance imaging and magnetoencephalography. Magnetic resonance imaging (MRI) is a method where a test subject’s brain can be imaged in high resolution while the test subject is immobilized, creating a map of the brain. Once this map has been obtained, magnetoencephalography (MEG) is used to record the magnetic fields created by the electrical activity within the brain. When the test subject is shocked, or anticipates a shock, amygdala activity is picked up by the MEG and mapped to the MRI computer model.
As an emotional control center in the brain, the amygdala serves as a key component in a line of neurological structures that identify and respond to perceived threat. Dr. Helmstetter tells us, “There is good evidence to suggest that anxiety disorders and other psychopathology might be directly related to altered functioning of the amygdala. Prior work with other non-invasive imaging modalities supports this idea but has only been able to average the results of neural activity over several seconds which results in a poor picture of how neurons react to a stimulus over time. This work represents a significant improvement and will allow new questions to be answered.”
 The article is part of the launch of JoVE Behavior, the eighth section of JoVE. Founded in 2006, JoVE has rapidly expanded its scope from general biology to many disciplines by visualizing experimentation. Director of Content Aaron Kolski-Andreaco, PhD explains that, “By dedicating a section to behavior, JoVE has provided a platform for researchers to visualize experiments aimed at answering questions about how we think, feel, and communicate with one another.   Emphasizing this area of science is the next logical step for our journal, as the multidisciplinary study of behavior is enabled by technological advancements in physics, chemistry, and the life sciences - areas JoVE has already covered.”

Fear: A Justified Response or Faulty Wiring?

Fear is one of the most primal feelings known to man and beast. As we develop in society and learn, fear is hard coded into our neural circuitry through the amygdala, a small, almond-shaped nuclei of neurons within the medial temporal lobe of the brain. For psychologists and neurologists, the amygdala is a particularly interesting region of the brain because it plays a role in emotional learning and can have profound effects on human and animal behavior.

On June 3, 2013, a new article studying amygdala activity in human beings will be published as part of JoVE Behavior, a new section of the video journal that focuses on the behavioral sciences. The technique, developed by Dr. Fred Helmstetter and his research group at the University of Wisconsin-Milwaukee, studies how the brain responds to anticipated painful stimuli, in this case an electric shock, in volunteer test subjects.

“We’re interested in how the brain reacts to stimuli in the environment and how it changes when we form a memory of what we experience.” Dr. Helmstetter explains. “The amygdala is a part of the brain that’s important for the way we determine what is dangerous and what is safe around us and how we react to threat.  This experiment is novel in that we are able to look at activity in the amygdala on a very detailed time scale while it responds to human faces.“

The technique takes advantage of two neuroimaging techniques: magnetic resonance imaging and magnetoencephalography. Magnetic resonance imaging (MRI) is a method where a test subject’s brain can be imaged in high resolution while the test subject is immobilized, creating a map of the brain. Once this map has been obtained, magnetoencephalography (MEG) is used to record the magnetic fields created by the electrical activity within the brain. When the test subject is shocked, or anticipates a shock, amygdala activity is picked up by the MEG and mapped to the MRI computer model.

As an emotional control center in the brain, the amygdala serves as a key component in a line of neurological structures that identify and respond to perceived threat. Dr. Helmstetter tells us, “There is good evidence to suggest that anxiety disorders and other psychopathology might be directly related to altered functioning of the amygdala. Prior work with other non-invasive imaging modalities supports this idea but has only been able to average the results of neural activity over several seconds which results in a poor picture of how neurons react to a stimulus over time. This work represents a significant improvement and will allow new questions to be answered.”

The article is part of the launch of JoVE Behavior, the eighth section of JoVE. Founded in 2006, JoVE has rapidly expanded its scope from general biology to many disciplines by visualizing experimentation. Director of Content Aaron Kolski-Andreaco, PhD explains that, “By dedicating a section to behavior, JoVE has provided a platform for researchers to visualize experiments aimed at answering questions about how we think, feel, and communicate with one another.   Emphasizing this area of science is the next logical step for our journal, as the multidisciplinary study of behavior is enabled by technological advancements in physics, chemistry, and the life sciences - areas JoVE has already covered.”

Filed under fear neuroimaging amygdala amygdala activity electric shock brain neuroscience psychology science

71 notes

Neuronal regeneration and the two-part design of nerves 
Researchers at the University of Michigan have evidence that a single gene controls both halves of nerve cells, and their research demonstrates the need to consider that design in the development of new treatments for regeneration of nerve cells.
A paper published online in PLOS Biology by U-M Life Sciences Institute faculty member Bing Ye and colleagues shows that manipulating genes of the fruit fly Drosophila to promote the growth of one part of the neuron simultaneously stunts the growth of the other part.
Understanding this bimodal nature of neurons is important for researchers developing therapies for spinal cord injury, neurodegeneration and other nervous system diseases, Ye said.
Nerve cells look strikingly like trees, with a crown of “branches” converging at a “trunk.” The branches, called dendrites, input information from other neurons into the nerve cell. The trunk, or axon, transmits the signal to the next cell.
"If you want to regenerate an axon to repair an injury, you have to take care of the other end, too," said Ye, assistant professor in the Department of Cell and Developmental Biology at the U-M Medical School.
The separation of the nerve cell into these two parts is so fundamental to neuroscience that it’s known as the “neuron doctrine,” but how exactly neurons create, maintain and regulate these two separate parts and functions is still largely unknown.
While the body is growing, the neuronal network grows rapidly. But nerve cells don’t divide and replicate like other cells in the body (instead, a specific type of stem cell creates them). Adult nerve cells appear to no longer have the drive to grow, so the loss of neurons due to injury or neurodegeneration can be permanent.
Ye’s paper highlights the bimodal nature of neurons by explaining how a kinase that promotes axon growth surprisingly has the opposite effect of impeding dendrite growth of the same cell.
In the quest to understand the fundamentals of nerve cell growth in order to stimulate regrowth after injury, scientists have identified the genes responsible for axon growth and were able to induce dramatic growth of the long “trunk” of the cell, but less attention has been given to dendrites.
There are technical reasons that studying axons is easier than studying dendrites: The bundle of axons in a nerve is easier to track under the microscope, but to get an image of dendrites would require labeling single neurons.
Ye’s lab circumvented that obstacle by using Drosophila as a model. Using this simple model of the nervous system, the scientists were able to reliably label both axons and dendrites of single neurons and see what happened to nerve cells with various mutations of genes that are shared between the flies and humans.
One of the genes shared by Drosophila and people is the one that makes a protein called Dual Lucine Zipper Kinase, or DLK. As described previously by other groups, DLK is a product of the gene responsible for axon growth. Cells with more of the protein had very long axons, and those without the gene or protein had no regeneration after nerve injury. The DLK kinase seemed a promising target for therapies to regenerate nerve cells.
However, Ye’s lab found that the kinase had the opposite effect on the dendrites: Lots of DLK leads to diminished dendrites.
"This in vivo evidence of bimodal control of neuronal growth calls attention to the need to look at the other side of a neuron in terms of developing new therapies," Ye said. "If we use this kinase, DLK, as a drug target for axon growth, we’ll have to figure out a way to block its effect on dendrites."

Neuronal regeneration and the two-part design of nerves

Researchers at the University of Michigan have evidence that a single gene controls both halves of nerve cells, and their research demonstrates the need to consider that design in the development of new treatments for regeneration of nerve cells.

A paper published online in PLOS Biology by U-M Life Sciences Institute faculty member Bing Ye and colleagues shows that manipulating genes of the fruit fly Drosophila to promote the growth of one part of the neuron simultaneously stunts the growth of the other part.

Understanding this bimodal nature of neurons is important for researchers developing therapies for spinal cord injury, neurodegeneration and other nervous system diseases, Ye said.

Nerve cells look strikingly like trees, with a crown of “branches” converging at a “trunk.” The branches, called dendrites, input information from other neurons into the nerve cell. The trunk, or axon, transmits the signal to the next cell.

"If you want to regenerate an axon to repair an injury, you have to take care of the other end, too," said Ye, assistant professor in the Department of Cell and Developmental Biology at the U-M Medical School.

The separation of the nerve cell into these two parts is so fundamental to neuroscience that it’s known as the “neuron doctrine,” but how exactly neurons create, maintain and regulate these two separate parts and functions is still largely unknown.

While the body is growing, the neuronal network grows rapidly. But nerve cells don’t divide and replicate like other cells in the body (instead, a specific type of stem cell creates them). Adult nerve cells appear to no longer have the drive to grow, so the loss of neurons due to injury or neurodegeneration can be permanent.

Ye’s paper highlights the bimodal nature of neurons by explaining how a kinase that promotes axon growth surprisingly has the opposite effect of impeding dendrite growth of the same cell.

In the quest to understand the fundamentals of nerve cell growth in order to stimulate regrowth after injury, scientists have identified the genes responsible for axon growth and were able to induce dramatic growth of the long “trunk” of the cell, but less attention has been given to dendrites.

There are technical reasons that studying axons is easier than studying dendrites: The bundle of axons in a nerve is easier to track under the microscope, but to get an image of dendrites would require labeling single neurons.

Ye’s lab circumvented that obstacle by using Drosophila as a model. Using this simple model of the nervous system, the scientists were able to reliably label both axons and dendrites of single neurons and see what happened to nerve cells with various mutations of genes that are shared between the flies and humans.

One of the genes shared by Drosophila and people is the one that makes a protein called Dual Lucine Zipper Kinase, or DLK. As described previously by other groups, DLK is a product of the gene responsible for axon growth. Cells with more of the protein had very long axons, and those without the gene or protein had no regeneration after nerve injury. The DLK kinase seemed a promising target for therapies to regenerate nerve cells.

However, Ye’s lab found that the kinase had the opposite effect on the dendrites: Lots of DLK leads to diminished dendrites.

"This in vivo evidence of bimodal control of neuronal growth calls attention to the need to look at the other side of a neuron in terms of developing new therapies," Ye said. "If we use this kinase, DLK, as a drug target for axon growth, we’ll have to figure out a way to block its effect on dendrites."

Filed under neurodegeneration nerve cells kinase spinal cord injuries axon growth neuroscience science

100 notes

Heart Health Matters to Your Brain
People suffering from type 2 diabetes and cardiovascular disease (CVD) are at an increased risk of cognitive decline, according to a new study from Wake Forest Baptist Medical Center.
Lead author Christina E. Hugenschmidt, Ph.D., an instructor of gerontology and geriatric medicine at Wake Forest Baptist, said the results from the Diabetes Heart Study-Mind (DHS-Mind) suggest that CVD is playing a role in cognition problems before it is clinically apparent in patients. The research appears online ahead of print in the Journal of Diabetes and Its Complications.
 ”There has been a lot of research looking at the links between type 2 diabetes and increased risk for dementia, but this is the first study to look specifically at subclinical CVD and the role it plays,” Hugenschmidt said. “Our research shows that CVD risk caused by diabetes even before it’s at a clinically treatable level might be bad for your brain.
"The results imply that additional CVD factors, especially calcified plaque and vascular status, and not diabetes status alone, are major contributors to type 2 diabetes related cognitive decline."
Hugenschmidt said DHS-Mind is a follow-up study to the Diabetes Heart Study (DHS), which examined relationships between cognitive function, vascular calcified plaque and other major diabetes risk factors associated with cognition. The DHS investigated CVD in siblings with a high incidence and prevalence of type 2 diabetes, where extensive measurements of CVD risk factors were obtained during exams that occurred from 1998 to 2006.
The study was supported by the National Institutes of Health through NINDS R01NS058700-02S109 and NIDDK 1F32DK083214-01.
The DHS-Mind study added cognitive testing to existing measures with the express purpose of exploring the relationships between measures of atherosclerosis and cognition in a population heavily affected by diabetes, a novel approach given that previous studies have focused on diabetes and cognition in the context of clinically evident CVD, Hugenschmidt said. The researchers followed up with as many of the original 1,443 DHS study participants as possible who had cardiovascular measures. Of that 516 total, 422 were affected with type 2 diabetes and 94 were unaffected.
Hugenschmidt said the researchers ran a battery of cognitive testing that looked at different kinds of thinking like memory and processing speed, as well as executive function, which is a set of mental skills coordinated in the brain’s frontal lobe that includes stop and think processes like managing time and attention, planning and organizing. She said that being able to look at data where the comparison group was  siblings, some of whom had a high level of CVD themselves, made the results more clinically relevant because the participants shared the same environmental and genetic background.
"We still saw a difference between these two groups. Even compared to their own siblings who were not disease free, those with diabetes and subclinical cardiovascular disease had a higher risk of cognitive dysfunction," Hugenschmidt said.
CVD explains a lot of the cognitive problems that people with diabetes experience, Hugenschmidt said. “One possibility is that your brain requires a really steady blood flow and it’s possible that the cardiovascular disease that accompanies diabetes might be the main driver behind the cognitive deficits that we see.”
Hugenschmidt said the takeaway for clinicians is to take CVD risk factors into consideration when they’re treating patients with type 2 diabetes patients because even at borderline clinical levels, it might have long-term implications for peoples’ mental, cognitive health.

Heart Health Matters to Your Brain

People suffering from type 2 diabetes and cardiovascular disease (CVD) are at an increased risk of cognitive decline, according to a new study from Wake Forest Baptist Medical Center.

Lead author Christina E. Hugenschmidt, Ph.D., an instructor of gerontology and geriatric medicine at Wake Forest Baptist, said the results from the Diabetes Heart Study-Mind (DHS-Mind) suggest that CVD is playing a role in cognition problems before it is clinically apparent in patients. The research appears online ahead of print in the Journal of Diabetes and Its Complications.

 ”There has been a lot of research looking at the links between type 2 diabetes and increased risk for dementia, but this is the first study to look specifically at subclinical CVD and the role it plays,” Hugenschmidt said. “Our research shows that CVD risk caused by diabetes even before it’s at a clinically treatable level might be bad for your brain.

"The results imply that additional CVD factors, especially calcified plaque and vascular status, and not diabetes status alone, are major contributors to type 2 diabetes related cognitive decline."

Hugenschmidt said DHS-Mind is a follow-up study to the Diabetes Heart Study (DHS), which examined relationships between cognitive function, vascular calcified plaque and other major diabetes risk factors associated with cognition. The DHS investigated CVD in siblings with a high incidence and prevalence of type 2 diabetes, where extensive measurements of CVD risk factors were obtained during exams that occurred from 1998 to 2006.

The study was supported by the National Institutes of Health through NINDS R01NS058700-02S109 and NIDDK 1F32DK083214-01.

The DHS-Mind study added cognitive testing to existing measures with the express purpose of exploring the relationships between measures of atherosclerosis and cognition in a population heavily affected by diabetes, a novel approach given that previous studies have focused on diabetes and cognition in the context of clinically evident CVD, Hugenschmidt said. The researchers followed up with as many of the original 1,443 DHS study participants as possible who had cardiovascular measures. Of that 516 total, 422 were affected with type 2 diabetes and 94 were unaffected.

Hugenschmidt said the researchers ran a battery of cognitive testing that looked at different kinds of thinking like memory and processing speed, as well as executive function, which is a set of mental skills coordinated in the brain’s frontal lobe that includes stop and think processes like managing time and attention, planning and organizing. She said that being able to look at data where the comparison group was  siblings, some of whom had a high level of CVD themselves, made the results more clinically relevant because the participants shared the same environmental and genetic background.

"We still saw a difference between these two groups. Even compared to their own siblings who were not disease free, those with diabetes and subclinical cardiovascular disease had a higher risk of cognitive dysfunction," Hugenschmidt said.

CVD explains a lot of the cognitive problems that people with diabetes experience, Hugenschmidt said. “One possibility is that your brain requires a really steady blood flow and it’s possible that the cardiovascular disease that accompanies diabetes might be the main driver behind the cognitive deficits that we see.”

Hugenschmidt said the takeaway for clinicians is to take CVD risk factors into consideration when they’re treating patients with type 2 diabetes patients because even at borderline clinical levels, it might have long-term implications for peoples’ mental, cognitive health.

Filed under cardiovascular disease diabetes cognitive decline neurodegeneration neuroscience science

50 notes

PET Finds Increased Cognitive Reserve Levels in Highly Educated Pre-Alzheimer’s Patients

Highly educated individuals with mild cognitive impairment that later progressed to Alzheimer’s disease cope better with the disease than individuals with a lower level of education in the same situation, according to research published in the June issue of The Journal of Nuclear Medicine. In the study “Metabolic Networks Underlying Cognitive Reserve in Prodromal Alzheimer Disease: A European Alzheimer Disease Consortium Project,”neural reserve and neural compensation were both shown to play a role in determining cognitive reserve, as evidenced by positron emission tomography (PET).

Cognitive reserve refers to the hypothesized capacity of an adult brain to cope with brain damage in order to maintain a relatively preserved functional level. Understanding the brain adaptation mechanisms underlying this process remains a critical question, and researchers of this study sought to investigate the metabolic basis of cognitive reserve in individuals with higher (more than 12 years) and lower (less than 12 years) levels of education who had mild cognitive impairment that progressed to Alzheimer’s disease, also known as prodromal Alzheimer’s disease.

“This study provides new insight into the functional mechanisms that mediate the cognitive reserve phenomenon in the early stages of Alzheimer’s disease,” said Silvia Morbelli, MD, lead author of the study.  “A crucial role of the dorso-lateral prefrontal cortex was highlighted by demonstrating that this region is involved in a wide fronto-temporal and limbic functional network in patients with Alzheimer’s disease and high education, but not in poorly educated Alzheimer’s disease patients.”

In the study, 64 patients with prodromal Alzheimer’s disease and 90 control subjects—coming from the brain PET project (chaired by Flavio Nobili, MD, in Genoa, Italy) of the European Alzheimer Disease Consortium—underwentbrain 18F-FDG PET scans. Individuals were divided into a subgroup with a low level of education (42 controls and 36 prodromal Alzheimer’s disease patients) and a highly educated subgroup (40 controls and 28 prodromal Alzheimer’s disease patients). Brain metabolism was compared between education-matched groups of patients and controls, and then between highly and poorly educated prodromal Alzheimer’s disease patients.

Higher metabolic activity was shown in the dorso-lateral prefrontal cortex for prodromal Alzheimer’s disease patients. More extended and significant correlations of metabolism within the right dorso-lateral prefrontal cortex and other brain regions were found with highly educated than less educated prodromal Alzheimer’s disease patients or even highly educated controls.

This result suggests that neural reserve and neural compensation are activated in highly educated prodromal Alzheimer’s disease patients. Researchers concluded that evaluation of the implication of metabolic connectivity in cognitive reserve further confirms that adding a comprehensive evaluation of resting 18F-FDG PET brain distribution to standard inspection may allow a more complete comprehension of Alzheimer’s disease pathophysiology and possibly may increase 18F-FDG PET diagnostic sensitivity.

“This work supports the notion that employing the brain in complex tasks and developing our own education may help in forming stronger ‘defenses’ against cognitive deterioration once Alzheimer knocks at our door,” noted Morbelli.“It’s possible that, in the future, a combined approach evaluating resting metabolic connectivity and cognitive performance can be used on an individual basis to better predict cognitive decline or response to disease-modifying therapy.”

(Source: interactive.snm.org)

Filed under cognitive impairment alzheimer's disease cognitive reserve PET prodromal alzheimer’s disease education neuroscience science

free counters