Neuroscience

Articles and news from the latest research reports.

Posts tagged neuroscience

230 notes

Human brain development is a symphony in three movements
The human brain develops with an exquisitely timed choreography marked by distinct patterns of gene activity at different stages from the womb to adulthood, Yale researchers report in the Dec. 26 issue of the journal Neuron.
The Yale team conducted a large-scale analysis of gene activity in cerebral neocortex —an area of the brain governing perception, behavior, and cognition — at different stages of development. The analysis shows the general architecture of brain regions is largely formed in the first six months after conception by a burst of genetic activity, which is distinct for specific regions of the neocortex. This rush is followed by a sort of intermission beginning in the third trimester of pregnancy. During this period, most genes that are active in specific brain regions are quieted — except for genes that spur connections between all neocortex regions. Then in late childhood and early adolescence, the genetic orchestra begins again and helps subtly shape neocortex regions that progressively perform more specialized tasks, a process that continues into adulthood.
The analysis is the first to show this “hour glass” sketch of human brain development, with a lull in genetic activity sandwiched between highly complex patterns of gene expression, said Nenad Sestan, professor of neurobiology at Yale’s Kavli Institute for Neuroscience and senior author of the study. Intriguingly, say the researchers, some of the same patterns of genetic activity that define this human “hour glass” sketch were not observed in developing monkeys, indicating that they may play a role in shaping the features specific to human brain development.
The findings emphasize the importance of the proper interplay between genes and environment in the child’s earliest years after birth when the formation of synaptic connections between brain cells becomes synchronized, which shape how brain structures will be used later in life, said Sestan. For instance, disruptions of in synchronization of synaptic connections during child’s earliest years have been implicated in autism.
Sestan says the human brain is more like a neighorhood, which is better defined by the community living within its borders than its buildings.
“The neighborhoods get built quickly and then everything slows down and the neocortex focuses solely on developing connections, almost like an electrical grid,” said Sestan.  “Later when these regions are synchronized, the neighborhoods begin to take on distinct functional identities like Little Italy or Chinatown.”

Human brain development is a symphony in three movements

The human brain develops with an exquisitely timed choreography marked by distinct patterns of gene activity at different stages from the womb to adulthood, Yale researchers report in the Dec. 26 issue of the journal Neuron.

The Yale team conducted a large-scale analysis of gene activity in cerebral neocortex —an area of the brain governing perception, behavior, and cognition — at different stages of development. The analysis shows the general architecture of brain regions is largely formed in the first six months after conception by a burst of genetic activity, which is distinct for specific regions of the neocortex. This rush is followed by a sort of intermission beginning in the third trimester of pregnancy. During this period, most genes that are active in specific brain regions are quieted — except for genes that spur connections between all neocortex regions. Then in late childhood and early adolescence, the genetic orchestra begins again and helps subtly shape neocortex regions that progressively perform more specialized tasks, a process that continues into adulthood.

The analysis is the first to show this “hour glass” sketch of human brain development, with a lull in genetic activity sandwiched between highly complex patterns of gene expression, said Nenad Sestan, professor of neurobiology at Yale’s Kavli Institute for Neuroscience and senior author of the study. Intriguingly, say the researchers, some of the same patterns of genetic activity that define this human “hour glass” sketch were not observed in developing monkeys, indicating that they may play a role in shaping the features specific to human brain development.

The findings emphasize the importance of the proper interplay between genes and environment in the child’s earliest years after birth when the formation of synaptic connections between brain cells becomes synchronized, which shape how brain structures will be used later in life, said Sestan. For instance, disruptions of in synchronization of synaptic connections during child’s earliest years have been implicated in autism.

Sestan says the human brain is more like a neighorhood, which is better defined by the community living within its borders than its buildings.

“The neighborhoods get built quickly and then everything slows down and the neocortex focuses solely on developing connections, almost like an electrical grid,” said Sestan.  “Later when these regions are synchronized, the neighborhoods begin to take on distinct functional identities like Little Italy or Chinatown.”

Filed under neocortex synaptic connections gene expression genetic activity neuroscience science

166 notes

Are concussions related to Alzheimer’s disease?
A new study suggests that a history of concussion involving at least a momentary loss of consciousness may be related to the buildup of Alzheimer’s-associated plaques in the brain. The research is published in the December 26, 2013, online issue of Neurology®, the medical journal of the American Academy of Neurology.
"Interestingly, in people with a history of concussion, a difference in the amount of brain plaques was found only in those with memory and thinking problems, not in those who were cognitively normal," said study author Michelle Mielke, PhD, with Mayo Clinic in Rochester, Minn.
For the study, people from Olmsted County in Minnesota were given brain scans; these included 448 people without any signs of memory problems and 141 people with memory and thinking problems called mild cognitive impairment. Participants, who were all age 70 or older, were also asked about whether they had ever experienced a brain injury that involved any loss of consciousness or memory.
Of the 448 people without any thinking or memory problems, 17 percent reported a brain injury and 18 percent of the 141 with memory and thinking difficulties reported a concussion or head trauma.
The study found no difference in any brain scan measures among the people without memory and thinking impairments, whether or not they had head trauma. However, people with memory and thinking impairments and a history of head trauma had levels of amyloid plaques an average of 18 percent higher than those with no head trauma history.
"Our results add merit to the idea that concussion and Alzheimer’s disease brain pathology may be related," said Mielke. "However, the fact that we did not find a relationship in those without memory and thinking problems suggests that any association between head trauma and amyloid is complex."

Are concussions related to Alzheimer’s disease?

A new study suggests that a history of concussion involving at least a momentary loss of consciousness may be related to the buildup of Alzheimer’s-associated plaques in the brain. The research is published in the December 26, 2013, online issue of Neurology®, the medical journal of the American Academy of Neurology.

"Interestingly, in people with a history of concussion, a difference in the amount of brain plaques was found only in those with memory and thinking problems, not in those who were cognitively normal," said study author Michelle Mielke, PhD, with Mayo Clinic in Rochester, Minn.

For the study, people from Olmsted County in Minnesota were given brain scans; these included 448 people without any signs of memory problems and 141 people with memory and thinking problems called mild cognitive impairment. Participants, who were all age 70 or older, were also asked about whether they had ever experienced a brain injury that involved any loss of consciousness or memory.

Of the 448 people without any thinking or memory problems, 17 percent reported a brain injury and 18 percent of the 141 with memory and thinking difficulties reported a concussion or head trauma.

The study found no difference in any brain scan measures among the people without memory and thinking impairments, whether or not they had head trauma. However, people with memory and thinking impairments and a history of head trauma had levels of amyloid plaques an average of 18 percent higher than those with no head trauma history.

"Our results add merit to the idea that concussion and Alzheimer’s disease brain pathology may be related," said Mielke. "However, the fact that we did not find a relationship in those without memory and thinking problems suggests that any association between head trauma and amyloid is complex."

Filed under concussions alzheimer's disease memory cognitive impairment neuroscience science

116 notes

Babies Don’t Develop Handedness All At Once

Reaching for Froot Loops and grabbing Lego pieces to build a tower are different challenges for toddlers. Depending on what they’re trying to do, tots tend to develop handedness for different tasks at different ages, according to new research.

image

Most people are right-handed. Babies start using their right hand to reach for cereal nuggets by age 1. However, children take until age 4 to show such a preference when building Lego models. The findings, published in this month’s issue of Developmental Psychobiology, imply tendencies to use one hand more than the other emerge depending on the tasks kids confront, rather than their age.

Preference for the right or left hand is, in part, genetic. Prior studies have shown that some of these one-sided tendencies emerge early. Fetuses suck their right thumb more often than their left; newborns on their back turn to the right more frequently. Most children grow up to be right-handed—in part because of these innate, early leanings, scientists believe.

But the timing of when one hand emerges as the dominant one for most tasks remained unclear.

"As a parent and a scientist, I was surprised to find researchers thought 3-year-olds don’t display a hand preference," said neurobiologist Claudia Gonzalez of the University of Lethbridge in Alberta, Canada.

To study how handedness emerged between ages 1 to 5, Gonzalez and her colleagues assigned about 50 tiny participants to a familiar task: grabbing a colorful object or a tasty tidbit. Children ages 1 to 2 picked up Froot Loops or Cheerios to munch at snack time. Four- and 5-year-olds grasped Lego blocks to build a small model. Three-year-old subjects tackled both tasks.

Even the youngest children had strong right-handed leanings when reaching for food, the team found. Three-year-olds were right-handed eaters, but they were just as likely to use their left hand when playing with blocks. The 4- and 5-year-olds used their left hand to hold the base of their model steady, but they manipulated blocks into the correct positions with their other hand—a clear preference for right-handedness.

"There is a developmental milestone between the ages of 3 and 4 when something clicks," Gonzalez said. "Maybe they become more skilled, or they understand the task better."

Until that developmental “click,” this study shows hand preference isn’t constant across tasks – regardless of a child’s age.

The study “uses a very clever design to get at the question of how handedness varies across tasks,” said Klaus Libertus, an infant development researcher at the University of Pittsburgh. “We did not know handedness is connected to tasks in this way. I would have expected the 3-year-olds to show the same pattern on both tasks, especially since the demands were so similar.”

Developing a hand preference might also correlate with other functions that rely strongly on just one side of the brain, such as language and certain decision-making skills, Gonzalez noted. Preliminary data from children in her lab suggests that when handedness is evident earlier, these other functions also mature more quickly.

Finding the right task to study handedness at different ages will give researchers a firmer grasp on how young brains develop right - or left -handed tendencies, she said.

"You could say hand preference develops before 1, or you could say it doesn’t emerge until age 4—just depending on what task you are looking at," said Gonzalez.

(Source: livescience.com)

Filed under handedness hand preference children child development psychology neuroscience science

684 notes

Getting Excited Helps with Performance Anxiety More Than Trying to Calm Down

People who tell themselves to get excited rather than trying to relax can improve their performance during anxiety-inducing activities such as public speaking and math tests, according to a study published by the American Psychological Association.

image

“Anxiety is incredibly pervasive. People have a very strong intuition that trying to calm down is the best way to cope with their anxiety, but that can be very difficult and ineffective,” said study author Alison Wood Brooks, PhD, of Harvard Business School. “When people feel anxious and try to calm down, they are thinking about all the things that could go badly. When they are excited, they are thinking about how things could go well.”

Several experiments conducted at Harvard University with college students and members of the local community showed that simple statements about excitement could improve performance during activities that triggered anxiety. The study was published online in APA’s Journal of Experimental Psychology: General®.

In one experiment, 140 participants (63 men and 77 women) were told to prepare a persuasive public speech on why they would be good work partners. To increase anxiety, a researcher videotaped the speeches and said they would be judged by a committee. Before delivering the speech, participants were instructed to say “I am excited” or “I am calm.” The subjects who said they were excited gave longer speeches and were more persuasive, competent and relaxed than those who said they were calm, according to ratings by independent evaluators.

“The way we talk about our feelings has a strong influence on how we actually feel,” said Brooks, an assistant professor of business administration at Harvard Business School.

In another experiment, 188 participants (80 men and 108 women), were given difficult math problems after they read “try to get excited” or “try to remain calm.” A control group didn’t read any statement. Participants in the excited group scored 8 percent higher on average than the calm group and the control group, and they reported feeling more confident about their math skills after the test.

In a trial involving karaoke, 113 participants (54 men and 59 women) were randomly assigned to say that they were anxious, excited, calm, angry or sad before singing a popular rock song on a video game console. A control group didn’t make any statement. All of the participants monitored their heart rates using a pulse meter strapped onto a finger to measure their anxiety.

Participants who said they were excited scored an average of 80 percent on the song based on their pitch, rhythm and volume as measured by the video game’s rating system. Those who said they were calm, angry or sad scored an average of 69 percent, compared to 53 percent for those who said they were anxious. Participants who said they were excited also reported feeling more excited and confident in their singing ability.

Since both anxiety and excitement are emotional states characterized by high arousal, it may be easier to view anxiety as excitement rather than trying to calm down to combat performance anxiety, Brooks said.

“When you feel anxious, you’re ruminating too much and focusing on potential threats,” she said. “In those circumstances, people should try to focus on the potential opportunities. It really does pay to be positive, and people should say they are excited. Even if they don’t believe it at first, saying ‘I’m excited’ out loud increases authentic feelings of excitement.”

Filed under anxiety performance excitement psychology neuroscience science

155 notes

Diabetes Gene Common In Latinos Has Ancient Roots
When it comes to the rising prevalence of Type 2 diabetes, there are many factors to blame.
Diet and exercise sit somewhere at the top of the list. But the genes that some of us inherit from Mom and Dad also help determine whether we develop the disease, and how early it crops up.
Now an international team of scientists have identified mutations in a gene that suggests an explanation for why Latinos are almost twice as likely to develop Type 2 diabetes as Caucasians and African-Americans.
But here’s the kicker: You have to go further back on the family tree than your parents to find who’s to blame for this genetic link to diabetes. Think thousands of generations ago.
Harvard geneticist and his colleagues uncovered hints that humans picked up the diabetes mutations from Neanderthals, our ancient cousins who went extinct about 30,000 years ago.
"As far as I know, this is the first time a version of a gene from Neanderthal has been connected to a modern-day disease," Altshuler tells Shots. He and his colleagues the findings Wednesday in the journal Nature.
A few years ago, geneticists at the in Germany sent shock waves through the scientific community when they the genome of a Neanderthal from a fossil. Hidden in the genetic code were patterns that matched those in human DNA. And the data strongly suggested that humans were more than just friendly neighbors with Neanderthal.
"Now it’s well accepted that humans interbred with Neanderthals," Altshuler says. On average most of us carry about 2 percent of Neanderthal DNA in our genome. So it’s not surprising, he says, that 2 percent of our traits would be inherited from the ancient primates.
The new data don’t mean that Neanderthals had diabetes, Altshuler is quick to point out. “It just happens that this disease sequence came from them,” he says.
To identify genes that contribute to Latinos’ high rate of Type 2 diabetes, Altshuler and his team analyzed DNA from over 8,000 Mexicans and other Latinos.
The team found many genes already known to be involved with diabetes, such as one related to insulin production. But a new one also popped up in the analysis: a gene that’s likely involved in fat metabolism.
Mutations in this gene increase a person’s risk of getting Type 2 diabetes by about a 20 percent, Altshuler and the team found. If the person has two copies of the mutations, one from each parent, the risk rises by about 40 percent.
So for Mexican Americans, their for Type 2 diabetes goes from about 13 percent to 19 percent if they inherit two copies of the mutations. For other Americans, the risk gets boosted to about 11 percent from 8 percent.
"This is a genetic factor that has a modest affect on the risk of getting the disease. Not everybody that has it will have the disease," Altshuler says. "But the genes are very common in Latinos and Asians."
About half of Latinos carry the disease mutations, while 20 percent of Asians have it. On the other hand, only 2 percent of European Americans carry the mutations.
So the new genetic data help to explain a big chunk — perhaps almost a quarter — of the difference in Type 2 diabetes prevalence in Latinos versus European Americans.
"The findings are important because they give us a new biological clue about a gene involved in diabetes, which could lead to more treatments," Altshuler says. "The Neanderthal connection is interesting, but it’s not the essence of the work."

Diabetes Gene Common In Latinos Has Ancient Roots

When it comes to the rising prevalence of Type 2 diabetes, there are many factors to blame.

Diet and exercise sit somewhere at the top of the list. But the genes that some of us inherit from Mom and Dad also help determine whether we develop the disease, and how early it crops up.

Now an international team of scientists have identified mutations in a gene that suggests an explanation for why Latinos are almost twice as likely to develop Type 2 diabetes as Caucasians and African-Americans.

But here’s the kicker: You have to go further back on the family tree than your parents to find who’s to blame for this genetic link to diabetes. Think thousands of generations ago.

Harvard geneticist and his colleagues uncovered hints that humans picked up the diabetes mutations from Neanderthals, our ancient cousins who went extinct about 30,000 years ago.

"As far as I know, this is the first time a version of a gene from Neanderthal has been connected to a modern-day disease," Altshuler tells Shots. He and his colleagues the findings Wednesday in the journal Nature.

A few years ago, geneticists at the in Germany sent shock waves through the scientific community when they the genome of a Neanderthal from a fossil. Hidden in the genetic code were patterns that matched those in human DNA. And the data strongly suggested that humans were more than just friendly neighbors with Neanderthal.

"Now it’s well accepted that humans interbred with Neanderthals," Altshuler says. On average most of us carry about 2 percent of Neanderthal DNA in our genome. So it’s not surprising, he says, that 2 percent of our traits would be inherited from the ancient primates.

The new data don’t mean that Neanderthals had diabetes, Altshuler is quick to point out. “It just happens that this disease sequence came from them,” he says.

To identify genes that contribute to Latinos’ high rate of Type 2 diabetes, Altshuler and his team analyzed DNA from over 8,000 Mexicans and other Latinos.

The team found many genes already known to be involved with diabetes, such as one related to insulin production. But a new one also popped up in the analysis: a gene that’s likely involved in fat metabolism.

Mutations in this gene increase a person’s risk of getting Type 2 diabetes by about a 20 percent, Altshuler and the team found. If the person has two copies of the mutations, one from each parent, the risk rises by about 40 percent.

So for Mexican Americans, their for Type 2 diabetes goes from about 13 percent to 19 percent if they inherit two copies of the mutations. For other Americans, the risk gets boosted to about 11 percent from 8 percent.

"This is a genetic factor that has a modest affect on the risk of getting the disease. Not everybody that has it will have the disease," Altshuler says. "But the genes are very common in Latinos and Asians."

About half of Latinos carry the disease mutations, while 20 percent of Asians have it. On the other hand, only 2 percent of European Americans carry the mutations.

So the new genetic data help to explain a big chunk — perhaps almost a quarter — of the difference in Type 2 diabetes prevalence in Latinos versus European Americans.

"The findings are important because they give us a new biological clue about a gene involved in diabetes, which could lead to more treatments," Altshuler says. "The Neanderthal connection is interesting, but it’s not the essence of the work."

Filed under diabetes type ii diabetes mutations genetics genomics neuroscience science

124 notes

Prolonged Exposure Therapy Found Beneficial in Treating Adolescent Girls with PTSD

Researchers at Penn Medicine report in the December 25 issue of JAMA that a modified form of prolonged exposure therapy – in which patients revisit and recount aloud their trauma-related thoughts, feelings and situations – shows greater success than supportive counseling for treating adolescent PTSD patients who have been sexually abused.

image

Despite a high prevalence of posttraumatic stress disorder (PTSD) in adolescents, evidence-based treatments like prolonged exposure therapy for PTSD in this population have never been established. 

“We hypothesized that prolonged exposure therapy could fill this gap and were eager to test its ability to provide benefit for adolescent patients,” says Edna Foa, PhD, professor of Clinical Psychology in the department of Psychiatry in the Perelman School of Medicine at the University of Pennsylvania, who developed prolonged exposure therapy.  

The concern has been that prolonged exposure therapy, while the most established evidence-based treatment for adults with PTSD, could exacerbate PTSD symptoms in adolescent patients who have not mastered the coping skills necessary for this type of exposure to be safely provided.

Adolescence is often a time when children begin to test limits and are in and out of situations, both good and bad – situations that often determine the path their lives take into adulthood.

The six-year (2006-2012) study examined the benefit of a prolonged exposure program called prolonged exposure-A (PE-A), that was modified to meet the developmental stage of adolescents, and compared it with supportive counseling in 61 adolescent girls, ages 13-18, with sexual abuse-related PTSD. In the single-blind randomized clinical trial, 31 received prolonged exposure-A, and 30 got supportive counseling. 

Each received 14 60- to- 90 minute sessions of either therapy in a community mental health setting.  The counselors were familiar with supportive counseling but naïve to PE-A before the study; their PE-A training consisted of a 4-day workshop followed by supervision every second week. 

Outcomes were assessed before treatment, mid-treatment and after treatment and at three, six and 12-month follow up.  During treatment, patients receiving PE-A demonstrated greater decline in PTSD and depression symptom severity, and improvement in overall functioning.  These differences were maintained throughout the 12-month follow up period.

“Another key finding of this research was that prolonged therapy can be administered in a community setting by professionals with no prior training in evidence-based treatments and can have a positive impact on this population,” Foa says.

(Source: uphs.upenn.edu)

Filed under PTSD adolescents exposure therapy psychology neuroscience science

500 notes

Take note students: Mice that ‘cram’ for exams remember less
It’s been more than 100 years since German psychologist Hermann Ebbinghaus determined that learning interspersed with rest created longer-lasting memories than so-called cramming, or learning without rest intervals.


Yet it’s only much more recently that scientists have begun to understand the underlying molecular mechanisms for this phenomenon. In a study published Monday in the journal PNAS, researchers examined the physical changes in the brain cells of mice while “training” their eyes to keep track of a moving image.
Researchers examined the horizontal optokinetic response, or HOKR, in mice to determine what rest interval was best suited to increasing their memory.
HOKR is what makes it possible for a rider in a train to visually track the moving scenery. While the process is unconscious, it involves frequent, minute eye movements.
Mice were fastened to a device that immobilized their heads and then were made to look at a revolving, checkered image that triggered the eye response. A high speed camera was used to determine when the tracking began and when it stopped.
While the eyes of lab mice are initially unable to track the revolving image at a high speed, they eventually adapt to faster and faster movement. This tracking ability is retained for a period of time before it is forgotten.
Some of the mice were allowed to rest between training sessions, while others were not. Researchers noted clear differences between the mice that were given rest time “spacing” and those that received no breaks, or “massed training.”
"One hour of spacing produced the highest memory retention at 24 hours, which lasted for one month," wrote lead study author Wajeeha Aziz, a molecular physiologist at the National Institute for Physiological Sciences in Okazaki, Japan, and her colleagues.
"Surprisingly, massed training also produced long-term memory…. However, this occurred slowly over days, and the memory lasted for only one week."
Researchers compared brain tissue from the two groups of trained mice and with those of mice that received no training. They found that both groups of trained mice had reduced synapses in a specific type of nerve cell, Purkinje neurons.
However, spacing the training appeared to make these structural changes in synapses occur more quickly, the authors said. 
"Further investigations are needed to elucidate the precise molecular mechanisms that regulate the temporal features of long-lasting memory, and the structural modifications of synapses provides an indispensable readout for such studies," the authors concluded.

Take note students: Mice that ‘cram’ for exams remember less

It’s been more than 100 years since German psychologist Hermann Ebbinghaus determined that learning interspersed with rest created longer-lasting memories than so-called cramming, or learning without rest intervals.

Yet it’s only much more recently that scientists have begun to understand the underlying molecular mechanisms for this phenomenon. In a study published Monday in the journal PNAS, researchers examined the physical changes in the brain cells of mice while “training” their eyes to keep track of a moving image.

Researchers examined the horizontal optokinetic response, or HOKR, in mice to determine what rest interval was best suited to increasing their memory.

HOKR is what makes it possible for a rider in a train to visually track the moving scenery. While the process is unconscious, it involves frequent, minute eye movements.

Mice were fastened to a device that immobilized their heads and then were made to look at a revolving, checkered image that triggered the eye response. A high speed camera was used to determine when the tracking began and when it stopped.

While the eyes of lab mice are initially unable to track the revolving image at a high speed, they eventually adapt to faster and faster movement. This tracking ability is retained for a period of time before it is forgotten.

Some of the mice were allowed to rest between training sessions, while others were not. Researchers noted clear differences between the mice that were given rest time “spacing” and those that received no breaks, or “massed training.”

"One hour of spacing produced the highest memory retention at 24 hours, which lasted for one month," wrote lead study author Wajeeha Aziz, a molecular physiologist at the National Institute for Physiological Sciences in Okazaki, Japan, and her colleagues.

"Surprisingly, massed training also produced long-term memory…. However, this occurred slowly over days, and the memory lasted for only one week."

Researchers compared brain tissue from the two groups of trained mice and with those of mice that received no training. They found that both groups of trained mice had reduced synapses in a specific type of nerve cell, Purkinje neurons.

However, spacing the training appeared to make these structural changes in synapses occur more quickly, the authors said. 

"Further investigations are needed to elucidate the precise molecular mechanisms that regulate the temporal features of long-lasting memory, and the structural modifications of synapses provides an indispensable readout for such studies," the authors concluded.

Filed under memory synaptic plasticity learning LTM neuroscience science

242 notes

Researchers find ECT can rid the mind of selected memory
A team of researchers working in the Netherlands has found that partial selective memory deletion can be achieved using Electroconvulsive Therapy (ECT). In their paper published in the journal Nature Neuroscience, the team describes a memory experiment they conducted with the assistance of severely depressed people who had already consented to undergoing ECT and found that such treatment could be used to at least partially erase memories of a specified event.
Scientists have known since 1968 (thanks to experiments conducted by psychologist Donald Lewis) that applying a shock to the brain of a rat can cause it to forget something unpleasant it had remembered. Subsequent experiments have found that memories can be blunted using repetitive type therapies or by injecting drugs such as propranolol into the brain. The one element all such findings have in common is that they must be applied during a time when a person is attempting to recall a certain event. Scientists hope that such research may lead to new ways to treat PTSD and other memory related mental ailments. In this new effort the researchers explored the idea of erasing specific memories using ECT.

Currently, people with severe depression who don’t respond to any other type of treatment are offered ECT as a last resort. It has a remarkably good success rate (approximately 86 percent rate of remission) but causes some degree of memory loss. In the Netherlands study, the team enlisted the assistance of 39 such patients who had already agreed to undergo ECT. Instead of receiving just the standard treatment, however, the volunteers were asked to watch two slide shows (along with narration) —both of which contained unsettling content. A week later the participants were divided into three groups—two to get the shock treatment and one to serve as a control group—all were asked to remember and describe one of the traumatic events described in the slide shows. Afterwards, one of the groups was given ECT and then the next day was asked to recount both stores. The other non-control group was given ECT and then were asked right afterwards to recount the unpleasant stories. The control group was asked to try to recount both stories as well.

In comparing the results between the groups, the researchers found that the first group that had been quizzed a day after receiving ECT had difficulty recalling the first story, which they had recounted prior to ECT, but remembered most of second. The second group that received ECT were able to recall both stories equally well, and the third—the control group—were able to remember both stories better than either of the groups that had received ECT.
The experiment suggests that it is possible to selectively erase short term memory in a controlled environment. Much more research will have to be conducted to determine if it would work in real world situations.

Researchers find ECT can rid the mind of selected memory

A team of researchers working in the Netherlands has found that partial selective memory deletion can be achieved using Electroconvulsive Therapy (ECT). In their paper published in the journal Nature Neuroscience, the team describes a memory experiment they conducted with the assistance of severely depressed people who had already consented to undergoing ECT and found that such treatment could be used to at least partially erase memories of a specified event.

Scientists have known since 1968 (thanks to experiments conducted by psychologist Donald Lewis) that applying a shock to the brain of a rat can cause it to forget something unpleasant it had remembered. Subsequent experiments have found that memories can be blunted using repetitive type therapies or by injecting drugs such as propranolol into the brain. The one element all such findings have in common is that they must be applied during a time when a person is attempting to recall a certain event. Scientists hope that such research may lead to new ways to treat PTSD and other memory related mental ailments. In this new effort the researchers explored the idea of erasing specific memories using ECT.

Currently, people with severe depression who don’t respond to any other type of treatment are offered ECT as a last resort. It has a remarkably good success rate (approximately 86 percent rate of remission) but causes some degree of memory loss. In the Netherlands study, the team enlisted the assistance of 39 such patients who had already agreed to undergo ECT. Instead of receiving just the standard treatment, however, the volunteers were asked to watch two slide shows (along with narration) —both of which contained unsettling content. A week later the participants were divided into three groups—two to get the shock treatment and one to serve as a control group—all were asked to remember and describe one of the traumatic events described in the slide shows. Afterwards, one of the groups was given ECT and then the next day was asked to recount both stores. The other non-control group was given ECT and then were asked right afterwards to recount the unpleasant stories. The control group was asked to try to recount both stories as well.

In comparing the results between the groups, the researchers found that the first group that had been quizzed a day after receiving ECT had difficulty recalling the first story, which they had recounted prior to ECT, but remembered most of second. The second group that received ECT were able to recall both stories equally well, and the third—the control group—were able to remember both stories better than either of the groups that had received ECT.

The experiment suggests that it is possible to selectively erase short term memory in a controlled environment. Much more research will have to be conducted to determine if it would work in real world situations.

Filed under electroconvulsive therapy PTSD depression memory memory loss neuroscience science

609 notes

Researchers identify gene that influences the ability to remember faces
New findings suggest the oxytocin receptor, a gene known to influence mother-infant bonding and pair bonding in monogamous species, also plays a special role in the ability to remember faces. This research has important implications for disorders in which social information processing is disrupted, including autism spectrum disorder. In addition, the finding may lead to new strategies for improving social cognition in several psychiatric disorders.
A team of researchers from Yerkes National Primate Research Center at Emory University in Atlanta, the University College London in the United Kingdom and University of Tampere in Finland made the discovery, which will be published in an online Early Edition of Proceedings of the National Academy of Sciences.
According to author Larry Young, PhD, of Yerkes, the Department of Psychiatry in Emory’s School of Medicine and Emory’s Center for Translational Social Neuroscience (CTSN), this is the first study to demonstrate that variation in the oxytocin receptor gene influences face recognition skills. He and co-author David Skuse point out the implication that oxytocin plays an important role in promoting our ability to recognize one another, yet about one-third of the population possesses only the genetic variant that negatively impacts that ability. They say this finding may help explain why a few people remember almost everyone they have met while others have difficulty recognizing members of their own family.
Skuse is with the Institute of Child Health, University College London, and the Great Ormond Street Hospital for Children, NHS Foundation Trust, London.
Young, Skuse and their research team studied 198 families with a single autistic child because these families were known to show a wide range of variability in facial recognition skills; two-thirds of the families were from the United Kingdom, and the remainder from Finland.
The Emory researchers previously found the oxytocin receptor is essential for olfactory-based social recognition in rodents, like mice and voles, and wondered whether the same gene could also be involved in human face recognition. They examined the influence of subtle differences in oxytocin receptor gene structure on face memory competence in the parents, non-autistic siblings and autistic child, and discovered a single change in the DNA of the oxytocin receptor had a big impact on face memory skills in the families. According to Young, this finding implies that oxytocin likely plays an important role more generally in social information processing, which is disrupted in disorders such as autism.
Additionally, this study is remarkable for its evolutionary aspect. Rodents use odors for social recognition while humans use visual facial cues. This suggests an ancient conservation in genetic and neural architectures involved in social information processing that transcends the sensory modalities used from mouse to man.
Skuse credits Young’s previous research that found mice with a mutated oxytocin receptor failed to recognize mice they previously encountered. “This led us to pursue more information about facial recognition and the implications for disorders in which social information processing is disrupted.” Young adds the team will continue working together to pursue strategies for improving social cognition in psychiatric disorders based on the current findings.

Researchers identify gene that influences the ability to remember faces

New findings suggest the oxytocin receptor, a gene known to influence mother-infant bonding and pair bonding in monogamous species, also plays a special role in the ability to remember faces. This research has important implications for disorders in which social information processing is disrupted, including autism spectrum disorder. In addition, the finding may lead to new strategies for improving social cognition in several psychiatric disorders.

A team of researchers from Yerkes National Primate Research Center at Emory University in Atlanta, the University College London in the United Kingdom and University of Tampere in Finland made the discovery, which will be published in an online Early Edition of Proceedings of the National Academy of Sciences.

According to author Larry Young, PhD, of Yerkes, the Department of Psychiatry in Emory’s School of Medicine and Emory’s Center for Translational Social Neuroscience (CTSN), this is the first study to demonstrate that variation in the oxytocin receptor gene influences face recognition skills. He and co-author David Skuse point out the implication that oxytocin plays an important role in promoting our ability to recognize one another, yet about one-third of the population possesses only the genetic variant that negatively impacts that ability. They say this finding may help explain why a few people remember almost everyone they have met while others have difficulty recognizing members of their own family.

Skuse is with the Institute of Child Health, University College London, and the Great Ormond Street Hospital for Children, NHS Foundation Trust, London.

Young, Skuse and their research team studied 198 families with a single autistic child because these families were known to show a wide range of variability in facial recognition skills; two-thirds of the families were from the United Kingdom, and the remainder from Finland.

The Emory researchers previously found the oxytocin receptor is essential for olfactory-based social recognition in rodents, like mice and voles, and wondered whether the same gene could also be involved in human face recognition. They examined the influence of subtle differences in oxytocin receptor gene structure on face memory competence in the parents, non-autistic siblings and autistic child, and discovered a single change in the DNA of the oxytocin receptor had a big impact on face memory skills in the families. According to Young, this finding implies that oxytocin likely plays an important role more generally in social information processing, which is disrupted in disorders such as autism.

Additionally, this study is remarkable for its evolutionary aspect. Rodents use odors for social recognition while humans use visual facial cues. This suggests an ancient conservation in genetic and neural architectures involved in social information processing that transcends the sensory modalities used from mouse to man.

Skuse credits Young’s previous research that found mice with a mutated oxytocin receptor failed to recognize mice they previously encountered. “This led us to pursue more information about facial recognition and the implications for disorders in which social information processing is disrupted.” Young adds the team will continue working together to pursue strategies for improving social cognition in psychiatric disorders based on the current findings.

Filed under oxytocin facial recognition memory ASD social cognition neuroscience science

407 notes

A novel look at how stories may change the brain
Many people can recall reading at least one cherished story that they say changed their life. Now researchers at Emory University have detected what may be biological traces related to this feeling: Actual changes in the brain that linger, at least for a few days, after reading a novel.
Their findings, that reading a novel may cause changes in resting-state connectivity of the brain that persist, were published by the journal Brain Connectivity.
“Stories shape our lives and in some cases help define a person,” says neuroscientist Gregory Berns, lead author of the study and the director of Emory’s Center for Neuropolicy. “We want to understand how stories get into your brain, and what they do to it.”
His co-authors included Kristina Blaine and Brandon Pye from the Center for Neuropolicy, and Michael Prietula, professor of information systems and operations management at Emory’s Goizueta Business School.
Neurobiological research using functional magnetic resonance imaging (fMRI) has begun to identify brain networks associated with reading stories. Most previous studies have focused on the cognitive processes involved in short stories, while subjects are actually reading them as they are in the fMRI scanner.
The Emory study focused on the lingering neural effects of reading a narrative. Twenty-one Emory undergraduates participated in the experiment, which was conducted over 19 consecutive days.
All of the study subjects read the same novel, “Pompeii,” a 2003 thriller by Robert Harris that is based on the real-life eruption of Mount Vesuvius in ancient Italy. “The story follows a protagonist, who is outside the city of Pompeii and notices steam and strange things happening around the volcano,” Berns says. “He tries to get back to Pompeii in time to save the woman he loves. Meanwhile, the volcano continues to bubble and nobody in the city recognizes the signs.”
The researchers chose the book due to its page-turning plot. “It depicts true events in a fictional and dramatic way,” Berns says. “It was important to us that the book had a strong narrative line.”
For the first five days, the participants came in each morning for a base-line fMRI scan of their brains in a resting state. Then they were given nine sections of the novel, about 30 pages each, over a nine-day period. They were asked to read the assigned section in the evening, and come in the following morning. After taking a quiz to ensure they had finished the assigned reading, the participants underwent an fMRI scan of their brain in a non-reading, resting state. After completing all nine sections of the novel, the participants returned for five more mornings to undergo additional scans in a resting state.
The results showed heightened connectivity in the left temporal cortex, an area of the brain associated with receptivity for language, on the mornings following the reading assignments. “Even though the participants were not actually reading the novel while they were in the scanner, they retained this heightened connectivity,” Berns says. “We call that a ‘shadow activity,’ almost like a muscle memory.”
Heightened connectivity was also seen in the central sulcus of the brain, the primary sensory motor region of the brain. Neurons of this region have been associated with making representations of sensation for the body, a phenomenon known as grounded cognition. Just thinking about running, for instance, can activate the neurons associated with the physical act of running.
“The neural changes that we found associated with physical sensation and movement systems suggest that reading a novel can transport you into the body of the protagonist,” Berns says. “We already knew that good stories can put you in someone else’s shoes in a figurative sense. Now we’re seeing that something may also be happening biologically.”
The neural changes were not just immediate reactions, Berns says, since they persisted the morning after the readings, and for the five days after the participants completed the novel.
“It remains an open question how long these neural changes might last,” Berns says. “But the fact that we’re detecting them over a few days for a randomly assigned novel suggests that your favorite novels could certainly have a bigger and longer-lasting effect on the biology of your brain.”

A novel look at how stories may change the brain

Many people can recall reading at least one cherished story that they say changed their life. Now researchers at Emory University have detected what may be biological traces related to this feeling: Actual changes in the brain that linger, at least for a few days, after reading a novel.

Their findings, that reading a novel may cause changes in resting-state connectivity of the brain that persist, were published by the journal Brain Connectivity.

“Stories shape our lives and in some cases help define a person,” says neuroscientist Gregory Berns, lead author of the study and the director of Emory’s Center for Neuropolicy. “We want to understand how stories get into your brain, and what they do to it.”

His co-authors included Kristina Blaine and Brandon Pye from the Center for Neuropolicy, and Michael Prietula, professor of information systems and operations management at Emory’s Goizueta Business School.

Neurobiological research using functional magnetic resonance imaging (fMRI) has begun to identify brain networks associated with reading stories. Most previous studies have focused on the cognitive processes involved in short stories, while subjects are actually reading them as they are in the fMRI scanner.

The Emory study focused on the lingering neural effects of reading a narrative. Twenty-one Emory undergraduates participated in the experiment, which was conducted over 19 consecutive days.

All of the study subjects read the same novel, “Pompeii,” a 2003 thriller by Robert Harris that is based on the real-life eruption of Mount Vesuvius in ancient Italy. “The story follows a protagonist, who is outside the city of Pompeii and notices steam and strange things happening around the volcano,” Berns says. “He tries to get back to Pompeii in time to save the woman he loves. Meanwhile, the volcano continues to bubble and nobody in the city recognizes the signs.”

The researchers chose the book due to its page-turning plot. “It depicts true events in a fictional and dramatic way,” Berns says. “It was important to us that the book had a strong narrative line.”

For the first five days, the participants came in each morning for a base-line fMRI scan of their brains in a resting state. Then they were given nine sections of the novel, about 30 pages each, over a nine-day period. They were asked to read the assigned section in the evening, and come in the following morning. After taking a quiz to ensure they had finished the assigned reading, the participants underwent an fMRI scan of their brain in a non-reading, resting state. After completing all nine sections of the novel, the participants returned for five more mornings to undergo additional scans in a resting state.

The results showed heightened connectivity in the left temporal cortex, an area of the brain associated with receptivity for language, on the mornings following the reading assignments. “Even though the participants were not actually reading the novel while they were in the scanner, they retained this heightened connectivity,” Berns says. “We call that a ‘shadow activity,’ almost like a muscle memory.”

Heightened connectivity was also seen in the central sulcus of the brain, the primary sensory motor region of the brain. Neurons of this region have been associated with making representations of sensation for the body, a phenomenon known as grounded cognition. Just thinking about running, for instance, can activate the neurons associated with the physical act of running.

“The neural changes that we found associated with physical sensation and movement systems suggest that reading a novel can transport you into the body of the protagonist,” Berns says. “We already knew that good stories can put you in someone else’s shoes in a figurative sense. Now we’re seeing that something may also be happening biologically.”

The neural changes were not just immediate reactions, Berns says, since they persisted the morning after the readings, and for the five days after the participants completed the novel.

“It remains an open question how long these neural changes might last,” Berns says. “But the fact that we’re detecting them over a few days for a randomly assigned novel suggests that your favorite novels could certainly have a bigger and longer-lasting effect on the biology of your brain.”

Filed under reading neuroimaging neural activity temporal cortex psychology neuroscience science

free counters