Neuroscience

Articles and news from the latest research reports.

Posts tagged prefrontal cortex

86 notes

Humans and monkeys of one mind when it comes to changing it

Covert changes of mind can be discovered by tracking neural activity when subjects make decisions, researchers from New York University and Stanford University have found. Their results, which appear in the journal Current Biology, offer new insights into how we make decisions and point to innovative ways to study this process in the future.

image

“The methods used in this study allowed us to see the idiosyncratic nature of decision making that was inaccessible before,” explains Roozbeh Kiani, an assistant professor in NYU’s Center for Neural Science and the study’s lead author.

The study’s other authors included Christopher Cueva and John Reppas of Stanford’s Department of Neurobiology and William Newsome, who holds appointments at the university’s Department of Neurobiology and at the Howard Hughes Medical Institute at Stanford’s School of Medicine.

Previous work on the decision-making process—a plan of action based on evidence, prior knowledge, and payoff—has been methodologically limited. In earlier studies, scientists analyzed one neuron at a time, then averaged these results across neurons to develop an understanding of this activity. However, such a measurement offers only snapshots of neurological behavior and misses the fine-scale dynamics that lead up to a decision.

In the Current Biology study, the researchers examined many neurons at once, giving them a more detailed understanding of decision making.

“Now we can look at the nuances of this dynamic and track changes over a specified period,” explains Kiani. “Looking at one neuron at a time is ‘noisy’: results vary from trial to trial so you cannot get a clear picture of this complex activity. By recording multiple neurons at the same time, you can take out this noise and get a more robust picture of the underlying dynamics.”

The researchers studied macaque monkeys, running them through a series of tasks while monitoring the animals’ neuronal workings.

In the experiment, the monkeys viewed a patch of randomly moving dots on a computer screen. Following the stimulus, monkeys received a “Go” signal to report the motion direction by making an eye movement. The scientists sought to predict the monkeys’ choices purely based on the recorded neural responses before the Go signal. Their model achieved highly accurate predictions.

The same model was then used to study potential dynamics of the monkeys’ decision at different times before the Go signal. The scientists confirmed these predictions by stopping the decision-making process at arbitrary times and comparing the model predictions with the monkeys’ actual choices.

Surprisingly, the monkeys’ decisions were not always stable. Occasionally, they vacillated from one choice to another, indicating covert changes of mind during decision-making. These changes of mind closely matched the properties of human changes of mind, which were uncovered in a 2009 study. They were more frequent in uncertain conditions, more likely to correct an initial mistake, and more likely to happen earlier during a decision.

(Source: nyu.edu)

Filed under decision making primates prefrontal cortex changes of mind neurons neuroscience science

484 notes

Finding thoughts in speech
For the first time, neuroscientists were able to find out how different thoughts are reflected in neuronal activity during natural conversations. Johanna Derix, Olga Iljina and the interdisciplinary team of Dr. Tonio Ball from the Cluster of Excellence BrainLinks-BrainTools at the University of Freiburg and the Epilepsy Center of the University Medical Center Freiburg (Freiburg, Germany) report on the link between speech, thoughts and brain responses in a special issue of Frontiers in Human Neuroscience.
"Thoughts are difficult to investigate, as one cannot observe in a direct manner what the person is thinking about. Language, however, reflects the underlying mental processes, so we can perform linguistic analyses of the subjects’ speech and use such information as a "bridge" between the neuronal processes and the subject’s thoughts," explains neuroscientist Johanna Derix.
The novelty of the authors’ approach is that the participants were not instructed to think and talk about a given topic in an experimental setting. Instead, the researchers analysed everyday conversations and the underlying brain activity, which was recorded directly from the cortical surface. This study was possible owing to the help of epilepsy patients in whom recordings of neural activity had to be obtained over several days for the purpose of pre-neurosurgical diagnostics.
For a start, borders between individual thoughts in continuous conversations had to be identified. Earlier psycholinguistic research indicates that a simple sentence is a suitable unit to contain a single thought, so the researchers opted for linguistic segmentation into simple sentences. The resulting “idea” units were classified into different categories. These included, for example, whether or not a sentence expressed memory- or self-related content. Then, the researchers analysed content-specific neural responses and observed clearly visible patterns of brain activity.
Thus, the neuroscientists from Freiburg have demonstrated the feasibility of their innovative approach to investigate, via speech, how the human brain processes thoughts during real-life conditions.

Finding thoughts in speech

For the first time, neuroscientists were able to find out how different thoughts are reflected in neuronal activity during natural conversations. Johanna Derix, Olga Iljina and the interdisciplinary team of Dr. Tonio Ball from the Cluster of Excellence BrainLinks-BrainTools at the University of Freiburg and the Epilepsy Center of the University Medical Center Freiburg (Freiburg, Germany) report on the link between speech, thoughts and brain responses in a special issue of Frontiers in Human Neuroscience.

"Thoughts are difficult to investigate, as one cannot observe in a direct manner what the person is thinking about. Language, however, reflects the underlying mental processes, so we can perform linguistic analyses of the subjects’ speech and use such information as a "bridge" between the neuronal processes and the subject’s thoughts," explains neuroscientist Johanna Derix.

The novelty of the authors’ approach is that the participants were not instructed to think and talk about a given topic in an experimental setting. Instead, the researchers analysed everyday conversations and the underlying brain activity, which was recorded directly from the cortical surface. This study was possible owing to the help of epilepsy patients in whom recordings of neural activity had to be obtained over several days for the purpose of pre-neurosurgical diagnostics.

For a start, borders between individual thoughts in continuous conversations had to be identified. Earlier psycholinguistic research indicates that a simple sentence is a suitable unit to contain a single thought, so the researchers opted for linguistic segmentation into simple sentences. The resulting “idea” units were classified into different categories. These included, for example, whether or not a sentence expressed memory- or self-related content. Then, the researchers analysed content-specific neural responses and observed clearly visible patterns of brain activity.

Thus, the neuroscientists from Freiburg have demonstrated the feasibility of their innovative approach to investigate, via speech, how the human brain processes thoughts during real-life conditions.

Filed under speech production neural activity thinking prefrontal cortex communication autobiographical memory neuroscience science

98 notes

Portable brain-mapping device allows researchers to ‘see’ where memory fails student veterans

UT Arlington researchers have successfully used a portable brain-mapping device to show limited prefrontal cortex activity among student veterans with Post Traumatic Stress Disorder when they were asked to recall information from simple memorization tasks.

The study by bioengineering professor Hanli Liu and Alexa Smith-Osborne, an associate professor of social work, and two other collaborators was published in the May 2014 edition of NeuroImage: Clinical. The team used functional near infrared spectroscopy to map brain activity responses during cognitive activities related to digit learning and memory retrial.

Smith-Osborne has used the findings to guide treatment recommendations for some veterans through her work as principal investigator for UT Arlington’s Student Veteran Project, which offers free services to veterans who are undergraduates or who are considering returning to college.

“When we retest those student veterans after we’ve provided therapy and interventions, they’ve shown marked improvement,” Smith-Osborne said. “The fNIRS data have shown improvement in brain functions and responses after the student veterans have undergone treatment.”

Liu said this type of brain imaging allows us to “see” which brain region or regions fail to memorize or recall learned knowledge in student veterans with PTSD.

“It also shows how PTSD can affect the way we learn and our ability to recall information, so this new way of brain imaging advances our understanding of PTSD.” Liu said.

This study is multi-disciplinary, associating objective brain imaging with neurological disorders and social work.

While UT Arlington bioengineering faculty associate Fenghua Tian is the primary author assisted by bioengineering graduate research assistant Amarnath Yennu, collaborators of the study include UT Austin psychology professor Francisco Gonzalez-Lima and psychology professor Carol North with UT Southwestern Medical Center and the Veterans Administration North Texas Health Care System.

Khosrow Behbehani, dean of the UT Arlington College of Engineering, said this collaborative research is “allowing the researchers to objectively measure the changes in the level of oxygen in the brain and relate them to some of the brain functions that may have been adversely affected by trauma or stress.”  

Numerous neuropsychological studies have linked learning dysfunctions – such as memory loss, attention deficits and learning disabilities – with PTSD.

The new study involved 16 combat veterans previously diagnosed with PTSD who were experiencing distress and functional impairment affecting cognitive and related academic performance.  The veterans were directed to perform a series of number-ordering tasks on a computer while researchers monitored their brain activity through near infrared spectroscopy, a noninvasive neuroimaging technology.

The research found that participants with PTSD experienced significant difficulty recalling the given digits compared with a control group. This deficiency is closely associated with dysfunction of a portion in the right frontal cortex. The team also determined that near infrared spectroscopy was an effective tool for measuring cognitive dysfunction associated with PTSD.

With that information, Smith-Osborne said mental healthcare providers could customize a treatment plan best suited for that individual.

“It’s not a one-size-fits-all treatment plan but a concentrated effort to tailor the treatment based on where that person is on the learning scale,” Smith-Osborne said.

Smith-Osborne and Liu hope that their research results lead to better and more comprehensive care for veterans and a better college education.

(Source: uta.edu)

Filed under PTSD prefrontal cortex brain activity working memory neuroscience science

382 notes

Stress hormone linked to short-term memory loss as we age
A new study at the University of Iowa reports a potential link between stress hormones and short-term memory loss in older adults.
The study, published in the Journal of Neuroscience, reveals that having high levels of cortisol—a natural hormone in our body whose levels surge when we are stressed—can lead to memory lapses as we age.
Short-term increases in cortisol are critical for survival. They promote coping and help us respond to life’s challenges by making us more alert and able to think on our feet. But abnormally high or prolonged spikes in cortisol—like what happens when we are dealing with long-term stress—can lead to negative consequences that numerous bodies of research have shown to include digestion problems, anxiety, weight gain, and high blood pressure.
In this study, the UI researchers linked elevated amounts of cortisol to the gradual loss of synapses in the prefrontal cortex, the region of the brain that houses short-term memory. Synapses are the connections that help us process, store, and recall information. And when we get older, repeated and long-term exposure to cortisol can cause them to shrink and disappear.
“Stress hormones are one mechanism that we believe leads to weathering of the brain,” says Jason Radley, assistant professor in psychology at the UI and corresponding author on the paper. Like a rock on the shoreline, after years and years it will eventually break down and disappear.
While previous studies have shown cortisol to produce similar effects in other regions of the aging brain, this was the first study to examine its impact on the prefrontal cortex.
And although preliminary, the findings raise the possibility that short-memory decline in aging adults may be slowed or prevented by treatments that decrease levels of cortisol in susceptible individuals, says Radley. That could mean treating people who have naturally high levels of cortisol—such as those who are depressed—or those who experience repeated, long-term stress due to traumatic life events like the death of a loved one.
According to Radley and Rachel Anderson, the paper’s lead author and a second year-graduate student in psychology at the UI, short-term memory lapses related to cortisol start around age 65. That’s about the equivalent of 21 month-old rats, which the pair studied to make their discovery.
The UI scientists compared the elderly rats to four-month old rats, which are roughly the same age as a 20 year-old person. The young and elderly groups were then separated further according to whether the rats had naturally high or naturally low levels of corticosterone—the hormone comparable to cortisol in humans.
The researchers subsequently placed the rats in a T-shaped maze that required them to use their short-term memory. In order to receive a treat, they needed to recall which direction they had turned at the top of the T just 30, 60, or 120 seconds ago and then turn the opposite way each time they ran the maze.
Though memory declined across all groups as the time rats waited before running the maze again increased, older rats with high corticosterone levels consistently performed the worst. They chose the correct direction only 58 percent of the time, compared to their older peers with low corticosterone levels who chose it 80 percent of the time.
When researchers took tissue samples from the rats’ prefrontal cortexes and examined them under a microscope, they found the poor performers had smaller and 20 percent fewer synapses than all other groups, indicating memory loss.
In contrast, older rats with low corticosterone levels showed little memory loss and ran the maze nearly as well as the younger rats, who were not affected by any level of corticosterone—low or high.
Still, researchers say it’s important to remember that stress hormones are only one of a host of factors when it comes to mental decline and memory loss as we age.

Stress hormone linked to short-term memory loss as we age

A new study at the University of Iowa reports a potential link between stress hormones and short-term memory loss in older adults.

The study, published in the Journal of Neuroscience, reveals that having high levels of cortisol—a natural hormone in our body whose levels surge when we are stressed—can lead to memory lapses as we age.

Short-term increases in cortisol are critical for survival. They promote coping and help us respond to life’s challenges by making us more alert and able to think on our feet. But abnormally high or prolonged spikes in cortisol—like what happens when we are dealing with long-term stress—can lead to negative consequences that numerous bodies of research have shown to include digestion problems, anxiety, weight gain, and high blood pressure.

In this study, the UI researchers linked elevated amounts of cortisol to the gradual loss of synapses in the prefrontal cortex, the region of the brain that houses short-term memory. Synapses are the connections that help us process, store, and recall information. And when we get older, repeated and long-term exposure to cortisol can cause them to shrink and disappear.

“Stress hormones are one mechanism that we believe leads to weathering of the brain,” says Jason Radley, assistant professor in psychology at the UI and corresponding author on the paper. Like a rock on the shoreline, after years and years it will eventually break down and disappear.

While previous studies have shown cortisol to produce similar effects in other regions of the aging brain, this was the first study to examine its impact on the prefrontal cortex.

And although preliminary, the findings raise the possibility that short-memory decline in aging adults may be slowed or prevented by treatments that decrease levels of cortisol in susceptible individuals, says Radley. That could mean treating people who have naturally high levels of cortisol—such as those who are depressed—or those who experience repeated, long-term stress due to traumatic life events like the death of a loved one.

According to Radley and Rachel Anderson, the paper’s lead author and a second year-graduate student in psychology at the UI, short-term memory lapses related to cortisol start around age 65. That’s about the equivalent of 21 month-old rats, which the pair studied to make their discovery.

The UI scientists compared the elderly rats to four-month old rats, which are roughly the same age as a 20 year-old person. The young and elderly groups were then separated further according to whether the rats had naturally high or naturally low levels of corticosterone—the hormone comparable to cortisol in humans.

The researchers subsequently placed the rats in a T-shaped maze that required them to use their short-term memory. In order to receive a treat, they needed to recall which direction they had turned at the top of the T just 30, 60, or 120 seconds ago and then turn the opposite way each time they ran the maze.

Though memory declined across all groups as the time rats waited before running the maze again increased, older rats with high corticosterone levels consistently performed the worst. They chose the correct direction only 58 percent of the time, compared to their older peers with low corticosterone levels who chose it 80 percent of the time.

When researchers took tissue samples from the rats’ prefrontal cortexes and examined them under a microscope, they found the poor performers had smaller and 20 percent fewer synapses than all other groups, indicating memory loss.

In contrast, older rats with low corticosterone levels showed little memory loss and ran the maze nearly as well as the younger rats, who were not affected by any level of corticosterone—low or high.

Still, researchers say it’s important to remember that stress hormones are only one of a host of factors when it comes to mental decline and memory loss as we age.

Filed under stress memory cortisol STM prefrontal cortex synapses aging neuroscience science

126 notes

(Image caption: Brain scans show high activity in the medial prefrontal cortex (top) and striatum (bottom) while playing a competitive game. UC Berkeley and UIUC researchers have now found genetic variations in dopamine-regulating genes in the prefrontal cortex and striatum associated with differences in belief learning and reinforcement learning, respectively. Credit: Ming Hsu)
Your genes affect your betting behavior
Investors and gamblers take note: your betting decisions and strategy are determined, in part, by your genes.
Researchers from the University of California, Berkeley, National University of Singapore and University of Illinois at Urbana-Champaign (UIUC) have shown that betting decisions in a simple competitive game are influenced by the specific variants of dopamine-regulating genes in a person’s brain.
Dopamine is a neurotransmitter – a chemical released by brain cells to signal other brain cells – that is a key part of the brain’s reward and pleasure-seeking system. Dopamine deficiency leads to Parkinson’s disease, while disruption of the dopamine network is linked to numerous psychiatric and neurodegenerative disorders, including schizophrenia, depression and dementia.
While previous studies have shown the important role of the neurotransmitter dopamine in social interactions, this is the first study tying these interactions to specific genes that govern dopamine functioning.
“This study shows that genes influence complex social behavior, in this case strategic behavior,” said study leader Ming Hsu, an assistant professor of marketing in UC Berkeley’s Haas School of Business and a member of the Helen Wills Neuroscience Institute. “We now have some clues about the neural mechanisms through which our genes affect behavior.”
The implications for business are potentially vast but unclear, Hsu said, though one possibility is training workforces to be more strategic. But the findings could significantly affect our understanding of diseases involving dopamine, such as schizophrenia, as well as disorders of social interaction, such as autism.
“When people talk about dopamine dysfunction, schizophrenia is one of the first diseases that come to mind,” Hsu said, noting that the disease involves a very complex pattern of social and decision making deficits. “To the degree that we can better understand ubiquitous social interactions in strategic settings, it may help us understand how to characterize and eventually treat the social deficits that are symptoms of diseases like schizophrenia.”
Hsu, UIUC graduate student Eric Set and their colleagues, including Richard P. Ebstein and Soo Hong Chew from the National University of Singapore, will publish their findings the week of June 16 in the online early edition of the Proceedings of the National Academy of Sciences.
Two brain areas involved in competition
Hsu established two years ago that when people engage in competitive social interactions, such as betting games, they primarily call upon two areas of the brain: the medial prefrontal cortex, which is the executive part of the brain, and the striatum, which deals with motivation and is crucial for learning to acquire rewards. Functional magnetic resonance imaging (fMRI) scans showed that people playing these games displayed intense activity in these areas.
“If you think of the brain as a computing machine, these are areas that take inputs, crank them through an algorithm, and translate them into behavioral outputs,” Hsu said. “What is really interesting about these areas is that both are innervated by neurons that use dopamine.”
Hsu and Set of UIUC’s Department of Economics wanted to determine which genes involved in regulating dopamine concentrations in these brain areas were associated with strategic thinking, so they enlisted as subjects a group of 217 undergraduates at the National University of Singapore, all of whom had had their genomes scanned for some 700,000 genetic variants. The researchers focused on only 143 variants within 12 genes involved in regulating dopamine. Some of the 12 are primarily involved in regulating dopamine in the prefrontal cortex, while others primarily regulate dopamine in the striatum.
The competition was a game called patent race, commonly used by social scientists to study social interactions. It involves one person betting, via computer, with an anonymous opponent.
“We know from brain imaging studies that when people compete against one another, they actually engage in two distinct types of learning processes,” Set said, referring to Hsu’s 2012 study. “One type involves learning purely from the consequences of your own actions, called reinforcement learning. The other is a bit more sophisticated, called belief learning, where people try to make a mental model of the other players, in order to anticipate and respond to their actions.”
Trial-and-error learning vs belief learning
Using a mathematical model of brain function during competitive social interactions, Hsu and Set correlated performance in reinforcement learning and belief learning with different variants or mutations of the 12 dopamine-related genes, and discovered a distinct difference.
They found that differences in belief learning – the degree to which players were able to anticipate and respond to the actions of others, or to imagine what their competitor is thinking and respond strategically – was associated with variation in three genes which primarily affect dopamine functioning in the medial prefrontal cortex.
In contrast, differences in trial-and-error reinforcement learning – how quickly they forget past experiences and how quickly they change strategy – was associated with variation in two genes that primarily affect striatal dopamine.
Hsu said that the findings correlate well with previous brain studies showing that the prefrontal cortex is involved in belief learning, while the striatum is involved in reinforcement learning.
“We were surprised by the degree of overlap, but it hints at the power of studying the neural and genetic levels under a single mathematical framework, which is only beginning in this area,” he said.
Hsu is currently collaborating with other scientists to correlate career achievements in older adults with genes and performance on competitive games, to see which brain regions and types of learning are most important for different kinds of success in life.

(Image caption: Brain scans show high activity in the medial prefrontal cortex (top) and striatum (bottom) while playing a competitive game. UC Berkeley and UIUC researchers have now found genetic variations in dopamine-regulating genes in the prefrontal cortex and striatum associated with differences in belief learning and reinforcement learning, respectively. Credit: Ming Hsu)

Your genes affect your betting behavior

Investors and gamblers take note: your betting decisions and strategy are determined, in part, by your genes.

Researchers from the University of California, Berkeley, National University of Singapore and University of Illinois at Urbana-Champaign (UIUC) have shown that betting decisions in a simple competitive game are influenced by the specific variants of dopamine-regulating genes in a person’s brain.

Dopamine is a neurotransmitter – a chemical released by brain cells to signal other brain cells – that is a key part of the brain’s reward and pleasure-seeking system. Dopamine deficiency leads to Parkinson’s disease, while disruption of the dopamine network is linked to numerous psychiatric and neurodegenerative disorders, including schizophrenia, depression and dementia.

While previous studies have shown the important role of the neurotransmitter dopamine in social interactions, this is the first study tying these interactions to specific genes that govern dopamine functioning.

“This study shows that genes influence complex social behavior, in this case strategic behavior,” said study leader Ming Hsu, an assistant professor of marketing in UC Berkeley’s Haas School of Business and a member of the Helen Wills Neuroscience Institute. “We now have some clues about the neural mechanisms through which our genes affect behavior.”

The implications for business are potentially vast but unclear, Hsu said, though one possibility is training workforces to be more strategic. But the findings could significantly affect our understanding of diseases involving dopamine, such as schizophrenia, as well as disorders of social interaction, such as autism.

“When people talk about dopamine dysfunction, schizophrenia is one of the first diseases that come to mind,” Hsu said, noting that the disease involves a very complex pattern of social and decision making deficits. “To the degree that we can better understand ubiquitous social interactions in strategic settings, it may help us understand how to characterize and eventually treat the social deficits that are symptoms of diseases like schizophrenia.”

Hsu, UIUC graduate student Eric Set and their colleagues, including Richard P. Ebstein and Soo Hong Chew from the National University of Singapore, will publish their findings the week of June 16 in the online early edition of the Proceedings of the National Academy of Sciences.

Two brain areas involved in competition

Hsu established two years ago that when people engage in competitive social interactions, such as betting games, they primarily call upon two areas of the brain: the medial prefrontal cortex, which is the executive part of the brain, and the striatum, which deals with motivation and is crucial for learning to acquire rewards. Functional magnetic resonance imaging (fMRI) scans showed that people playing these games displayed intense activity in these areas.

“If you think of the brain as a computing machine, these are areas that take inputs, crank them through an algorithm, and translate them into behavioral outputs,” Hsu said. “What is really interesting about these areas is that both are innervated by neurons that use dopamine.”

Hsu and Set of UIUC’s Department of Economics wanted to determine which genes involved in regulating dopamine concentrations in these brain areas were associated with strategic thinking, so they enlisted as subjects a group of 217 undergraduates at the National University of Singapore, all of whom had had their genomes scanned for some 700,000 genetic variants. The researchers focused on only 143 variants within 12 genes involved in regulating dopamine. Some of the 12 are primarily involved in regulating dopamine in the prefrontal cortex, while others primarily regulate dopamine in the striatum.

The competition was a game called patent race, commonly used by social scientists to study social interactions. It involves one person betting, via computer, with an anonymous opponent.

“We know from brain imaging studies that when people compete against one another, they actually engage in two distinct types of learning processes,” Set said, referring to Hsu’s 2012 study. “One type involves learning purely from the consequences of your own actions, called reinforcement learning. The other is a bit more sophisticated, called belief learning, where people try to make a mental model of the other players, in order to anticipate and respond to their actions.”

Trial-and-error learning vs belief learning

Using a mathematical model of brain function during competitive social interactions, Hsu and Set correlated performance in reinforcement learning and belief learning with different variants or mutations of the 12 dopamine-related genes, and discovered a distinct difference.

They found that differences in belief learning – the degree to which players were able to anticipate and respond to the actions of others, or to imagine what their competitor is thinking and respond strategically – was associated with variation in three genes which primarily affect dopamine functioning in the medial prefrontal cortex.

In contrast, differences in trial-and-error reinforcement learning – how quickly they forget past experiences and how quickly they change strategy – was associated with variation in two genes that primarily affect striatal dopamine.

Hsu said that the findings correlate well with previous brain studies showing that the prefrontal cortex is involved in belief learning, while the striatum is involved in reinforcement learning.

“We were surprised by the degree of overlap, but it hints at the power of studying the neural and genetic levels under a single mathematical framework, which is only beginning in this area,” he said.

Hsu is currently collaborating with other scientists to correlate career achievements in older adults with genes and performance on competitive games, to see which brain regions and types of learning are most important for different kinds of success in life.

Filed under dopamine genes prefrontal cortex striatum learning neuroscience science

980 notes

When good people do bad things
When people get together in groups, unusual things can happen — both good and bad. Groups create important social institutions that an individual could not achieve alone, but there can be a darker side to such alliances: Belonging to a group makes people more likely to harm others outside the group.
“Although humans exhibit strong preferences for equity and moral prohibitions against harm in many contexts, people’s priorities change when there is an ‘us’ and a ‘them,’” says Rebecca Saxe, an associate professor of cognitive neuroscience at MIT. “A group of people will often engage in actions that are contrary to the private moral standards of each individual in that group, sweeping otherwise decent individuals into ‘mobs’ that commit looting, vandalism, even physical brutality.”
Several factors play into this transformation. When people are in a group, they feel more anonymous, and less likely to be caught doing something wrong. They may also feel a diminished sense of personal responsibility for collective actions.
Saxe and colleagues recently studied a third factor that cognitive scientists believe may be involved in this group dynamic: the hypothesis that when people are in groups, they “lose touch” with their own morals and beliefs, and become more likely to do things that they would normally believe are wrong.
In a study that recently went online in the journal NeuroImage, the researchers measured brain activity in a part of the brain involved in thinking about oneself. They found that in some people, this activity was reduced when the subjects participated in a competition as part of a group, compared with when they competed as individuals. Those people were more likely to harm their competitors than people who did not exhibit this decreased brain activity.
“This process alone does not account for intergroup conflict: Groups also promote anonymity, diminish personal responsibility, and encourage reframing harmful actions as ‘necessary for the greater good.’ Still, these results suggest that at least in some cases, explicitly reflecting on one’s own personal moral standards may help to attenuate the influence of ‘mob mentality,’” says Mina Cikara, a former MIT postdoc and lead author of the NeuroImage paper.
Group dynamics
Cikara, who is now an assistant professor at Carnegie Mellon University, started this research project after experiencing the consequences of a “mob mentality”: During a visit to Yankee Stadium, her husband was ceaselessly heckled by Yankees fans for wearing a Red Sox cap. “What I decided to do was take the hat from him, thinking I would be a lesser target by virtue of the fact that I was a woman,” Cikara says. “I was so wrong. I have never been called names like that in my entire life.”
The harassment, which continued throughout the trip back to Manhattan, provoked a strong reaction in Cikara, who isn’t even a Red Sox fan.
“It was a really amazing experience because what I realized was I had gone from being an individual to being seen as a member of ‘Red Sox Nation.’ And the way that people responded to me, and the way I felt myself responding back, had changed, by virtue of this visual cue — the baseball hat,” she says. “Once you start feeling attacked on behalf of your group, however arbitrary, it changes your psychology.”
Cikara, then a third-year graduate student at Princeton University, started to investigate the neural mechanisms behind the group dynamics that produce bad behavior. In the new study, done at MIT, Cikara, Saxe (who is also an associate member of MIT’s McGovern Institute for Brain Research), former Harvard University graduate student Anna Jenkins, and former MIT lab manager Nicholas Dufour focused on a part of the brain called the medial prefrontal cortex. When someone is reflecting on himself or herself, this part of the brain lights up in functional magnetic resonance imaging (fMRI) brain scans.
A couple of weeks before the study participants came in for the experiment, the researchers surveyed each of them about their social-media habits, as well as their moral beliefs and behavior. This allowed the researchers to create individualized statements for each subject that were true for that person — for example, “I have stolen food from shared refrigerators” or “I always apologize after bumping into someone.”
When the subjects arrived at the lab, their brains were scanned as they played a game once on their own and once as part of a team. The purpose of the game was to press a button if they saw a statement related to social media, such as “I have more than 600 Facebook friends.”
The subjects also saw their personalized moral statements mixed in with sentences about social media. Brain scans revealed that when subjects were playing for themselves, the medial prefrontal cortex lit up much more when they read moral statements about themselves than statements about others, consistent with previous findings. However, during the team competition, some people showed a much smaller difference in medial prefrontal cortex activation when they saw the moral statements about themselves compared to those about other people.
Those people also turned out to be much more likely to harm members of the competing group during a task performed after the game. Each subject was asked to select photos that would appear with the published study, from a set of four photos apiece of two teammates and two members of the opposing team. The subjects with suppressed medial prefrontal cortex activity chose the least flattering photos of the opposing team members, but not of their own teammates.
“This is a nice way of using neuroimaging to try to get insight into something that behaviorally has been really hard to explore,” says David Rand, an assistant professor of psychology at Yale University who was not involved in the research. “It’s been hard to get a direct handle on the extent to which people within a group are tapping into their own understanding of things versus the group’s understanding.”
Getting lost
The researchers also found that after the game, people with reduced medial prefrontal cortex activity had more difficulty remembering the moral statements they had heard during the game.
“If you need to encode something with regard to the self and that ability is somehow undermined when you’re competing with a group, then you should have poor memory associated with that reduction in medial prefrontal cortex signal, and that’s exactly what we see,” Cikara says.
Cikara hopes to follow up on these findings to investigate what makes some people more likely to become “lost” in a group than others. She is also interested in studying whether people are slower to recognize themselves or pick themselves out of a photo lineup after being absorbed in a group activity.

When good people do bad things

When people get together in groups, unusual things can happen — both good and bad. Groups create important social institutions that an individual could not achieve alone, but there can be a darker side to such alliances: Belonging to a group makes people more likely to harm others outside the group.

“Although humans exhibit strong preferences for equity and moral prohibitions against harm in many contexts, people’s priorities change when there is an ‘us’ and a ‘them,’” says Rebecca Saxe, an associate professor of cognitive neuroscience at MIT. “A group of people will often engage in actions that are contrary to the private moral standards of each individual in that group, sweeping otherwise decent individuals into ‘mobs’ that commit looting, vandalism, even physical brutality.”

Several factors play into this transformation. When people are in a group, they feel more anonymous, and less likely to be caught doing something wrong. They may also feel a diminished sense of personal responsibility for collective actions.

Saxe and colleagues recently studied a third factor that cognitive scientists believe may be involved in this group dynamic: the hypothesis that when people are in groups, they “lose touch” with their own morals and beliefs, and become more likely to do things that they would normally believe are wrong.

In a study that recently went online in the journal NeuroImage, the researchers measured brain activity in a part of the brain involved in thinking about oneself. They found that in some people, this activity was reduced when the subjects participated in a competition as part of a group, compared with when they competed as individuals. Those people were more likely to harm their competitors than people who did not exhibit this decreased brain activity.

“This process alone does not account for intergroup conflict: Groups also promote anonymity, diminish personal responsibility, and encourage reframing harmful actions as ‘necessary for the greater good.’ Still, these results suggest that at least in some cases, explicitly reflecting on one’s own personal moral standards may help to attenuate the influence of ‘mob mentality,’” says Mina Cikara, a former MIT postdoc and lead author of the NeuroImage paper.

Group dynamics

Cikara, who is now an assistant professor at Carnegie Mellon University, started this research project after experiencing the consequences of a “mob mentality”: During a visit to Yankee Stadium, her husband was ceaselessly heckled by Yankees fans for wearing a Red Sox cap. “What I decided to do was take the hat from him, thinking I would be a lesser target by virtue of the fact that I was a woman,” Cikara says. “I was so wrong. I have never been called names like that in my entire life.”

The harassment, which continued throughout the trip back to Manhattan, provoked a strong reaction in Cikara, who isn’t even a Red Sox fan.

“It was a really amazing experience because what I realized was I had gone from being an individual to being seen as a member of ‘Red Sox Nation.’ And the way that people responded to me, and the way I felt myself responding back, had changed, by virtue of this visual cue — the baseball hat,” she says. “Once you start feeling attacked on behalf of your group, however arbitrary, it changes your psychology.”

Cikara, then a third-year graduate student at Princeton University, started to investigate the neural mechanisms behind the group dynamics that produce bad behavior. In the new study, done at MIT, Cikara, Saxe (who is also an associate member of MIT’s McGovern Institute for Brain Research), former Harvard University graduate student Anna Jenkins, and former MIT lab manager Nicholas Dufour focused on a part of the brain called the medial prefrontal cortex. When someone is reflecting on himself or herself, this part of the brain lights up in functional magnetic resonance imaging (fMRI) brain scans.

A couple of weeks before the study participants came in for the experiment, the researchers surveyed each of them about their social-media habits, as well as their moral beliefs and behavior. This allowed the researchers to create individualized statements for each subject that were true for that person — for example, “I have stolen food from shared refrigerators” or “I always apologize after bumping into someone.”

When the subjects arrived at the lab, their brains were scanned as they played a game once on their own and once as part of a team. The purpose of the game was to press a button if they saw a statement related to social media, such as “I have more than 600 Facebook friends.”

The subjects also saw their personalized moral statements mixed in with sentences about social media. Brain scans revealed that when subjects were playing for themselves, the medial prefrontal cortex lit up much more when they read moral statements about themselves than statements about others, consistent with previous findings. However, during the team competition, some people showed a much smaller difference in medial prefrontal cortex activation when they saw the moral statements about themselves compared to those about other people.

Those people also turned out to be much more likely to harm members of the competing group during a task performed after the game. Each subject was asked to select photos that would appear with the published study, from a set of four photos apiece of two teammates and two members of the opposing team. The subjects with suppressed medial prefrontal cortex activity chose the least flattering photos of the opposing team members, but not of their own teammates.

“This is a nice way of using neuroimaging to try to get insight into something that behaviorally has been really hard to explore,” says David Rand, an assistant professor of psychology at Yale University who was not involved in the research. “It’s been hard to get a direct handle on the extent to which people within a group are tapping into their own understanding of things versus the group’s understanding.”

Getting lost

The researchers also found that after the game, people with reduced medial prefrontal cortex activity had more difficulty remembering the moral statements they had heard during the game.

“If you need to encode something with regard to the self and that ability is somehow undermined when you’re competing with a group, then you should have poor memory associated with that reduction in medial prefrontal cortex signal, and that’s exactly what we see,” Cikara says.

Cikara hopes to follow up on these findings to investigate what makes some people more likely to become “lost” in a group than others. She is also interested in studying whether people are slower to recognize themselves or pick themselves out of a photo lineup after being absorbed in a group activity.

Filed under prefrontal cortex social cognition intergroup competition psychology neuroscience science

293 notes

Synchronized brain waves enable rapid learning
The human mind can rapidly absorb and analyze new information as it flits from thought to thought. These quickly changing brain states may be encoded by synchronization of brain waves across different brain regions, according to a new study from MIT neuroscientists.
The researchers found that as monkeys learn to categorize different patterns of dots, two brain areas involved in learning — the prefrontal cortex and the striatum — synchronize their brain waves to form new communication circuits.
“We’re seeing direct evidence for the interactions between these two systems during learning, which hasn’t been seen before. Category-learning results in new functional circuits between these two areas, and these functional circuits are rhythm-based, which is key because that’s a relatively new concept in systems neuroscience,” says Earl Miller, the Picower Professor of Neuroscience at MIT and senior author of the study, which appears in the June 12 issue of Neuron.
There are millions of neurons in the brain, each producing its own electrical signals. These combined signals generate oscillations known as brain waves, which can be measured by electroencephalography (EEG). The research team focused on EEG patterns from the prefrontal cortex —the seat of the brain’s executive control system — and the striatum, which controls habit formation.
The phenomenon of brain-wave synchronization likely precedes the changes in synapses, or connections between neurons, believed to underlie learning and long-term memory formation, Miller says. That process, known as synaptic plasticity, is too time-consuming to account for the human mind’s flexibility, he believes.
“If you can change your thoughts from moment to moment, you can’t be doing it by constantly making new connections and breaking them apart in your brain. Plasticity doesn’t happen on that kind of time scale,” says Miller, who is a member of MIT’s Picower Institute for Learning and Memory. “There’s got to be some way of dynamically establishing circuits to correspond to the thoughts we’re having in this moment, and then if we change our minds a moment later, those circuits break apart somehow. We think synchronized brain waves may be the way the brain does it.”
The paper’s lead author is former Picower Institute postdoc Evan Antzoulatos, who is now at the University of California at Davis.
Humming together
Miller’s lab has previously shown that during category-learning, neurons in the striatum become active early, followed by slower activation of neurons in the prefrontal cortex. “The striatum learns very simple things really quickly, and then its output trains the prefrontal cortex to gradually pick up on the bigger picture,” Miller says. “The striatum learns the pieces of the puzzle, and then the prefrontal cortex puts the pieces of the puzzle together.”
In the new study, the researchers wanted to investigate whether this activity pattern actually reflects communication between the prefrontal cortex and striatum, or if each region is working independently. To do this, they measured EEG signals as monkeys learned to assign patterns of dots into one of two categories.
At first, the animals were shown just two different examples, or “exemplars,” from each category. After each round, the number of exemplars was doubled. In the early stages, the animals could simply memorize which exemplars belonged to each category. However, the number of exemplars eventually became too large for the animals to memorize all of them, and they began to learn the general traits that characterized each category.
By the end of the experiment, when the researchers were showing 256 novel exemplars, the monkeys were able to categorize all of them correctly.
As the monkeys shifted from rote memorization to learning the categories, the researchers saw a corresponding shift in EEG patterns. Brain waves known as “beta bands,” produced independently by the prefrontal cortex and the striatum, began to synchronize with each other. This suggests that a communication circuit is forming between the two regions, Miller says.
“There is some unknown mechanism that allows these resonance patterns to form, and these circuits start humming together,” he says. “That humming may then foster subsequent long-term plasticity changes in the brain, so real anatomical circuits can form. But the first thing that happens is they start humming together.”
A little later, as an animal nailed down the two categories, two separate circuits formed between the striatum and prefrontal cortex, each corresponding to one of the categories.
“This is the first paper that provides data suggesting that coupling in the beta-band between prefrontal cortex and striatum may play a key role in category-formation. In addition to revealing a novel mechanism involved in category-learning, the results also contribute to better understanding of the significance of coupled beta-band oscillations in the brain,” says Andreas Engel, a professor of physiology at the University Medical Center Hamburg-Eppendorf in Germany.
“Expanding your knowledge”
Previous studies have shown that during cognitively demanding tasks, there is increased synchrony between the frontal cortex and visual cortex, but Miller’s lab is the first to show specific patterns of synchrony linked to specific thoughts.
Miller and Antzoulatos also showed that once the prefrontal cortex learns the categories and sends them to the striatum, they undergo further modification as new information comes in, allowing more expansive learning to take place. This iteration can occur over and over.
“That’s how you get the open-ended nature of human thought. You keep expanding your knowledge,” Miller says. “The prefrontal cortex learning the categories isn’t the end of the game. The cortex is learning these new categories and then forming circuits that can send the categories down to the striatum as if it’s just brand-new material for the brain to elaborate on.”
In follow-up studies, the researchers are now looking at how the brain learns more abstract categories, and how activity in the striatum and prefrontal cortex might reflect that type of abstraction.

Synchronized brain waves enable rapid learning

The human mind can rapidly absorb and analyze new information as it flits from thought to thought. These quickly changing brain states may be encoded by synchronization of brain waves across different brain regions, according to a new study from MIT neuroscientists.

The researchers found that as monkeys learn to categorize different patterns of dots, two brain areas involved in learning — the prefrontal cortex and the striatum — synchronize their brain waves to form new communication circuits.

“We’re seeing direct evidence for the interactions between these two systems during learning, which hasn’t been seen before. Category-learning results in new functional circuits between these two areas, and these functional circuits are rhythm-based, which is key because that’s a relatively new concept in systems neuroscience,” says Earl Miller, the Picower Professor of Neuroscience at MIT and senior author of the study, which appears in the June 12 issue of Neuron.

There are millions of neurons in the brain, each producing its own electrical signals. These combined signals generate oscillations known as brain waves, which can be measured by electroencephalography (EEG). The research team focused on EEG patterns from the prefrontal cortex —the seat of the brain’s executive control system — and the striatum, which controls habit formation.

The phenomenon of brain-wave synchronization likely precedes the changes in synapses, or connections between neurons, believed to underlie learning and long-term memory formation, Miller says. That process, known as synaptic plasticity, is too time-consuming to account for the human mind’s flexibility, he believes.

“If you can change your thoughts from moment to moment, you can’t be doing it by constantly making new connections and breaking them apart in your brain. Plasticity doesn’t happen on that kind of time scale,” says Miller, who is a member of MIT’s Picower Institute for Learning and Memory. “There’s got to be some way of dynamically establishing circuits to correspond to the thoughts we’re having in this moment, and then if we change our minds a moment later, those circuits break apart somehow. We think synchronized brain waves may be the way the brain does it.”

The paper’s lead author is former Picower Institute postdoc Evan Antzoulatos, who is now at the University of California at Davis.

Humming together

Miller’s lab has previously shown that during category-learning, neurons in the striatum become active early, followed by slower activation of neurons in the prefrontal cortex. “The striatum learns very simple things really quickly, and then its output trains the prefrontal cortex to gradually pick up on the bigger picture,” Miller says. “The striatum learns the pieces of the puzzle, and then the prefrontal cortex puts the pieces of the puzzle together.”

In the new study, the researchers wanted to investigate whether this activity pattern actually reflects communication between the prefrontal cortex and striatum, or if each region is working independently. To do this, they measured EEG signals as monkeys learned to assign patterns of dots into one of two categories.

At first, the animals were shown just two different examples, or “exemplars,” from each category. After each round, the number of exemplars was doubled. In the early stages, the animals could simply memorize which exemplars belonged to each category. However, the number of exemplars eventually became too large for the animals to memorize all of them, and they began to learn the general traits that characterized each category.

By the end of the experiment, when the researchers were showing 256 novel exemplars, the monkeys were able to categorize all of them correctly.

As the monkeys shifted from rote memorization to learning the categories, the researchers saw a corresponding shift in EEG patterns. Brain waves known as “beta bands,” produced independently by the prefrontal cortex and the striatum, began to synchronize with each other. This suggests that a communication circuit is forming between the two regions, Miller says.

“There is some unknown mechanism that allows these resonance patterns to form, and these circuits start humming together,” he says. “That humming may then foster subsequent long-term plasticity changes in the brain, so real anatomical circuits can form. But the first thing that happens is they start humming together.”

A little later, as an animal nailed down the two categories, two separate circuits formed between the striatum and prefrontal cortex, each corresponding to one of the categories.

“This is the first paper that provides data suggesting that coupling in the beta-band between prefrontal cortex and striatum may play a key role in category-formation. In addition to revealing a novel mechanism involved in category-learning, the results also contribute to better understanding of the significance of coupled beta-band oscillations in the brain,” says Andreas Engel, a professor of physiology at the University Medical Center Hamburg-Eppendorf in Germany.

“Expanding your knowledge”

Previous studies have shown that during cognitively demanding tasks, there is increased synchrony between the frontal cortex and visual cortex, but Miller’s lab is the first to show specific patterns of synchrony linked to specific thoughts.

Miller and Antzoulatos also showed that once the prefrontal cortex learns the categories and sends them to the striatum, they undergo further modification as new information comes in, allowing more expansive learning to take place. This iteration can occur over and over.

“That’s how you get the open-ended nature of human thought. You keep expanding your knowledge,” Miller says. “The prefrontal cortex learning the categories isn’t the end of the game. The cortex is learning these new categories and then forming circuits that can send the categories down to the striatum as if it’s just brand-new material for the brain to elaborate on.”

In follow-up studies, the researchers are now looking at how the brain learns more abstract categories, and how activity in the striatum and prefrontal cortex might reflect that type of abstraction.

Filed under brainwaves learning prefrontal cortex striatum neuroscience science

215 notes

(Image caption: At left, the brains of adults who had ADHD as children but no longer have it show synchronous activity between the posterior cingulate cortex (the larger red region) and the medial prefrontal cortex (smaller red region). At right, the brains of adults who continue to experience ADHD do not show this synchronous activity. Illustration: Jose-Luis Olivares/MIT, based on images courtesy of the researchers)
Inside the adult ADHD brain
About 11 percent of school-age children in the United States have been diagnosed with attention deficit hyperactivity disorder (ADHD). While many of these children eventually “outgrow” the disorder, some carry their difficulties into adulthood: About 10 million American adults are currently diagnosed with ADHD.
In the first study to compare patterns of brain activity in adults who recovered from childhood ADHD and those who did not, MIT neuroscientists have discovered key differences in a brain communication network that is active when the brain is at wakeful rest and not focused on a particular task. The findings offer evidence of a biological basis for adult ADHD and should help to validate the criteria used to diagnose the disorder, according to the researchers.
Diagnoses of adult ADHD have risen dramatically in the past several years, with symptoms similar to those of childhood ADHD: a general inability to focus, reflected in difficulty completing tasks, listening to instructions, or remembering details.
“The psychiatric guidelines for whether a person’s ADHD is persistent or remitted are based on lots of clinical studies and impressions. This new study suggests that there is a real biological boundary between those two sets of patients,” says MIT’s John Gabrieli, the Grover M. Hermann Professor of Health Sciences and Technology, professor of brain and cognitive sciences, and an author of the study, which appears in the June 10 issue of the journal Brain.
Shifting brain patterns
This study focused on 35 adults who were diagnosed with ADHD as children; 13 of them still have the disorder, while the rest have recovered. “This sample really gave us a unique opportunity to ask questions about whether or not the brain basis of ADHD is similar in the remitted-ADHD and persistent-ADHD cohorts,” says Aaron Mattfeld, a postdoc at MIT’s McGovern Institute for Brain Research and the paper’s lead author.
The researchers used a technique called resting-state functional magnetic resonance imaging (fMRI) to study what the brain is doing when a person is not engaged in any particular activity. These patterns reveal which parts of the brain communicate with each other during this type of wakeful rest.
“It’s a different way of using functional brain imaging to investigate brain networks,” says Susan Whitfield-Gabrieli, a research scientist at the McGovern Institute and the senior author of the paper. “Here we have subjects just lying in the scanner. This method reveals the intrinsic functional architecture of the human brain without invoking any specific task.”
In people without ADHD, when the mind is unfocused, there is a distinctive synchrony of activity in brain regions known as the default mode network. Previous studies have shown that in children and adults with ADHD, two major hubs of this network — the posterior cingulate cortex and the medial prefrontal cortex — no longer synchronize.
In the new study, the MIT team showed for the first time that in adults who had been diagnosed with ADHD as children but no longer have it, this normal synchrony pattern is restored. “Their brains now look like those of people who never had ADHD,” Mattfeld says.
“This finding is quite intriguing,” says Francisco Xavier Castellanos, a professor of child and adolescent psychiatry at New York University who was not involved in the research. “If it can be confirmed, this pattern could become a target for potential modification to help patients learn to compensate for the disorder without changing their genetic makeup.”
Lingering problems
However, in another measure of brain synchrony, the researchers found much more similarity between both groups of ADHD patients.
In people without ADHD, when the default mode network is active, another network, called the task positive network, is suppressed. When the brain is performing tasks that require focus, the task positive network takes over and suppresses the default mode network. If this reciprocal relationship degrades, the ability to focus declines.
Both groups of adult ADHD patients, including those who had recovered, showed patterns of simultaneous activation of both networks. This is thought to be a sign of impairment in executive function — the management of cognitive tasks — that is separate from ADHD, but occurs in about half of ADHD patients. All of the ADHD patients in this study performed poorly on tests of executive function. “Once you have executive function problems, they seem to hang in there,” says Gabrieli, who is a member of the McGovern Institute.
The researchers now plan to investigate how ADHD medications influence the brain’s default mode network, in hopes that this might allow them to predict which drugs will work best for individual patients. Currently, about 60 percent of patients respond well to the first drug they receive.
“It’s unknown what’s different about the other 40 percent or so who don’t respond very much,” Gabrieli says. “We’re pretty excited about the possibility that some brain measurement would tell us which child or adult is most likely to benefit from a treatment.”

(Image caption: At left, the brains of adults who had ADHD as children but no longer have it show synchronous activity between the posterior cingulate cortex (the larger red region) and the medial prefrontal cortex (smaller red region). At right, the brains of adults who continue to experience ADHD do not show this synchronous activity. Illustration: Jose-Luis Olivares/MIT, based on images courtesy of the researchers)

Inside the adult ADHD brain

About 11 percent of school-age children in the United States have been diagnosed with attention deficit hyperactivity disorder (ADHD). While many of these children eventually “outgrow” the disorder, some carry their difficulties into adulthood: About 10 million American adults are currently diagnosed with ADHD.

In the first study to compare patterns of brain activity in adults who recovered from childhood ADHD and those who did not, MIT neuroscientists have discovered key differences in a brain communication network that is active when the brain is at wakeful rest and not focused on a particular task. The findings offer evidence of a biological basis for adult ADHD and should help to validate the criteria used to diagnose the disorder, according to the researchers.

Diagnoses of adult ADHD have risen dramatically in the past several years, with symptoms similar to those of childhood ADHD: a general inability to focus, reflected in difficulty completing tasks, listening to instructions, or remembering details.

“The psychiatric guidelines for whether a person’s ADHD is persistent or remitted are based on lots of clinical studies and impressions. This new study suggests that there is a real biological boundary between those two sets of patients,” says MIT’s John Gabrieli, the Grover M. Hermann Professor of Health Sciences and Technology, professor of brain and cognitive sciences, and an author of the study, which appears in the June 10 issue of the journal Brain.

Shifting brain patterns

This study focused on 35 adults who were diagnosed with ADHD as children; 13 of them still have the disorder, while the rest have recovered. “This sample really gave us a unique opportunity to ask questions about whether or not the brain basis of ADHD is similar in the remitted-ADHD and persistent-ADHD cohorts,” says Aaron Mattfeld, a postdoc at MIT’s McGovern Institute for Brain Research and the paper’s lead author.

The researchers used a technique called resting-state functional magnetic resonance imaging (fMRI) to study what the brain is doing when a person is not engaged in any particular activity. These patterns reveal which parts of the brain communicate with each other during this type of wakeful rest.

“It’s a different way of using functional brain imaging to investigate brain networks,” says Susan Whitfield-Gabrieli, a research scientist at the McGovern Institute and the senior author of the paper. “Here we have subjects just lying in the scanner. This method reveals the intrinsic functional architecture of the human brain without invoking any specific task.”

In people without ADHD, when the mind is unfocused, there is a distinctive synchrony of activity in brain regions known as the default mode network. Previous studies have shown that in children and adults with ADHD, two major hubs of this network — the posterior cingulate cortex and the medial prefrontal cortex — no longer synchronize.

In the new study, the MIT team showed for the first time that in adults who had been diagnosed with ADHD as children but no longer have it, this normal synchrony pattern is restored. “Their brains now look like those of people who never had ADHD,” Mattfeld says.

“This finding is quite intriguing,” says Francisco Xavier Castellanos, a professor of child and adolescent psychiatry at New York University who was not involved in the research. “If it can be confirmed, this pattern could become a target for potential modification to help patients learn to compensate for the disorder without changing their genetic makeup.”

Lingering problems

However, in another measure of brain synchrony, the researchers found much more similarity between both groups of ADHD patients.

In people without ADHD, when the default mode network is active, another network, called the task positive network, is suppressed. When the brain is performing tasks that require focus, the task positive network takes over and suppresses the default mode network. If this reciprocal relationship degrades, the ability to focus declines.

Both groups of adult ADHD patients, including those who had recovered, showed patterns of simultaneous activation of both networks. This is thought to be a sign of impairment in executive function — the management of cognitive tasks — that is separate from ADHD, but occurs in about half of ADHD patients. All of the ADHD patients in this study performed poorly on tests of executive function. “Once you have executive function problems, they seem to hang in there,” says Gabrieli, who is a member of the McGovern Institute.

The researchers now plan to investigate how ADHD medications influence the brain’s default mode network, in hopes that this might allow them to predict which drugs will work best for individual patients. Currently, about 60 percent of patients respond well to the first drug they receive.

“It’s unknown what’s different about the other 40 percent or so who don’t respond very much,” Gabrieli says. “We’re pretty excited about the possibility that some brain measurement would tell us which child or adult is most likely to benefit from a treatment.”

Filed under ADHD neuroimaging prefrontal cortex default mode network neuroscience science

560 notes

Finding the perfect balance — regulating brain activity to improve attention
Researchers from The University of Nottingham have found that balanced activity in the brain’s prefrontal cortex is necessary for attention. 
The research helps to make sense of attention deficits in people suffering from cognitive disorders — like schizophrenia — who often find it hard to sustain their attention. This has a significant effect on many aspects of their lives, including the ability to follow conversations, drive a car and hold down a job.
Activity in a healthy brain is controlled by inhibitory signals between neurons. The research shows that disrupting this healthy inhibition may be just as bad for attention as reducing neuron firing. It is often assumed that increasing brain activity has cognitive benefits, but the findings show that this is not always the case.
The research was carried out by a team in the University’s School of Psychology and involved inhibiting or disinhibiting the prefrontal cortex in rats and monitoring the effect. The researchers found that both of these extremes resulted in attentional deficits and that the ability to pay attention required an appropriate balance where neuron-firing was kept within a certain range.
Schizophrenia and attention deficits 
Studies of the brain in people with schizophrenia suggest aberrant neuron-firing in the prefrontal cortex. There is evidence that neuron firing in this part of the brain is often too high or too low.
Dr Tobias Bast, who led the study together with first author Dr Marie Pezze, said: “The implication of our findings is that the abnormalities we see in the prefrontal cortex of schizophrenia patients, for example, are indeed a plausible cause of the attention deficit these patients have.
“It also means that if we want to treat this pharmacologically, we can’t just boost activity of the prefrontal cortex or inactivate it, because that would actually result in an impairment. What we need to do is look at restoring balance of activity through drugs which keep the activity within a certain range.”
Cognitive deficits associated with schizophrenia
In people with schizophrenia, cognitive deficits — such as problems with attention — are less striking than other issues associated with the disorder, such as hallucinations, but are nevertheless a major problem.
Dr Bast said: “Initially people focused on the so-called ‘psychotic symptoms’, including hallucinations and delusions, so that’s what probably comes to mind when you think of schizophrenia. They have been in the fore because they have been so striking and that’s why referrals are made. But these can be treated, at least in a large proportion of patients, by using anti-psychotic medication, which we have had since the late 1950s.
“The problem is that unfortunately anti-psychotic drugs don’t improve cognitive deficits which are very debilitating, affecting many aspects of the patients’ lives. Cognitive deficits are a big problem and something that is currently not treated so finding something that helps this is really important.”

Finding the perfect balance — regulating brain activity to improve attention

Researchers from The University of Nottingham have found that balanced activity in the brain’s prefrontal cortex is necessary for attention. 

The research helps to make sense of attention deficits in people suffering from cognitive disorders — like schizophrenia — who often find it hard to sustain their attention. This has a significant effect on many aspects of their lives, including the ability to follow conversations, drive a car and hold down a job.

Activity in a healthy brain is controlled by inhibitory signals between neurons. The research shows that disrupting this healthy inhibition may be just as bad for attention as reducing neuron firing. It is often assumed that increasing brain activity has cognitive benefits, but the findings show that this is not always the case.

The research was carried out by a team in the University’s School of Psychology and involved inhibiting or disinhibiting the prefrontal cortex in rats and monitoring the effect. The researchers found that both of these extremes resulted in attentional deficits and that the ability to pay attention required an appropriate balance where neuron-firing was kept within a certain range.

Schizophrenia and attention deficits 

Studies of the brain in people with schizophrenia suggest aberrant neuron-firing in the prefrontal cortex. There is evidence that neuron firing in this part of the brain is often too high or too low.

Dr Tobias Bast, who led the study together with first author Dr Marie Pezze, said: “The implication of our findings is that the abnormalities we see in the prefrontal cortex of schizophrenia patients, for example, are indeed a plausible cause of the attention deficit these patients have.

“It also means that if we want to treat this pharmacologically, we can’t just boost activity of the prefrontal cortex or inactivate it, because that would actually result in an impairment. What we need to do is look at restoring balance of activity through drugs which keep the activity within a certain range.”

Cognitive deficits associated with schizophrenia

In people with schizophrenia, cognitive deficits — such as problems with attention — are less striking than other issues associated with the disorder, such as hallucinations, but are nevertheless a major problem.

Dr Bast said: “Initially people focused on the so-called ‘psychotic symptoms’, including hallucinations and delusions, so that’s what probably comes to mind when you think of schizophrenia. They have been in the fore because they have been so striking and that’s why referrals are made. But these can be treated, at least in a large proportion of patients, by using anti-psychotic medication, which we have had since the late 1950s.

“The problem is that unfortunately anti-psychotic drugs don’t improve cognitive deficits which are very debilitating, affecting many aspects of the patients’ lives. Cognitive deficits are a big problem and something that is currently not treated so finding something that helps this is really important.”

Filed under brain activity attention prefrontal cortex schizophrenia neuroscience science

168 notes

Exceptional Evolutionary Divergence of Human Muscle and Brain Metabolomes Parallels Human Cognitive and Physical Uniqueness
Metabolite concentrations reflect the physiological states of tissues and cells. However, the role of metabolic changes in species evolution is currently unknown. Here, we present a study of metabolome evolution conducted in three brain regions and two non-neural tissues from humans, chimpanzees, macaque monkeys, and mice based on over 10,000 hydrophilic compounds. While chimpanzee, macaque, and mouse metabolomes diverge following the genetic distances among species, we detect remarkable acceleration of metabolome evolution in human prefrontal cortex and skeletal muscle affecting neural and energy metabolism pathways. These metabolic changes could not be attributed to environmental conditions and were confirmed against the expression of their corresponding enzymes. We further conducted muscle strength tests in humans, chimpanzees, and macaques. The results suggest that, while humans are characterized by superior cognition, their muscular performance might be markedly inferior to that of chimpanzees and macaque monkeys.
Full Article
(Image credit)

Exceptional Evolutionary Divergence of Human Muscle and Brain Metabolomes Parallels Human Cognitive and Physical Uniqueness

Metabolite concentrations reflect the physiological states of tissues and cells. However, the role of metabolic changes in species evolution is currently unknown. Here, we present a study of metabolome evolution conducted in three brain regions and two non-neural tissues from humans, chimpanzees, macaque monkeys, and mice based on over 10,000 hydrophilic compounds. While chimpanzee, macaque, and mouse metabolomes diverge following the genetic distances among species, we detect remarkable acceleration of metabolome evolution in human prefrontal cortex and skeletal muscle affecting neural and energy metabolism pathways. These metabolic changes could not be attributed to environmental conditions and were confirmed against the expression of their corresponding enzymes. We further conducted muscle strength tests in humans, chimpanzees, and macaques. The results suggest that, while humans are characterized by superior cognition, their muscular performance might be markedly inferior to that of chimpanzees and macaque monkeys.

Full Article

(Image credit)

Filed under primates evolution skeletal muscle prefrontal cortex metabolites genomics neuroscience science

free counters