Neuroscience

Articles and news from the latest research reports.

Posts tagged psychology

798 notes

Brain scans link concern for justice with reason, not emotion
People who care about justice are swayed more by reason than emotion, according to new brain scan research from the Department of Psychology and Center for Cognitive and Social Neuroscience.
Psychologists have found that some individuals react more strongly than others to situations that invoke a sense of justice—for example, seeing a person being treated unfairly or mercifully. The new study used brain scans to analyze the thought processes of people with high “justice sensitivity.”
“We were interested to examine how individual differences about justice and fairness are represented in the brain to better understand the contribution of emotion and cognition in moral judgment,” explained lead author Jean Decety, the Irving B. Harris Professor of Psychology and Psychiatry.    
Using a functional magnetic resonance imaging (fMRI) brain-scanning device, the team studied what happened in the participants’ brains as they judged videos depicting behavior that was morally good or bad. For example, they saw a person put money in a beggar’s cup or kick the beggar’s cup away. The participants were asked to rate on a scale how much they would blame or praise the actor seen in the video. People in the study also completed questionnaires that assessed cognitive and emotional empathy, as well as their justice sensitivity.
As expected, study participants who scored high on the justice sensitivity questionnaire assigned significantly more blame when they were evaluating scenes of harm, Decety said. They also registered more praise for scenes showing a person helping another individual.
But the brain imaging also yielded surprises. During the behavior-evaluation exercise, people with high justice sensitivity showed more activity than average participants in parts of the brain associated with higher-order cognition. Brain areas commonly linked with emotional processing were not affected.
The conclusion was clear, Decety said: “Individuals who are sensitive to justice and fairness do not seem to be emotionally driven. Rather, they are cognitively driven.” 
According to Decety, one implication is that the search for justice and the moral missions of human rights organizations and others do not come primarily from sentimental motivations, as they are often portrayed. Instead, that drive may have more to do with sophisticated analysis and mental calculation.
Decety adds that evaluating good actions elicited relatively high activity in the region of the brain involved in decision-making, motivation and rewards. This finding suggests that perhaps individuals make judgments about behavior based on how they process the reward value of good actions as compared to bad actions.
“Our results provide some of the first evidence for the role of justice sensitivity in enhancing neural processing of moral information in specific components of the brain network involved in moral judgment,” Decety said.

Brain scans link concern for justice with reason, not emotion

People who care about justice are swayed more by reason than emotion, according to new brain scan research from the Department of Psychology and Center for Cognitive and Social Neuroscience.

Psychologists have found that some individuals react more strongly than others to situations that invoke a sense of justice—for example, seeing a person being treated unfairly or mercifully. The new study used brain scans to analyze the thought processes of people with high “justice sensitivity.”

“We were interested to examine how individual differences about justice and fairness are represented in the brain to better understand the contribution of emotion and cognition in moral judgment,” explained lead author Jean Decety, the Irving B. Harris Professor of Psychology and Psychiatry.    

Using a functional magnetic resonance imaging (fMRI) brain-scanning device, the team studied what happened in the participants’ brains as they judged videos depicting behavior that was morally good or bad. For example, they saw a person put money in a beggar’s cup or kick the beggar’s cup away. The participants were asked to rate on a scale how much they would blame or praise the actor seen in the video. People in the study also completed questionnaires that assessed cognitive and emotional empathy, as well as their justice sensitivity.

As expected, study participants who scored high on the justice sensitivity questionnaire assigned significantly more blame when they were evaluating scenes of harm, Decety said. They also registered more praise for scenes showing a person helping another individual.

But the brain imaging also yielded surprises. During the behavior-evaluation exercise, people with high justice sensitivity showed more activity than average participants in parts of the brain associated with higher-order cognition. Brain areas commonly linked with emotional processing were not affected.

The conclusion was clear, Decety said: “Individuals who are sensitive to justice and fairness do not seem to be emotionally driven. Rather, they are cognitively driven.” 

According to Decety, one implication is that the search for justice and the moral missions of human rights organizations and others do not come primarily from sentimental motivations, as they are often portrayed. Instead, that drive may have more to do with sophisticated analysis and mental calculation.

Decety adds that evaluating good actions elicited relatively high activity in the region of the brain involved in decision-making, motivation and rewards. This finding suggests that perhaps individuals make judgments about behavior based on how they process the reward value of good actions as compared to bad actions.

“Our results provide some of the first evidence for the role of justice sensitivity in enhancing neural processing of moral information in specific components of the brain network involved in moral judgment,” Decety said.

Filed under moral cognition justice sensitivity prefrontal cortex decision making empathy psychology neuroscience science

76 notes

EEG study: Brain infers structure, rules of tasks
A new study documents the brain activity underlying our strong tendency to infer a structure of context and rules when learning new tasks (even when a structure isn’t valid). The findings, which revealed individual differences, shows how we try to apply task knowledge to similar situations and could inform future research on learning disabilities.
In life, many tasks have a context that dictates the right actions, so when people learn to do something new, they’ll often infer cues of context and rules. In a new study, Brown University brain scientists took advantage of that tendency to track the emergence of such rule structures in the frontal cortex — even when such structure was not necessary or even helpful to learn — and to predict from EEG readings how people would apply them to learn new tasks speedily.
Context and rule structures are everywhere. They allow an iPhone user who switches to an Android phone, for example, to reason that dimming the screen would involve finding a “settings” icon that will probably lead to a slider control for “brightness.” But when the context changes, inflexible generalization can lead a person temporarily astray — like a small-town tourist who greets strangers on the streets of New York City. In some developmental learning disabilities, the whole process of inferring abstract structures may be impaired.
“The world tends to be organized, and so we probably develop prior [notions] over time that there is going to be a structure,” said Anne Collins, a postdoctoral scholar in the Department of Cognitive, Linguistic, and Psychological Sciences at Brown and lead author of the study published March 25 in the Journal of Neuroscience. “When the world is organized, you just reduce the size of what you have to learn about by being able to generalize across situations in which the same things usually happen together. It is efficient to generalize if there is structure, and there usually is structure.”
Read more

EEG study: Brain infers structure, rules of tasks

A new study documents the brain activity underlying our strong tendency to infer a structure of context and rules when learning new tasks (even when a structure isn’t valid). The findings, which revealed individual differences, shows how we try to apply task knowledge to similar situations and could inform future research on learning disabilities.

In life, many tasks have a context that dictates the right actions, so when people learn to do something new, they’ll often infer cues of context and rules. In a new study, Brown University brain scientists took advantage of that tendency to track the emergence of such rule structures in the frontal cortex — even when such structure was not necessary or even helpful to learn — and to predict from EEG readings how people would apply them to learn new tasks speedily.

Context and rule structures are everywhere. They allow an iPhone user who switches to an Android phone, for example, to reason that dimming the screen would involve finding a “settings” icon that will probably lead to a slider control for “brightness.” But when the context changes, inflexible generalization can lead a person temporarily astray — like a small-town tourist who greets strangers on the streets of New York City. In some developmental learning disabilities, the whole process of inferring abstract structures may be impaired.

“The world tends to be organized, and so we probably develop prior [notions] over time that there is going to be a structure,” said Anne Collins, a postdoctoral scholar in the Department of Cognitive, Linguistic, and Psychological Sciences at Brown and lead author of the study published March 25 in the Journal of Neuroscience. “When the world is organized, you just reduce the size of what you have to learn about by being able to generalize across situations in which the same things usually happen together. It is efficient to generalize if there is structure, and there usually is structure.”

Read more

Filed under brain activity frontal cortex EEG learning psychology neuroscience science

112 notes

Face-blind people can learn to tell similar shapes apart
Study could support theory that the brain has specialized mechanisms for recognizing faces
People who are unable to recognize faces can still learn to distinguish between other types of very similar objects, researchers report. The finding provides fresh support for the idea that the brain mechanisms that process face images are specialized for that task. It also offers evidence against an ‘expertise’ hypothesis, in which the same mechanisms are responsible for recognition of faces and other highly similar objects we have learned to tell apart — the way bird watchers can recognize birds after years of training.
Constantin Rezlescu, a psychologist at Harvard University in Cambridge, Massachusetts, and his colleagues worked with two volunteers nicknamed Florence and Herschel, who had acquired prosopagnosia, or face blindness, following brain damage. The condition renders people unable to recognize and distinguish between faces — in some cases, even those of their own family members.
Read more

Face-blind people can learn to tell similar shapes apart

Study could support theory that the brain has specialized mechanisms for recognizing faces

People who are unable to recognize faces can still learn to distinguish between other types of very similar objects, researchers report. The finding provides fresh support for the idea that the brain mechanisms that process face images are specialized for that task. It also offers evidence against an ‘expertise’ hypothesis, in which the same mechanisms are responsible for recognition of faces and other highly similar objects we have learned to tell apart — the way bird watchers can recognize birds after years of training.

Constantin Rezlescu, a psychologist at Harvard University in Cambridge, Massachusetts, and his colleagues worked with two volunteers nicknamed Florence and Herschel, who had acquired prosopagnosia, or face blindness, following brain damage. The condition renders people unable to recognize and distinguish between faces — in some cases, even those of their own family members.

Read more

Filed under prosopagnosia face recognition face blindness psychology neuroscience science

121 notes

Understanding Binge Eating and Obesity
Researchers at the University of Cambridge have developed a novel method for evaluating the treatment of obesity-related food behavior. In an effort to further scientific understanding of the underlying problem, they have published the first peer-reviewed video of their technique in JoVE, the Journal of Visualized Experiments.
In the video, the authors demonstrate their means of objectively studying the drivers and mechanisms of overconsumption in humans. To do this, they assesses their subject’s willingness to work or pay for food, and they simultaneously track the corresponding brain activity using an MRI scanner.
“We present alternative ways of exploring attitudes to food by using indirect, objective measures—such as measuring the amount of energy exerted to obtain or view different foods, as well as determining brain responses during the anticipation and consumption of desirable foods,” said the lab’s principal investigator, Dr. Paul Fletcher. He and his colleagues use participant hand-grip intensity (referred to as “grip force” in the video) to calculate the motivation for a given food reward.
According to Dr. Fletcher, typical approaches for evaluating anti-obesity type drugs rely on more subjective methods—like having test subjects self-report their ratings of hunger and cravings.  
“When a person is asked how much they subjectively desire a food, they may feel pressured to give a ‘correct’ rather than a true answer,” said Dr. Fletcher, “[Our] grip force task may, under certain circumstances, present a more accurate reflection of what they really want.”
Dr. Fletcher and his colleagues brought the technique to JoVE after using it in their earlier publication, “Food images engage subliminal motivation to seek food,” published in 2011. They decided to publish a video capturing the protocol “Because it offered the opportunity to demonstrate the methods more fully,” he said.
In the video, Dr. Fletcher expands on the purpose of publishing the method with JoVE. “Individuals new to the technique may struggle because there aren’t many examples of grip-force tasks published in the literature, and there are no full and clear descriptions of how to design and set up the tasks,” he said.
With rising concerns surrounding obesity, researchers can use the technique presented in the JoVE video to determine the efficacy of a potential emerging market in anti-obesity medicine.

Understanding Binge Eating and Obesity

Researchers at the University of Cambridge have developed a novel method for evaluating the treatment of obesity-related food behavior. In an effort to further scientific understanding of the underlying problem, they have published the first peer-reviewed video of their technique in JoVE, the Journal of Visualized Experiments.

In the video, the authors demonstrate their means of objectively studying the drivers and mechanisms of overconsumption in humans. To do this, they assesses their subject’s willingness to work or pay for food, and they simultaneously track the corresponding brain activity using an MRI scanner.

“We present alternative ways of exploring attitudes to food by using indirect, objective measures—such as measuring the amount of energy exerted to obtain or view different foods, as well as determining brain responses during the anticipation and consumption of desirable foods,” said the lab’s principal investigator, Dr. Paul Fletcher. He and his colleagues use participant hand-grip intensity (referred to as “grip force” in the video) to calculate the motivation for a given food reward.

According to Dr. Fletcher, typical approaches for evaluating anti-obesity type drugs rely on more subjective methods—like having test subjects self-report their ratings of hunger and cravings.  

“When a person is asked how much they subjectively desire a food, they may feel pressured to give a ‘correct’ rather than a true answer,” said Dr. Fletcher, “[Our] grip force task may, under certain circumstances, present a more accurate reflection of what they really want.”

Dr. Fletcher and his colleagues brought the technique to JoVE after using it in their earlier publication, “Food images engage subliminal motivation to seek food,” published in 2011. They decided to publish a video capturing the protocol “Because it offered the opportunity to demonstrate the methods more fully,” he said.

In the video, Dr. Fletcher expands on the purpose of publishing the method with JoVE. “Individuals new to the technique may struggle because there aren’t many examples of grip-force tasks published in the literature, and there are no full and clear descriptions of how to design and set up the tasks,” he said.

With rising concerns surrounding obesity, researchers can use the technique presented in the JoVE video to determine the efficacy of a potential emerging market in anti-obesity medicine.

Filed under binge eating obesity motivation reward processing food reward decision making psychology neuroscience science

145 notes

Overlapping Neural Systems Represent Cognitive Effort and Reward Anticipation
Anticipating a potential benefit and how difficult it will be to obtain it are valuable skills in a constantly changing environment. In the human brain, the anticipation of reward is encoded by the Anterior Cingulate Cortex (ACC) and Striatum. Naturally, potential rewards have an incentive quality, resulting in a motivational effect improving performance. Recently it has been proposed that an upcoming task requiring effort induces a similar anticipation mechanism as reward, relying on the same cortico-limbic network. However, this overlapping anticipatory activity for reward and effort has only been investigated in a perceptual task. Whether this generalizes to high-level cognitive tasks remains to be investigated. To this end, an fMRI experiment was designed to investigate anticipation of reward and effort in cognitive tasks. A mental arithmetic task was implemented, manipulating effort (difficulty), reward, and delay in reward delivery to control for temporal confounds. The goal was to test for the motivational effect induced by the expectation of bigger reward and higher effort. The results showed that the activation elicited by an upcoming difficult task overlapped with higher reward prospect in the ACC and in the striatum, thus highlighting a pivotal role of this circuit in sustaining motivated behavior.
Full article

Overlapping Neural Systems Represent Cognitive Effort and Reward Anticipation

Anticipating a potential benefit and how difficult it will be to obtain it are valuable skills in a constantly changing environment. In the human brain, the anticipation of reward is encoded by the Anterior Cingulate Cortex (ACC) and Striatum. Naturally, potential rewards have an incentive quality, resulting in a motivational effect improving performance. Recently it has been proposed that an upcoming task requiring effort induces a similar anticipation mechanism as reward, relying on the same cortico-limbic network. However, this overlapping anticipatory activity for reward and effort has only been investigated in a perceptual task. Whether this generalizes to high-level cognitive tasks remains to be investigated. To this end, an fMRI experiment was designed to investigate anticipation of reward and effort in cognitive tasks. A mental arithmetic task was implemented, manipulating effort (difficulty), reward, and delay in reward delivery to control for temporal confounds. The goal was to test for the motivational effect induced by the expectation of bigger reward and higher effort. The results showed that the activation elicited by an upcoming difficult task overlapped with higher reward prospect in the ACC and in the striatum, thus highlighting a pivotal role of this circuit in sustaining motivated behavior.

Full article

Filed under neuroimaging reward processing reward motivation performance cingulate cortex psychology neuroscience science

324 notes

Computers See Through Faked Expressions of Pain Better Than People
A joint study by researchers at the University of California, San Diego and the University of Toronto has found that a computer system spots real or faked expressions of pain more accurately than people can.
The work, titled “Automatic Decoding of Deceptive Pain Expressions,” is published in the latest issue of Current Biology.
“The computer system managed to detect distinctive dynamic features of facial expressions that people missed,” said Marian Bartlett, research professor at UC San Diego’s Institute for Neural Computation and lead author of the study. “Human observers just aren’t very good at telling real from faked expressions of pain.”
Senior author Kang Lee, professor at the Dr. Eric Jackman Institute of Child Study at the University of Toronto, said “humans can simulate facial expressions and fake emotions well enough to deceive most observers. The computer’s pattern-recognition abilities prove better at telling whether pain is real or faked.”
The research team found that humans could not discriminate real from faked expressions of pain better than random chance – and, even after training, only improved accuracy to a modest 55 percent. The computer system attains an 85 percent accuracy.
“In highly social species such as humans,” said Lee, “faces have evolved to convey rich information, including expressions of emotion and pain. And, because of the way our brains are built, people can simulate emotions they’re not actually experiencing – so successfully that they fool other people. The computer is much better at spotting the subtle differences between involuntary and voluntary facial movements.”
“By revealing the dynamics of facial action through machine vision systems,” said Bartlett, “our approach has the potential to elucidate ‘behavioral fingerprints’ of the neural-control systems involved in emotional signaling.”
The single most predictive feature of falsified expressions, the study shows, is the mouth, and how and when it opens. Fakers’ mouths open with less variation and too regularly.
“Further investigations,” said the researchers, “will explore whether over-regularity is a general feature of fake expressions.”
In addition to detecting pain malingering, the computer-vision system might be used to detect other real-world deceptive actions in the realms of homeland security, psychopathology, job screening, medicine, and law, said Bartlett.
“As with causes of pain, these scenarios also generate strong emotions, along with attempts to minimize, mask, and fake such emotions, which may involve ‘dual control’ of the face,” she said. “In addition, our computer-vision system can be applied to detect states in which the human face may provide important clues as to health, physiology, emotion, or thought, such as drivers’ expressions of sleepiness, students’ expressions of attention and comprehension of lectures, or responses to treatment of affective disorders.”

Computers See Through Faked Expressions of Pain Better Than People

A joint study by researchers at the University of California, San Diego and the University of Toronto has found that a computer system spots real or faked expressions of pain more accurately than people can.

The work, titled “Automatic Decoding of Deceptive Pain Expressions,” is published in the latest issue of Current Biology.

“The computer system managed to detect distinctive dynamic features of facial expressions that people missed,” said Marian Bartlett, research professor at UC San Diego’s Institute for Neural Computation and lead author of the study. “Human observers just aren’t very good at telling real from faked expressions of pain.”

Senior author Kang Lee, professor at the Dr. Eric Jackman Institute of Child Study at the University of Toronto, said “humans can simulate facial expressions and fake emotions well enough to deceive most observers. The computer’s pattern-recognition abilities prove better at telling whether pain is real or faked.”

The research team found that humans could not discriminate real from faked expressions of pain better than random chance – and, even after training, only improved accuracy to a modest 55 percent. The computer system attains an 85 percent accuracy.

“In highly social species such as humans,” said Lee, “faces have evolved to convey rich information, including expressions of emotion and pain. And, because of the way our brains are built, people can simulate emotions they’re not actually experiencing – so successfully that they fool other people. The computer is much better at spotting the subtle differences between involuntary and voluntary facial movements.”

“By revealing the dynamics of facial action through machine vision systems,” said Bartlett, “our approach has the potential to elucidate ‘behavioral fingerprints’ of the neural-control systems involved in emotional signaling.”

The single most predictive feature of falsified expressions, the study shows, is the mouth, and how and when it opens. Fakers’ mouths open with less variation and too regularly.

“Further investigations,” said the researchers, “will explore whether over-regularity is a general feature of fake expressions.”

In addition to detecting pain malingering, the computer-vision system might be used to detect other real-world deceptive actions in the realms of homeland security, psychopathology, job screening, medicine, and law, said Bartlett.

“As with causes of pain, these scenarios also generate strong emotions, along with attempts to minimize, mask, and fake such emotions, which may involve ‘dual control’ of the face,” she said. “In addition, our computer-vision system can be applied to detect states in which the human face may provide important clues as to health, physiology, emotion, or thought, such as drivers’ expressions of sleepiness, students’ expressions of attention and comprehension of lectures, or responses to treatment of affective disorders.”

Filed under pain emotion facial expressions computer-vision system psychology neuroscience science

517 notes

(Image caption: Illustration of the mirror neuron system in the human brain. Credit: Jan Brascamp)
Brain mapping confirms patients with schizophrenia have impaired ability to imitate
According to George Bernard Shaw, “Imitation is not just the sincerest form of flattery – it’s the sincerest form of learning.” According to psychologists, imitation is something that we all do whenever we learn a new skill, whether it is dancing or how to behave in specific social situations.
Now, the results of a brain-mapping experiment conducted by a team of neuroscientists at Vanderbilt University strengthen the theory that an impaired ability to imitate may underlie the profound and enduring difficulty with social interactions that characterize schizophrenia. In a paper published online on Mar. 14 by the American Journal of Psychiatry, the researchers report that when patients with schizophrenia were asked to imitate simple hand movements, their brains exhibited abnormal brain activity in areas associated with the ability to imitate.
“The fact that patients with schizophrenia show abnormal brain activity when they imitate simple hand gestures is important because action imitation is a primary building block of social abilities,” said first author Katharine Thakkar, who conducted much of the research while completing her doctoral program at Vanderbilt and is now a post-doctoral fellow at the University Medical Center in Utrecht. “The ability to imitate is present early in life and is crucial for learning how to navigate the social world. According to current theory, covert imitation is also the most fundamental way that we understand the intentions and feelings of other people.”
Read more

(Image caption: Illustration of the mirror neuron system in the human brain. Credit: Jan Brascamp)

Brain mapping confirms patients with schizophrenia have impaired ability to imitate

According to George Bernard Shaw, “Imitation is not just the sincerest form of flattery – it’s the sincerest form of learning.” According to psychologists, imitation is something that we all do whenever we learn a new skill, whether it is dancing or how to behave in specific social situations.

Now, the results of a brain-mapping experiment conducted by a team of neuroscientists at Vanderbilt University strengthen the theory that an impaired ability to imitate may underlie the profound and enduring difficulty with social interactions that characterize schizophrenia. In a paper published online on Mar. 14 by the American Journal of Psychiatry, the researchers report that when patients with schizophrenia were asked to imitate simple hand movements, their brains exhibited abnormal brain activity in areas associated with the ability to imitate.

“The fact that patients with schizophrenia show abnormal brain activity when they imitate simple hand gestures is important because action imitation is a primary building block of social abilities,” said first author Katharine Thakkar, who conducted much of the research while completing her doctoral program at Vanderbilt and is now a post-doctoral fellow at the University Medical Center in Utrecht. “The ability to imitate is present early in life and is crucial for learning how to navigate the social world. According to current theory, covert imitation is also the most fundamental way that we understand the intentions and feelings of other people.”

Read more

Filed under brain mapping mirror neuron system schizophrenia brain activity imitation psychology neuroscience science

203 notes

Contagious Yawning May Not Be Linked to Empathy; Still Largely Unexplained
While previous studies have suggested a connection between contagious yawning and empathy, new research from the Duke Center for Human Genome Variation finds that contagious yawning may decrease with age and is not strongly related to variables like empathy, tiredness and energy levels.
The study, published March 14 in the journal PLOS ONE, is the most comprehensive look at factors influencing contagious yawning to date.
The researchers said a better understanding of the biology involved in contagious yawning could ultimately shed light on illnesses such as schizophrenia or autism.
“The lack of association in our study between contagious yawning and empathy suggests that contagious yawning is not simply a product of one’s capacity for empathy,” said study author Elizabeth Cirulli, Ph.D., assistant professor of medicine at the Center for Human Genome Variation at Duke University School of Medicine.
Contagious yawning is a well-documented phenomenon that occurs only in humans and chimpanzees in response to hearing, seeing or thinking about yawning. It differs from spontaneous yawning, which occurs when someone is bored or tired. Spontaneous yawning is first observed in the womb, while contagious yawning does not begin until early childhood.
Why certain individuals are more susceptible to contagious yawning remains poorly understood. Previous research, including neuroimaging studies, has shown a relationship between contagious yawning and empathy, or the ability to recognize or understand another’s emotions. Other studies have shown correlations between contagious yawning and intelligence or time of day.
Interestingly, people with autism or schizophrenia, both of which involve impaired social skills, demonstrate less contagious yawning despite still yawning spontaneously. A deeper understanding of contagious yawning could lead to insights on these diseases and the general biological functioning of humans.
The current study aimed to better define how certain factors affect someone’s susceptibility to contagious yawning. The researchers recruited 328 healthy volunteers, who completed cognitive testing, a demographic survey, and a comprehensive questionnaire that included measures of empathy, energy levels and sleepiness.
The participants then watched a three-minute video of people yawning, and recorded the number of times they yawned while watching the video.
The researchers found that certain individuals were less susceptible to contagious yawns than others, with participants yawning between zero and 15 times during the video. Of the 328 people studied, 222 contagiously yawned at least once. When verified across multiple testing sessions, the number of yawns was consistent, demonstrating that contagious yawning is a very stable trait.
In contrast to previous studies, the researchers did not find a strong connection between contagious yawning and empathy, intelligence or time of day. The only independent factor that significantly influenced contagious yawning was age: as age increased, participants were less likely to yawn. However, age was only able to explain 8 percent of the variability in the contagious yawn response.
“Age was the most important predictor of contagious yawning, and even age was not that important. The vast majority of variation in the contagious yawning response was just not explained,” Cirulli said.
Because most variability in contagious yawning remains unexplained, the researchers are now looking to see whether there are genetic influences that contribute to contagious yawning. Their long-term goal in characterizing variability in contagious yawning is to better understand human diseases like schizophrenia and autism, as well as general human functioning, by identifying the genetic basis of this trait.
“It is possible that if we find a genetic variant that makes people less likely to have contagious yawns, we might see that variant or variants of the same gene also associated with schizophrenia or autism,” Cirulli said. “Even if no association with a disease is found, a better understanding of the biology behind contagious yawning can inform us about the pathways involved in these conditions.”

Contagious Yawning May Not Be Linked to Empathy; Still Largely Unexplained

While previous studies have suggested a connection between contagious yawning and empathy, new research from the Duke Center for Human Genome Variation finds that contagious yawning may decrease with age and is not strongly related to variables like empathy, tiredness and energy levels.

The study, published March 14 in the journal PLOS ONE, is the most comprehensive look at factors influencing contagious yawning to date.

The researchers said a better understanding of the biology involved in contagious yawning could ultimately shed light on illnesses such as schizophrenia or autism.

“The lack of association in our study between contagious yawning and empathy suggests that contagious yawning is not simply a product of one’s capacity for empathy,” said study author Elizabeth Cirulli, Ph.D., assistant professor of medicine at the Center for Human Genome Variation at Duke University School of Medicine.

Contagious yawning is a well-documented phenomenon that occurs only in humans and chimpanzees in response to hearing, seeing or thinking about yawning. It differs from spontaneous yawning, which occurs when someone is bored or tired. Spontaneous yawning is first observed in the womb, while contagious yawning does not begin until early childhood.

Why certain individuals are more susceptible to contagious yawning remains poorly understood. Previous research, including neuroimaging studies, has shown a relationship between contagious yawning and empathy, or the ability to recognize or understand another’s emotions. Other studies have shown correlations between contagious yawning and intelligence or time of day.

Interestingly, people with autism or schizophrenia, both of which involve impaired social skills, demonstrate less contagious yawning despite still yawning spontaneously. A deeper understanding of contagious yawning could lead to insights on these diseases and the general biological functioning of humans.

The current study aimed to better define how certain factors affect someone’s susceptibility to contagious yawning. The researchers recruited 328 healthy volunteers, who completed cognitive testing, a demographic survey, and a comprehensive questionnaire that included measures of empathy, energy levels and sleepiness.

The participants then watched a three-minute video of people yawning, and recorded the number of times they yawned while watching the video.

The researchers found that certain individuals were less susceptible to contagious yawns than others, with participants yawning between zero and 15 times during the video. Of the 328 people studied, 222 contagiously yawned at least once. When verified across multiple testing sessions, the number of yawns was consistent, demonstrating that contagious yawning is a very stable trait.

In contrast to previous studies, the researchers did not find a strong connection between contagious yawning and empathy, intelligence or time of day. The only independent factor that significantly influenced contagious yawning was age: as age increased, participants were less likely to yawn. However, age was only able to explain 8 percent of the variability in the contagious yawn response.

“Age was the most important predictor of contagious yawning, and even age was not that important. The vast majority of variation in the contagious yawning response was just not explained,” Cirulli said.

Because most variability in contagious yawning remains unexplained, the researchers are now looking to see whether there are genetic influences that contribute to contagious yawning. Their long-term goal in characterizing variability in contagious yawning is to better understand human diseases like schizophrenia and autism, as well as general human functioning, by identifying the genetic basis of this trait.

“It is possible that if we find a genetic variant that makes people less likely to have contagious yawns, we might see that variant or variants of the same gene also associated with schizophrenia or autism,” Cirulli said. “Even if no association with a disease is found, a better understanding of the biology behind contagious yawning can inform us about the pathways involved in these conditions.”

Filed under empathy contagious yawning autism schizophrenia social interaction psychology neuroscience science

184 notes

Nicotine Withdrawal Weakens Brain Connections Tied to Self-Control Over Cigarette Cravings

People who try to quit smoking often say that kicking the habit makes the voice inside telling them to light up even louder, but why people succumb to those cravings so often has never been fully understood.  Now, a new brain imaging study in this week’s JAMA Psychiatry from scientists in Penn Medicine and the National Institute on Drug Abuse (NIDA) Intramural Research Program shows how smokers suffering from nicotine withdrawal may have more trouble shifting from a key brain network—known as default mode, when people are in a so-called “introspective” or “self-referential” state— and into a control network, the so-called executive control network, that could help exert more conscious, self-control over cravings and to focus on quitting for good.

image

The findings help validate a neurobiological basis behind why so many people trying to quit end up relapsing—up to 80 percent, depending on the type of treatment—and may lead to new ways to identify smokers at high risk for relapse who need more intensive smoking cessation therapy.  

The brain imaging study was led by researchers at University of Pennsylvania’s new Brain and Behavior Change Program, led by Caryn Lerman, PhD, who is also the deputy director of Penn’s Abramson Cancer Center, and Elliot Stein, PhD, and collaborators at NIDA. They found that smokers who abstained from cigarettes showed weakened interconnectivity between certain large-scale networks in their brains: the default mode network, the executive control network, and the salience network. They posit that this weakened connectivity reduces smokers’ ability to shift into or maintain greater influence from the executive control network, which may ultimately help maintain their quitting attempt.

“What we believe this means is that smokers who just quit have a more difficult time shifting gears from inward thoughts about how they feel to an outward focus on the tasks at hand,” said Lerman, who also serves as the Mary W. Calkins professor in the Department of Psychiatry. “It’s very important for people who are trying to quit to be able to maintain activity within the control network— to be able to shift from thinking about yourself and your inner state to focus on your more immediate goals and plan.”

Prior studies have looked at the effects of nicotine on brain interconnectivity in the resting state, that is, in the absence of any specific goal directed activity. This is the first study, however, to compare resting brain connectivity in an abstinent state and when people are smoking as usual, and then relate those changes to symptoms of craving and mental performance.

For the study, researchers conducted brain scans on 37 healthy smokers (those who smoke more than 10 cigarettes a day) ages 19 to 61 using functional magnetic resonance imaging (fMRI) in two different sessions: 24 hours after biochemically confirmed abstinence and after smoking as usual.

Imaging showed a significantly weaker connectivity between the salience network and default mode network during abstinence, compared with their sated state. Also, weakened connectivity during abstinence was linked with increases in smoking urges, negative mood, and withdrawal symptoms, suggesting that this weaker internetwork connectivity may make it more difficult for people to quit.

Establishing the strength of the connectivity between these large-scale brain networks will be important in predicting people’s ability to quit and stay quit, the authors write. Also, such connectivity could serve as a clinical biomarker to identify smokers who are most likely to respond to a particular treatment.

“Symptoms of withdrawal are related to changes in smokers’ brains, as they adjust to being off of nicotine, and this study validates those experiences as having a biological basis,” said Lerman. “The next step will be to identify in advance those smokers who will have more difficultly quitting and target more intensive treatments, based on brain activity and network connectivity.”

(Source: uphs.upenn.edu)

Filed under default mode network smoking nicotine neuroimaging psychology neuroscience science

496 notes

Gesturing with hands is a powerful tool for children’s math learning
Children who use their hands to gesture during a math lesson gain a deep understanding of the problems they are taught, according to new research from University of Chicago’s Department of Psychology.
Previous research has found that gestures can help children learn. This study in particular was designed to answer whether abstract gesture can support generalization beyond a particular problem and whether abstract gesture is a more effective teaching tool than concrete action.
“We found that acting gave children a relatively shallow understanding of a novel math concept, whereas gesturing led to deeper and more flexible learning,” explained the study’s lead author, Miriam A. Novack, a PhD student in psychology.
The study, “From action to abstraction: Using the hands to learn math,” is published online by Psychological Science.
The researchers taught third-grade children a strategy for solving one type of mathematical equivalence problem, for example, 4 + 2 + 6 = ____ + 6. They then tested the students on similar mathematical equivalence problems to determine how well they understood the underlying principle.
The researchers randomly assigned 90 children to conditions in which they learned using different kinds of physical interaction with the material. In one group, children picked up magnetic number tiles and put them in the proper place in the formula. For example, for the problem 4 + 2 + 6 = ___ + 6, they picked up the 4 and 2 and placed them on a magnetic whiteboard. Another group mimed that action without actually touching the tiles, and a third group was taught to use abstract gestures with their hands to solve the equations. In the abstract gesture group, children were taught to produce a V-point gesture with their fingers under two of the numbers, metaphorically grouping them, followed by pointing a finger at the blank in the equation.
The children were tested before and after solving each problem in the lesson, including problems that required children to generalize beyond what they had learned in grouping the numbers. For example, they were given problems that were similar to the original one, but had different numbers on both sides of the equation.
Children in all three groups learned the problems they had been taught during the lesson. But only children who gestured during the lesson were successful on the generalization problems.
“Abstract gesture was most effective in encouraging learners to generalize the knowledge they had gained during instruction, action least effective, and concrete gesture somewhere in between,” said senior author Susan Goldin-Meadow, the Beardsley Ruml Distinguished Service Professor in Psychology. “Our findings provide the first evidence that gesture not only supports learning a task at hand but, more importantly, leads to generalization beyond the task. Children appear to learn underlying principles from their actions only insofar as those actions can be interpreted symbolically.”

Gesturing with hands is a powerful tool for children’s math learning

Children who use their hands to gesture during a math lesson gain a deep understanding of the problems they are taught, according to new research from University of Chicago’s Department of Psychology.

Previous research has found that gestures can help children learn. This study in particular was designed to answer whether abstract gesture can support generalization beyond a particular problem and whether abstract gesture is a more effective teaching tool than concrete action.

“We found that acting gave children a relatively shallow understanding of a novel math concept, whereas gesturing led to deeper and more flexible learning,” explained the study’s lead author, Miriam A. Novack, a PhD student in psychology.

The study, “From action to abstraction: Using the hands to learn math,” is published online by Psychological Science.

The researchers taught third-grade children a strategy for solving one type of mathematical equivalence problem, for example, 4 + 2 + 6 = ____ + 6. They then tested the students on similar mathematical equivalence problems to determine how well they understood the underlying principle.

The researchers randomly assigned 90 children to conditions in which they learned using different kinds of physical interaction with the material. In one group, children picked up magnetic number tiles and put them in the proper place in the formula. For example, for the problem 4 + 2 + 6 = ___ + 6, they picked up the 4 and 2 and placed them on a magnetic whiteboard. Another group mimed that action without actually touching the tiles, and a third group was taught to use abstract gestures with their hands to solve the equations. In the abstract gesture group, children were taught to produce a V-point gesture with their fingers under two of the numbers, metaphorically grouping them, followed by pointing a finger at the blank in the equation.

The children were tested before and after solving each problem in the lesson, including problems that required children to generalize beyond what they had learned in grouping the numbers. For example, they were given problems that were similar to the original one, but had different numbers on both sides of the equation.

Children in all three groups learned the problems they had been taught during the lesson. But only children who gestured during the lesson were successful on the generalization problems.

“Abstract gesture was most effective in encouraging learners to generalize the knowledge they had gained during instruction, action least effective, and concrete gesture somewhere in between,” said senior author Susan Goldin-Meadow, the Beardsley Ruml Distinguished Service Professor in Psychology. “Our findings provide the first evidence that gesture not only supports learning a task at hand but, more importantly, leads to generalization beyond the task. Children appear to learn underlying principles from their actions only insofar as those actions can be interpreted symbolically.”

Filed under mathematics learning psychology neuroscience science

free counters