Neuroscience

Articles and news from the latest research reports.

Posts tagged morality

342 notes

The Moral Brain
Consider a failed murder attempt. Or a simple mistake that causes another to die. Is one of these more acceptable than the other?
Neuroscientists don’t pretend to hold the answers as to how people know what is right and what is wrong. But studies show individual biology may influence the ways people process the actions of others.
It turns out we judge others not only for what they do, but also for what we perceive they are thinking while they do it.
Consider the following scenario: Grace and Sally are touring a chemical factory when Grace decides to grab a cup of coffee. Sally asks Grace to pour her a cup as well. Grace spots a container of white powder next to the coffee maker and, knowing that her friend takes sugar in her coffee, she pours some into Sally’s cup. As it turns out, the powder is poison, and Sally dies after a few sips.
Most of us would understand and maybe forgive Grace for accidentally poisoning — or even killing — her friend. But what would you think of Grace if you were to learn that she had a hunch that the powder was toxic, yet decided to add it to her friend’s cup anyway?
“Often, what determines moral blame is not what the outcome is, but what [we think] is going on in the mind of the person performing the act,” says Rebecca Saxe, a neuroscientist at the Massachusetts Institute of Technology who studies how the brain casts judgment.
Scientists are learning the ways the brain responds when we attempt to determine right from wrong. Ultimately, they hope such information will help show how the brain processes difficult situations.
What was she thinking?
One way scientists study how we make right-or-wrong judgments is to look at brain regions that are most active when people attempt to interpret the thoughts of others.
In some studies, participants read stories about characters that either accidentally or intentionally cause harm to others while scientists use functional magnetic resonance imaging (fMRI) to track how brain activity changes. Such studies show that thinking about another’s thoughts increases the activity of nerve cells in a brain region known as the right temporo-parietal junction located behind the right ear.
As it turns out, some of these cells respond differently when presented with an intentional harm versus an accident. By zeroing in on the distinct patterns of activity in these cells, Saxe’s group discovered that they could accurately predict how forgiving the participants would be.
“People who say accidents are forgivable have really different [activity] patterns” than those less willing to overlook the unintentional harm, Saxe says.
Thinking about harm
Neuroscientists also study how people respond when asked how they themselves would act in morally challenging scenarios.
In one popular moral dilemma scenario, scientists ask participants to imagine the following: A runaway train is barreling down on five people. The only way to save these people is to hit a switch that would redirect the train onto tracks where it will kill only one person. Would you hit the switch?
What if, instead, you had to push a man off of a bridge to stop the train, knowing that doing so will kill him but save the lives of the others?
Studies ran these scenarios by people with damage to the ventromedial prefrontal cortex — a region believed to be involved in the processing of emotions — and those without damage. Both groups equally support the decision to hit the switch to redirect the train to save more lives.
However, those with damage to the ventromedial prefrontal cortex are much more likely to endorse pushing the man in front of the train, a more direct and personal harm. These studies, led by neuroscientist Antonio Damasio of the University of Southern California, suggest the important role of emotion in the generation of such judgments.
To test how important the ventromedial prefrontal cortex is when we judge the actions of others, Damasio along with neuroscientist Liane Young of Boston asked a small group of people with damage to this region to evaluate variations of the Grace and Sally story.
When told that Grace deliberately puts powder she believes is toxic into Sally’s cup, only to later learn the powder was sugar, healthy adults regularly condemn Grace’s failed attempt to harm her friend. However, people with ventromedial prefrontal cortex damage shrug off Grace’s action. As they see it, as long as Sally survives, Grace’s actions are no big deal.
Damasio says these results, along with others, reveal the role of the ventromedial prefrontal cortex and emotion in evaluating harmful intent.
That’s not fair
There is also evidence that changes in the chemistry of the brain influence how we behave when others treat us unfairly.
To measure how changes in brain chemistry affect people’s reactions to unfairness, University College London neuroscientist Molly Crockett and others gave study participants a drink to drive down levels of the neurotransmitter serotonin in the brain before asking them to play the ultimatum game.
In the ultimatum game, participants are paired with strangers they are told have been given a lump sum of money to share with them. The stranger determines how to divvy up the money, and proposes a split to the participant. The participant decides whether or not to accept the stranger’s offer. If the participant accepts, both players walk away with some money. However, a participant may reject the offer, believing it to be unfair, leaving both players empty-handed. Crockett found that people with lower levels of serotonin were more likely than others to reject offers they deemed to be unfair.
When the scientists examined the brain activity of participants with depleted serotonin levels as they accepted or rejected the offers, they found that rejecting offers led to increased activity in the dorsal striatum — a region involved in processing reward. Crockett says the findings suggest that dips in serotonin can shift people’s motivations to punish unfairness. For instance, when you deplete serotonin, people who are normally more forgiving may become happier with revenge, she says.
Crockett notes that serotonin levels may fluctuate when people are hungry or stressed. The findings illustrate how individual differences in biology might influence the way people view, and respond to, the actions of others.

The Moral Brain

Consider a failed murder attempt. Or a simple mistake that causes another to die. Is one of these more acceptable than the other?

Neuroscientists don’t pretend to hold the answers as to how people know what is right and what is wrong. But studies show individual biology may influence the ways people process the actions of others.

It turns out we judge others not only for what they do, but also for what we perceive they are thinking while they do it.

Consider the following scenario: Grace and Sally are touring a chemical factory when Grace decides to grab a cup of coffee. Sally asks Grace to pour her a cup as well. Grace spots a container of white powder next to the coffee maker and, knowing that her friend takes sugar in her coffee, she pours some into Sally’s cup. As it turns out, the powder is poison, and Sally dies after a few sips.

Most of us would understand and maybe forgive Grace for accidentally poisoning — or even killing — her friend. But what would you think of Grace if you were to learn that she had a hunch that the powder was toxic, yet decided to add it to her friend’s cup anyway?

“Often, what determines moral blame is not what the outcome is, but what [we think] is going on in the mind of the person performing the act,” says Rebecca Saxe, a neuroscientist at the Massachusetts Institute of Technology who studies how the brain casts judgment.

Scientists are learning the ways the brain responds when we attempt to determine right from wrong. Ultimately, they hope such information will help show how the brain processes difficult situations.

What was she thinking?

One way scientists study how we make right-or-wrong judgments is to look at brain regions that are most active when people attempt to interpret the thoughts of others.

In some studies, participants read stories about characters that either accidentally or intentionally cause harm to others while scientists use functional magnetic resonance imaging (fMRI) to track how brain activity changes. Such studies show that thinking about another’s thoughts increases the activity of nerve cells in a brain region known as the right temporo-parietal junction located behind the right ear.

As it turns out, some of these cells respond differently when presented with an intentional harm versus an accident. By zeroing in on the distinct patterns of activity in these cells, Saxe’s group discovered that they could accurately predict how forgiving the participants would be.

“People who say accidents are forgivable have really different [activity] patterns” than those less willing to overlook the unintentional harm, Saxe says.

Thinking about harm

Neuroscientists also study how people respond when asked how they themselves would act in morally challenging scenarios.

In one popular moral dilemma scenario, scientists ask participants to imagine the following: A runaway train is barreling down on five people. The only way to save these people is to hit a switch that would redirect the train onto tracks where it will kill only one person. Would you hit the switch?

What if, instead, you had to push a man off of a bridge to stop the train, knowing that doing so will kill him but save the lives of the others?

Studies ran these scenarios by people with damage to the ventromedial prefrontal cortex — a region believed to be involved in the processing of emotions — and those without damage. Both groups equally support the decision to hit the switch to redirect the train to save more lives.

However, those with damage to the ventromedial prefrontal cortex are much more likely to endorse pushing the man in front of the train, a more direct and personal harm. These studies, led by neuroscientist Antonio Damasio of the University of Southern California, suggest the important role of emotion in the generation of such judgments.

To test how important the ventromedial prefrontal cortex is when we judge the actions of others, Damasio along with neuroscientist Liane Young of Boston asked a small group of people with damage to this region to evaluate variations of the Grace and Sally story.

When told that Grace deliberately puts powder she believes is toxic into Sally’s cup, only to later learn the powder was sugar, healthy adults regularly condemn Grace’s failed attempt to harm her friend. However, people with ventromedial prefrontal cortex damage shrug off Grace’s action. As they see it, as long as Sally survives, Grace’s actions are no big deal.

Damasio says these results, along with others, reveal the role of the ventromedial prefrontal cortex and emotion in evaluating harmful intent.

That’s not fair

There is also evidence that changes in the chemistry of the brain influence how we behave when others treat us unfairly.

To measure how changes in brain chemistry affect people’s reactions to unfairness, University College London neuroscientist Molly Crockett and others gave study participants a drink to drive down levels of the neurotransmitter serotonin in the brain before asking them to play the ultimatum game.

In the ultimatum game, participants are paired with strangers they are told have been given a lump sum of money to share with them. The stranger determines how to divvy up the money, and proposes a split to the participant. The participant decides whether or not to accept the stranger’s offer. If the participant accepts, both players walk away with some money. However, a participant may reject the offer, believing it to be unfair, leaving both players empty-handed. Crockett found that people with lower levels of serotonin were more likely than others to reject offers they deemed to be unfair.

When the scientists examined the brain activity of participants with depleted serotonin levels as they accepted or rejected the offers, they found that rejecting offers led to increased activity in the dorsal striatum — a region involved in processing reward. Crockett says the findings suggest that dips in serotonin can shift people’s motivations to punish unfairness. For instance, when you deplete serotonin, people who are normally more forgiving may become happier with revenge, she says.

Crockett notes that serotonin levels may fluctuate when people are hungry or stressed. The findings illustrate how individual differences in biology might influence the way people view, and respond to, the actions of others.

Filed under brain morality moral judgment intentions fairness psychology neuroscience science

566 notes

Can Meditation Make You a More Compassionate Person?
Scientists have mostly focused on the benefits of meditation for the brain and the body, but a recent study by Northeastern University’s David DeSteno, published in Psychological Science, takes a look at what impacts meditation has on interpersonal harmony and compassion.
Several religious traditions have suggested that mediation does just that, but there has been no scientific proof—until now.
In this study, a team of researchers from Northeastern University and Harvard University examined the effects meditation would have on compassion and virtuous behavior, and the results were fascinating.
THE STUDY
This study—funded by the Mind and Life Institute—invited participants to complete eight-week trainings in two types of meditation. After the sessions, they were put to the test.
Sitting in a staged waiting room with three chairs were two actors. With one empty chair left, the participant sat down and waited to be called. Another actor using crutches and appearing to be in great physical pain, would then enter the room.  As she did, the actors in the chair would ignore her by fiddling with their phones or opening a book.
The question DeSteno and Paul Condon – a graduate student in DeSteno’s lab who led the study – and their team wanted to answer was whether the subjects who took part in the meditation classes would be more likely to come to the aid of the person in pain, even in the face of everyone else ignoring her. “We know meditation improves a person’s own physical and psychological wellbeing,” said Condon. “We wanted to know whether it actually increases compassionate behavior.”
MEDITATION WORKS
Among the non-meditating participants, only about 15 percent of people acted to help. But among the participants who were in the meditation sessions “we were able to boost that up to 50 percent,” said DeSteno.  This result was true for both meditation groups thereby showing the effect to be consistent across different forms of meditation.  “The truly surprising aspect of this finding is that meditation made people willing to act virtuous – to help another who was suffering – even in the face of a norm not to do so,” DeSteno said, “The fact that the other actors were ignoring the pain creates as ‘bystander-effect’ that normally tends to reduce helping.  People often wonder ‘Why should I help someone if no one else is?’”
These results appear to prove what the Buddhist theologians have long believed—that meditation is supposed to lead you to experience more compassion and love for all sentient beings. But even for non-Buddhists, the findings offer scientific proof for meditation techniques to alter the calculus of the moral mind.

Can Meditation Make You a More Compassionate Person?

Scientists have mostly focused on the benefits of meditation for the brain and the body, but a recent study by Northeastern University’s David DeSteno, published in Psychological Science, takes a look at what impacts meditation has on interpersonal harmony and compassion.

Several religious traditions have suggested that mediation does just that, but there has been no scientific proof—until now.

In this study, a team of researchers from Northeastern University and Harvard University examined the effects meditation would have on compassion and virtuous behavior, and the results were fascinating.

THE STUDY

This study—funded by the Mind and Life Institute—invited participants to complete eight-week trainings in two types of meditation. After the sessions, they were put to the test.

Sitting in a staged waiting room with three chairs were two actors. With one empty chair left, the participant sat down and waited to be called. Another actor using crutches and appearing to be in great physical pain, would then enter the room.  As she did, the actors in the chair would ignore her by fiddling with their phones or opening a book.

The question DeSteno and Paul Condon – a graduate student in DeSteno’s lab who led the study – and their team wanted to answer was whether the subjects who took part in the meditation classes would be more likely to come to the aid of the person in pain, even in the face of everyone else ignoring her. “We know meditation improves a person’s own physical and psychological wellbeing,” said Condon. “We wanted to know whether it actually increases compassionate behavior.”

MEDITATION WORKS

Among the non-meditating participants, only about 15 percent of people acted to help. But among the participants who were in the meditation sessions “we were able to boost that up to 50 percent,” said DeSteno.  This result was true for both meditation groups thereby showing the effect to be consistent across different forms of meditation.  “The truly surprising aspect of this finding is that meditation made people willing to act virtuous – to help another who was suffering – even in the face of a norm not to do so,” DeSteno said, “The fact that the other actors were ignoring the pain creates as ‘bystander-effect’ that normally tends to reduce helping.  People often wonder ‘Why should I help someone if no one else is?’”

These results appear to prove what the Buddhist theologians have long believed—that meditation is supposed to lead you to experience more compassion and love for all sentient beings. But even for non-Buddhists, the findings offer scientific proof for meditation techniques to alter the calculus of the moral mind.

Filed under meditation compassion compassionate behavior morality psychology neuroscience science

231 notes

Are Babies Born Good?
Arber Tasimi is a 23-year-old researcher at Yale University’s Infant Cognition Center, where he studies the moral inclinations of babies—how the littlest children understand right and wrong, before language and culture exert their deep influence.“What are we at our core, before anything, before everything?” he asks. His experiments draw on the work of Jean Piaget, Noam Chomsky, his own undergraduate thesis at the University of Pennsylvania and what happened to him in New Haven, Connecticut, one Friday night last February.
It was about 9:45 p.m., and Tasimi and a friend were strolling home from dinner at Buffalo Wild Wings. Just a few hundred feet from his apartment building, he passed a group of young men in jeans and hoodies. Tasimi barely noticed them, until one landed a punch to the back of his head.
There was no time to run. The teenagers, ignoring his friend, wordlessly surrounded Tasimi, who had crumpled to the brick sidewalk. “It was seven guys versus one aspiring PhD,” he remembers. “I started counting punches, one, two, three, four, five, six, seven. Somewhere along the way, a knife came out.” The blade slashed through his winter coat, just missing his skin.
At last the attackers ran, leaving Tasimi prone and weeping on the sidewalk, his left arm broken. Police later said he was likely the random victim of a gang initiation.
After surgeons inserted a metal rod in his arm, Tasimi moved back home with his parents in Waterbury, Conn­­ecticut, about 35 minutes from New Haven, and became a creature much like the babies whose social lives he studies. He couldn’t shower on his own. His mom washed him and tied his shoes. His sister cut his meat.
Spring came. One beautiful afternoon, the temperature soared into the 70s and Tasimi, whose purple and yellow bruises were still healing, worked up the courage to stroll outside by himself for the first time. He went for a walk on a nearby jogging trail. He tried not to notice the two teenagers who seemed to be following him. “Stop ca­tastrophizing,” he told himself again and again, up until the moment the boys demanded his headphones.
The mugging wasn’t violent but it broke his spirit. Now the whole world seemed menacing. When he at last resumed his morality studies at the Infant Cognition Center, he parked his car on the street, feeding the meter every few hours rather than risking a shadowy parking garage.
“I’ve never been this low in life,” he told me when we first met at the baby lab a few weeks after the second crime. “You can’t help wonder: Are we a failed species?”
At times, he said, “only my research gives me hope.”
Continue reading

Are Babies Born Good?

Arber Tasimi is a 23-year-old researcher at Yale University’s Infant Cognition Center, where he studies the moral inclinations of babies—how the littlest children understand right and wrong, before language and culture exert their deep influence.“What are we at our core, before anything, before everything?” he asks. His experiments draw on the work of Jean Piaget, Noam Chomsky, his own undergraduate thesis at the University of Pennsylvania and what happened to him in New Haven, Connecticut, one Friday night last February.

It was about 9:45 p.m., and Tasimi and a friend were strolling home from dinner at Buffalo Wild Wings. Just a few hundred feet from his apartment building, he passed a group of young men in jeans and hoodies. Tasimi barely noticed them, until one landed a punch to the back of his head.

There was no time to run. The teenagers, ignoring his friend, wordlessly surrounded Tasimi, who had crumpled to the brick sidewalk. “It was seven guys versus one aspiring PhD,” he remembers. “I started counting punches, one, two, three, four, five, six, seven. Somewhere along the way, a knife came out.” The blade slashed through his winter coat, just missing his skin.

At last the attackers ran, leaving Tasimi prone and weeping on the sidewalk, his left arm broken. Police later said he was likely the random victim of a gang initiation.

After surgeons inserted a metal rod in his arm, Tasimi moved back home with his parents in Waterbury, Conn­­ecticut, about 35 minutes from New Haven, and became a creature much like the babies whose social lives he studies. He couldn’t shower on his own. His mom washed him and tied his shoes. His sister cut his meat.

Spring came. One beautiful afternoon, the temperature soared into the 70s and Tasimi, whose purple and yellow bruises were still healing, worked up the courage to stroll outside by himself for the first time. He went for a walk on a nearby jogging trail. He tried not to notice the two teenagers who seemed to be following him. “Stop ca­tastrophizing,” he told himself again and again, up until the moment the boys demanded his headphones.

The mugging wasn’t violent but it broke his spirit. Now the whole world seemed menacing. When he at last resumed his morality studies at the Infant Cognition Center, he parked his car on the street, feeding the meter every few hours rather than risking a shadowy parking garage.

“I’ve never been this low in life,” he told me when we first met at the baby lab a few weeks after the second crime. “You can’t help wonder: Are we a failed species?”

At times, he said, “only my research gives me hope.”

Continue reading

Filed under evolution infant morality cognition morality psychology neuroscience science

79 notes


New Studies Show Moral Judgments Quicker, More Extreme than Practical Ones—But Also Flexible
Judgments we make with a moral underpinning are made more quickly and are more extreme than those same judgments based on practical considerations, a new set of studies finds. However, the findings, which appear in the journal PLOS ONE, also show that judgments based on morality can be readily shifted and made with other considerations in mind.
“Little work has been done on how attaching morality to a particular judgment or decision may affect that outcome,” explains Jay Van Bavel, an assistant professor in New York University’s Department of Psychology and one of the study’s co-authors. “Our findings show that we make and see decisions quite differently if they are made with a morality frame. But, despite these differences, there is now evidence that we can shift judgments so they are based on practical, rather than moral, considerations—and vice versa.”
“Our findings suggest that deciding to frame any issue as moral or not may have important consequences,” said co-author Ingrid Haas, an assistant professor of political science at the University of Nebraska-Lincoln. “Once an issue is declared moral, people’s judgments about that issue become more extreme, and they are more likely to apply those judgments to others.”
“Ultimately, the way that people make decisions is likely to affect their behavior,” said co-author Dominic Packer, an assistant professor at Lehigh University. ”People may act in ways that violate their moral values when they make decisions in terms of pragmatic concerns - dollars and cents - rather than in a moral frame. In ongoing research, we are examining factors that can trigger moral forms of decision making, so that people are more likely to behave in line with their values.”

New Studies Show Moral Judgments Quicker, More Extreme than Practical Ones—But Also Flexible

Judgments we make with a moral underpinning are made more quickly and are more extreme than those same judgments based on practical considerations, a new set of studies finds. However, the findings, which appear in the journal PLOS ONE, also show that judgments based on morality can be readily shifted and made with other considerations in mind.

“Little work has been done on how attaching morality to a particular judgment or decision may affect that outcome,” explains Jay Van Bavel, an assistant professor in New York University’s Department of Psychology and one of the study’s co-authors. “Our findings show that we make and see decisions quite differently if they are made with a morality frame. But, despite these differences, there is now evidence that we can shift judgments so they are based on practical, rather than moral, considerations—and vice versa.”

“Our findings suggest that deciding to frame any issue as moral or not may have important consequences,” said co-author Ingrid Haas, an assistant professor of political science at the University of Nebraska-Lincoln. “Once an issue is declared moral, people’s judgments about that issue become more extreme, and they are more likely to apply those judgments to others.”

“Ultimately, the way that people make decisions is likely to affect their behavior,” said co-author Dominic Packer, an assistant professor at Lehigh University. ”People may act in ways that violate their moral values when they make decisions in terms of pragmatic concerns - dollars and cents - rather than in a moral frame. In ongoing research, we are examining factors that can trigger moral forms of decision making, so that people are more likely to behave in line with their values.”

Filed under moral judgments moral reasoning morality decision-making neuroscience psychology science

700 notes


The brain of OCD sufferers is more active when faced with a moral dilemma
Patients with obsessive-compulsive disorder are characterised by persistent thoughts and repetitive behaviours. A new study reveals that sufferers worry considerably more than the general population in the face of morality problems.
Along with the help of experts from the Barcelona’s Hospital del Mar and the University of Melbourne (Australia), researchers at the Hospital de Bellvitge in Barcelona have proven that patients with obsessive-compulsive disorder, known as OCD, are more morally sensitive.
"Faced with a problem of this type, people suffering from this type of anxiety disorder show that they worry considerably more," as explained to SINC by Carles Soriano, researcher at the Catalan hospital and one of the lead authors of the work published in the journal Archives of General Psychiatry.

The brain of OCD sufferers is more active when faced with a moral dilemma

Patients with obsessive-compulsive disorder are characterised by persistent thoughts and repetitive behaviours. A new study reveals that sufferers worry considerably more than the general population in the face of morality problems.

Along with the help of experts from the Barcelona’s Hospital del Mar and the University of Melbourne (Australia), researchers at the Hospital de Bellvitge in Barcelona have proven that patients with obsessive-compulsive disorder, known as OCD, are more morally sensitive.

"Faced with a problem of this type, people suffering from this type of anxiety disorder show that they worry considerably more," as explained to SINC by Carles Soriano, researcher at the Catalan hospital and one of the lead authors of the work published in the journal Archives of General Psychiatry.

Filed under brain OCD anxiety morality neuroscience psychology science

free counters