Neuroscience

Articles and news from the latest research reports.

Posts tagged psychology

109 notes

Laughter perception networks in brain different for mocking, joyful or ticklish laughter
A laugh may signal mockery, humor, joy or simply be a response to tickling, but each kind of laughter conveys a wealth of auditory and social information. These different kinds of laughter also spark different connections within the “laughter perception network” in the human brain depending on their context, according to research published May 8 in the open access journal PLOS ONE by Dirk Wildgruber and colleagues from the University of Tuebingen, Germany.
Laughter in animals is a form of social bonding based on a primordial reflex to tickling, but human laughter has come a long way from these playful roots. Though many people laugh when they’re tickled, ‘social laughter’ in humans can be used to communicate happiness, taunts or other conscious messages to peers. Here, researchers studied participants’ neural responses as they listened to three kinds of laughter: joy, taunt and tickling.
"Laughing at someone and laughing with someone leads to different social consequences," says Wildgruber. "Specific cerebral connectivity patterns during perception of these different types of laughter presumably reflect modulation of attentional mechanisms and processing resources.
The researchers found that brain regions sensitive to processing more complex social information were activated when people heard joyous or taunting laughter, but not when they heard the ‘tickling laughter’. However, ‘tickling laughter’ is more complex than the other types at the acoustic level, and consequently activated brain regions sensitive to this higher degree of acoustic complexity. These dynamic changes activated and connected different regions depending on the kind of laughter participants heard. Patterns of brain connectivity can impact cognitive function in health and disease. Though some previous research has examined how speech can influence these patterns, this study is among the first few to examine non-verbal vocal cues like laughter.
(Image: Bigstock)

Laughter perception networks in brain different for mocking, joyful or ticklish laughter

A laugh may signal mockery, humor, joy or simply be a response to tickling, but each kind of laughter conveys a wealth of auditory and social information. These different kinds of laughter also spark different connections within the “laughter perception network” in the human brain depending on their context, according to research published May 8 in the open access journal PLOS ONE by Dirk Wildgruber and colleagues from the University of Tuebingen, Germany.

Laughter in animals is a form of social bonding based on a primordial reflex to tickling, but human laughter has come a long way from these playful roots. Though many people laugh when they’re tickled, ‘social laughter’ in humans can be used to communicate happiness, taunts or other conscious messages to peers. Here, researchers studied participants’ neural responses as they listened to three kinds of laughter: joy, taunt and tickling.

"Laughing at someone and laughing with someone leads to different social consequences," says Wildgruber. "Specific cerebral connectivity patterns during perception of these different types of laughter presumably reflect modulation of attentional mechanisms and processing resources.

The researchers found that brain regions sensitive to processing more complex social information were activated when people heard joyous or taunting laughter, but not when they heard the ‘tickling laughter’. However, ‘tickling laughter’ is more complex than the other types at the acoustic level, and consequently activated brain regions sensitive to this higher degree of acoustic complexity. These dynamic changes activated and connected different regions depending on the kind of laughter participants heard. Patterns of brain connectivity can impact cognitive function in health and disease. Though some previous research has examined how speech can influence these patterns, this study is among the first few to examine non-verbal vocal cues like laughter.

(Image: Bigstock)

Filed under brain laughter neural response cognitive functioning psychology neuroscience science

96 notes

Food commercials excite teen brains
Watching TV commercials of people munching on hot, crispy French fries or sugar-laden cereal resonates more with teens than advertisements about cell phone plans or the latest car.
A new University of Michigan study found that regardless of body weight, teens had high brain activity during food commercials compared to nonfood commercials.
"It appears that food advertising is better at getting into the mind and memory of kids," said Ashley Gearhardt, U-M assistant professor of psychology and the study’s lead author. "This makes sense because our brains are hard-wired to get excited in response to delicious foods."
Children see thousands of commercials each year designed to increase their desire for foods high in sugar, fat and salt. Researchers from U-M, the Oregon Research Institute and Yale University analyzed how the advertising onslaught affects the brain.
Thirty teenagers (ages 14-17) ranging from normal weight to obese watched a television show with commercial breaks. Their brain activity was measured with a functional magnetic resonance imaging scanner.
The video showed 20 food commercials and 20 nonfood commercials featuring major brands such as McDonald’s, Cheerios, AT&T and Allstate Insurance. Study participants were asked to list five commercials they saw and to rate how much they liked the product or company featured in the ads.
Regions of the brain linked to attention, reward and taste were active for all participants, especially when food commercials aired. Overall, they recalled and liked food commercials better than nonfood commercials.
Teens whose weight was considered normal had greater reward-related brain activity when viewing the food commercials compared to obese adolescents. Gearhardt said this suggests that all teenagers, even those who are not currently overweight, are affected by food advertising and that exposure could lead to future weight gain in normal weight youth.
The study concluded that obese participants may attempt to control their response to food commercials, which might alter the way their brain responds. But if these teens are bombarded with frequent food cues, their self-control might falter—especially if they feel stressed, hungry or depressed.
Gearhardt said brain regions that are more responsive in lean adolescents during food commercials have been linked with future weight gain. These findings, which appear in the current issue of Social Cognitive and Affective Neuroscience, may inform the current debates about the impact of food advertising on minors.

Food commercials excite teen brains

Watching TV commercials of people munching on hot, crispy French fries or sugar-laden cereal resonates more with teens than advertisements about cell phone plans or the latest car.

A new University of Michigan study found that regardless of body weight, teens had high brain activity during food commercials compared to nonfood commercials.

"It appears that food advertising is better at getting into the mind and memory of kids," said Ashley Gearhardt, U-M assistant professor of psychology and the study’s lead author. "This makes sense because our brains are hard-wired to get excited in response to delicious foods."

Children see thousands of commercials each year designed to increase their desire for foods high in sugar, fat and salt. Researchers from U-M, the Oregon Research Institute and Yale University analyzed how the advertising onslaught affects the brain.

Thirty teenagers (ages 14-17) ranging from normal weight to obese watched a television show with commercial breaks. Their brain activity was measured with a functional magnetic resonance imaging scanner.

The video showed 20 food commercials and 20 nonfood commercials featuring major brands such as McDonald’s, Cheerios, AT&T and Allstate Insurance. Study participants were asked to list five commercials they saw and to rate how much they liked the product or company featured in the ads.

Regions of the brain linked to attention, reward and taste were active for all participants, especially when food commercials aired. Overall, they recalled and liked food commercials better than nonfood commercials.

Teens whose weight was considered normal had greater reward-related brain activity when viewing the food commercials compared to obese adolescents. Gearhardt said this suggests that all teenagers, even those who are not currently overweight, are affected by food advertising and that exposure could lead to future weight gain in normal weight youth.

The study concluded that obese participants may attempt to control their response to food commercials, which might alter the way their brain responds. But if these teens are bombarded with frequent food cues, their self-control might falter—especially if they feel stressed, hungry or depressed.

Gearhardt said brain regions that are more responsive in lean adolescents during food commercials have been linked with future weight gain. These findings, which appear in the current issue of Social Cognitive and Affective Neuroscience, may inform the current debates about the impact of food advertising on minors.

Filed under food commercials brain activity teenagers adolescents fMRI neuroscience psychology science

112 notes

Women’s, men’s brains respond differently to hungry infant’s cries
Researchers at the National Institutes of Health have uncovered firm evidence for what many mothers have long suspected: women’s brains appear to be hard-wired to respond to the cries of a hungry infant.
Researchers asked men and women to let their minds wander, then played a recording of white noise interspersed with the sounds of an infant crying. Brain scans showed that, in the women, patterns of brain activity abruptly switched to an attentive mode when they heard the infant cries, whereas the men’s brains remained in the resting state.
“Previous studies have shown that, on an emotional level, men and women respond differently to the sound of an infant crying,” said study co-author Marc H. Bornstein, Ph.D., head of the Child and Family Research Section of the Eunice Kennedy Shriver National Institute of Child Health and Human Development (NICHD), the institute that conducted the study. “Our findings indicate that men and women show marked differences in terms of attention as well.”
The earlier studies showed that women are more likely than men to feel sympathy when they hear an infant cry, and are more likely to want to care for the infant.
Dr. Bornstein collaborated with Nicola De Pisapia, Ph.D., Paola Rigo, Simona DeFalco, Ph.D., and Paola Venuti, Ph.D., all of the Observation, Diagnosis and Education Lab at the University of Trento, Italy, and Gianluca Esposito, Ph.D., of RIKEN Brain Science Institute, Japan.
Their findings appear in NeuroReport.
Previous studies have shown differences in patterns of brain activity between when an individual’s attention is focused and when the mind wanders. The pattern of unfocused activity is referred to as default mode, Dr. Bornstein explained. When individuals focus on something in particular, their brains disengage from the default mode and activate other brain networks.
For about 15 minutes, participants listened to white noise interspersed with short periods of silence and with the sounds of a hungry infant crying. The patterns of their brain activity were recorded by a technique known as functional magnetic resonance imaging.
The researchers analyzed brain images from 18 adults, parents and nonparents. The researchers found that when participants listened to the typical infant cries, the brain activity of men and women differed. When hearing a hungry infant cry, women’s brains were more likely to disengage from the default mode, indicating that they focused their attention on the crying. In contrast, the men’s brains tended to remain in default mode during the infant crying sounds. The brain patterns did not vary between parents and nonparents.
Infants cry because they are distressed, hungry, or in need of physical closeness. To determine if adults respond differently to different types of cries, the researchers also played the cries of infants who were later diagnosed with autism. An earlier study of Dr. Bornstein and the same Italian group found that the cries of infants who develop ASD tend to be higher pitched than those of other infants and that the pauses between cries are shorter. In this other study, both men and women tended to interrupt their mind wandering when they heard these cries.
“Adults have many-layered responses to the things infants do,” said Dr. Bornstein. “Determining whether these responses differ between men and women, by age, and by parental status, helps us understand instincts for caring for the very young.”
In an earlier study, Dr. Bornstein and his colleagues found that patterns of brain activity in men and women also changed when they viewed an image of an infant face and that the patterns were indicative of a predisposition to relate to and care for the infant.
Such studies documenting the brain activity patterns of adults represent first stages of research in neuroscience understanding how adults relate to and care for infants, Dr. Bornstein explained. It is possible that not all adults exhibit the brain patterns seen in these studies.

Women’s, men’s brains respond differently to hungry infant’s cries

Researchers at the National Institutes of Health have uncovered firm evidence for what many mothers have long suspected: women’s brains appear to be hard-wired to respond to the cries of a hungry infant.

Researchers asked men and women to let their minds wander, then played a recording of white noise interspersed with the sounds of an infant crying. Brain scans showed that, in the women, patterns of brain activity abruptly switched to an attentive mode when they heard the infant cries, whereas the men’s brains remained in the resting state.

“Previous studies have shown that, on an emotional level, men and women respond differently to the sound of an infant crying,” said study co-author Marc H. Bornstein, Ph.D., head of the Child and Family Research Section of the Eunice Kennedy Shriver National Institute of Child Health and Human Development (NICHD), the institute that conducted the study. “Our findings indicate that men and women show marked differences in terms of attention as well.”

The earlier studies showed that women are more likely than men to feel sympathy when they hear an infant cry, and are more likely to want to care for the infant.

Dr. Bornstein collaborated with Nicola De Pisapia, Ph.D., Paola Rigo, Simona DeFalco, Ph.D., and Paola Venuti, Ph.D., all of the Observation, Diagnosis and Education Lab at the University of Trento, Italy, and Gianluca Esposito, Ph.D., of RIKEN Brain Science Institute, Japan.

Their findings appear in NeuroReport.

Previous studies have shown differences in patterns of brain activity between when an individual’s attention is focused and when the mind wanders. The pattern of unfocused activity is referred to as default mode, Dr. Bornstein explained. When individuals focus on something in particular, their brains disengage from the default mode and activate other brain networks.

For about 15 minutes, participants listened to white noise interspersed with short periods of silence and with the sounds of a hungry infant crying. The patterns of their brain activity were recorded by a technique known as functional magnetic resonance imaging.

The researchers analyzed brain images from 18 adults, parents and nonparents. The researchers found that when participants listened to the typical infant cries, the brain activity of men and women differed. When hearing a hungry infant cry, women’s brains were more likely to disengage from the default mode, indicating that they focused their attention on the crying. In contrast, the men’s brains tended to remain in default mode during the infant crying sounds. The brain patterns did not vary between parents and nonparents.

Infants cry because they are distressed, hungry, or in need of physical closeness. To determine if adults respond differently to different types of cries, the researchers also played the cries of infants who were later diagnosed with autism. An earlier study of Dr. Bornstein and the same Italian group found that the cries of infants who develop ASD tend to be higher pitched than those of other infants and that the pauses between cries are shorter. In this other study, both men and women tended to interrupt their mind wandering when they heard these cries.

“Adults have many-layered responses to the things infants do,” said Dr. Bornstein. “Determining whether these responses differ between men and women, by age, and by parental status, helps us understand instincts for caring for the very young.”

In an earlier study, Dr. Bornstein and his colleagues found that patterns of brain activity in men and women also changed when they viewed an image of an infant face and that the patterns were indicative of a predisposition to relate to and care for the infant.

Such studies documenting the brain activity patterns of adults represent first stages of research in neuroscience understanding how adults relate to and care for infants, Dr. Bornstein explained. It is possible that not all adults exhibit the brain patterns seen in these studies.

Filed under brain scans brain activity infant cries infants women fMRI psychology neuroscience science

121 notes

Nerve stimulation for severe depression changes brain function 
For nearly a decade, doctors have used implanted electronic stimulators to treat severe depression in people who don’t respond to standard antidepressant therapy.
Now, preliminary brain scan studies conducted by researchers at Washington University School of Medicine in St. Louis are beginning to reveal the processes occurring in the brain during stimulation and may provide some clues about how the device improves depression. They found that vagus nerve stimulation brings about changes in brain metabolism weeks or even months before patients begin to feel better.
The findings will appear in an upcoming issue of the journal Brain Stimulation and are now available online.
“Previous studies involving large numbers of people have demonstrated that many with treatment-resistant depression improve with vagus nerve stimulation,” said first author Charles R. Conway, MD, associate professor of psychiatry. “But little is known about how this stimulation works to relieve depression. We focused on specific brain regions known to be connected to depression.”
Conway’s team followed 13 people with treatment-resistant depression. Their symptoms had not improved after many months of treatment with as many as five different antidepressant medications. Most had been depressed for at least two years, but some patients had been clinically depressed for more than 20 years.
All of the participants had surgery to insert a device to electronically stimulate the left vagus nerve, which runs down the side of the body from the brainstem to the abdomen. Once activated, the device delivers a 30-second electronic stimulus to the vagus nerve every five minutes.
To establish the nature of the treatment’s effects on brain activity, the researchers performed positron emission tomography (PET) brain imaging before the initiation of stimulation, and again three and 12 months after stimulation had begun.
Eventually, nine of the 13 subjects experienced improvements in depression with the treatment. However, in most cases it took several months for improvement to occur.Remarkably, in those who responded, the scans showed significant changes in brain metabolism following three months of stimulation, which typically preceded improvements in symptoms of depression by several months.
“We saw very large changes in brain metabolism occurring far in advance of any improvement in mood,” Conway said. “It’s almost as if there’s an adaptive process that occurs. First, the brain begins to function differently. Then, the patient’s mood begins to improve.”
Although the patients remained on antidepressants for several months after their stimulators were implanted, Conway says many of those who responded to the device eventually were able to stop taking medication.
“Sometimes the antidepressant drugs work in concert with the stimulator, but it appears to us that when people get better, it is the vagus nerve stimulator that is doing the heavy lifting,” Conway explained. “Stimulation seems to be responsible for most of the improvement we see.”
Additionally, the PET scans demonstrated that structures deeper in the brain also begin to change several months after nerve stimulation begins. Many of those structures have high concentrations of brain cells that release dopamine, a neurotransmitter that helps control the brain’s reward and pleasure centers and also helps regulate emotional responses.
There is a consensus forming among depression researchers that problems in dopamine pathways may be particularly important in treatment-resistant depression, according to Conway. And he said the finding that vagus nerve stimulators influence those pathways may explain why the therapy can help and why, when it works, its effects are not transient. Patients who respond to vagus nerve stimulation tend to get better and stay better.
“We hypothesized that something significant had to be occurring in the brain, and our research seems to back that up,” he said.

Nerve stimulation for severe depression changes brain function

For nearly a decade, doctors have used implanted electronic stimulators to treat severe depression in people who don’t respond to standard antidepressant therapy.

Now, preliminary brain scan studies conducted by researchers at Washington University School of Medicine in St. Louis are beginning to reveal the processes occurring in the brain during stimulation and may provide some clues about how the device improves depression. They found that vagus nerve stimulation brings about changes in brain metabolism weeks or even months before patients begin to feel better.

The findings will appear in an upcoming issue of the journal Brain Stimulation and are now available online.

“Previous studies involving large numbers of people have demonstrated that many with treatment-resistant depression improve with vagus nerve stimulation,” said first author Charles R. Conway, MD, associate professor of psychiatry. “But little is known about how this stimulation works to relieve depression. We focused on specific brain regions known to be connected to depression.”

Conway’s team followed 13 people with treatment-resistant depression. Their symptoms had not improved after many months of treatment with as many as five different antidepressant medications. Most had been depressed for at least two years, but some patients had been clinically depressed for more than 20 years.

All of the participants had surgery to insert a device to electronically stimulate the left vagus nerve, which runs down the side of the body from the brainstem to the abdomen. Once activated, the device delivers a 30-second electronic stimulus to the vagus nerve every five minutes.

To establish the nature of the treatment’s effects on brain activity, the researchers performed positron emission tomography (PET) brain imaging before the initiation of stimulation, and again three and 12 months after stimulation had begun.

Eventually, nine of the 13 subjects experienced improvements in depression with the treatment. However, in most cases it took several months for improvement to occur.

Remarkably, in those who responded, the scans showed significant changes in brain metabolism following three months of stimulation, which typically preceded improvements in symptoms of depression by several months.

“We saw very large changes in brain metabolism occurring far in advance of any improvement in mood,” Conway said. “It’s almost as if there’s an adaptive process that occurs. First, the brain begins to function differently. Then, the patient’s mood begins to improve.”

Although the patients remained on antidepressants for several months after their stimulators were implanted, Conway says many of those who responded to the device eventually were able to stop taking medication.

“Sometimes the antidepressant drugs work in concert with the stimulator, but it appears to us that when people get better, it is the vagus nerve stimulator that is doing the heavy lifting,” Conway explained. “Stimulation seems to be responsible for most of the improvement we see.”

Additionally, the PET scans demonstrated that structures deeper in the brain also begin to change several months after nerve stimulation begins. Many of those structures have high concentrations of brain cells that release dopamine, a neurotransmitter that helps control the brain’s reward and pleasure centers and also helps regulate emotional responses.

There is a consensus forming among depression researchers that problems in dopamine pathways may be particularly important in treatment-resistant depression, according to Conway. And he said the finding that vagus nerve stimulators influence those pathways may explain why the therapy can help and why, when it works, its effects are not transient. Patients who respond to vagus nerve stimulation tend to get better and stay better.

“We hypothesized that something significant had to be occurring in the brain, and our research seems to back that up,” he said.

Filed under nerve stimulation depression brain activity brain metabolism psychology neuroscience science

103 notes

Children’s brain processing speed indicates risk of psychosis

New research from Bristol and Cardiff universities shows that children whose brains process information more slowly than their peers are at greater risk of psychotic experiences.

image

These can include hearing voices, seeing things that are not present or holding unrealistic beliefs that other people don’t share. These experiences can often be distressing and frightening and interfere with their everyday life.

Children with psychotic experiences are more likely to develop psychotic illnesses like schizophrenia later in life.

Using data gathered from 6,784 participants in Children of the 90s, researchers from the MRC Centre for Neuropsychiatric Genetics and Genomics in Cardiff University and the School of Social and Community Medicine in the University of Bristol examined whether performance in a number of cognitive tests conducted at ages 8, 10 and 11 was related to the risk of having psychotic experiences at age 12.

The tests assessed how quickly the children could process information, as well as their attention, memory, reasoning, and ability to solve problems.

Among those interviewed, 787 (11.6 per cent) had suspected or definite psychotic experiences at age 12. Children that scored less well in the various tests at the ages of 8, 10 and 11 were more likely to have psychotic experiences at age 12.

This was particularly the case for the test that assessed how quickly the children processed information. Furthermore, children whose speed of processing information became slower between ages 8 and 11 had greater risk of having psychotic experiences at age 12.

These findings did not change when other factors, including the parent’s psychiatric history and the children’s own developmental delay, were taken into account. The study’s findings could have important implications for identifying children at risk of psychosis, with the benefit of early treatment.

Speaking about the findings, lead author and PhD student, Miss Maria Niarchou from Cardiff University’s School of Medicine, said:

‘Previous research has shown a link between the slowing down of information processing and schizophrenia and this was found to be at least in part the result of anti-psychotic medication.

‘However, this study shows that impaired information processing speed can already be present in childhood and associated with higher risk of psychotic experiences, irrespective of medication.

‘Our findings improve our understanding of the brain processes that are associated with high risk of psychotic experiences in childhood and in turn high risk of psychotic disorder later in life.’

Senior author, Dr Marianne van den Bree of Cardiff University’s School of Medicine, said:

‘Schizophrenia is a complex and relatively rare mental health condition, occurring at a rate of 1 per cent in the general population. Not every child with impaired information processing speed is at risk of psychosis later in life. Further research is needed to determine whether interventions to improve processing speed in at-risk children can lead to decreased transition to psychotic disorders.’

Ruth Coombs, Manager for Influence and Change at Mind Cymru, said:

‘This is a very interesting piece of research, which could help young people at risk of developing mental health problems in later life build resilience and benefit from early intervention. It is important to remember that people can and do recover from mental health problems and we also welcome further research which supports resilience building in young people.’

(Source: bristol.ac.uk)

Filed under brain psychotic experiences schizophrenia chidren child development psychology neuroscience science

223 notes

The woman who can’t recognise her face

"I’ve been in a crowded elevator with mirrors all around, and a woman will move and I’ll go to get out the way and then realise: ‘oh that woman is me’."

Heather Sellers has prosopagnosia, more commonly known as face blindness. “I can’t remember any image of the human face. It’s simply not special to me,” she says. “I don’t process them like I do a car or a dog. It’s not a visual problem, it’s a perception problem.”

image

Heather knew from a young age that something was different about the way she navigated her world, but her condition wasn’t diagnosed until she was in her 30s. “I always knew something was wrong – it was impossible for me to trust my perceptions of the world. I was diagnosed as anxious. My parents thought I was crazy.”

The condition is estimated to affect around 2.5 per cent of the population, and it’s common for those who have it not to realise that anything is wrong. “In many ways it’s a subtle disorder,” says Heather. “It’s easy for your brain to compensate because there are so many other things you can use to identify a person: hair colour, gait or certain clothes. But meet that person out of context and it’s socially devastating.”

As a child, she was once separated from her mum at a grocery store. Store staff reunited the pair, but it was confusing for Heather, since she didn’t initially recognise her mother. “But I didn’t know that I wasn’t recognising her.”

Chaos explained

Heather was 36 when she stumbled across the phrase face blindness in a psychology textbook. “When I saw those two words I knew instantly that was exactly what I had – that explained all the chaos.”

She found her way to Harvard neuroscientist Brad Duchaine who diagnosed her as having one of the three worst cases of the disorder that he had ever seen.

So what’s it like to not recognise anyone you know? Heather says the biggest difficulty with the disorder is recognising people who she is close to – the people that are most important to recognise. In the school where she teaches English she is fine, because she recognises people by their clothes or hair and asks her students to wear name badges.

But it can be harder in social settings. Once she went up to the wrong person at a party and put her arm around him thinking he was her partner. And at college men would phone her angry that she had walked straight past them after they had had a date. “At the time I was thinking ‘I didn’t see you, why is everyone making my life so difficult?’”

It’s not just other people Heather doesn’t recognise – she can’t identify her own face either. “A few times I have been in a crowded elevator with mirrors all around and a woman will move, and I will go to get out the way and then realise ‘oh that woman is me’.” She also finds it unsettling to see photos and not recognise herself in them.

Face processing

To try and understand the condition, Duchaine and his colleagues recorded brain activity while 12 people with prosopagnosia looked at famous and non-famous faces. The team found that part of the brain responsible for stored visual memory was activated in six people when they saw the famous faces.

But another component of brain activity thought to represent a later stage of face processing wasn’t triggered. “Some part of their brain was recognising the face,” says Duchaine, but the brain was failing to pass this information into higher-level consciousness (Brain).

"There may be training where we give people feedback and say ‘look you recognise that face even though you’re not aware of it’," says Duchaine.

Now Zaira Cattaneo at the University of Milano-Bicocca in Italy and colleagues have identified the specific brain areas that allow us to recognise our friends. The team used transcranial magnetic stimulation to block two vital aspects of face processing in people without prosopagnosia. Targeting the left prefrontal cortex blocked the ability to distinguish individual features like the nose and eyes, and blocking the right prefrontal cortex impaired the ability to distinguish the location of those features from one another (NeuroImage).

"We made performance worse," says Cattaneo. "We want to make it better." Now the team are trying to activate these areas of the brain. "The aim is to enhance face recognition abilities by directly modulating excitability in the prefrontal cortices," says Cattaneo.

Would Heather want a cure, should one be found? “I can’t imagine what you see when you see a face, and it’s scary,” she says. “I go back and forth on what I’d do. I’ve done so much work in figuring out how to chart my world, I’d need to do a whole new rewrite. But it would be fascinating.”

Filed under prosopagnosia face blindness visual perception visual memory psychology neuroscience science

71 notes

A little brain training goes a long way
People who use a ‘brain-workout’ program for just 10 hours have a mental edge over their peers even a year later, researchers report today in PLoS ONE.
The search for a regimen of mental callisthenics to stave off age-related cognitive decline is a booming area of research — and a multimillion-dollar business. But critics argue that even though such computer programs can improve performance on specific mental tasks, there is scant proof that they have broader cognitive benefits.
For the study, adults aged 50 and older played a computer game designed to boost the speed at which players process visual stimuli. Processing speed is thought to be “the first domino that falls in cognitive decline”, says Fredric Wolinsky, a public-health researcher at the University of Iowa in Iowa City, who led the research.
The game was developed by academic researchers but is now sold under the name Double Decision by Posit Science, based in San Francisco, California. (Posit did not fund the study.) Players are timed on how fast they click on an image in the centre of the screen and on others that appear around the periphery. The program ratchets up the difficulty as a player’s performance improves.
Participants played the training game for 10 hours on site, some with an extra 4-hour ‘booster’ session later, or for 10 hours at home. A control group worked on computerized crossword puzzles for 10 hours on site. Researchers measured the mental agility of all 621 subjects before the brain training began, and again one year later, using eight well-established tests of cognitive performance.
The control group’s scores did not increase over the course of that year, but all the brain-training groups significantly upped their scores in the Useful Field of View test — which requires a subject to identify items in a scene with just a quick glance — and four others. When they compared the study participants’ scores to those expected for people their ages, the researchers found improvements that translated to 3-4.1 years of protection in age-related decline for the field-of-view test and from 1.5-6.6 years for the other tasks.
“It was interesting that it didn’t matter whether you were on site at the clinic or just did this at home — you got basically the same bang for your buck,” says Frederick Unverzagt, a neuropsychologist at the Indiana University School of Medicine in Indianapolis, who was not involved with the study.
But Peter Snyder, a neuropsychologist at Brown University in Providence, Rhode Island, points out that players’ performance could have improved simply because they were familiar with the game — not because their cognitive skills improved. “To me, that makes it hard to interpret the results with the same degree of certainty” that the authors have, he says.
Snyder also doubts that 10 hours of training could affect brain wiring enough to provide long-lasting general benefits, but Henry Mahncke, chief executive of Posit Science, disagrees. “If you’ve never played piano before and spend 10 hours practising, a year later you will be better than when you started,” he says. “The new study shows that there’s science to be done here. Some things you can do with your brain are highly productive and others are not.”

A little brain training goes a long way

People who use a ‘brain-workout’ program for just 10 hours have a mental edge over their peers even a year later, researchers report today in PLoS ONE.

The search for a regimen of mental callisthenics to stave off age-related cognitive decline is a booming area of research — and a multimillion-dollar business. But critics argue that even though such computer programs can improve performance on specific mental tasks, there is scant proof that they have broader cognitive benefits.

For the study, adults aged 50 and older played a computer game designed to boost the speed at which players process visual stimuli. Processing speed is thought to be “the first domino that falls in cognitive decline”, says Fredric Wolinsky, a public-health researcher at the University of Iowa in Iowa City, who led the research.

The game was developed by academic researchers but is now sold under the name Double Decision by Posit Science, based in San Francisco, California. (Posit did not fund the study.) Players are timed on how fast they click on an image in the centre of the screen and on others that appear around the periphery. The program ratchets up the difficulty as a player’s performance improves.

Participants played the training game for 10 hours on site, some with an extra 4-hour ‘booster’ session later, or for 10 hours at home. A control group worked on computerized crossword puzzles for 10 hours on site. Researchers measured the mental agility of all 621 subjects before the brain training began, and again one year later, using eight well-established tests of cognitive performance.

The control group’s scores did not increase over the course of that year, but all the brain-training groups significantly upped their scores in the Useful Field of View test — which requires a subject to identify items in a scene with just a quick glance — and four others. When they compared the study participants’ scores to those expected for people their ages, the researchers found improvements that translated to 3-4.1 years of protection in age-related decline for the field-of-view test and from 1.5-6.6 years for the other tasks.

“It was interesting that it didn’t matter whether you were on site at the clinic or just did this at home — you got basically the same bang for your buck,” says Frederick Unverzagt, a neuropsychologist at the Indiana University School of Medicine in Indianapolis, who was not involved with the study.

But Peter Snyder, a neuropsychologist at Brown University in Providence, Rhode Island, points out that players’ performance could have improved simply because they were familiar with the game — not because their cognitive skills improved. “To me, that makes it hard to interpret the results with the same degree of certainty” that the authors have, he says.

Snyder also doubts that 10 hours of training could affect brain wiring enough to provide long-lasting general benefits, but Henry Mahncke, chief executive of Posit Science, disagrees. “If you’ve never played piano before and spend 10 hours practising, a year later you will be better than when you started,” he says. “The new study shows that there’s science to be done here. Some things you can do with your brain are highly productive and others are not.”

Filed under cognitive training aging cognitive decline visual processing performance psychology neuroscience science

82 notes

Monkey Math: Baboons Show Brain’s Ability To Understand Numbers 
Opposing thumbs, expressive faces, complex social systems: it’s hard to miss the similarities between apes and humans. Now a new study with a troop of zoo baboons and lots of peanuts shows that a less obvious trait—the ability to understand numbers—also is shared by man and his primate cousins.
“The human capacity for complex symbolic math is clearly unique to our species,” says co-author Jessica Cantlon, assistant professor of brain and cognitive sciences at the University of Rochester. “But where did this numeric prowess come from? In this study we’ve shown that non-human primates also possess basic quantitative abilities. In fact, non-human primates can be as accurate at discriminating between different quantities as a human child.”
“This tells us that non-human primates have in common with humans a fundamental ability to make approximate quantity judgments,” says Cantlon. “Humans build on this talent by learning number words and developing a linguistic system of numbers, but in the absence of language and counting, complex math abilities do still exist.”
Cantlon, her research assistant Allison Barnard, postdoctoral fellow Kelly Hughes, and other colleagues at the University of Rochester and the Seneca Park Zoo in Rochester, N.Y., reported their findings online May 2 in the open-access journal Frontiers in Psychology.
The study tracked eight olive baboons, ages 4 to 14, in 54 separate trials of guess-which-cup-has-the-most-treats. Researchers placed one to eight peanuts into each of two cups, varying the numbers in each container. The baboons received all the peanuts in the cup they chose, whether it was the cup with the most goodies or not. The baboons guessed the larger quantity roughly 75 percent of the time on easy pairs when the relative difference between the quantities was large, for example two versus seven. But when the ratios were more difficult to discriminate, say six versus seven, their accuracy fell to 55 percent.
That pattern, argue the authors, helps to resolve a standing question about how animals understand quantity. Scientists have speculated that animals may use two different systems for evaluating numbers: one based on keeping track of discrete objects—a skill known to be limited to about three items at a time—and a second approach based on comparing the approximate differences between counts.
The baboons’ choices, conclude the authors, clearly relied on this latter “more than” or “less than” cognitive approach, known as the analog system. The baboons were able to consistently discriminate pairs with numbers larger than three as long as the relative difference between the peanuts in each cup was large. Research has shown that children who have not yet learned to count also depend on such comparisons to discriminate between number groups, as do human adults when they are required to quickly estimate quantity.
Studies with other animals, including birds, lemurs, chimpanzees, and even fish, have also revealed a similar ability to estimate relative quantity, but scientists have been wary of the findings because much of this research is limited to animals trained extensively in experimental procedures. The concern is that the results could reflect more about the experimenters than about the innate ability of the animals.
“We want to make sure we are not creating a ‘Clever Hans effect,’” cautions Cantlon, referring to the horse whose alleged aptitude for math was shown to rest instead on the ability to read the unintentional body language of his human trainer. To rule out such influence, the study relied on zoo baboons with no prior exposure to experimental procedures. Additionally, a control condition tested for human bias by using two experimenters—each blind to the contents of the other cup—and found that the choice patterns remained unchanged.
A final experiment tested two baboons over 130 more trials. The monkeys showed little improvement in their choice rate, indicating that learning did not play a significant role in understanding quantity.“What’s surprising is that without any prior training, these animals have the ability to solve numerical problems,” says Cantlon. The results indicate that baboons not only use comparisons to understand numbers, but that these abilities occur naturally and in the wild, the authors conclude.
Finding a functioning baboon troop for cognitive research was serendipitous, explains study co-author Jenna Bovee, the elephant handler at the Seneca Park Zoo who is also the primary keeper for the baboons. The African monkeys are hierarchical, with an alpha male at the top of the social ladder and lots of jockeying for status among the other members of the group. Many zoos have to separate baboons that don’t get along, leaving only a handful of zoos with functioning troops, Bovee explained.
Involvement in this study and ongoing research has been enriching for the 12-member troop, she said, noting that several baboons participate in research tasks about three days a week. “They enjoy it,” she says. “We never have to force them to participate. If they don’t want to do it that day, no big deal.
“It stimulates our animals in a new way that we hadn’t thought of before,” Bovee adds. “It kind of breaks up their routine during the day, gets them thinking. It gives them time by themselves to get the attention focused on them for once. And it reduces fighting among the troop. So it’s good for everybody.”
The zoo has actually adapted some of the research techniques, like a matching game with a touch-screen computer that dispenses treats, and taken it to the orangutans. “They’re using an iPad,” she says.
She also enjoys documenting the intelligence of her charges. “A lot of people don’t realize how smart these animals are. Baboons can show you that five is more than two. That’s as accurate as a typical three year old, so you have to give them that credit.”
Cantlon extends those insights to young children: “In the same way that we underestimate the cognitive abilities of non-human animals, we sometimes underestimate the cognitive abilities of preverbal children. There are quantitative abilities that exist in children prior to formal schooling or even being able to use language.”

Monkey Math: Baboons Show Brain’s Ability To Understand Numbers

Opposing thumbs, expressive faces, complex social systems: it’s hard to miss the similarities between apes and humans. Now a new study with a troop of zoo baboons and lots of peanuts shows that a less obvious trait—the ability to understand numbers—also is shared by man and his primate cousins.

“The human capacity for complex symbolic math is clearly unique to our species,” says co-author Jessica Cantlon, assistant professor of brain and cognitive sciences at the University of Rochester. “But where did this numeric prowess come from? In this study we’ve shown that non-human primates also possess basic quantitative abilities. In fact, non-human primates can be as accurate at discriminating between different quantities as a human child.”

“This tells us that non-human primates have in common with humans a fundamental ability to make approximate quantity judgments,” says Cantlon. “Humans build on this talent by learning number words and developing a linguistic system of numbers, but in the absence of language and counting, complex math abilities do still exist.”

Cantlon, her research assistant Allison Barnard, postdoctoral fellow Kelly Hughes, and other colleagues at the University of Rochester and the Seneca Park Zoo in Rochester, N.Y., reported their findings online May 2 in the open-access journal Frontiers in Psychology.

The study tracked eight olive baboons, ages 4 to 14, in 54 separate trials of guess-which-cup-has-the-most-treats. Researchers placed one to eight peanuts into each of two cups, varying the numbers in each container. The baboons received all the peanuts in the cup they chose, whether it was the cup with the most goodies or not. The baboons guessed the larger quantity roughly 75 percent of the time on easy pairs when the relative difference between the quantities was large, for example two versus seven. But when the ratios were more difficult to discriminate, say six versus seven, their accuracy fell to 55 percent.

That pattern, argue the authors, helps to resolve a standing question about how animals understand quantity. Scientists have speculated that animals may use two different systems for evaluating numbers: one based on keeping track of discrete objects—a skill known to be limited to about three items at a time—and a second approach based on comparing the approximate differences between counts.

The baboons’ choices, conclude the authors, clearly relied on this latter “more than” or “less than” cognitive approach, known as the analog system. The baboons were able to consistently discriminate pairs with numbers larger than three as long as the relative difference between the peanuts in each cup was large. Research has shown that children who have not yet learned to count also depend on such comparisons to discriminate between number groups, as do human adults when they are required to quickly estimate quantity.

Studies with other animals, including birds, lemurs, chimpanzees, and even fish, have also revealed a similar ability to estimate relative quantity, but scientists have been wary of the findings because much of this research is limited to animals trained extensively in experimental procedures. The concern is that the results could reflect more about the experimenters than about the innate ability of the animals.

“We want to make sure we are not creating a ‘Clever Hans effect,’” cautions Cantlon, referring to the horse whose alleged aptitude for math was shown to rest instead on the ability to read the unintentional body language of his human trainer. To rule out such influence, the study relied on zoo baboons with no prior exposure to experimental procedures. Additionally, a control condition tested for human bias by using two experimenters—each blind to the contents of the other cup—and found that the choice patterns remained unchanged.

A final experiment tested two baboons over 130 more trials. The monkeys showed little improvement in their choice rate, indicating that learning did not play a significant role in understanding quantity.

“What’s surprising is that without any prior training, these animals have the ability to solve numerical problems,” says Cantlon. The results indicate that baboons not only use comparisons to understand numbers, but that these abilities occur naturally and in the wild, the authors conclude.

Finding a functioning baboon troop for cognitive research was serendipitous, explains study co-author Jenna Bovee, the elephant handler at the Seneca Park Zoo who is also the primary keeper for the baboons. The African monkeys are hierarchical, with an alpha male at the top of the social ladder and lots of jockeying for status among the other members of the group. Many zoos have to separate baboons that don’t get along, leaving only a handful of zoos with functioning troops, Bovee explained.

Involvement in this study and ongoing research has been enriching for the 12-member troop, she said, noting that several baboons participate in research tasks about three days a week. “They enjoy it,” she says. “We never have to force them to participate. If they don’t want to do it that day, no big deal.

“It stimulates our animals in a new way that we hadn’t thought of before,” Bovee adds. “It kind of breaks up their routine during the day, gets them thinking. It gives them time by themselves to get the attention focused on them for once. And it reduces fighting among the troop. So it’s good for everybody.”

The zoo has actually adapted some of the research techniques, like a matching game with a touch-screen computer that dispenses treats, and taken it to the orangutans. “They’re using an iPad,” she says.

She also enjoys documenting the intelligence of her charges. “A lot of people don’t realize how smart these animals are. Baboons can show you that five is more than two. That’s as accurate as a typical three year old, so you have to give them that credit.”

Cantlon extends those insights to young children: “In the same way that we underestimate the cognitive abilities of non-human animals, we sometimes underestimate the cognitive abilities of preverbal children. There are quantitative abilities that exist in children prior to formal schooling or even being able to use language.”

Filed under primates evolution numerosity math cognition psychology neuroscience science

287 notes

The science of magic: it’s not all hocus pocus 
Think of your favourite magic trick. Is it as grandiose as David Copperfield’s Death Saw, or is it as simple as making a coin disappear in front of your very eyes?
These two very different tricks have the same effect; they delight and astound, leaving the audience to ponder (usually unsuccessfully):

How did they do that?

But while magic has entertained us for thousands of years, it also has a long and colourful history of informing areas of scientific research, from cognitive psychology to treatment of paralysis.
How could such a seemingly innocuous form of entertainment affect such diverse areas?
Uncovering magic’s secrets
In 1893, French psychologist Alfred Binet managed to co-opt five of the country’s most prominent magicians to help him understand illusions.
His interest in the development of cinema led him to record and view their performances frame by frame.
He was able to analyse the movement of the magicians as an animated sequence with the hope of understanding how audiences could be deceived by the magic performed right in front of them.
In his 1894 article La Psychologie de la Prestidigitation, Binet concluded that magical illusions were created by so many little optical tricks that:

to perceive them could be quite as difficult as to count with the naked eye the grains of sand on the seashore.

A 2008 article by a group of research psychologists argued that it was time to acknowledge magic’s influence on the cognitive sciences, opening a new field called the “science of magic”.
In 2010, neuroscientists Stephen Macknik and Susana Martinez-Conde coined the term “neuromagic” in their book Sleights of Mind.
The pair published some of their research findings in Nature, co-authored with not one, but four of the world’s leading magicians.
Like Binet more than a century before, they saw the value of working directly with magicians.
Perceiving blindness
Magic has finally emerged from the box labelled “entertainment” and now shines a light on one of the most perplexing areas of mind studies – perception.
Perception is key in many magic techniques. Audience members will follow a magician’s hand when he or she gestures in a curved line – but not when the line is straight, to give just one example.
Scientific attempts to understand perceptual processes have largely relied on functional Magnetic Resonance Imaging (fMRI) – medical imaging techniques that identify brain activity through changes in its blood flow.
Scientists also study eye movements using head-mounted eye trackers to ascertain objects of visual focus.
But much of our visual perception cannot be understood as a direct fit between seeing something and that thing registering in our attention.
Looking but not seeing
Our everyday perception is littered with episodes that psychologists call “inattentional blindness” and “change blindness”.
In other words, something happens in front of us but because our attention is elsewhere, we don’t register having seen it.
Neurologically speaking, when change occurs gradually it is referred to as change blindness, and one of the best examples of this is British psychologist Richard Wiseman’s colour card changing trick.
If the change occurs abruptly, it’s called inattentional blindness.
An experiment by American psychologists Daniel Simons and Christopher Chabris is by far the most famous illustration of this, and won them the Ig Nobel Prize in 2005.
But while the colour card changing “trick” and Simons and Chabris’ experiment aren’t technically magic tricks, magic provides an arena for observing how our visual perception is often at odds with the objects and events happening before our very eyes.
Misdirection is a standard technique of the magician’s palette and demonstrates the perceptual rift between looking at something and attending to it and it is this rift that fascinates neuroscientists and neuropsychologists.
Commonly thought to be about speed – isn’t the hand quicker than the eye? – misdirection is actually more about leading us to focus only on a particular area.
When a magician throws a ball into the air and it seemingly vanishes, the trick works because the audience is following the magician’s gaze – not his hand.
After really throwing the ball into the air numerous times and then simply performing the same movement in every way but without the ball, most people will see a ball fly into the air and disappear.
The magician has misdirected your gaze into following his and deployed a combination of inattentional and change blindness.
A neurological perspective
What we also learn from this neurologically is that implied movement stimulates brain functioning in much the same way as watching an actual movement.
That your gaze can differ from your attention is something that magicians have long exploited.
So now neurologists are looking to magic to help answer questions such as:

Why don’t we see always something right in front of us?
Why do our eyes more easily follow curved rather than straight gestures across space?

Magic, which has exploited such aspects of the visual for centuries, offers us a framework to explore perception in an intriguing way, and the potential for understanding our perceptual system by investigating how magic exploits its blindness and gaps is enormous.
It has become a sophisticated research method and field helping to create more intuitive human-computer interface designs and advance rehabilitation techniques for people physically impaired by neurological conditions like strokes.
It is even being used to study problems in social responsiveness across the autism spectrum.
All we need to do now is convince more magicians to give up their secrets – but how easy that will be remains to be seen.

The science of magic: it’s not all hocus pocus

Think of your favourite magic trick. Is it as grandiose as David Copperfield’s Death Saw, or is it as simple as making a coin disappear in front of your very eyes?

These two very different tricks have the same effect; they delight and astound, leaving the audience to ponder (usually unsuccessfully):

How did they do that?

But while magic has entertained us for thousands of years, it also has a long and colourful history of informing areas of scientific research, from cognitive psychology to treatment of paralysis.

How could such a seemingly innocuous form of entertainment affect such diverse areas?

Uncovering magic’s secrets

In 1893, French psychologist Alfred Binet managed to co-opt five of the country’s most prominent magicians to help him understand illusions.

His interest in the development of cinema led him to record and view their performances frame by frame.

He was able to analyse the movement of the magicians as an animated sequence with the hope of understanding how audiences could be deceived by the magic performed right in front of them.

In his 1894 article La Psychologie de la Prestidigitation, Binet concluded that magical illusions were created by so many little optical tricks that:

to perceive them could be quite as difficult as to count with the naked eye the grains of sand on the seashore.

A 2008 article by a group of research psychologists argued that it was time to acknowledge magic’s influence on the cognitive sciences, opening a new field called the “science of magic”.

In 2010, neuroscientists Stephen Macknik and Susana Martinez-Conde coined the term “neuromagic” in their book Sleights of Mind.

The pair published some of their research findings in Nature, co-authored with not one, but four of the world’s leading magicians.

Like Binet more than a century before, they saw the value of working directly with magicians.

Perceiving blindness

Magic has finally emerged from the box labelled “entertainment” and now shines a light on one of the most perplexing areas of mind studies – perception.

Perception is key in many magic techniques. Audience members will follow a magician’s hand when he or she gestures in a curved line – but not when the line is straight, to give just one example.

Scientific attempts to understand perceptual processes have largely relied on functional Magnetic Resonance Imaging (fMRI) – medical imaging techniques that identify brain activity through changes in its blood flow.

Scientists also study eye movements using head-mounted eye trackers to ascertain objects of visual focus.

But much of our visual perception cannot be understood as a direct fit between seeing something and that thing registering in our attention.

Looking but not seeing

Our everyday perception is littered with episodes that psychologists call “inattentional blindness” and “change blindness”.

In other words, something happens in front of us but because our attention is elsewhere, we don’t register having seen it.

Neurologically speaking, when change occurs gradually it is referred to as change blindness, and one of the best examples of this is British psychologist Richard Wiseman’s colour card changing trick.

If the change occurs abruptly, it’s called inattentional blindness.

An experiment by American psychologists Daniel Simons and Christopher Chabris is by far the most famous illustration of this, and won them the Ig Nobel Prize in 2005.

But while the colour card changing “trick” and Simons and Chabris’ experiment aren’t technically magic tricks, magic provides an arena for observing how our visual perception is often at odds with the objects and events happening before our very eyes.

Misdirection is a standard technique of the magician’s palette and demonstrates the perceptual rift between looking at something and attending to it and it is this rift that fascinates neuroscientists and neuropsychologists.

Commonly thought to be about speed – isn’t the hand quicker than the eye? – misdirection is actually more about leading us to focus only on a particular area.

When a magician throws a ball into the air and it seemingly vanishes, the trick works because the audience is following the magician’s gaze – not his hand.

After really throwing the ball into the air numerous times and then simply performing the same movement in every way but without the ball, most people will see a ball fly into the air and disappear.

The magician has misdirected your gaze into following his and deployed a combination of inattentional and change blindness.

A neurological perspective

What we also learn from this neurologically is that implied movement stimulates brain functioning in much the same way as watching an actual movement.

That your gaze can differ from your attention is something that magicians have long exploited.

So now neurologists are looking to magic to help answer questions such as:

Why don’t we see always something right in front of us?

Why do our eyes more easily follow curved rather than straight gestures across space?

Magic, which has exploited such aspects of the visual for centuries, offers us a framework to explore perception in an intriguing way, and the potential for understanding our perceptual system by investigating how magic exploits its blindness and gaps is enormous.

It has become a sophisticated research method and field helping to create more intuitive human-computer interface designs and advance rehabilitation techniques for people physically impaired by neurological conditions like strokes.

It is even being used to study problems in social responsiveness across the autism spectrum.

All we need to do now is convince more magicians to give up their secrets – but how easy that will be remains to be seen.

Filed under perception magic tricks neuroimaging inattentional blindness change blindness psychology neuroscience science

90 notes

Monkey see, monkey do
A new experimental method allows the spontaneous synchronization of arm motions by pairs of Japanese macaques to be observed under controlled conditions
Humans often synchronize their movements when, for example, we cooperate to move a piece of furniture. We also synchronize gestures and facial expressions when we interact. Coordinated actions are in fact surprisingly common in the animal kingdom, as exemplified by the flocking of birds and the schooling of fish. Such behaviors, however, have to date only been observed in the wild. Yasuo Nagasaka and colleagues from the Laboratory for Adaptive Intelligence at the RIKEN Brain Science Institute have now devised the first method for observing coordination under experimental conditions.
The researchers individually trained three Japanese macaque monkeys to press two buttons repeatedly and alternately with one hand. They then recorded the monkeys performing this task with a video camera and motion capture device.
Nagasaka and his colleagues later paired the monkeys and had them perform the task again while facing each other. Initially, each monkey in a pair pressed the buttons at different speeds. However, after a certain amount of time, the two monkeys spontaneously synchronized their button presses by altering the speed of their actions so that their button presses became harmonized with those of their partner.  
The speed of repeated button presses differed among the three pairs of monkeys, as did the timing of the synchrony. In one pair, the button presses were synchronized but one monkey was always delayed by 1 millisecond, while in another the delay was 13 milliseconds. In all cases, however, the timing of the actions became closely matched, and the delay seemed to be dependent on exactly which monkeys had been paired together. 
The researchers then played back the video recordings of the monkeys performing the task at different speeds while a monkey watched. The monkeys sped up or slowed down their button presses to harmonize their actions with those of the ‘virtual’ monkey, and they seemed to prefer to slow down their button presses, perhaps to save energy. 
In a final set of experiments, the research team allowed the real monkeys to either see or hear the video recordings, and found that visual information is far more important than auditory information for synchronization. 
“We believe that this spontaneous synchronization plays an important role in the building of social bonds, and we are now looking for the brain areas responsible,” says Nagasaka. “This could be fundamental to understanding the brain itself, and also the social interaction deficits in conditions such as autism.”
A video showing the spontaneous synchronization of monkey actions can be found here.

Monkey see, monkey do

A new experimental method allows the spontaneous synchronization of arm motions by pairs of Japanese macaques to be observed under controlled conditions

Humans often synchronize their movements when, for example, we cooperate to move a piece of furniture. We also synchronize gestures and facial expressions when we interact. Coordinated actions are in fact surprisingly common in the animal kingdom, as exemplified by the flocking of birds and the schooling of fish. Such behaviors, however, have to date only been observed in the wild. Yasuo Nagasaka and colleagues from the Laboratory for Adaptive Intelligence at the RIKEN Brain Science Institute have now devised the first method for observing coordination under experimental conditions.

The researchers individually trained three Japanese macaque monkeys to press two buttons repeatedly and alternately with one hand. They then recorded the monkeys performing this task with a video camera and motion capture device.

Nagasaka and his colleagues later paired the monkeys and had them perform the task again while facing each other. Initially, each monkey in a pair pressed the buttons at different speeds. However, after a certain amount of time, the two monkeys spontaneously synchronized their button presses by altering the speed of their actions so that their button presses became harmonized with those of their partner.  

The speed of repeated button presses differed among the three pairs of monkeys, as did the timing of the synchrony. In one pair, the button presses were synchronized but one monkey was always delayed by 1 millisecond, while in another the delay was 13 milliseconds. In all cases, however, the timing of the actions became closely matched, and the delay seemed to be dependent on exactly which monkeys had been paired together. 

The researchers then played back the video recordings of the monkeys performing the task at different speeds while a monkey watched. The monkeys sped up or slowed down their button presses to harmonize their actions with those of the ‘virtual’ monkey, and they seemed to prefer to slow down their button presses, perhaps to save energy. 

In a final set of experiments, the research team allowed the real monkeys to either see or hear the video recordings, and found that visual information is far more important than auditory information for synchronization. 

“We believe that this spontaneous synchronization plays an important role in the building of social bonds, and we are now looking for the brain areas responsible,” says Nagasaka. “This could be fundamental to understanding the brain itself, and also the social interaction deficits in conditions such as autism.”

A video showing the spontaneous synchronization of monkey actions can be found here.

Filed under synchronization motion capture macaques animal behavior neuroscience psychology science

free counters