Neuroscience

Articles and news from the latest research reports.

Posts tagged psychology

302 notes

The secrets of children’s chatter: research shows boys and girls learn language differently
Experts believe language uses both a mental dictionary and a mental grammar. The mental ‘dictionary’ stores sounds, words and common phrases, while mental ‘grammar’ involves the real-time composition of longer words and sentences. For example, making a longer word ‘walked’ from a smaller one ‘walk’.
However, most research into understanding how these processes work has been carried out with adults.
“Most researchers agree that the way we use language in our minds involves both storing and real-time composition,” said lead researcher Dr Cristina Dye, a specialist in child language development at Newcastle University. “But a lot of the specifics about how this happens are unclear, such as identifying exactly which parts of language are stored and which are composed.
“Most research on this topic has concentrated on adults and we wanted to see if studying children could help us learn more about these processes.”
A test based around 29 irregular verbs and 29 regular verbs was presented to the young participants. Only verbs which would be known by eight-year-olds were used.
They were presented with two sentences. One featured the verb in the context of the sentence, with the second sentence containing a blank to allow the children to produce the past-tense form. For example: Every day I walk to school. Just like every day, yesterday I ____ to school.
The children were asked to produce the missing word as quickly and as accurately as possible and their response times were recorded. The results were then analysed to discover which words were stored or created in real-time.
Results showed girls were more likely to memorise words and phrases – use their mental dictionary - while boys used mental grammar - i.e assembled these from smaller parts - more often.
The findings could have implications in the way youngsters are taught in the classroom, believes Dr Dye, who is based in the Centre for Research in Linguistics and Language Sciences.
She said: “What we found as we carried out the study was that girls were far more likely to remember forms like ‘walked’ while boys relied much more on their mental grammar to compose ‘walked’ from ‘walk’ and ‘ed’. This fits in with previous research which has identified differences between the sexes when it comes to memorising facts and events, where girls also seem to have an advantage compared to boys.
“One interesting aside to this is that as girls often outperform boys at school, it could be that the curriculum is put together in a way which benefits the way girls learn. It may be worth further investigation to see if this is the case and if so, is there a way lessons could be changed so boys can get the most out of them too.”
Paper: Children’s Computation of Complex Linguistic Forms: A study of Frequency and Imageability Effects
(Image: Getty Images)

The secrets of children’s chatter: research shows boys and girls learn language differently

Experts believe language uses both a mental dictionary and a mental grammar. The mental ‘dictionary’ stores sounds, words and common phrases, while mental ‘grammar’ involves the real-time composition of longer words and sentences. For example, making a longer word ‘walked’ from a smaller one ‘walk’.

However, most research into understanding how these processes work has been carried out with adults.

“Most researchers agree that the way we use language in our minds involves both storing and real-time composition,” said lead researcher Dr Cristina Dye, a specialist in child language development at Newcastle University. “But a lot of the specifics about how this happens are unclear, such as identifying exactly which parts of language are stored and which are composed.

“Most research on this topic has concentrated on adults and we wanted to see if studying children could help us learn more about these processes.”

A test based around 29 irregular verbs and 29 regular verbs was presented to the young participants. Only verbs which would be known by eight-year-olds were used.

They were presented with two sentences. One featured the verb in the context of the sentence, with the second sentence containing a blank to allow the children to produce the past-tense form. For example: Every day I walk to school. Just like every day, yesterday I ____ to school.

The children were asked to produce the missing word as quickly and as accurately as possible and their response times were recorded. The results were then analysed to discover which words were stored or created in real-time.

Results showed girls were more likely to memorise words and phrases – use their mental dictionary - while boys used mental grammar - i.e assembled these from smaller parts - more often.

The findings could have implications in the way youngsters are taught in the classroom, believes Dr Dye, who is based in the Centre for Research in Linguistics and Language Sciences.

She said: “What we found as we carried out the study was that girls were far more likely to remember forms like ‘walked’ while boys relied much more on their mental grammar to compose ‘walked’ from ‘walk’ and ‘ed’. This fits in with previous research which has identified differences between the sexes when it comes to memorising facts and events, where girls also seem to have an advantage compared to boys.

“One interesting aside to this is that as girls often outperform boys at school, it could be that the curriculum is put together in a way which benefits the way girls learn. It may be worth further investigation to see if this is the case and if so, is there a way lessons could be changed so boys can get the most out of them too.”

Paper: Children’s Computation of Complex Linguistic Forms: A study of Frequency and Imageability Effects

(Image: Getty Images)

Filed under language memory children child development sex differences psychology neuroscience science

1,567 notes

Neuroscience: The man who saw time stand still

One day, a man saw time itself stop, and as David Robson discovers, unpicking what happened is revealing that we can all experience temporal trickery too. 
It started as a headache, but soon became much stranger. Simon Baker entered the bathroom to see if a warm shower could ease his pain. “I looked up at the shower head, and it was as if the water droplets had stopped in mid-air”, he says. “They came into hard focus rapidly, over the course of a few seconds”. Where you’d normally perceive the streams as more of a blur of movement, he could see each one hanging in front of him, distorted by the pressure of the air rushing past. The effect, he recalls, was very similar to the way the bullets travelled in the Matrix movies. “It was like a high-speed film, slowed down.”
The next day, Baker went to hospital, where doctors found that he had suffered an aneurysm. The experience was soon overshadowed by the more immediate threat to his health, but in a follow-up appointment, he happened to mention what happened to his neurologist, Fred Ovsiew at Northwestern University in Chicago, who was struck by the vivid descriptions. “He was a very bright guy, and very eloquent” says Ovsiew, who recently wrote about Baker in the journal NeuroCase. (Baker’s identity was anonymised, which is typical for such studies, so this is not his real name).

Read more

Neuroscience: The man who saw time stand still

One day, a man saw time itself stop, and as David Robson discovers, unpicking what happened is revealing that we can all experience temporal trickery too.

It started as a headache, but soon became much stranger. Simon Baker entered the bathroom to see if a warm shower could ease his pain. “I looked up at the shower head, and it was as if the water droplets had stopped in mid-air”, he says. “They came into hard focus rapidly, over the course of a few seconds”. Where you’d normally perceive the streams as more of a blur of movement, he could see each one hanging in front of him, distorted by the pressure of the air rushing past. The effect, he recalls, was very similar to the way the bullets travelled in the Matrix movies. “It was like a high-speed film, slowed down.”

The next day, Baker went to hospital, where doctors found that he had suffered an aneurysm. The experience was soon overshadowed by the more immediate threat to his health, but in a follow-up appointment, he happened to mention what happened to his neurologist, Fred Ovsiew at Northwestern University in Chicago, who was struck by the vivid descriptions. “He was a very bright guy, and very eloquent” says Ovsiew, who recently wrote about Baker in the journal NeuroCase. (Baker’s identity was anonymised, which is typical for such studies, so this is not his real name).

Read more

Filed under zeitraffer phenomenon akinetopsia motion perception psychology neuroscience science

140 notes

Monkeys also believe in winning streaks
Humans have a well-documented tendency to see winning and losing streaks in situations that, in fact, are random. But scientists disagree about whether the “hot-hand bias” is a cultural artifact picked up in childhood or a predisposition deeply ingrained in the structure of our cognitive architecture.
Now in the first study in non-human primates of this systematic error in decision making, researchers find that monkeys also share our unfounded belief in winning and losing streaks. The results suggests that the penchant to see patterns that actually don’t exist may be inherited—an evolutionary adaptation that may have provided our ancestors a selective advantage when foraging for food in the wild, according to lead author Tommy Blanchard, a doctoral candidate in brain and cognitive sciences at the University of Rochester.
The cognitive bias may be difficult to override even in situations that are truly random. This inborn tendency to feel that we are on a roll or in a slump may help explain why gambling can be so alluring and why the stock market is so prone to wild swings, said coauthor Benjamin Hayden, assistant professor brain and cognitive sciences at the University of Rochester.
Hayden, Blanchard, and Andreas Wilke, an assistant professor of psychology at Clarkson University, reported their findings in the July issue of the Journal of Experimental Psychology: Animal Learning and Cognition.
To measure whether monkeys actually believe in winning streaks, the researchers had to create a computerized game that was so captivating monkeys would want to play for hours. “Luckily, monkeys love to gamble,” said Blanchard. So the team devised a fast-paced task in which each monkey could choose right or left and receive a reward when they guessed correctly.
The researchers created three types of play, two with clear patterns (the correct answer tended to repeat on one side or to alternate from side to side) and a third in which the lucky pick was completely random. Where clear patterns existed, the three rhesus monkeys in the study quickly guessed the correct sequence. But in the random scenarios, the monkeys continued to make choices as if they expected a “streak”. In other words, even when rewards were random, the monkeys favored one side.
The monkeys showed the hot-hand bias consistently over weeks of play and an average of 1,244 trials per condition. “They had lots and lots of opportunities to get over this bias, to learn and change, and yet they continued to show the same tendency,” said Blanchard.
So why do monkeys and humans share this false belief in a run of luck even when faced over and over with evidence that the results are random? The authors speculate that the distribution of food in the wild, which is not random, may be the culprit. “If you find a nice juicy beetle on the underside of a log, this is pretty good evidence that there might be a beetle in a similar location nearby, because beetles, like most food sources, tend to live near each other,” explained Hayden.
Evolution has also primed our brains to look for patterns, added Hayden. “We have this incredible drive to see patterns in the world, and we also have this incredible drive to learn. I think it’s very related to why we like music, and why we like to do crossword puzzles, Sudoku, and things like that. If there’s a pattern there, we’re on top of it. And if there may or may not be a pattern there, that’s even more interesting.”
Understanding the hot-hand bias could inform treatment for gambling addiction and provide insights for investors, said Hayden. “If a belief in winning streaks is hardwired, then we may want to look for more rigorous retaining for individuals who cannot control their gambling. And investors should keep in mind that humans have an inherited bias to believe that if a stock goes up one day, it will continue to go up.”
The results also could provide nuance to our understanding of free will, said Blanchard, who was drawn to the study of decision making during prior graduate training in philosophy. “Biases in our decision-making mechanisms, like this bias towards belief in winning and losing streaks, say something really deep about what sorts of creatures we are. We often like to think we make decisions based only on the information we’re conscious of. But we’re not always aware of why we make certain decisions or believe certain things.
“We’re a complex mix of biases and heuristics and statistical reasoning. When you put it all together, that’s how you get sophisticated behavior. We don’t know where a lot of these biases come from, but this study—and others like it—suggest many of them are due to cognitive mechanisms we share with our primate relatives,” said Blanchard.

Monkeys also believe in winning streaks

Humans have a well-documented tendency to see winning and losing streaks in situations that, in fact, are random. But scientists disagree about whether the “hot-hand bias” is a cultural artifact picked up in childhood or a predisposition deeply ingrained in the structure of our cognitive architecture.

Now in the first study in non-human primates of this systematic error in decision making, researchers find that monkeys also share our unfounded belief in winning and losing streaks. The results suggests that the penchant to see patterns that actually don’t exist may be inherited—an evolutionary adaptation that may have provided our ancestors a selective advantage when foraging for food in the wild, according to lead author Tommy Blanchard, a doctoral candidate in brain and cognitive sciences at the University of Rochester.

The cognitive bias may be difficult to override even in situations that are truly random. This inborn tendency to feel that we are on a roll or in a slump may help explain why gambling can be so alluring and why the stock market is so prone to wild swings, said coauthor Benjamin Hayden, assistant professor brain and cognitive sciences at the University of Rochester.

Hayden, Blanchard, and Andreas Wilke, an assistant professor of psychology at Clarkson University, reported their findings in the July issue of the Journal of Experimental Psychology: Animal Learning and Cognition.

To measure whether monkeys actually believe in winning streaks, the researchers had to create a computerized game that was so captivating monkeys would want to play for hours. “Luckily, monkeys love to gamble,” said Blanchard. So the team devised a fast-paced task in which each monkey could choose right or left and receive a reward when they guessed correctly.

The researchers created three types of play, two with clear patterns (the correct answer tended to repeat on one side or to alternate from side to side) and a third in which the lucky pick was completely random. Where clear patterns existed, the three rhesus monkeys in the study quickly guessed the correct sequence. But in the random scenarios, the monkeys continued to make choices as if they expected a “streak”. In other words, even when rewards were random, the monkeys favored one side.

The monkeys showed the hot-hand bias consistently over weeks of play and an average of 1,244 trials per condition. “They had lots and lots of opportunities to get over this bias, to learn and change, and yet they continued to show the same tendency,” said Blanchard.

So why do monkeys and humans share this false belief in a run of luck even when faced over and over with evidence that the results are random? The authors speculate that the distribution of food in the wild, which is not random, may be the culprit. “If you find a nice juicy beetle on the underside of a log, this is pretty good evidence that there might be a beetle in a similar location nearby, because beetles, like most food sources, tend to live near each other,” explained Hayden.

Evolution has also primed our brains to look for patterns, added Hayden. “We have this incredible drive to see patterns in the world, and we also have this incredible drive to learn. I think it’s very related to why we like music, and why we like to do crossword puzzles, Sudoku, and things like that. If there’s a pattern there, we’re on top of it. And if there may or may not be a pattern there, that’s even more interesting.”

Understanding the hot-hand bias could inform treatment for gambling addiction and provide insights for investors, said Hayden. “If a belief in winning streaks is hardwired, then we may want to look for more rigorous retaining for individuals who cannot control their gambling. And investors should keep in mind that humans have an inherited bias to believe that if a stock goes up one day, it will continue to go up.”

The results also could provide nuance to our understanding of free will, said Blanchard, who was drawn to the study of decision making during prior graduate training in philosophy. “Biases in our decision-making mechanisms, like this bias towards belief in winning and losing streaks, say something really deep about what sorts of creatures we are. We often like to think we make decisions based only on the information we’re conscious of. But we’re not always aware of why we make certain decisions or believe certain things.

“We’re a complex mix of biases and heuristics and statistical reasoning. When you put it all together, that’s how you get sophisticated behavior. We don’t know where a lot of these biases come from, but this study—and others like it—suggest many of them are due to cognitive mechanisms we share with our primate relatives,” said Blanchard.

Filed under hot-hand fallacy decision making primates gambling psychology neuroscience science

212 notes

Little or poor sleep may be associated with worse brain function when aging
Research published today in PLOS ONE by researchers at the University of Warwick indicates that sleep problems are associated with worse memory and executive function in older people.
Analysis of sleep and cognitive (brain function) data from 3,968 men and 4,821 women who took part in the English Longitudinal Study of Ageing (ELSA), was conducted in a study funded by the Economic and Social Research Council (ESRC). Respondents reported on the quality and quantity of sleep over the period of a month.
The study showed that there is an association between both quality and duration of sleep and brain function which changes with age.
In adults aged between 50 and 64 years of age, short sleep (<6hrs per night) and long sleep (>8hrs per night) were associated with lower brain function scores. By contrast, in older adults (65-89 years) lower brain function scores were only observed in long sleepers.
Dr Michelle A Miller says “6-8 hours of sleep per night is particularly important for optimum brain function, in younger adults”. These results are consistent with our previous research, which showed that 6-8 hours of sleep per night was optimal for physical health, including lowest risk of developing obesity, hypertension, diabetes, heart disease and stroke”.
Interestingly, in the younger pre-retirement aged adults, sleep quality did not have any significant association with brain function scores, whereas in the older adults (>65 years), there was a significant relationship between sleep quality and the observed scores.
“Sleep is important for good health and mental wellbeing” says Professor Francesco Cappuccio, “Optimising sleep at an older age may help to delay the decline in brain function seen with age, or indeed may slow or prevent the rapid decline that leads to dementia”.
Dr Miller concludes that “if poor sleep is causative of future cognitive decline, non-pharmacological improvements in sleep may provide an alternative low-cost and more accessible Public Health intervention, to delay or slow the rate of cognitive decline”.

Little or poor sleep may be associated with worse brain function when aging

Research published today in PLOS ONE by researchers at the University of Warwick indicates that sleep problems are associated with worse memory and executive function in older people.

Analysis of sleep and cognitive (brain function) data from 3,968 men and 4,821 women who took part in the English Longitudinal Study of Ageing (ELSA), was conducted in a study funded by the Economic and Social Research Council (ESRC). Respondents reported on the quality and quantity of sleep over the period of a month.

The study showed that there is an association between both quality and duration of sleep and brain function which changes with age.

In adults aged between 50 and 64 years of age, short sleep (<6hrs per night) and long sleep (>8hrs per night) were associated with lower brain function scores. By contrast, in older adults (65-89 years) lower brain function scores were only observed in long sleepers.

Dr Michelle A Miller says “6-8 hours of sleep per night is particularly important for optimum brain function, in younger adults”. These results are consistent with our previous research, which showed that 6-8 hours of sleep per night was optimal for physical health, including lowest risk of developing obesity, hypertension, diabetes, heart disease and stroke”.

Interestingly, in the younger pre-retirement aged adults, sleep quality did not have any significant association with brain function scores, whereas in the older adults (>65 years), there was a significant relationship between sleep quality and the observed scores.

Sleep is important for good health and mental wellbeing” says Professor Francesco Cappuccio, “Optimising sleep at an older age may help to delay the decline in brain function seen with age, or indeed may slow or prevent the rapid decline that leads to dementia”.

Dr Miller concludes that “if poor sleep is causative of future cognitive decline, non-pharmacological improvements in sleep may provide an alternative low-cost and more accessible Public Health intervention, to delay or slow the rate of cognitive decline”.

Filed under brain function cognitive impairment memory sleep aging psychology neuroscience science

371 notes

Gestures that speak
When you gesticulate you don’t just add a “note of colour” that makes your speech more pleasant: you convey information on sentence structure and make your meanings clearer. A study carried out at SISSA in Trieste demonstrates that gestures and “prosody” (the intonation and rhythm of spoken language) form  a  single “communication system” at the cognitive level, and that we speak using our “whole  body” and not only our vocal tract.
Have you ever found yourself gesticulating and felt a bit stupid for it while talking on the phone?
You’re not alone: it happens very often that people accompany their speech with hand gestures, sometimes even when no one can see them. Why can’t we keep still while speaking? “Because gestures and words very probably form a single “communication system”, which ultimately serves to enhance expression intended as the ability to make oneself understood”, explains Marina Nespor, a neuroscientist at the International School for Advanced Studies (SISSA) of Trieste. Nespor, together with Alan Langus, a SISSA research fellow, and Bahia Guellai from the Université Paris Ouest Nanterre La Défence, who conducted the investigation at SISSA, has just published a study in Frontiers in Psychology which demonstrates the role of gestures in speech &#8220;prosody&#8221;.
Linguists define prosody as the intonation and rhythm of spoken language, features that help to highlight sentence structure and therefore make the message easier to understand. For example, without prosody, nothing would distinguish the declarative statement &#8220;this is an apple&#8221; from the surprise question “this is an apple?” (in this case the difference lies in the intonation).
According to Nespor and colleagues, even hand gestures are part of prosody: &#8220;the prosody that accompanies speech is not ‘modality specific&#8221; explains Langus. “Prosodic information, for the person receiving the message, is a combination of auditory and visual cues. The &#8216;superior&#8217; aspects (at the cognitive processing level) of spoken language are mapped to the motor‐programs responsible for the production of both speech sounds and accompanying hand gestures&#8221;.
Nespor, Langus and Guellai had 20 Italian speakers listen to a series of &#8220;ambiguous&#8221; utterances, which could be said with different prosodies corresponding to two different meanings. Examples of utterances were “come sicuramente hai visto la vecchia sbarra la porta&#8221; where, depending on meaning, “vecchia&#8221; can be the subject of the main verb (sbarrare, to block) or an adjective qualifying the subject (sbarra, bar) (‘As you for sure have seen the old lady blocks the door’ versus &#8216;As you for sure have seen the old bar carries it’). The utterances could be simply listened to (“audio only” modality) or be presented in a video, where the participants could both listen to the sentences and see the accompanying gestures. In the “video” stimuli, the condition could be &#8220;matched” (gestures corresponding to the meaning conveyed by speech prosody) or “mismatched” (gestures matching the alternative meaning).
“In the matched conditions there was no improvement ascribable to gestures: the  participants’ performance was very good both in the video and in the “audio only” sessions. It’s in the mismatched condition that the effect of hand gestures became apparent”, explains Langus. “With these stimuli the subjects were much more likely to make the wrong choice (that is, they’d choose the meaning indicated in the gestures rather than in the speech) compared to matched or audio only conditions. This means that gestures affect how meaning is interpreted, and we believe this points to the existence of a common cognitive system for gestures, intonation and rhythm of spoken language”.
“In human communication, voice is not sufficient: even the torso and in particular hand movements are involved, as are facial expressions”, concludes Nespor.

Gestures that speak

When you gesticulate you don’t just add a “note of colour” that makes your speech more pleasant: you convey information on sentence structure and make your meanings clearer. A study carried out at SISSA in Trieste demonstrates that gestures and “prosody” (the intonation and rhythm of spoken language) form  a  single “communication system” at the cognitive level, and that we speak using our “whole  body” and not only our vocal tract.

Have you ever found yourself gesticulating and felt a bit stupid for it while talking on the phone?

You’re not alone: it happens very often that people accompany their speech with hand gestures, sometimes even when no one can see them. Why can’t we keep still while speaking? “Because gestures and words very probably form a single “communication system”, which ultimately serves to enhance expression intended as the ability to make oneself understood”, explains Marina Nespor, a neuroscientist at the International School for Advanced Studies (SISSA) of Trieste. Nespor, together with Alan Langus, a SISSA research fellow, and Bahia Guellai from the Université Paris Ouest Nanterre La Défence, who conducted the investigation at SISSA, has just published a study in Frontiers in Psychology which demonstrates the role of gestures in speech “prosody”.

Linguists define prosody as the intonation and rhythm of spoken language, features that help to highlight sentence structure and therefore make the message easier to understand. For example, without prosody, nothing would distinguish the declarative statement “this is an apple” from the surprise question “this is an apple?” (in this case the difference lies in the intonation).

According to Nespor and colleagues, even hand gestures are part of prosody: “the prosody that accompanies speech is not ‘modality specific” explains Langus. “Prosodic information, for the person receiving the message, is a combination of auditory and visual cues. The ‘superior’ aspects (at the cognitive processing level) of spoken language are mapped to the motor‐programs responsible for the production of both speech sounds and accompanying hand gestures”.

Nespor, Langus and Guellai had 20 Italian speakers listen to a series of “ambiguous” utterances, which could be said with different prosodies corresponding to two different meanings. Examples of utterances were “come sicuramente hai visto la vecchia sbarra la porta” where, depending on meaning, “vecchia” can be the subject of the main verb (sbarrare, to block) or an adjective qualifying the subject (sbarra, bar) (‘As you for sure have seen the old lady blocks the door’ versus ‘As you for sure have seen the old bar carries it’). The utterances could be simply listened to (“audio only” modality) or be presented in a video, where the participants could both listen to the sentences and see the accompanying gestures. In the “video” stimuli, the condition could be “matched” (gestures corresponding to the meaning conveyed by speech prosody) or “mismatched” (gestures matching the alternative meaning).

“In the matched conditions there was no improvement ascribable to gestures: the  participants’ performance was very good both in the video and in the “audio only” sessions. It’s in the mismatched condition that the effect of hand gestures became apparent”, explains Langus. “With these stimuli the subjects were much more likely to make the wrong choice (that is, they’d choose the meaning indicated in the gestures rather than in the speech) compared to matched or audio only conditions. This means that gestures affect how meaning is interpreted, and we believe this points to the existence of a common cognitive system for gestures, intonation and rhythm of spoken language”.

“In human communication, voice is not sufficient: even the torso and in particular hand movements are involved, as are facial expressions”, concludes Nespor.

Filed under gestures prosody communication speech perception psychology neuroscience science

209 notes

Study shows puzzle games can improve mental flexibility
A recent study by Nanyang Technological University (NTU) scientists showed that adults who played the physics-based puzzle video game Cut the Rope regularly, for as little as an hour a day, had improved executive functions.
The executive functions in your brain are important for making decisions in everyday life when you have to deal with sudden changes in your environment – better known as thinking on your feet. An example would be when the traffic light turns amber and a driver has to decide in an instant if he will be able to brake in time or if it is safer to travel across the junction/intersection.
The video game study by Assistant Professor Michael D. Patterson and his PhD student Mr Adam Oei, tested four different games for the mobile platform, as their previous research had shown that different games trained different skills.
The games varied in their genres, which included a first person shooter (Modern Combat); arcade (Fruit Ninja); real-time strategy (StarFront Collision); and a complex puzzle (Cut the Rope).
NTU undergraduates, who were non-gamers, were then selected to play an hour a day, 5 days a week on their iPhone or iPod Touch. This video game training lasted for 4 weeks, a total of 20 hours.
Prof Patterson said students who played Cut the Rope, showed significant improvement on executive function tasks while no significant improvements were observed in those playing the other three games.
“This finding is important because previously, no video games have demonstrated this type of broad improvement to executive functions, which are important for general intelligence, dealing with new situations and managing multitasking,” said Prof Patterson, an expert in the psychology of video games.
“This indicates that while some games may help to improve mental abilities, not all games give you the same effect. To improve the specific ability you are looking for, you need to play the right game,” added Mr Oei.
The abilities tested in this study included how fast the players can switch tasks (an indicator of mental flexibility); how fast can the players adapt to a new situation instead of relying on the same strategy (the ability to inhibit prepotent or predominant responses); and how well they can focus on information while blocking out distractors or inappropriate responses (also known as the Flanker task in cognitive psychology).
Prof Patterson said the reason Cut the Rope improved executive function in their players was probably due to the game’s unique puzzle design. Strategies which worked for earlier levels would not work in later levels, and regularly forced the players to think creatively and try alternate solutions. This is unlike most other video games which keep the same general mechanics and goals, and just speed up or increase the number of items to keep track of. 
After 20 hours of game play, players of Cut the Rope could switch between tasks 33 per cent faster, were 30 per cent faster in adapting to new situations, and 60 per cent better in blocking out distractions and focusing on the tasks at hand than before training.
All three tests were done one week after the 52 students had finished playing their assigned game, to ensure that these were not temporary gains due to motivation or arousal effects.
The study will be published in the academic journal, Computers in Human Behavior, this August, but is available currently online. This is the first study that showed broad transfer to several different executive functions, further providing evidence the video games can be effective in training human cognition.
“This result could have implications in many areas such as educational, occupational and rehabilitative settings,” Prof Patterson said.
“In future, with more studies, we will be able to know what type of games improves specific abilities, and prescribe games that will benefit people aside from just being entertainment.”
In their previous study published last year in PloS One, a top academic journal, Prof Patterson and Mr Oei studied the effects mobile gaming had on 75 NTU undergraduates.
The non-gamers were instructed to play one of the following games: “match three” game Bejeweled, virtual life simulation game The Sims, and action shooter Modern Combat.
The study findings showed that adults who play action games improved their ability to track multiple objects in a short span of time, useful when driving during a busy rush hour; while other games improved the participants’ ability for visual search tasks, useful when picking out an item from a large supermarket.
Moving forward, the Prof Patterson is keen to look at whether there is any improvement from playing such games in experienced adult gamers and how much improvement one can make through playing games.

Study shows puzzle games can improve mental flexibility

A recent study by Nanyang Technological University (NTU) scientists showed that adults who played the physics-based puzzle video game Cut the Rope regularly, for as little as an hour a day, had improved executive functions.

The executive functions in your brain are important for making decisions in everyday life when you have to deal with sudden changes in your environment – better known as thinking on your feet. An example would be when the traffic light turns amber and a driver has to decide in an instant if he will be able to brake in time or if it is safer to travel across the junction/intersection.

The video game study by Assistant Professor Michael D. Patterson and his PhD student Mr Adam Oei, tested four different games for the mobile platform, as their previous research had shown that different games trained different skills.

The games varied in their genres, which included a first person shooter (Modern Combat); arcade (Fruit Ninja); real-time strategy (StarFront Collision); and a complex puzzle (Cut the Rope).

NTU undergraduates, who were non-gamers, were then selected to play an hour a day, 5 days a week on their iPhone or iPod Touch. This video game training lasted for 4 weeks, a total of 20 hours.

Prof Patterson said students who played Cut the Rope, showed significant improvement on executive function tasks while no significant improvements were observed in those playing the other three games.

“This finding is important because previously, no video games have demonstrated this type of broad improvement to executive functions, which are important for general intelligence, dealing with new situations and managing multitasking,” said Prof Patterson, an expert in the psychology of video games.

“This indicates that while some games may help to improve mental abilities, not all games give you the same effect. To improve the specific ability you are looking for, you need to play the right game,” added Mr Oei.

The abilities tested in this study included how fast the players can switch tasks (an indicator of mental flexibility); how fast can the players adapt to a new situation instead of relying on the same strategy (the ability to inhibit prepotent or predominant responses); and how well they can focus on information while blocking out distractors or inappropriate responses (also known as the Flanker task in cognitive psychology).

Prof Patterson said the reason Cut the Rope improved executive function in their players was probably due to the game’s unique puzzle design. Strategies which worked for earlier levels would not work in later levels, and regularly forced the players to think creatively and try alternate solutions. This is unlike most other video games which keep the same general mechanics and goals, and just speed up or increase the number of items to keep track of. 

After 20 hours of game play, players of Cut the Rope could switch between tasks 33 per cent faster, were 30 per cent faster in adapting to new situations, and 60 per cent better in blocking out distractions and focusing on the tasks at hand than before training.

All three tests were done one week after the 52 students had finished playing their assigned game, to ensure that these were not temporary gains due to motivation or arousal effects.

The study will be published in the academic journal, Computers in Human Behavior, this August, but is available currently online. This is the first study that showed broad transfer to several different executive functions, further providing evidence the video games can be effective in training human cognition.

“This result could have implications in many areas such as educational, occupational and rehabilitative settings,” Prof Patterson said.

“In future, with more studies, we will be able to know what type of games improves specific abilities, and prescribe games that will benefit people aside from just being entertainment.”

In their previous study published last year in PloS One, a top academic journal, Prof Patterson and Mr Oei studied the effects mobile gaming had on 75 NTU undergraduates.

The non-gamers were instructed to play one of the following games: “match three” game Bejeweled, virtual life simulation game The Sims, and action shooter Modern Combat.

The study findings showed that adults who play action games improved their ability to track multiple objects in a short span of time, useful when driving during a busy rush hour; while other games improved the participants’ ability for visual search tasks, useful when picking out an item from a large supermarket.

Moving forward, the Prof Patterson is keen to look at whether there is any improvement from playing such games in experienced adult gamers and how much improvement one can make through playing games.

Filed under executive function video games cognition psychology neuroscience science

99 notes

Researchers publish one of the longest longitudinal studies of cognition in MS
Researchers at Kessler Foundation and the Cleveland Clinic have published one of the longest longitudinal studies of cognition in multiple sclerosis (MS). The article, “Cognitive impairment in multiple sclerosis: An 18-year follow-up study,” was epublished by Multiple Sclerosis and Related Disorders on April 13, 2014. Results provide insight into the natural evolution of cognitive changes over time, an important consideration for researchers and clinicians. Authors are Lauren B. Strober, PhD, of Kessler Foundation and  Stephen M. Rao, PhD, Jar-Chi Lee, Elizabeth Fisher, PhD, and Richard Rudick, MD, of the Cleveland Clinic.
“While cognitive impairment is known to affect 40 to 65% of individuals with MS, few studies have followed the pattern of cognitive decline over time, which is important for understanding long-term care and outcomes associated with MS,” said Dr. Strober, senior research scientist at Kessler Foundation. “Our study was based on a unique sample of 22 patients who underwent neuropsychological testing at entry into the original phase 3 clinical trial of intramuscular interferon beta-1a, and again at 18-year followup.”
At baseline, 9 patients (41%) had cognitive impairment; at 18-year followup, 13 patients (59%), were found to be impaired. Significant declines over time were found in information processing speed, auditory attention, memory, episodic learning and visual construction. Decline was steeper in the unimpaired than in the impaired group, as indicated by the Symbol Digit Modalities Test (SDMT).
"These longitudinal data contribute substantially to our knowledge of the course of cognitive decline in MS,” noted John DeLuca, PhD, VP of Research &amp; Training at Kessler Foundation. “In light of the young age at diagnosis, this perspective is fundamental to the development of rehabilitation strategies that meet the needs of people dealing with the cognitive effects of MS.”
The study was funded by Biogen Idec.

Researchers publish one of the longest longitudinal studies of cognition in MS

Researchers at Kessler Foundation and the Cleveland Clinic have published one of the longest longitudinal studies of cognition in multiple sclerosis (MS). The article, “Cognitive impairment in multiple sclerosis: An 18-year follow-up study,” was epublished by Multiple Sclerosis and Related Disorders on April 13, 2014. Results provide insight into the natural evolution of cognitive changes over time, an important consideration for researchers and clinicians. Authors are Lauren B. Strober, PhD, of Kessler Foundation and  Stephen M. Rao, PhD, Jar-Chi Lee, Elizabeth Fisher, PhD, and Richard Rudick, MD, of the Cleveland Clinic.

“While cognitive impairment is known to affect 40 to 65% of individuals with MS, few studies have followed the pattern of cognitive decline over time, which is important for understanding long-term care and outcomes associated with MS,” said Dr. Strober, senior research scientist at Kessler Foundation. “Our study was based on a unique sample of 22 patients who underwent neuropsychological testing at entry into the original phase 3 clinical trial of intramuscular interferon beta-1a, and again at 18-year followup.”

At baseline, 9 patients (41%) had cognitive impairment; at 18-year followup, 13 patients (59%), were found to be impaired. Significant declines over time were found in information processing speed, auditory attention, memory, episodic learning and visual construction. Decline was steeper in the unimpaired than in the impaired group, as indicated by the Symbol Digit Modalities Test (SDMT).

"These longitudinal data contribute substantially to our knowledge of the course of cognitive decline in MS,” noted John DeLuca, PhD, VP of Research & Training at Kessler Foundation. “In light of the young age at diagnosis, this perspective is fundamental to the development of rehabilitation strategies that meet the needs of people dealing with the cognitive effects of MS.”

The study was funded by Biogen Idec.

Filed under MS cognitive impairment cognition psychology neuroscience science

169 notes

Neural sweet talk: Taste metaphors emotionally engage the brain

So accustomed are we to metaphors related to taste that when we hear a kind smile described as “sweet,” or a resentful comment as “bitter,” we most likely don’t even think of those words as metaphors. But while it may seem to our ears that “sweet” by any other name means the same thing, new research shows that taste-related words actually engage the emotional centers of the brain more than literal words with the same meaning.

Researchers from Princeton University and the Free University of Berlin report in the Journal of Cognitive Neuroscience the first study to experimentally show that the brain processes these everyday metaphors differently than literal language. In the study, participants read 37 sentences that included common metaphors based on taste while the researchers recorded their brain activity. Each taste-related word was then swapped with a literal counterpart so that, for instance, “She looked at him sweetly” became “She looked at him kindly.”

The researchers found that the sentences containing words that invoked taste activated areas known to be associated with emotional processing, such as the amygdala, as well as the areas known as the gustatory cortices that allow for the physical act of tasting. Interestingly, the metaphorical and literal words only resulted in brain activity related to emotion when part of a sentence, but stimulated the gustatory cortices both in sentences and as stand-alone words.

Metaphorical sentences may spark increased brain activity in emotion-related regions because they allude to physical experiences, said co-author Adele Goldberg, a Princeton professor of linguistics in the Council of the Humanities. Human language frequently uses physical sensations or objects to refer to abstract domains such as time, understanding or emotion, Goldberg said. For instance, people liken love to a number of afflictions including being “sick” or shot through the heart with an arrow. Similarly, “sweet” has a much clearer physical component than “kind.” The new research suggests that these associations go beyond just being descriptive to engage our brains on an emotional level and potentially amplify the impact of the sentence, Goldberg said.

"You begin to realize when you look at metaphors how common they are in helping us understand abstract domains," Goldberg said. "It could be that we are more engaged with abstract concepts when we use metaphorical language that ties into physical experiences."

If metaphors in general elicit an emotional response from the brain that is similar to that caused by taste-related metaphors, then that could mean that figurative language presents a “rhetorical advantage” when communicating with others, explained co-author Francesca Citron, a postdoctoral researcher of psycholinguistics at the Free University’s Languages of Emotion research center.

"Figurative language may be more effective in communication and may facilitate processes such as affiliation, persuasion and support," Citron said. "Further, as a reader or listener, one should be wary of being overly influenced by metaphorical language."

Colloquially, metaphors seem to be employed precisely to evoke an emotional reaction, yet the actual emotional effect of figurative phrases on the person hearing them has not before been deeply explored, said Benjamin Bergen, an associate professor of cognitive science at the University of California-San Diego who studies language comprehension, and metaphorical language and thought.

"There’s a lot of research on the conceptual effects of metaphors, such as how they allow people to think about new or abstract concepts in terms of concrete things they’re familiar with. But there’s very little work on the emotional impact of metaphor," said Bergen, who had no role in the research but is familiar with it.

"Emotional impact seems to be one of the main reasons people use metaphors to begin with. For instance, a senator might describe a bill as ‘job-killing’ to evoke an emotional reaction," he said. "These results suggest that using certain metaphorical expressions induces more of an emotional reaction than saying the same thing literally. Those expressions that have this property are likely to have the effects on reasoning, inference, judgment and decision-making that emotion is known to have."

The brain areas that taste-related words did not stimulate are also an important outcome of the study, Citron said. Existing research on metaphors and neural processing has shown that figurative language generally requires more brainpower than literal language, Citron and Goldberg wrote. But these bursts of neural activity have been related to higher-order processing from thinking through an unfamiliar metaphor.

The brain activity Citron and Goldberg observed did not correlate with this process. In order to create the metaphorical- and literal-sentence stimuli, they had a group of people separate from the study participants rate sentences for familiarity, apparent arousal, imageability — which is how easily a phrase can be imagined in the reader’s mind — and how positive or negative each sentence was interpreted as being. The metaphorical and literal sentences were equal on all of these factors. In addition, each metaphorical phrase and its literal counterpart were rated as being highly similar in meaning.

These steps helped to ensure that the metaphorical and literal sentences were equally as easy to comprehend. Thus, the brain activity the researchers recorded was not likely to be in response to any additional difficulty study participants had in understanding the metaphors.

"It is important to rule out possible effects of familiarity, since less familiar items may require more processing resources to be understood and elicit enhanced brain responses in several brain regions," Citron said.

Citron and Goldberg plan to follow up on their results by examining if figurative language is remembered more accurately than literal language, if metaphors are more physically stimulating, and if metaphors related to other senses also provoke an emotional response from the brain.

Filed under brain activity taste metaphorical expressions amygdala emotions psychology neuroscience science

228 notes

Those with episodic amnesia are not ‘stuck in time,’ says philosopher Carl Craver
In 1981, a motorcycle accident left Toronto native Kent Cochrane with severe brain damage and dramatically impaired episodic memory. Following the accident, Cochrane could no longer remember events from his past. Nor could he predict specific events that might happen in the future.
When neuroscientist Endel Tulving, PhD, asked him to describe what he would do tomorrow, Cochrane could not answer and described his state of mind as “blank.”
Psychologists and neuroscientists came to know Cochrane, who passed away earlier this year, simply as “KC.” Many scientists have described KC as “stuck in time,” or trapped in a permanent present.
It has generally been assumed that people with episodic amnesia experience time much differently than those with more typical memory function. 
However, a recent paper in Neuropsychologia co-authored by Carl F. Craver, PhD, professor of philosophy and of philosophy-neuroscience-psychology, both in Arts &amp; Sciences at Washington University in St. Louis, disputes this type of claim.
“It’s our whole way of thinking about these people that we wanted to bring under pressure,” Craver said. “There are sets of claims that sound empirical, like ‘These people are stuck in time.’ But if you ask, ‘Have you actually tested what they know about time?’ the answer is no.”
Time and consciousness
A series of experiments convinced Craver and his co-authors that although KC could not remember specific past experiences, he did in fact have an understanding of time and an appreciation of its significance to his life.
Interviews with KC by Craver and his colleagues revealed that KC retained much of what psychologists refer to as “temporal consciousness.” KC could order significant events from his life on a timeline, and he seemed to have complete mastery of central temporal concepts.
For example, KC understood that events in the past have already happened, that they influence the future, and that once they happen, they cannot be changed. 
He also knew that events in the future don’t remain in the future, but eventually become present. Even more interestingly, KC’s understanding of time influenced his decision-making.
If KC truly had no understanding of time, Craver argues, then he and others with his type of amnesia would act as if only the present mattered. Without understanding that present actions have future consequences or rewards, KC would have based his actions only upon immediate outcomes. However, this was not the case.
On a personality test, KC scored as low as possible on measures of hedonism, or the tendency to be a self-indulgent pleasure-seeker.
In systematic tests of his decision-making, carried out with WUSTL’s Len Green, PhD, professor of psychology, and Joel Myerson, PhD, research professor of psychology, and researchers at York University in Toronto, KC also showed that he was willing to trade a smaller, sooner reward for a larger, later reward.
In other words, KC’s inability to remember past events did not affect his ability to appreciate the value of future rewards. 
‘Questions are now wide open’
KC’s case reveals how much is left to discover about memory and how it relates to human understanding of time.
“If you think about memory long enough it starts to sound magical,” Craver said. “How is it that we can replay these events from our lives? And what’s going on in our brains that allows us to re-experience these events from our past?”
Craver hopes that this article — the last to be published about KC during his lifetime — brings these types of questions to the forefront. 
“These findings open up a whole new set of questions about people with amnesia,” Craver said. “Things that we previously thought were closed questions are now wide open.”
(Image credit)

Those with episodic amnesia are not ‘stuck in time,’ says philosopher Carl Craver

In 1981, a motorcycle accident left Toronto native Kent Cochrane with severe brain damage and dramatically impaired episodic memory. Following the accident, Cochrane could no longer remember events from his past. Nor could he predict specific events that might happen in the future.

When neuroscientist Endel Tulving, PhD, asked him to describe what he would do tomorrow, Cochrane could not answer and described his state of mind as “blank.”

Psychologists and neuroscientists came to know Cochrane, who passed away earlier this year, simply as “KC.” Many scientists have described KC as “stuck in time,” or trapped in a permanent present.

It has generally been assumed that people with episodic amnesia experience time much differently than those with more typical memory function. 

However, a recent paper in Neuropsychologia co-authored by Carl F. Craver, PhD, professor of philosophy and of philosophy-neuroscience-psychology, both in Arts & Sciences at Washington University in St. Louis, disputes this type of claim.

“It’s our whole way of thinking about these people that we wanted to bring under pressure,” Craver said. “There are sets of claims that sound empirical, like ‘These people are stuck in time.’ But if you ask, ‘Have you actually tested what they know about time?’ the answer is no.”

Time and consciousness

A series of experiments convinced Craver and his co-authors that although KC could not remember specific past experiences, he did in fact have an understanding of time and an appreciation of its significance to his life.

Interviews with KC by Craver and his colleagues revealed that KC retained much of what psychologists refer to as “temporal consciousness.” KC could order significant events from his life on a timeline, and he seemed to have complete mastery of central temporal concepts.

For example, KC understood that events in the past have already happened, that they influence the future, and that once they happen, they cannot be changed. 

He also knew that events in the future don’t remain in the future, but eventually become present. Even more interestingly, KC’s understanding of time influenced his decision-making.

If KC truly had no understanding of time, Craver argues, then he and others with his type of amnesia would act as if only the present mattered. Without understanding that present actions have future consequences or rewards, KC would have based his actions only upon immediate outcomes. However, this was not the case.

On a personality test, KC scored as low as possible on measures of hedonism, or the tendency to be a self-indulgent pleasure-seeker.

In systematic tests of his decision-making, carried out with WUSTL’s Len Green, PhD, professor of psychology, and Joel Myerson, PhD, research professor of psychology, and researchers at York University in Toronto, KC also showed that he was willing to trade a smaller, sooner reward for a larger, later reward.

In other words, KC’s inability to remember past events did not affect his ability to appreciate the value of future rewards. 

‘Questions are now wide open’

KC’s case reveals how much is left to discover about memory and how it relates to human understanding of time.

“If you think about memory long enough it starts to sound magical,” Craver said. “How is it that we can replay these events from our lives? And what’s going on in our brains that allows us to re-experience these events from our past?”

Craver hopes that this article — the last to be published about KC during his lifetime — brings these types of questions to the forefront. 

“These findings open up a whole new set of questions about people with amnesia,” Craver said. “Things that we previously thought were closed questions are now wide open.”

(Image credit)

Filed under amnesia episodic memory consciousness time perception psychology neuroscience science

77 notes

Hormones affect voting behavior
Researchers from the University of Nebraska at Omaha (UNO), the University of Nebraska-Lincoln (UNL) and Rice University have released a study that shows hormone levels can affect voter turnout.
As witnessed by recent voter turnout in primary elections, participation in U.S. national elections is low, relative to other western democracies. In fact, voter turnout in biennial national elections ranges includes only 40 to 60 percent of eligible voters.
The study, published June 22 in Physiology and Behavior, reports that while participation in electoral politics is affected by a host of social and demographic variables, there are also biological factors that may play a role, as well. Specifically, the paper points to low levels of the stress hormone cortisol as a strong predictor of actual voting behavior, determined via voting records maintained by the Secretary of State.
"Politics and political participation is an inherently stressful activity," explained the paper&#8217;s lead author, Jeff French, Varner Professor of Psychology and Biology and director of UNO&#8217;s neuroscience program. "It would logically follow that those individuals with low thresholds for stress might avoid engaging in that activity and our study confirmed that hypothesis."
Additional authors on the paper are Adam Guck and Andrew K. Birnie from UNO&#8217;s Department of Psychology; Kevin B. Smith and John R. Hibbing from UNL&#8217;s Department of Political Science; and John R. Alford from the Department of Political Science at Rice University.
The study is part of a larger body of research exploring connections between biology and political orientation, led by Smith and Hibbing. Previous studies have involved twins, eye-tracking equipment and skin conductance in their efforts to identify physical and genetic links to political beliefs.
"It&#8217;s one more piece of solid evidence that there are biological markers for political attitudes and behavior," said Smith. "It&#8217;s long been known that cortisol levels are associated with your willingness to interact socially – that&#8217;s something fairly well established in the research literature. The big contribution here is that nobody really looked at politics and voting behaviors before."
"This research shows that cortisol is related to a willingness to participate in politics," he said.
To reach their conclusion, researchers collected the saliva of over 100 participants who identified themselves as highly conservative, highly liberal or disinterested in politics altogether and analyzed the levels of cortisol found.
Cortisol was measured in saliva collected from the participants before and during activities designed to raise and lower stress. These data were then compared against the participants&#8217; earlier responses regarding involvement in political activities (voting and nonvoting) and religious participation.
"Not only did the study show, expectedly, that high-stress activities led to higher levels of cortisol production, but that political participation was significantly correlated with low baseline levels of cortisol," French explained. "Participation in another group-oriented activity, specifically religious participation, was not as strongly associated with cortisol levels. Involvement in nonvoting political activities, such as volunteering for a campaign, financial political contributions, or correspondence with elected officials, was not predicted by levels of stress hormones."
According to the study, the only other factor that was predictive of voting behavior was age; older adults were likely to have voted more often than younger adults. Research from other groups has also pointed to education, income, and race as important predictors of voting behavior.
In explaining why elevated cortisol could be linked with lower rates of participation in elections, French cited previous experiments in which high levels of afternoon cortisol are linked to major depressive disorder, social withdrawal, separation anxiety and enhanced memory for fearful stimuli.
"High afternoon cortisol is reflective of a variety of social, cognitive, and emotional processes, and may also influence a trait as complex as voting behavior," French suggested.
"The key takeaway from this research, I believe, is that while social scientists have spent decades trying to predict voting behavior based on demographic information, there is much to be learned from looking at biological differences as well," he said. "Many factors influence the decision to participate in the most important political activity in our democracy, and our study demonstrates that stress physiology is an important biological factor in this decision. Our experiment helps to more fully explain why some people engage in electoral politics and others do not."

Hormones affect voting behavior

Researchers from the University of Nebraska at Omaha (UNO), the University of Nebraska-Lincoln (UNL) and Rice University have released a study that shows hormone levels can affect voter turnout.

As witnessed by recent voter turnout in primary elections, participation in U.S. national elections is low, relative to other western democracies. In fact, voter turnout in biennial national elections ranges includes only 40 to 60 percent of eligible voters.

The study, published June 22 in Physiology and Behavior, reports that while participation in electoral politics is affected by a host of social and demographic variables, there are also biological factors that may play a role, as well. Specifically, the paper points to low levels of the stress hormone cortisol as a strong predictor of actual voting behavior, determined via voting records maintained by the Secretary of State.

"Politics and political participation is an inherently stressful activity," explained the paper’s lead author, Jeff French, Varner Professor of Psychology and Biology and director of UNO’s neuroscience program. "It would logically follow that those individuals with low thresholds for stress might avoid engaging in that activity and our study confirmed that hypothesis."

Additional authors on the paper are Adam Guck and Andrew K. Birnie from UNO’s Department of Psychology; Kevin B. Smith and John R. Hibbing from UNL’s Department of Political Science; and John R. Alford from the Department of Political Science at Rice University.

The study is part of a larger body of research exploring connections between biology and political orientation, led by Smith and Hibbing. Previous studies have involved twins, eye-tracking equipment and skin conductance in their efforts to identify physical and genetic links to political beliefs.

"It’s one more piece of solid evidence that there are biological markers for political attitudes and behavior," said Smith. "It’s long been known that cortisol levels are associated with your willingness to interact socially – that’s something fairly well established in the research literature. The big contribution here is that nobody really looked at politics and voting behaviors before."

"This research shows that cortisol is related to a willingness to participate in politics," he said.

To reach their conclusion, researchers collected the saliva of over 100 participants who identified themselves as highly conservative, highly liberal or disinterested in politics altogether and analyzed the levels of cortisol found.

Cortisol was measured in saliva collected from the participants before and during activities designed to raise and lower stress. These data were then compared against the participants’ earlier responses regarding involvement in political activities (voting and nonvoting) and religious participation.

"Not only did the study show, expectedly, that high-stress activities led to higher levels of cortisol production, but that political participation was significantly correlated with low baseline levels of cortisol," French explained. "Participation in another group-oriented activity, specifically religious participation, was not as strongly associated with cortisol levels. Involvement in nonvoting political activities, such as volunteering for a campaign, financial political contributions, or correspondence with elected officials, was not predicted by levels of stress hormones."

According to the study, the only other factor that was predictive of voting behavior was age; older adults were likely to have voted more often than younger adults. Research from other groups has also pointed to education, income, and race as important predictors of voting behavior.

In explaining why elevated cortisol could be linked with lower rates of participation in elections, French cited previous experiments in which high levels of afternoon cortisol are linked to major depressive disorder, social withdrawal, separation anxiety and enhanced memory for fearful stimuli.

"High afternoon cortisol is reflective of a variety of social, cognitive, and emotional processes, and may also influence a trait as complex as voting behavior," French suggested.

"The key takeaway from this research, I believe, is that while social scientists have spent decades trying to predict voting behavior based on demographic information, there is much to be learned from looking at biological differences as well," he said. "Many factors influence the decision to participate in the most important political activity in our democracy, and our study demonstrates that stress physiology is an important biological factor in this decision. Our experiment helps to more fully explain why some people engage in electoral politics and others do not."

Filed under stress cortisol voting behavior psychology neuroscience science

free counters