Neuroscience

Articles and news from the latest research reports.

Posts tagged attention

67 notes

Staying On Task in the Automated Cockpit

Automation in the cockpit is traditionally believed to free pilots’ attention from mundane flight tasks and allow them to focus on the big picture or prepare for any unexpected events during flight. However, a new study published in Human Factors indicates that pilots may have a hard time concentrating on the automated systems that now carry out many of the tasks once completed by humans.

image

“The automated systems in today’s cockpits assume many of the tasks formerly performed by human pilots and do it with impressive reliability,” says Stephen Casner, coauthor of “Thoughts in Flight: Automation Use and Pilots’ Task-Related and Task-Unrelated Thought” and research psychologist at NASA’s Ames Research Center. “This leaves pilots to watch over the automation as it does its work, but people can only concentrate on something uneventful for so long. Humans aren’t robots. We can’t stare at a green light for hours at a stretch without getting tired, bored, or going crazy.”

Researchers Casner and coauthor Jonathan Schooler designed a flight simulation study in which they asked pilots to follow a published arrival procedure into New York’s busy John F. Kennedy International Airport. As the pilots navigated the flight, they were asked about what they were thinking during various levels of automation and to assign their thoughts to three categories: the specific task at hand, higher-level thoughts (for example planning ahead), or thoughts unrelated to the flight (e.g., what’s for dinner).

The pilots reported an increase in big-picture flight-related thoughts when using higher levels of automation, but when the flight was progressing according to plan and pilots were not interacting with the automation, their thoughts were more likely to wander.

“The mind is restless,” says Schooler, a professor of psychological and brain sciences at the University of California, Santa Barbara. “When we’re not given something specific to think about, we come up with something else to think about.”

“Pilots limited their off-task thoughts to times in which the automation was doing the flying and all was going according to plan,” adds Casner. “Nevertheless, there seem to be potential costs to situations in which pilots disengage from a highly-automated task. What happens when something suddenly goes amiss after long periods of uneventful flight?”

The study’s authors concluded that although automation frees pilots’ minds from tedious tasks and enables them to focus on the overall flight, it might inadvertently encourage them to devote time to unrelated thoughts. Casner notes that on the basis of these findings, researchers studying cockpit automation might consider rethinking the interaction between humans and machines.

“As technology grows in capability, we seem to be taking the approach of using humans as safety nets for computers,” he says. “We need to sort out the strengths and weaknesses of both humans and computers and think of work environments that combine and exploit the best features of both to keep humans meaningfully engaged in their work.”

(Source: hfes.org)

Filed under attention cockpit automation mind wandering awareness psychology neuroscience science

203 notes

Study: People Pay More Attention to the Upper Half of Field of Vision
A new study from North Carolina State University and the University of Toronto finds that people pay more attention to the upper half of their field of vision – a finding which could have ramifications for everything from traffic signs to software interface design.
“Specifically, we tested people’s ability to quickly identify a target amidst visual clutter,” says Dr. Jing Feng, an assistant professor of psychology at NC State and lead author of a paper on the work. “Basically, we wanted to see where people concentrate their attention at first glance.”
Researchers had participants fix their eyes on the center of a computer screen, and then flashed a target and distracting symbols onto the screen for 10 to 80 milliseconds. The screen was then replaced by an unconnected “mask” image to disrupt their train of thought. Participants were asked to indicate where the target had been located on the screen.
Researchers found that people were 7 percent better at finding the target when it was located in the upper half of the screen.
“It doesn’t mean people don’t pay attention to the lower field of vision, but they were demonstrably better at paying attention to the upper field,” Feng says.
“A difference of 7 percent could make a significant difference for technologies that are safety-related or that we interact with on a regular basis,” Feng says. “For example, this could make a difference in determining where to locate traffic signs to make them more noticeable to drivers, or where to place important information on a website to highlight that information for users.”
The paper, “Upper Visual Field Advantage in Localizing a Target among Distractors,” is published online in the open-access journal i-Perception. The paper was co-authored by Dr. Ian Spence of the University of Toronto. The work was supported, in part, by the Natural Sciences and Engineering Research Council of Canada.

Study: People Pay More Attention to the Upper Half of Field of Vision

A new study from North Carolina State University and the University of Toronto finds that people pay more attention to the upper half of their field of vision – a finding which could have ramifications for everything from traffic signs to software interface design.

“Specifically, we tested people’s ability to quickly identify a target amidst visual clutter,” says Dr. Jing Feng, an assistant professor of psychology at NC State and lead author of a paper on the work. “Basically, we wanted to see where people concentrate their attention at first glance.”

Researchers had participants fix their eyes on the center of a computer screen, and then flashed a target and distracting symbols onto the screen for 10 to 80 milliseconds. The screen was then replaced by an unconnected “mask” image to disrupt their train of thought. Participants were asked to indicate where the target had been located on the screen.

Researchers found that people were 7 percent better at finding the target when it was located in the upper half of the screen.

“It doesn’t mean people don’t pay attention to the lower field of vision, but they were demonstrably better at paying attention to the upper field,” Feng says.

“A difference of 7 percent could make a significant difference for technologies that are safety-related or that we interact with on a regular basis,” Feng says. “For example, this could make a difference in determining where to locate traffic signs to make them more noticeable to drivers, or where to place important information on a website to highlight that information for users.”

The paper, “Upper Visual Field Advantage in Localizing a Target among Distractors,” is published online in the open-access journal i-Perception. The paper was co-authored by Dr. Ian Spence of the University of Toronto. The work was supported, in part, by the Natural Sciences and Engineering Research Council of Canada.

Filed under attention spatial attention vision visual field psychology neuroscience science

216 notes

ADHD Drug May Help Preserve Our Self-Control Resources

Methylphenidate, also known as Ritalin, may prevent the depletion of self-control, according to research published in Psychological Science, a journal of the Association for Psychological Science.

image

Self-control can be difficult — sticking with a diet or trying to focus attention on a boring textbook are hard things to do. Considerable research suggests one potential explanation for this difficulty: Exerting self-control for a long period seems to “deplete” our ability to exert self-control effectively on subsequent tasks.

“It is as if self-control is a limited resource that ‘runs out’ if it is used too much,” says lead researcher Chandra Sripada of the University of Michigan. “If we could figure out the brain mechanisms that cause regulatory depletion, then maybe we could find a way to prevent it.”

Previous research has implicated the neurotransmitters dopamine and norepinephrine in regulatory processing. Sripada and University of Michigan collaborators Daniel Kessler and John Jonides decided to see whether manipulating levels of these transmitters might affect regulatory depletion.

The researchers tested 108 adult participants, all of whom took a drug capsule 60 minutes prior to testing. Half of the participants received a capsule that contained methylphenidate, a medication used to treat ADHD that increases brain dopamine and norepinephrine. The other half received a placebo capsule. The study was double-blind, so neither the participants nor the researchers knew at the time of testing who had received which capsule.

The participants then completed a computer-based task in which they were required to press a button when a word containing the letter e appeared on screen. Some were given modified instructions that asked them to refrain from pressing the button if the letter e was next to or one extra letter away from another vowel — this version of the task was designed to tax participants’ self-control.

All of the participants then completed a second computer task aimed at testing their ability to process competing information and exert regulatory control in order to make a correct response.

In line with the researchers’ hypotheses, participants who received the placebo and performed the taxing version of the first task showed greater variability in how quickly they responded in the second task, compared to those whose self-control hadn’t been depleted in the first task.

But for those participants who took the methylphenidate capsule, the first task didn’t have an effect on later performance — the methylphenidate seemed to counteract the self-regulatory depletion incurred by the harder version of the first task.

“These results indicate that depletion of self-control due to prior effort can be fully blocked pharmacologically,” says Sripada. “The task we give people to deplete their self-control is pretty cognitively demanding, so we were surprised at how effective methylphenidate was in blocking depletion of self-control.”

Sripada and colleagues suggest that methylphenidate may help to boost performance of the specific circuits in the brain’s prefrontal cortex that are normally compromised after sustained exertion of self-control.

This doesn’t mean, however, that those of us looking to boost our self-control should go out and get some Ritalin:

“Methylphenidate is a powerful psychotropic medicine that should only be taken with a prescription,” says Sripada. “We want to use this research to better understand the brain mechanisms that lead to depletion of self-control, and what interventions — pharmacological or behavioral — might prevent this.”

Filed under ADHD methylphenidate self-control cognitive control attention psychology neuroscience science

148 notes

Paying closer attention to attention

Ellen’s (not her real name) adoptive parents weren’t surprised when the school counselor suggested that she might have attention deficit hyperactivity disorder (ADHD).

Several professionals had made this suggestion over the years. Given that homework led to one explosion after another, and that at school Ellen, who is eleven, spent her days jiggling up and down in her seat, unable to concentrate for more than ten minutes, it seemed a reasonable assumption. Yet her parents always felt that ADHD didn’t quite capture the extent of Ellen’s issues over the years. Fortunately the school counsellor was familiar with fetal alcohol spectrum disorder (FASD). When she learned that Ellen’s birth mother had consumed alcohol during pregnancy, she raised the possibility that Ellen’s problems could be attributable to FASD and referred her for further assessment.

It’s a familiar story, and most of us reading about Ellen would assume that she did indeed suffer from ADHD.

But now researchers from McGill have suggested that there may be an overreporting of attention problems in children with FASD, simply because parents and teachers are using a misplaced basis for comparison. They are testing and comparing children with FASD with children of the same physical or chronological age, rather than with children of the same mental age, which is often quite a lot younger.

image

“Because the link between fetal alcohol syndrome and ADHD is so commonly described in the literature, both parents and teachers are more likely to expect these children to have attention problems,” says Prof. Jacob Burack, a professor in McGill’s Dept. of Educational and Counselling Psychology and the senior author on a recent study on the subject. “But what teachers often don’t recognize is that although the child they are dealing with is eleven years old in chronological terms, they are actually functioning at the developmental age of an eight-year old. That’s a pretty big difference. And when you use mental age as the basis of comparison, many of the attention problems that have been described in children with FASD no longer seem of primary importance.”

The researchers recruited children with FASD whose average chronological age was just under twelve years old. But their average mental age, determined by standard tests, was actually closer to nine-and-a-half years old. (The children were recruited through the Asante Centre for Fetal Alcohol Syndrome in British Columbia, and though the number of children studied may appear small, this is a fairly typical size for studies on FASD, given the difficulties of the diagnostic process.)

These children were then compared with children who were developing typically and whose average chronological age was about eight-and-a-half years old and whose average mental age was similar to that of the group of children diagnosed with FASD.

After using tests to measure specific aspects of attention, the researchers then compared the performance of children with FASD on these tests with the results of children of the same mental age. What they found was that while children like Ellen had difficulties with certain kinds of attention skills, notably in terms of shifting attention from one object to another, there were other areas, such as focus, where they had no significant difficulties at all. So, if we were to compare these aspects of attention to a hockey game, typically these children would have no difficulty focusing on the puck in the arena, but would have problems following the puck being passed from one player to another.

This suggests to Dr. Kimberly Lane, the PhD student who conducted the research, that there is a need to develop a more nuanced understanding of attention skills. “We use words like attention loosely, but it’s really an umbrella term that covers various aspects of attending to different people or events or environments,” says Dr. Lane. “By using more complex assessment techniques of various aspects of attention it will be possible to get a better picture of the attention difficulties faced by children with FASD,” she adds.

“But no matter what the tests say, it’s important for teachers and parents to understand that.the difficulties these children have with attention may be less important than their more general problems, and we need to work with them as they are.”

(Source: mcgill.ca)

Filed under attention disorders FASD selective attention attention neuroscience science

510 notes

Scientists discover brain’s anti-distraction system
Two Simon Fraser University psychologists have made a brain-related discovery that could revolutionize doctors’ perception and treatment of attention-deficit disorders.
This discovery opens up the possibility that environmental and/or genetic factors may hinder or suppress a specific brain activity that the researchers have identified as helping us prevent distraction.
The Journal of Neuroscience has just published a paper about the discovery by John McDonald, an associate professor of psychology and his doctoral student John Gaspar, who made the discovery during his master’s thesis research.
This is the first study to reveal our brains rely on an active suppression mechanism to avoid being distracted by salient irrelevant information when we want to focus on a particular item or task.
McDonald, a Canada Research Chair in Cognitive Neuroscience, and other scientists first discovered the existence of the specific neural index of suppression in his lab in 2009. But, until now, little was known about how it helps us ignore visual distractions.
“This is an important discovery for neuroscientists and psychologists because most contemporary ideas of attention highlight brain processes that are involved in picking out relevant objects from the visual field. It’s like finding Waldo in a Where’s Waldo illustration,” says Gaspar, the study’s lead author.
“Our results show clearly that this is only one part of the equation and that active suppression of the irrelevant objects is another important part.”
Given the proliferation of distracting consumer devices in our technology-driven, fast-paced society, the psychologists say their discovery could help scientists and health care professionals better treat individuals with distraction-related attentional deficits.
“Distraction is a leading cause of injury and death in driving and other high-stakes environments,” notes McDonald, the study’s senior author. “There are individual differences in the ability to deal with distraction. New electronic products are designed to grab attention. Suppressing such signals takes effort, and sometimes people can’t seem to do it.
“Moreover, disorders associated with attention deficits, such as ADHD and schizophrenia, may turn out to be due to difficulties in suppressing irrelevant objects rather than difficulty selecting relevant ones.”
The researchers are now turning their attention to understanding how we deal with distraction. They’re looking at when and why we can’t suppress potentially distracting objects, whether some of us are better at doing so and why that is the case.
“There’s evidence that attentional abilities decline with age and that women are better than men at certain visual attentional tasks,” says Gaspar, the study’s first author.
The study was based on three experiments in which 47 students performed an attention-demanding visual search task. Their mean age was 21. The researchers studied their neural processes related to attention, distraction and suppression by recording electrical brain signals from sensors embedded in a cap they wore.

Scientists discover brain’s anti-distraction system

Two Simon Fraser University psychologists have made a brain-related discovery that could revolutionize doctors’ perception and treatment of attention-deficit disorders.

This discovery opens up the possibility that environmental and/or genetic factors may hinder or suppress a specific brain activity that the researchers have identified as helping us prevent distraction.

The Journal of Neuroscience has just published a paper about the discovery by John McDonald, an associate professor of psychology and his doctoral student John Gaspar, who made the discovery during his master’s thesis research.

This is the first study to reveal our brains rely on an active suppression mechanism to avoid being distracted by salient irrelevant information when we want to focus on a particular item or task.

McDonald, a Canada Research Chair in Cognitive Neuroscience, and other scientists first discovered the existence of the specific neural index of suppression in his lab in 2009. But, until now, little was known about how it helps us ignore visual distractions.

“This is an important discovery for neuroscientists and psychologists because most contemporary ideas of attention highlight brain processes that are involved in picking out relevant objects from the visual field. It’s like finding Waldo in a Where’s Waldo illustration,” says Gaspar, the study’s lead author.

“Our results show clearly that this is only one part of the equation and that active suppression of the irrelevant objects is another important part.”

Given the proliferation of distracting consumer devices in our technology-driven, fast-paced society, the psychologists say their discovery could help scientists and health care professionals better treat individuals with distraction-related attentional deficits.

“Distraction is a leading cause of injury and death in driving and other high-stakes environments,” notes McDonald, the study’s senior author. “There are individual differences in the ability to deal with distraction. New electronic products are designed to grab attention. Suppressing such signals takes effort, and sometimes people can’t seem to do it.

“Moreover, disorders associated with attention deficits, such as ADHD and schizophrenia, may turn out to be due to difficulties in suppressing irrelevant objects rather than difficulty selecting relevant ones.”

The researchers are now turning their attention to understanding how we deal with distraction. They’re looking at when and why we can’t suppress potentially distracting objects, whether some of us are better at doing so and why that is the case.

“There’s evidence that attentional abilities decline with age and that women are better than men at certain visual attentional tasks,” says Gaspar, the study’s first author.

The study was based on three experiments in which 47 students performed an attention-demanding visual search task. Their mean age was 21. The researchers studied their neural processes related to attention, distraction and suppression by recording electrical brain signals from sensors embedded in a cap they wore.

Filed under attention disorders attention distraction EEG psychology neuroscience science

204 notes

How the brain pays attention
Neuroscientists identify a brain circuit that’s key to shifting our focus from one object to another.
Picking out a face in the crowd is a complicated task: Your brain has to retrieve the memory of the face you’re seeking, then hold it in place while scanning the crowd, paying special attention to finding a match.
A new study by MIT neuroscientists reveals how the brain achieves this type of focused attention on faces or other objects: A part of the prefrontal cortex known as the inferior frontal junction (IFJ) controls visual processing areas that are tuned to recognize a specific category of objects, the researchers report in the April 10 online edition of Science.

Scientists know much less about this type of attention, known as object-based attention, than spatial attention, which involves focusing on what’s happening in a particular location. However, the new findings suggest that these two types of attention have similar mechanisms involving related brain regions, says Robert Desimone, the Doris and Don Berkey Professor of Neuroscience, director of MIT’s McGovern Institute for Brain Research, and senior author of the paper.
“The interactions are surprisingly similar to those seen in spatial attention,” Desimone says. “It seems like it’s a parallel process involving different areas.”
In both cases, the prefrontal cortex — the control center for most cognitive functions — appears to take charge of the brain’s attention and control relevant parts of the visual cortex, which receives sensory input. For spatial attention, that involves regions of the visual cortex that map to a particular area within the visual field.
In the new study, the researchers found that IFJ coordinates with a brain region that processes faces, known as the fusiform face area (FFA), and a region that interprets information about places, known as the parahippocampal place area (PPA). The FFA and PPA were first identified in the human cortex by Nancy Kanwisher, the Walter A. Rosenblith Professor of Cognitive Neuroscience at MIT.  
The IFJ has previously been implicated in a cognitive ability known as working memory, which is what allows us to gather and coordinate information while performing a task — such as remembering and dialing a phone number, or doing a math problem.
For this study, the researchers used magnetoencephalography (MEG) to scan human subjects as they viewed a series of overlapping images of faces and houses. Unlike functional magnetic resonance imaging (fMRI), which is commonly used to measure brain activity, MEG can reveal the precise timing of neural activity, down to the millisecond. The researchers presented the overlapping streams at two different rhythms — two images per second and 1.5 images per second — allowing them to identify brain regions responding to those stimuli.
“We wanted to frequency-tag each stimulus with different rhythms. When you look at all of the brain activity, you can tell apart signals that are engaged in processing each stimulus,” says Daniel Baldauf, a postdoc at the McGovern Institute and the lead author of the paper.
Each subject was told to pay attention to either faces or houses; because the houses and faces were in the same spot, the brain could not use spatial information to distinguish them. When the subjects were told to look for faces, activity in the FFA and the IFJ became synchronized, suggesting that they were communicating with each other. When the subjects paid attention to houses, the IFJ synchronized instead with the PPA.
The researchers also found that the communication was initiated by the IFJ and the activity was staggered by 20 milliseconds — about the amount of time it would take for neurons to electrically convey information from the IFJ to either the FFA or PPA. The researchers believe that the IFJ holds onto the idea of the object that the brain is looking for and directs the correct part of the brain to look for it.
The MEG scanner, as well as the study’s “elegant design,” were critical to discovering this relationship, says Robert Knight, a professor of psychology and neuroscience at the University of California at Berkeley who was not part of the research team.
“Functional MRI gives hints of connectivity,” Knight says, “but the time course is way too slow to show these millisecond-scale frequencies and to establish what they show, which is that the inferior frontal lobe is the prime driver.”
Further bolstering this idea, the researchers used an MRI-based method to measure the white matter that connects different brain regions and found that the IFJ is highly connected with both the FFA and PPA.
Members of Desimone’s lab are now studying how the brain shifts its focus between different types of sensory input, such as vision and hearing. They are also investigating whether it might be possible to train people to better focus their attention by controlling the brain interactions  involved in this process.
“You have to identify the basic neural mechanisms and do basic research studies, which sometimes generate ideas for things that could be of practical benefit,” Desimone says. “It’s too early to say whether this training is even going to work at all, but it’s something that we’re actively pursuing.”

How the brain pays attention

Neuroscientists identify a brain circuit that’s key to shifting our focus from one object to another.

Picking out a face in the crowd is a complicated task: Your brain has to retrieve the memory of the face you’re seeking, then hold it in place while scanning the crowd, paying special attention to finding a match.

A new study by MIT neuroscientists reveals how the brain achieves this type of focused attention on faces or other objects: A part of the prefrontal cortex known as the inferior frontal junction (IFJ) controls visual processing areas that are tuned to recognize a specific category of objects, the researchers report in the April 10 online edition of Science.

Scientists know much less about this type of attention, known as object-based attention, than spatial attention, which involves focusing on what’s happening in a particular location. However, the new findings suggest that these two types of attention have similar mechanisms involving related brain regions, says Robert Desimone, the Doris and Don Berkey Professor of Neuroscience, director of MIT’s McGovern Institute for Brain Research, and senior author of the paper.

“The interactions are surprisingly similar to those seen in spatial attention,” Desimone says. “It seems like it’s a parallel process involving different areas.”

In both cases, the prefrontal cortex — the control center for most cognitive functions — appears to take charge of the brain’s attention and control relevant parts of the visual cortex, which receives sensory input. For spatial attention, that involves regions of the visual cortex that map to a particular area within the visual field.

In the new study, the researchers found that IFJ coordinates with a brain region that processes faces, known as the fusiform face area (FFA), and a region that interprets information about places, known as the parahippocampal place area (PPA). The FFA and PPA were first identified in the human cortex by Nancy Kanwisher, the Walter A. Rosenblith Professor of Cognitive Neuroscience at MIT.  

The IFJ has previously been implicated in a cognitive ability known as working memory, which is what allows us to gather and coordinate information while performing a task — such as remembering and dialing a phone number, or doing a math problem.

For this study, the researchers used magnetoencephalography (MEG) to scan human subjects as they viewed a series of overlapping images of faces and houses. Unlike functional magnetic resonance imaging (fMRI), which is commonly used to measure brain activity, MEG can reveal the precise timing of neural activity, down to the millisecond. The researchers presented the overlapping streams at two different rhythms — two images per second and 1.5 images per second — allowing them to identify brain regions responding to those stimuli.

“We wanted to frequency-tag each stimulus with different rhythms. When you look at all of the brain activity, you can tell apart signals that are engaged in processing each stimulus,” says Daniel Baldauf, a postdoc at the McGovern Institute and the lead author of the paper.

Each subject was told to pay attention to either faces or houses; because the houses and faces were in the same spot, the brain could not use spatial information to distinguish them. When the subjects were told to look for faces, activity in the FFA and the IFJ became synchronized, suggesting that they were communicating with each other. When the subjects paid attention to houses, the IFJ synchronized instead with the PPA.

The researchers also found that the communication was initiated by the IFJ and the activity was staggered by 20 milliseconds — about the amount of time it would take for neurons to electrically convey information from the IFJ to either the FFA or PPA. The researchers believe that the IFJ holds onto the idea of the object that the brain is looking for and directs the correct part of the brain to look for it.

The MEG scanner, as well as the study’s “elegant design,” were critical to discovering this relationship, says Robert Knight, a professor of psychology and neuroscience at the University of California at Berkeley who was not part of the research team.

“Functional MRI gives hints of connectivity,” Knight says, “but the time course is way too slow to show these millisecond-scale frequencies and to establish what they show, which is that the inferior frontal lobe is the prime driver.”

Further bolstering this idea, the researchers used an MRI-based method to measure the white matter that connects different brain regions and found that the IFJ is highly connected with both the FFA and PPA.

Members of Desimone’s lab are now studying how the brain shifts its focus between different types of sensory input, such as vision and hearing. They are also investigating whether it might be possible to train people to better focus their attention by controlling the brain interactions  involved in this process.

“You have to identify the basic neural mechanisms and do basic research studies, which sometimes generate ideas for things that could be of practical benefit,” Desimone says. “It’s too early to say whether this training is even going to work at all, but it’s something that we’re actively pursuing.”

Filed under inferior frontal junction attention object-based attention prefrontal cortex fusiform face area neuroscience science

256 notes

Dog watch - How attention changes in the course of a dog’s life
Dogs are known to be Man’s best friend. No other pet has adjusted to Man’s lifestyle as this four-legged animal. Scientists at the Messerli Research Institute at the Vetmeduni Vienna, have been the first to investigate the evolution of dogs’ attentiveness in the course of their lives and to what extent they resemble Man in this regard. The outcome: dogs’ attentional and sensorimotor control developmental trajectories are very similar to those found in humans. The results were published in the journal Frontiers in Psychology.
Dogs are individual personalities, possess awareness, and are particularly known for their learning capabilities, or trainability. To learn successfully, they must display a sufficient quantity of attention and concentration. However, the attentiveness of dogs’ changes in the course of their lives, as it does in humans. The lead author Lisa Wallis and her colleagues investigated 145 Border Collies aged 6 months to 14 years in the Clever Dog Lab at the Vetmeduni Vienna and determined, for the first time, how attentiveness changes in the entire course of a dog’s life using a cross-sectional study design.
Humans are more interesting for dogs than objects
To determine how rapidly dogs of various age groups pay attention to objects or humans, the scientists performed two tests. In the first situation the dogs were confronted with a child’s toy suspended suddenly from the ceiling. The scientists measured how rapidly each dog reacted to this occurrence and how quickly the dogs became accustomed to it. Initially all dogs reacted with similar speed to the stimulus, but older dogs lost interest in the toy more rapidly than younger ones did.
In the second test situation, a person known to the dog entered the room and pretended to paint the wall. All dogs reacted by watching the person and the paint roller in the person’s hands for a longer duration than the toy hanging from the ceiling. 
Wallis’ conclusion: “So-called social attentiveness was more pronounced in all dogs than “non-social” attentiveness. The dogs generally tended to react by watching the person with the object for longer than an object on its own. We found that older dogs - like older human beings - demonstrated a certain calmness. They were less affected by new items in the environment and thus showed less interest than younger dogs.”
Selective attention is highest in mid-adulthood
In a further test the scientists investigated so-called selective attention. The dogs participated in an alternating attention task, where they had to perform two tasks consecutively. First, they needed  to find a food reward thrown onto the floor by the experimenter, then after eating the food, the experimenter waited for the dog to establish eye contact with her.  These tasks were repeated for a further twenty trials. The establishment of eye contact was marked by a clicking sound produced by a  “clicker” and small pieces of hot dog were used as a reward. The time spans to find the food and look up into the face were measured. With respect to both time spans, middle-aged dogs (3 to 6 years) reacted most rapidly.
Under these test conditions, sensorimotor abilities were highest among dogs of middle age. Younger dogs fared more poorly probably because of their general lack of experience. Motor abilities in dogs as in humans deteriorate with age. Humans between the age of 20 and 39 years experience a similar peak in sensorimotor abilities,” says Wallis.
Adolescent dogs have the steepest learning curve
Dogs also go through a difficult phase during adolescence (1-2 years) which affects their ability to pay attention. This phase of hormonal change may be compared to puberty in Man. Therefore, young dogs occasionally reacted with some delay to the clicker test. However, Wallis found that adolescent dogs improved their performance more rapidly than other age groups after several repetitions of the clicker test. In other words, the learning curve was found to be steepest in puberty. “Thus, dogs in puberty have great potential for learning and therefore trainability” says Wallis.
Dogs as a model for ADHD and Alzheimer’s disease
As the development of attentiveness in the course of a dog’s life is similar to human development in many respects, dogs make appropriate animal models for various human psychological diseases. For instance, the course of diseases like ADHD (attention deficit/hyperactivity disorder) or Alzheimer’s can be studied by observing the behavior of dogs. In her current project Wallis is investigating the effects of diet on cognition in older dogs together with her colleague Durga Chapagain. The scientists are still looking for dog owners who would like to participate in a long-term study.

Dog watch - How attention changes in the course of a dog’s life

Dogs are known to be Man’s best friend. No other pet has adjusted to Man’s lifestyle as this four-legged animal. Scientists at the Messerli Research Institute at the Vetmeduni Vienna, have been the first to investigate the evolution of dogs’ attentiveness in the course of their lives and to what extent they resemble Man in this regard. The outcome: dogs’ attentional and sensorimotor control developmental trajectories are very similar to those found in humans. The results were published in the journal Frontiers in Psychology.

Dogs are individual personalities, possess awareness, and are particularly known for their learning capabilities, or trainability. To learn successfully, they must display a sufficient quantity of attention and concentration. However, the attentiveness of dogs’ changes in the course of their lives, as it does in humans. The lead author Lisa Wallis and her colleagues investigated 145 Border Collies aged 6 months to 14 years in the Clever Dog Lab at the Vetmeduni Vienna and determined, for the first time, how attentiveness changes in the entire course of a dog’s life using a cross-sectional study design.

Humans are more interesting for dogs than objects

To determine how rapidly dogs of various age groups pay attention to objects or humans, the scientists performed two tests. In the first situation the dogs were confronted with a child’s toy suspended suddenly from the ceiling. The scientists measured how rapidly each dog reacted to this occurrence and how quickly the dogs became accustomed to it. Initially all dogs reacted with similar speed to the stimulus, but older dogs lost interest in the toy more rapidly than younger ones did.

In the second test situation, a person known to the dog entered the room and pretended to paint the wall. All dogs reacted by watching the person and the paint roller in the person’s hands for a longer duration than the toy hanging from the ceiling.

Wallis’ conclusion: “So-called social attentiveness was more pronounced in all dogs than “non-social” attentiveness. The dogs generally tended to react by watching the person with the object for longer than an object on its own. We found that older dogs - like older human beings - demonstrated a certain calmness. They were less affected by new items in the environment and thus showed less interest than younger dogs.”

Selective attention is highest in mid-adulthood

In a further test the scientists investigated so-called selective attention. The dogs participated in an alternating attention task, where they had to perform two tasks consecutively. First, they needed  to find a food reward thrown onto the floor by the experimenter, then after eating the food, the experimenter waited for the dog to establish eye contact with her.  These tasks were repeated for a further twenty trials. The establishment of eye contact was marked by a clicking sound produced by a  “clicker” and small pieces of hot dog were used as a reward. The time spans to find the food and look up into the face were measured. With respect to both time spans, middle-aged dogs (3 to 6 years) reacted most rapidly.

Under these test conditions, sensorimotor abilities were highest among dogs of middle age. Younger dogs fared more poorly probably because of their general lack of experience. Motor abilities in dogs as in humans deteriorate with age. Humans between the age of 20 and 39 years experience a similar peak in sensorimotor abilities,” says Wallis.

Adolescent dogs have the steepest learning curve

Dogs also go through a difficult phase during adolescence (1-2 years) which affects their ability to pay attention. This phase of hormonal change may be compared to puberty in Man. Therefore, young dogs occasionally reacted with some delay to the clicker test. However, Wallis found that adolescent dogs improved their performance more rapidly than other age groups after several repetitions of the clicker test. In other words, the learning curve was found to be steepest in puberty. “Thus, dogs in puberty have great potential for learning and therefore trainability” says Wallis.

Dogs as a model for ADHD and Alzheimer’s disease

As the development of attentiveness in the course of a dog’s life is similar to human development in many respects, dogs make appropriate animal models for various human psychological diseases. For instance, the course of diseases like ADHD (attention deficit/hyperactivity disorder) or Alzheimer’s can be studied by observing the behavior of dogs. In her current project Wallis is investigating the effects of diet on cognition in older dogs together with her colleague Durga Chapagain. The scientists are still looking for dog owners who would like to participate in a long-term study.

Filed under attention learning social attentiveness dogs aging animal model psychology neuroscience science

97 notes

Detecting Unidentified Changes
Does becoming aware of a change to a purely visual stimulus necessarily cause the observer to be able to identify or localise the change or can change detection occur in the absence of identification or localisation? Several theories of visual awareness stress that we are aware of more than just the few objects to which we attend. In particular, it is clear that to some extent we are also aware of the global properties of the scene, such as the mean luminance or the distribution of spatial frequencies. It follows that we may be able to detect a change to a visual scene by detecting a change to one or more of these global properties. However, detecting a change to global property may not supply us with enough information to accurately identify or localise which object in the scene has been changed. Thus, it may be possible to reliably detect the occurrence of changes without being able to identify or localise what has changed. Previous attempts to show that this can occur with natural images have produced mixed results. Here we use a novel analysis technique to provide additional evidence that changes can be detected in natural images without also being identified or localised. It is likely that this occurs by the observers monitoring the global properties of the scene.
Full Article

Detecting Unidentified Changes

Does becoming aware of a change to a purely visual stimulus necessarily cause the observer to be able to identify or localise the change or can change detection occur in the absence of identification or localisation? Several theories of visual awareness stress that we are aware of more than just the few objects to which we attend. In particular, it is clear that to some extent we are also aware of the global properties of the scene, such as the mean luminance or the distribution of spatial frequencies. It follows that we may be able to detect a change to a visual scene by detecting a change to one or more of these global properties. However, detecting a change to global property may not supply us with enough information to accurately identify or localise which object in the scene has been changed. Thus, it may be possible to reliably detect the occurrence of changes without being able to identify or localise what has changed. Previous attempts to show that this can occur with natural images have produced mixed results. Here we use a novel analysis technique to provide additional evidence that changes can be detected in natural images without also being identified or localised. It is likely that this occurs by the observers monitoring the global properties of the scene.

Full Article

Filed under attention blindness visual awareness eye movements visual perception psychology neuroscience science

137 notes

Ever-So-Slight Delay Improves Decision-Making Accuracy
Columbia University Medical Center (CUMC) researchers have found that decision-making accuracy can be improved by postponing the onset of a decision by a mere fraction of a second. The results could further our understanding of neuropsychiatric conditions characterized by abnormalities in cognitive function and lead to new training strategies to improve decision-making in high-stake environments. The study was published in the March 5 online issue of the journal PLoS One.
“Decision making isn’t always easy, and sometimes we make errors on seemingly trivial tasks, especially if multiple sources of information compete for our attention,” said first author Tobias Teichert, PhD, a postdoctoral research scientist in neuroscience at CUMC at the time of the study and now an assistant professor of psychiatry at the University of Pittsburgh. “We have identified a novel mechanism that is surprisingly effective at improving response accuracy.
The mechanism requires that decision-makers do nothing—just briefly. “Postponing the onset of the decision process by as little as 50 to 100 milliseconds enables the brain to focus attention on the most relevant information and block out irrelevant distractors,” said last author Jack Grinband, PhD, associate research scientist in the Taub Institute and assistant professor of clinical radiology (physics). “This way, rather than working longer or harder at making the decision, the brain simply postpones the decision onset to a more beneficial point in time.”
In making decisions, the brain integrates many small pieces of potentially contradictory sensory information. “Imagine that you’re coming up to a traffic light—the target—and need to decide whether the light is red or green,” said Dr. Teichert. “There is typically little ambiguity, and you make the correct decision quickly, in a matter of tens of milliseconds.”
The decision process itself, however, does not distinguish between relevant and irrelevant information. Hence, a task is made more difficult if irrelevant information—a distractor—interferes with the processing of the target. Distractors are present all the time; in this case, it might be in the form of traffic lights regulating traffic in other lanes. Though the brain is able to enhance relevant information and filter out distractions, these mechanisms take time.  If the decision process starts while the brain is still processing irrelevant information, errors can occur.
Studies have shown that response accuracy can be improved by prolonging the decision process, to allow the brain time to collect more information. Because accuracy is increased at the cost of longer reaction times, this process is referred to as the “speed-accuracy trade-off.” The researchers thought that a more effective way to reduce errors might be to delay the decision process so that it starts out with better information.
The research team conducted two experiments to test this hypothesis. In the first, subjects were shown what looked like a swarm of randomly moving dots (the target stimulus) on a computer monitor and were asked to judge whether the overall motion was to the left or right. A second and brighter set of moving dots (the distractor) appeared simultaneously in the same location, obscuring the motion of the target.  When the distractor dots moved in the same direction as the target dots, subjects performed with near-perfect accuracy, but when the distractor dots moved in the opposite direction, the error rate increased. The subjects were asked to perform the task either as quickly or as accurately as possible; they were free to respond at any time after the onset of the stimulus.
The second experiment was similar to the first, except that the subjects also heard regular clicks, indicating when they had to respond. The time allowed for viewing the dots varied between 17 and 500 milliseconds. This condition simulates real-life situations, such as driving, where the time to respond is beyond the driver’s control. “Manipulating how long the subject viewed the stimulus before responding allowed us to determine how quickly the brain is able to block out the distractors and focus on the target dots,” said Dr. Grinband.
“In this situation, it takes about 120 milliseconds to shift attention from one stimulus (the bright distractors) to another (the darker targets),” said Dr. Grinband. “To our knowledge, that’s something that no one has ever measured before.”
The experiments also revealed that it’s more beneficial to delay rather than prolong the decision process. The delay allows attention to be focused on the target stimulus and helps prevent irrelevant information from interfering with the decision process. “Basically, by delaying decision onset—simply by doing nothing—you are more likely to make a correct decision,” said Dr. Teichert.
Finally, the results showed that decision onset is, to some extent, under cognitive control. “The subjects automatically used this mechanism to improve response accuracy,” said Dr. Teichert. “However, we don’t think that they were aware that they were doing so. The process seems to go on behind the scenes. We hope to devise training strategies to bring the mechanism under conscious control.”
“This might be the first scientific study to justify procrastination,” Dr. Teichert said. “On a more serious note, our study provides important insights into fundamental brain processes and yields clues as to what might be going wrong in diseases such as ADHD and schizophrenia. It also could lead to new training strategies to improve decision making in complex high-stakes environments, such as air traffic control towers and military combat.”

Ever-So-Slight Delay Improves Decision-Making Accuracy

Columbia University Medical Center (CUMC) researchers have found that decision-making accuracy can be improved by postponing the onset of a decision by a mere fraction of a second. The results could further our understanding of neuropsychiatric conditions characterized by abnormalities in cognitive function and lead to new training strategies to improve decision-making in high-stake environments. The study was published in the March 5 online issue of the journal PLoS One.

“Decision making isn’t always easy, and sometimes we make errors on seemingly trivial tasks, especially if multiple sources of information compete for our attention,” said first author Tobias Teichert, PhD, a postdoctoral research scientist in neuroscience at CUMC at the time of the study and now an assistant professor of psychiatry at the University of Pittsburgh. “We have identified a novel mechanism that is surprisingly effective at improving response accuracy.

The mechanism requires that decision-makers do nothing—just briefly. “Postponing the onset of the decision process by as little as 50 to 100 milliseconds enables the brain to focus attention on the most relevant information and block out irrelevant distractors,” said last author Jack Grinband, PhD, associate research scientist in the Taub Institute and assistant professor of clinical radiology (physics). “This way, rather than working longer or harder at making the decision, the brain simply postpones the decision onset to a more beneficial point in time.”

In making decisions, the brain integrates many small pieces of potentially contradictory sensory information. “Imagine that you’re coming up to a traffic light—the target—and need to decide whether the light is red or green,” said Dr. Teichert. “There is typically little ambiguity, and you make the correct decision quickly, in a matter of tens of milliseconds.”

The decision process itself, however, does not distinguish between relevant and irrelevant information. Hence, a task is made more difficult if irrelevant information—a distractor—interferes with the processing of the target. Distractors are present all the time; in this case, it might be in the form of traffic lights regulating traffic in other lanes. Though the brain is able to enhance relevant information and filter out distractions, these mechanisms take time.  If the decision process starts while the brain is still processing irrelevant information, errors can occur.

Studies have shown that response accuracy can be improved by prolonging the decision process, to allow the brain time to collect more information. Because accuracy is increased at the cost of longer reaction times, this process is referred to as the “speed-accuracy trade-off.” The researchers thought that a more effective way to reduce errors might be to delay the decision process so that it starts out with better information.

The research team conducted two experiments to test this hypothesis. In the first, subjects were shown what looked like a swarm of randomly moving dots (the target stimulus) on a computer monitor and were asked to judge whether the overall motion was to the left or right. A second and brighter set of moving dots (the distractor) appeared simultaneously in the same location, obscuring the motion of the target.  When the distractor dots moved in the same direction as the target dots, subjects performed with near-perfect accuracy, but when the distractor dots moved in the opposite direction, the error rate increased. The subjects were asked to perform the task either as quickly or as accurately as possible; they were free to respond at any time after the onset of the stimulus.

The second experiment was similar to the first, except that the subjects also heard regular clicks, indicating when they had to respond. The time allowed for viewing the dots varied between 17 and 500 milliseconds. This condition simulates real-life situations, such as driving, where the time to respond is beyond the driver’s control. “Manipulating how long the subject viewed the stimulus before responding allowed us to determine how quickly the brain is able to block out the distractors and focus on the target dots,” said Dr. Grinband.

“In this situation, it takes about 120 milliseconds to shift attention from one stimulus (the bright distractors) to another (the darker targets),” said Dr. Grinband. “To our knowledge, that’s something that no one has ever measured before.”

The experiments also revealed that it’s more beneficial to delay rather than prolong the decision process. The delay allows attention to be focused on the target stimulus and helps prevent irrelevant information from interfering with the decision process. “Basically, by delaying decision onset—simply by doing nothing—you are more likely to make a correct decision,” said Dr. Teichert.

Finally, the results showed that decision onset is, to some extent, under cognitive control. “The subjects automatically used this mechanism to improve response accuracy,” said Dr. Teichert. “However, we don’t think that they were aware that they were doing so. The process seems to go on behind the scenes. We hope to devise training strategies to bring the mechanism under conscious control.”

“This might be the first scientific study to justify procrastination,” Dr. Teichert said. “On a more serious note, our study provides important insights into fundamental brain processes and yields clues as to what might be going wrong in diseases such as ADHD and schizophrenia. It also could lead to new training strategies to improve decision making in complex high-stakes environments, such as air traffic control towers and military combat.”

Filed under decision making attention cognition psychology neuroscience science

898 notes

Patient in ‘vegetative state’ not just aware, but paying attention
Research raises possibility of devices in the future to help some patients in a vegetative state interact with the outside world.
A patient in a seemingly vegetative state, unable to move or speak, showed signs of attentive awareness that had not been detected before, a new study reveals. This patient was able to focus on words signalled by the experimenters as auditory targets as successfully as healthy individuals. If this ability can be developed consistently in certain patients who are vegetative, it could open the door to specialised devices in the future and enable them to interact with the outside world.
The research, by scientists at the Medical Research Council Cognition and Brain Sciences Unit (MRC CBSU) and the University of Cambridge, is published today, 31 October, in the journal Neuroimage: Clinical.
For the study, the researchers used electroencephalography (EEG), which non-invasively measures the electrical activity over the scalp, to test 21 patients diagnosed as vegetative or minimally conscious, and eight healthy volunteers. Participants heard a series of different words  - one word a second over 90 seconds at a time - while asked to alternatingly attend to either the word ‘yes’ or the word ‘no’, each of which appeared 15% of the time. (Some examples of the words used include moss, moth, worm and toad.) This was repeated several times over a period of 30 minutes to detect whether the patients were able to attend to the correct target word.
They found that one of the vegetative patients was able to filter out unimportant information and home in on relevant words they were being asked to pay attention to. Using brain imaging (fMRI), the scientists also discovered that this patient could follow simple commands to imagine playing tennis. They also found that three other minimally conscious patients reacted to novel but irrelevant words, but were unable to selectively pay attention to the target word.
These findings suggest that some patients in a vegetative or minimally conscious state might in fact be able to direct attention to the sounds in the world around them.
Dr Srivas Chennu at the University of Cambridge, said: ”Not only did we find the patient had the ability to pay attention, we also found independent evidence of their ability to follow commands – information which could enable the development of future technology to help patients in a vegetative state communicate with the outside world.
“In order to try and assess the true level of brain function and awareness that survives in the vegetative and minimally conscious states, we are progressively building up a fuller picture of the sensory, perceptual and cognitive abilities in patients. This study has added a key piece to that puzzle, and provided a tremendous amount of insight into the ability of these patients to pay attention.”
Dr Tristan Bekinschtein at the MRC Cognition and Brain Sciences Unit said: “Our attention can be drawn to something by its strangeness or novelty, or we can consciously decide to pay attention to it. A lot of cognitive neuroscience research tells us that we have distinct patterns in the brain for both forms of attention, which we can measure even when the individual is unable to speak. These findings mean that, in certain cases of individuals who are vegetative, we might be able to enhance this ability and improve their level of communication with the outside world.”
This study builds on a joint programme of research at the University of Cambridge and MRC CBSU where a team of researchers have been developing a series of diagnostic and prognostic tools based on brain imaging techniques since 1998. Famously, in 2006 the group was able to use fMRI imaging techniques to establish that a patient in a vegetative state could respond to yes or no questions by indicating different, distinct patterns of brain activity.

Patient in ‘vegetative state’ not just aware, but paying attention

Research raises possibility of devices in the future to help some patients in a vegetative state interact with the outside world.

A patient in a seemingly vegetative state, unable to move or speak, showed signs of attentive awareness that had not been detected before, a new study reveals. This patient was able to focus on words signalled by the experimenters as auditory targets as successfully as healthy individuals. If this ability can be developed consistently in certain patients who are vegetative, it could open the door to specialised devices in the future and enable them to interact with the outside world.

The research, by scientists at the Medical Research Council Cognition and Brain Sciences Unit (MRC CBSU) and the University of Cambridge, is published today, 31 October, in the journal Neuroimage: Clinical.

For the study, the researchers used electroencephalography (EEG), which non-invasively measures the electrical activity over the scalp, to test 21 patients diagnosed as vegetative or minimally conscious, and eight healthy volunteers. Participants heard a series of different words  - one word a second over 90 seconds at a time - while asked to alternatingly attend to either the word ‘yes’ or the word ‘no’, each of which appeared 15% of the time. (Some examples of the words used include moss, moth, worm and toad.) This was repeated several times over a period of 30 minutes to detect whether the patients were able to attend to the correct target word.

They found that one of the vegetative patients was able to filter out unimportant information and home in on relevant words they were being asked to pay attention to. Using brain imaging (fMRI), the scientists also discovered that this patient could follow simple commands to imagine playing tennis. They also found that three other minimally conscious patients reacted to novel but irrelevant words, but were unable to selectively pay attention to the target word.

These findings suggest that some patients in a vegetative or minimally conscious state might in fact be able to direct attention to the sounds in the world around them.

Dr Srivas Chennu at the University of Cambridge, said: ”Not only did we find the patient had the ability to pay attention, we also found independent evidence of their ability to follow commands – information which could enable the development of future technology to help patients in a vegetative state communicate with the outside world.

“In order to try and assess the true level of brain function and awareness that survives in the vegetative and minimally conscious states, we are progressively building up a fuller picture of the sensory, perceptual and cognitive abilities in patients. This study has added a key piece to that puzzle, and provided a tremendous amount of insight into the ability of these patients to pay attention.”

Dr Tristan Bekinschtein at the MRC Cognition and Brain Sciences Unit said: “Our attention can be drawn to something by its strangeness or novelty, or we can consciously decide to pay attention to it. A lot of cognitive neuroscience research tells us that we have distinct patterns in the brain for both forms of attention, which we can measure even when the individual is unable to speak. These findings mean that, in certain cases of individuals who are vegetative, we might be able to enhance this ability and improve their level of communication with the outside world.”

This study builds on a joint programme of research at the University of Cambridge and MRC CBSU where a team of researchers have been developing a series of diagnostic and prognostic tools based on brain imaging techniques since 1998. Famously, in 2006 the group was able to use fMRI imaging techniques to establish that a patient in a vegetative state could respond to yes or no questions by indicating different, distinct patterns of brain activity.

Filed under consciousness vegetative state neuroimaging attention brain mapping neuroscience science

free counters