Posts tagged cognition

Posts tagged cognition
‘Sticky synapses’ can impair new memories by holding on to old ones
University of British Columbia researchers have discovered that so-called “sticky synapses” in the brain can impair new learning by excessively hard-wiring old memories and inhibiting our ability to adapt to our changing environment.
Memories are formed by strong synaptic connections between nerve cells. Now a team of UBC neuroscientists has found that synapses that are too strong or “sticky” can actually hinder our capacity to learn new things by affecting cognitive flexibility, the ability to modify our behaviours to adjust to circumstances that are similar, but not identical, to previous experiences.
“We tend to think that strong retention of memories is always a good thing,” says Fergil Mills, UBC PhD candidate and the study’s first author. “But our study shows that cognitive flexibility involves actively weakening old memory traces. In certain situations, you have to be able to ‘forget’ to learn.”
The study, published today in the Proceedings of the National Academy of Sciences, shows that mice with excessive beta-catenin – a protein that is part of the “molecular glue” that holds synapses together – can learn a task just as well as normal mice, but lacked the mental dexterity to adapt if the task was altered.
“Increased levels of beta-catenin have previously been reported in disorders such as Alzheimer’s disease and Huntington’s disease, and, intriguingly, patients with these diseases have been shown to have deficits in cognitive flexibility similar to those we observed in this study,” says Shernaz Bamji, an associate professor in UBC’s Dept. of Cellular and Physiological Sciences.
“Now, we see that changes in beta-catenin levels can dramatically affect learning and memory, and may indeed play a role in the cognitive deficits associated with these diseases,” she adds. “This opens up many exciting new avenues for research into these diseases and potential therapeutic approaches.”
BACKGROUND
To test cognitive flexibility in mice, researchers conducted an experiment where the mice were placed in a pool of water and had to learn to find a submerged hidden platform. The mice with excessive beta-catenin could learn to find the platform just as well as normal mice. However, if the platform was moved to a different location in the pool, these mice kept swimming to the platform’s previous location. Even after many days of training, the ‘sticky synapses’ in their brains made them unable to effectively learn to find the new platform.
Children with profound deafness who receive a cochlear implant had as much as five times the risk of having delays in areas of working memory, controlled attention, planning and conceptual learning as children with normal hearing, according to Indiana University research published May 22 in the Journal of the American Medical Association Otolaryngology—Head and Neck Surgery.

The authors evaluated 73 children implanted before age 7 and 78 children with normal hearing to determine the risk of deficits in executive functioning behaviors in everyday life.
Executive functioning, a set of mental processes involved in regulating and directing thinking and behavior, is important for focusing and attaining goals in daily life. All children in the study had average to above-average IQ scores. The results, reported in “Neurocognitive Risk in Children With Cochlear Implants,” are the first from a large-scale study to compare real-world executive functioning behavior in children with cochlear implants and those with normal hearing.
A cochlear implant device consists of an external component that processes sound into electrical signals that are sent to an internal receiver and electrodes that stimulate the auditory nerve. Although the device restores the ability to perceive many sounds to children who are born deaf, some details and nuances of hearing are lost in the process.
First author William Kronenberger, Ph.D., professor of clinical psychology in psychiatry at the IU School of Medicine and a specialist in neurocognitive and executive function testing, said that delays in executive functioning have been commonly reported by parents and others who work with children with cochlear implants. Based on these observations, his group sought to evaluate whether elevated risks of delays in executive functioning in children with cochlear implants exist, and what components of executive functioning were affected.
"In this study, about one-third to one-half of children with cochlear implants were found to be at-risk for delays in areas of parent-rated executive functioning such as concept formation, memory, controlled attention and planning. This rate was 2 to 5 times greater than that seen in normal-hearing children," reported Dr. Kronenberger, who also is co-chief of the ADHD-Disruptive Behavior Disorders Clinic and directs the psychology testing clinic at Riley Hospital for Children at IU Health.
"This is really innovative work," said co-author David B. Pisoni, Ph.D., director of the Speech Research Laboratory in the IU Department of Psychological and Brain Sciences. "Almost no one has looked at these issues in these children. Most audiologists, neuro-otologists, surgeons and speech-language pathologists — the people who work in this field — focus on the hearing deficit as a medical condition and have been less focused on the important discoveries in developmental science and cognitive neuroscience." Dr. Pisoni also is a Chancellors’ Professor of Psychological and Brain Sciences at IU Bloomington.
Richard Miyamoto, M.D., chair of the IU School of Medicine Department of Otolaryngology-Head and Neck Surgery and a pioneer in the field of cochlear implantation in children and adults, said this finding augments other research on interventions to help children with cochlear implants perform at a level similar to children without hearing deficits.
"The ultimate goal of our department’s research with cochlear implants has always been to influence higher-level neurocognitive functioning," Dr. Miyamoto said. "Much of the success we have seen to date clearly relates to the brain’s ability to process an incomplete signal. The current research will further assist in identifying gaps in our knowledge."
One possible answer may lie in earlier implantation, Dr. Miyamoto said. The age at which children are implanted has been steadily decreasing, which has produced significant improvement in spoken language outcomes. Research shows the early implantation is related to better outcomes in speech and understanding, and it is reasonable to believe that there may be less of a deficit in executive functioning with earlier implantation, said Dr. Miyamoto, who is the Arilla Spence DeVault Professor of Otolaryngology-Head and Neck Surgery and medical director of audiology and speech language pathology at the IU School of Medicine.
Preschoolers in the IU study were implanted at an average age of 18 months, and they had fewer executive function delays than school-age children who were implanted 10 months later, at an average age of 28 months.
Children in the study were divided into two age groups: preschool (3 to 5 years) and school-age (7 to17 years). Using an established rating scale, parents rated executive function in everyday life for children with cochlear implants and for the control group with normal hearing.
"We compared parent ratings and looked at the percentage of children in each group who scored above a cut-off value that indicates at least a mild delay in executive functioning," Dr. Kronenberger said. "In the critical areas of controlled attention, working memory, planning and solving new problems, about 30 to 45 percent of the children with cochlear implants scored above the cut-off value, compared to about 15 percent or less of the children in the normal-hearing sample."
Dr. Kronenberger said the research also shows that many children develop average or better executive functioning skills after cochlear implantation.
"These results show that half or more of our group with cochlear implants did not have significant delays in executive functioning," Dr. Kronenberger said. "Cochlear implants produce remarkable gains in spoken language and other neurocognitive skills, but there is a certain amount of learning and catch-up that needs to take place with children who have experienced a hearing loss prior to cochlear implantation. So far, most of the interventions to help with this learning have focused on speech and language. Our findings show a need to identify and help some children in certain domains of executive functioning as well."
"We are now looking for early markers in children who are at risk before they get implants," Dr. Pisoni said. "It will be beneficial to identify as early as possible which children might be at risk for poor outcomes, and we need to understand the variability in the outcome and what can be done about it."
(Source: news.medicine.iu.edu)

(Image caption: In Greek mythology, Clotho – the eponym for the anti-aging factor klotho – is the Fate who spins the thread of life. Here, the goddess spins the metaphorical thread of life that is DNA, influencing lifespan and cognition. Illustration by Michael Griffin Kelley)
Better Cognition Seen with Gene Variant Carried by 1 in 5 People
A scientific team led by the Gladstone Institutes and UC San Francisco has discovered that a common form of a gene already associated with long life also improves learning and memory, a finding that could have implications for treating age-related diseases like Alzheimer’s.
The researchers found that people who carry a single copy of the KL-VS variant of the KLOTHO gene perform better on a wide variety of cognitive tests. When the researchers modeled the effects in mice, they found it strengthened the connections between neurons that make learning possible – what is known as synaptic plasticity – by increasing the action of a cell receptor critical to forming memories.
The discovery is a major step toward understanding how genes improve cognitive ability and could open a new route to treating diseases like Alzheimer’s. Researchers have long suspected that some people may be protected from the disease because of their greater cognitive capacity, or reserve. Since elevated levels of the klotho protein appear to improve cognition throughout the lifespan, raising klotho levels could build cognitive reserve as a bulwark against the disease.
“As the world’s population ages, cognitive frailty is our biggest biomedical challenge,” said Dena Dubal, MD, PhD, assistant professor of neurology, the David A. Coulter Endowed Chair in Aging and Neurodegeneration at UCSF and lead author of the study, published May 8 in Cell Reports. “If we can understand how to enhance brain function, it would have a huge impact on people’s lives.”
First to Link Between Klotho Variant and Better Cognition
Klotho was discovered in 1997 and named after the Fate from Greek mythology who spins the thread of life.
The investigators found that people who carry a single copy of the KL-VS variant of the KLOTHO gene, roughly 20 percent of the population, have more klotho protein in their blood than non-carriers. Besides increasing the secretion of klotho, the KL-VS variant may also change the action of the protein and is known to lessen age-related cardiovascular disease and promote longevity.
The team’s report is the first to link the KL-VS variant, or allele, to better cognition in humans, and buttresses these findings with genetic, electrophysiological, biochemical and behavioral experiments in mice.
The researchers tested the associations between the allele and age-related human cognition in three separate studies involving more than 700 people without dementia between the ages of 52 and 85. Altogether, it took about three years to conduct the work.
“These surprising results pave a promising new avenue of research,” said Roderick Corriveau, PhD, program director at NIH’s National Institute of Neurological Disorders and Stroke (NINDS). “Although preliminary, they suggest klotho could be used to bump up cognition for people suffering from dementia.”
Learning Better at All Stages of Life
Having the KL-VS allele did not seem to protect people from age-related cognitive decline. But overall the effect was to boost cognition, so that the middle-aged study participants began their decline from a higher point.
“Based on what was known about klotho, we expected it to affect the brain by changing the aging process,” said senior author Lennart Mucke, MD, who directs neurological research at the Gladstone Institutes and is a professor of neurology and the Joseph B. Martin Distinguished Professor of Neuroscience at UCSF. “But this is not what we found, which suggested to us that we were on to something new and different.”
To get a closer look at how the gene variant operates, the researchers used mice that were engineered to produce more of the mouse version of klotho and found that these mice learned better at all stages of life. Put through mazes, these transgenic mice were more likely to try different routes, an indication that they had superior working memory. In a test of spatial learning and memory, the mice with extra klothoperformed twice as well.
Researchers then analyzed the mouse brain tissue and found that the mice with elevated klotho had twice as many GluN2B subunits within synaptic connections. GluN2B is part of the N-methyl-D-aspartate receptor, or NMDAR, a key receptor involved in synaptic plasticity.
The researchers found more GluN2B-containing receptors in the hippocampus and frontal cortex, brain regions that support cognitive functions. When the researchers gave the mice a drug that blocks the action of these receptors, the klotho-enhanced mice lost their cognitive advantage.
Premature menopause is associated with long-term negative effects on cognitive function, suggests a new study published today (7 May) in BJOG: An International Journal of Obstetrics and Gynaecology (BJOG).

The average age of menopause is around 50 years in the Western World. Premature menopause refers to menopause at or before 40 years of age, this could be due to a bilateral ovariectomy, (surgically induced menopause)or non-surgical loss of ovarian function (sometimes referred to as ‘natural’ menopause).
The study, based on a sample of 4868 women, used cognitive tests and clinical dementia diagnosis at baseline and after two, four and seven years and aimed to determine whether premature menopause can have an effect on later-life cognitive function. The effects of the type of menopause, whether natural or surgical, and use of hormone treatment were also examined.
Of the 4,868 women in this study, natural menopause was reported by 79% of the women, 10% as a surgical menopause and 11% of women reported menopause due to other causes, such as radiation or chemotherapy. Around 7.6% of the women in the study had a premature menopause and a further 12.8% an early menopause (between the ages of 41 and 45 years). Over a fifth of the women used hormone treatment during the menopause.
Results show that in comparison to women who experienced menopause after the age of 50, those with a premature menopause had a more than 40% increased risk of poor performance on tasks assessing verbal fluency and visual memory and was associated with a 35% increased risk of decline in psychomotor speed (coordination between the brain and the muscles that brings about movement) and overall cognitive function over 7 years. There was no significant association with the risk of dementia.
Furthermore, both premature ovarian failure and premature surgical menopause were associated with a more than two-fold risk of poor verbal fluency. In terms of visual memory, premature ovarian failure was associated with a significantly increased risk of poor performance, and there was a similar trend for premature surgical menopause.
When the potential modifying effect of using hormone treatment at the time of premature menopause was examined, there was some evidence that it may be beneficial for visual memory, but it could increase the risk of poor verbal fluency.
Dr Joanne Ryan, Postdoctoral Research Fellow, Neuropsychiatry: Epidemiological and Clinical Research, Hospital La Colombiere, Montpellier, said:
“Both premature surgical menopause and premature ovarian failure, were associated with long-term negative effects on cognitive function, which are not entirely offset by menopausal hormone treatment.
“In terms of surgical menopause, our results suggest that the potential long-term effects on cognitive function should form part of the decision-making process when considering ovariectomy in younger women.”
Pierre Martin Hirsch, BJOG deputy editor-in-chief added:
“With the ageing population it is important to have a better understanding of the long term effects of a premature menopause on later-life cognitive function and the potential benefit from using menopausal hormone treatment.
“This study adds to the existing evidence base to suggest premature menopause can have a significant impact on cognitive function in later life which healthcare professionals must be aware of.”
(Source: eu.wiley.com)
DHA during pregnancy does not appear to improve cognitive outcomes for children
Although there are recommendations for pregnant women to increase their intake of the omega-3 fatty acid docosahexaenoic acid (DHA) to improve fetal brain development, a randomized trial finds that prenatal DHA supplementation did not result in improved cognitive, problem-solving or language abilities for children at four years of age, according to the study in the May 7 issue of JAMA, a theme issue on child health. This issue is being released early to coincide with the Pediatric Academic Societies Annual Meeting.
Maria Makrides, B.Sc., B.N.D., Ph.D., of the South Australian Health and Medical Research Institute, Adelaide, Australia and colleagues conducted longer-term follow-up from a previously published study in which pregnant women received 800 mg/d of DHA or placebo. In the initial study, the researchers found that average cognitive, language, and motor scores did not differ between children at 18 months of age. For the follow-up study, outcomes were assessed at 4 years, a time point when any subtle effects on development should have emerged and can be more reliably assessed.
The majority (91.9 percent) of eligible families (DHA group, n = 313; control group, n = 333) participated in the follow-up. The authors found that measures of cognition, the ability to perform complex mental processing, language, and executive functioning (such as memory, reasoning, problem solving) did not differ significantly between groups.
"Our data do not support prenatal DHA supplementation to enhance early childhood development."
(Image: Shutterstock)
Research Shows Strategic Thinking Strengthens Intellectual Capacity
Strategy-based cognitive training has the potential to enhance cognitive performance and spill over to real-life benefit according to a data-driven perspective article by the Center for BrainHealth at The University of Texas at Dallas published in the open-access journal Frontiers in Systems Neuroscience. The research-based perspective highlights cognitive, neural and real-life changes measured in randomized clinical trials that compared a gist-reasoning strategy-training program to memory training in populations ranging from teenagers to healthy older adults, individuals with brain injury to those at-risk for Alzheimer’s disease.
“Our brains are wired to be inspired,” said Dr. Sandra Bond Chapman, founder and chief director of the Center for BrainHeath and Dee Wyly Distinguished University Chair at The University of Texas at Dallas. “One of the key differences in our studies from other interventional research aimed at improving cognitive abilities is that we did not focus on specific cognitive functions such as speed of processing, memory, or learning isolated new skills. Instead, the gist reasoning training program encouraged use of a common set of multi-dimensional thinking strategies to synthesize information and elimination of toxic habits that impair efficient brain performance.”
The training across the studies was short, ranging from 8 to 12 sessions delivered over one to two months in 45 to 60 minute time periods. The protocol focused on three cognitive strategies — strategic attention, integrated reasoning and innovation. These strategies are hierarchical in nature and can be broadly applied to most complex daily life mental activities.
At a basic level, research participants were encouraged to filter competing information that is irrelevant and focus only on important information. At more advanced levels, participants were instructed to generate interpretations, themes or generalized statements from information they were wanting or needing to read, for example. Each strategy built on previous strategies and research participants were challenged to integrate all steps when tackling mental activities both inside and outside of training.
“Cognitive gains were documented in trained areas such as abstracting, reasoning, and innovating,” said Chapman. “And benefits also spilled over to untrained areas such as memory for facts, planning, and problem solving. What’s exciting about this work is that in randomized trials comparing gist reasoning training to memory training, we found that it was not learning new information that engaged widespread brain networks and elevated cognitive performance, but rather actually deeper processing of information and using that information in new ways that augmented brain performance.
Strengthening intellectual capacity is no longer science fiction; what used to seem improbable is now in the realm of reality.”
Positive physical changes within the brain and cognitive improvement across populations in response to strategy-based mental training demonstrate the neuro-regenerative potential of the brain.
“The ability to recognize, synthesize and create the essence of complex ideas and problems to solve are fundamental skills for academic, occupational and real-life success,” Chapman said. “The capacity to enhance cognition and complex neural networks in health, after injury or disease diagnosis will have major implications to preventing, diagnosing and treating cognitive decline and enhancing cognitive performance in youth to prepare them for an unknown future and in middle age to older adults who want to remain mentally robust.”
First brain images of African infants enable research into cognitive effects of nutrition
Brain activity of babies in developing countries could be monitored from birth to reveal the first signs of cognitive dysfunction, using a new technique piloted by a London-based university collaboration.
The cognitive function of infants can be visualised and tracked more quickly, more accurately and more cheaply using the method, called functional near infra-red spectroscopy (fNIRS), compared to the behavioural assessments Western regions have relied upon for decades.
Professor Clare Elwell, Professor of Medical Physics at University College London (UCL), said: “Brain activity soon after birth has barely been studied in low-income countries, because of the lack of transportable brain imaging facilities needed to do this at any reasonable scale. We have high hopes of building on these promising findings to develop functional near infra-red spectroscopy into an assessment tool for investigating cognitive function of infants who may be at risk of malnutrition or childhood diseases associated with low income settings.”
The pioneering study, published this week in Nature Scientific Reports, was performed by a collaboration of researchers from UCL; the London School of Hygiene and Tropical Medicine; the Babylab at Birkbeck, University of London; and the Medical Research Council unit in Gambia. It aimed to investigate the impact of nutrition in resource-poor regions on infant brain development, and was funded by the Bill and Melinda Gates Foundation.
Professor Clare Elwell (UCL Medical Physics & Bioengineering), said: “This is the first use of brain imaging methods to investigate localised brain activity in African infants.
"Until now, much of our understanding of brain development in low income countries has relied upon behavioural assessments which need careful cultural and linguistic translations to ensure they are accurate. Our technology, functional near infrared spectroscopy, can provide a more objective marker of brain activity."
For the studies in the Gambia, babies aged 4–8 months old were played sounds and shown videos of adults performing specific movements, such as playing ‘peek-a-boo’. The fNIRS system monitored changes in blood flow to the baby’s brain and showed that distinct brain regions responded to visual–social prompts, while others responded to auditory-social stimuli. Comparison of the results with those obtained from babies in the UK showed that the responses were similar in both groups.
fNIRS has previously been used to study brain development in UK infants and most recently to investigate early markers of autism during the first few months of life.
Professor Andrew Prentice (Medical Research Council International Nutrition Group, London School of Hygiene and Tropical Medicine) said: “Humans have evolved to survive and succeed on the basis of their large brain and intelligence, but nutritional deficits in early life can limit this success. In order to plan the best interventions to maximise brain function we need tools that can give us an early read out. fNIRS is showing great promise in this respect.”
Cognitive scientists use ‘I spy’ to show spoken language helps direct children’s eyes
In a new study, Indiana University cognitive scientists Catarina Vales and Linda Smith demonstrate that children spot objects more quickly when prompted by words than if they are only prompted by images.
Language, the study suggests, is transformative: More so than images, spoken language taps into children’s cognitive system, enhancing their ability to learn and to navigate cluttered environments. As such the study, published last week in the journal Developmental Science, opens up new avenues for research into the way language might shape the course of developmental disabilities such as ADHD, difficulties with school, and other attention-related problems.
In the experiment, children played a series of “I spy” games, widely used to study attention and memory in adults. Asked to look for one image in a crowded scene on a computer screen, the children were shown a picture of the object they needed to find — a bed, for example, hidden in a group of couches.
"If the name of the target object was also said, the children were much faster at finding it and less distracted by the other objects in the scene," said Vales, a graduate student in the Department of Psychological and Brain Sciences.
"What we’ve shown is that in 3-year-old children, words activate memories that then rapidly deploy attention and lead children to find the relevant object in a cluttered array," said Smith, Chancellor’s Professor in the Department of Psychological and Brain Sciences. "Words call up an idea that is more robust than an image and to which we more rapidly respond. Words have a way of calling up what you know that filters the environment for you.”
The study, she said , “is the first clear demonstration of the impact of words on the way children navigate the visual world and is a first step toward understanding the way language influences visual attention, raising new testable hypotheses about the process.”
Vales said the use of language can change how people inspect the world around them.
"We also know that language will change the way people perform in a lot of different laboratory tasks," she said. "And if you have a child with ADHD who has a hard time focusing, one of the things parents are told to do is to use words to walk the child through what she needs to do. So there is this notion that words change cognition. The question is ‘how?’"
Vales said their research results “begin to tell us precisely how words help, the kinds of cognitive processes words tap into to change how children behave. For instance, the difference between search times, with and without naming the target object, indicate a key role for a kind of brief visual memory known as working memory, that helps us remember what we just saw as we look to something new. Words put ideas in working memory faster than images.”
For this reason, language may play an important role in a number of developmental disabilities.
"Limitations in working memory have been implicated in almost every developmental disability, especially those concerned with language, reading and negative outcomes in school," Smith said. "These results also suggest the culprit for these difficulties may be language in addition to working memory.
"This study changes the causal arrow a little bit. People have thought that children have difficulty with language because they don’t have enough working memory to learn language. This turns it around because it suggests that language may also make working memory more effective."
How does this matter to child development?
"Children learn in the real world, and the real world is a cluttered place," Smith said. "If you don’t know where to look, chances are you don’t learn anything. The words you know are a driving force behind attention. People have not thought about it as important or pervasive, but once children acquire language, it changes everything about their cognitive system."
"Our results suggest that language has huge effects, not just on talking, but on attention — which can determine how children learn, how much they learn and how well they learn," Vales said.
Study says we’re over the hill at 24
It’s a hard pill to swallow, but if you’re over 24 years of age you’ve already reached your peak in terms of your cognitive motor performance, according to a new Simon Fraser University study.
SFU’s Joe Thompson, a psychology doctoral student, associate professor Mark Blair, Thompson’s thesis supervisor, and Andrew Henrey, a statistics and actuarial science doctoral student, deliver the news in a just-published PLOS ONE Journal paper.
In one of the first social science experiments to rest on big data, the trio investigates when we start to experience an age-related decline in our cognitive motor skills and how we compensate for that.
The researchers analyzed the digital performance records of 3,305 StarCraft 2 players, aged 16 to 44. StarCraft 2 is a ruthless competitive intergalactic computer war game that players often undertake to win serious money.
Their performance records, which can be readily replayed, constitute big data because they represent thousands of hours worth of strategic real-time cognitive-based moves performed at varied skill levels.
Using complex statistical modeling, the researchers distilled meaning from this colossal compilation of information about how players responded to their opponents and more importantly, how long they took to react.
“After around 24 years of age, players show slowing in a measure of cognitive speed that is known to be important for performance,” explains Thompson, the lead author of the study, which is his thesis. “This cognitive performance decline is present even at higher levels of skill.”
But there’s a silver lining in this earlier-than-expected slippery slope into old age. “Our research tells a new story about human development,” says Thompson.
“Older players, though slower, seem to compensate by employing simpler strategies and using the game’s interface more efficiently than younger players, enabling them to retain their skill, despite cognitive motor-speed loss.”
For example, older players more readily use short cut and sophisticated command keys to compensate for declining speed in executing real time decisions.
The findings, says Thompson, suggest “that our cognitive-motor capacities are not stable across our adulthood, but are constantly in flux, and that our day-to-day performance is a result of the constant interplay between change and adaptation.”
Thompson says this study doesn’t inform us about how our increasingly distracting computerized world may ultimately affect our use of adaptive behaviours to compensate for declining cognitive motor skills.
But he does say our increasingly digitized world is providing a growing wealth of big data that will be a goldmine for future social science studies such as this one.
Ever-So-Slight Delay Improves Decision-Making Accuracy
Columbia University Medical Center (CUMC) researchers have found that decision-making accuracy can be improved by postponing the onset of a decision by a mere fraction of a second. The results could further our understanding of neuropsychiatric conditions characterized by abnormalities in cognitive function and lead to new training strategies to improve decision-making in high-stake environments. The study was published in the March 5 online issue of the journal PLoS One.
“Decision making isn’t always easy, and sometimes we make errors on seemingly trivial tasks, especially if multiple sources of information compete for our attention,” said first author Tobias Teichert, PhD, a postdoctoral research scientist in neuroscience at CUMC at the time of the study and now an assistant professor of psychiatry at the University of Pittsburgh. “We have identified a novel mechanism that is surprisingly effective at improving response accuracy.
The mechanism requires that decision-makers do nothing—just briefly. “Postponing the onset of the decision process by as little as 50 to 100 milliseconds enables the brain to focus attention on the most relevant information and block out irrelevant distractors,” said last author Jack Grinband, PhD, associate research scientist in the Taub Institute and assistant professor of clinical radiology (physics). “This way, rather than working longer or harder at making the decision, the brain simply postpones the decision onset to a more beneficial point in time.”
In making decisions, the brain integrates many small pieces of potentially contradictory sensory information. “Imagine that you’re coming up to a traffic light—the target—and need to decide whether the light is red or green,” said Dr. Teichert. “There is typically little ambiguity, and you make the correct decision quickly, in a matter of tens of milliseconds.”
The decision process itself, however, does not distinguish between relevant and irrelevant information. Hence, a task is made more difficult if irrelevant information—a distractor—interferes with the processing of the target. Distractors are present all the time; in this case, it might be in the form of traffic lights regulating traffic in other lanes. Though the brain is able to enhance relevant information and filter out distractions, these mechanisms take time. If the decision process starts while the brain is still processing irrelevant information, errors can occur.
Studies have shown that response accuracy can be improved by prolonging the decision process, to allow the brain time to collect more information. Because accuracy is increased at the cost of longer reaction times, this process is referred to as the “speed-accuracy trade-off.” The researchers thought that a more effective way to reduce errors might be to delay the decision process so that it starts out with better information.
The research team conducted two experiments to test this hypothesis. In the first, subjects were shown what looked like a swarm of randomly moving dots (the target stimulus) on a computer monitor and were asked to judge whether the overall motion was to the left or right. A second and brighter set of moving dots (the distractor) appeared simultaneously in the same location, obscuring the motion of the target. When the distractor dots moved in the same direction as the target dots, subjects performed with near-perfect accuracy, but when the distractor dots moved in the opposite direction, the error rate increased. The subjects were asked to perform the task either as quickly or as accurately as possible; they were free to respond at any time after the onset of the stimulus.
The second experiment was similar to the first, except that the subjects also heard regular clicks, indicating when they had to respond. The time allowed for viewing the dots varied between 17 and 500 milliseconds. This condition simulates real-life situations, such as driving, where the time to respond is beyond the driver’s control. “Manipulating how long the subject viewed the stimulus before responding allowed us to determine how quickly the brain is able to block out the distractors and focus on the target dots,” said Dr. Grinband.
“In this situation, it takes about 120 milliseconds to shift attention from one stimulus (the bright distractors) to another (the darker targets),” said Dr. Grinband. “To our knowledge, that’s something that no one has ever measured before.”
The experiments also revealed that it’s more beneficial to delay rather than prolong the decision process. The delay allows attention to be focused on the target stimulus and helps prevent irrelevant information from interfering with the decision process. “Basically, by delaying decision onset—simply by doing nothing—you are more likely to make a correct decision,” said Dr. Teichert.
Finally, the results showed that decision onset is, to some extent, under cognitive control. “The subjects automatically used this mechanism to improve response accuracy,” said Dr. Teichert. “However, we don’t think that they were aware that they were doing so. The process seems to go on behind the scenes. We hope to devise training strategies to bring the mechanism under conscious control.”
“This might be the first scientific study to justify procrastination,” Dr. Teichert said. “On a more serious note, our study provides important insights into fundamental brain processes and yields clues as to what might be going wrong in diseases such as ADHD and schizophrenia. It also could lead to new training strategies to improve decision making in complex high-stakes environments, such as air traffic control towers and military combat.”