Posts tagged psychology

Posts tagged psychology
UCSF Team Reveals How the Brain Recognizes Speech Sounds
UC San Francisco researchers are reporting a detailed account of how speech sounds are identified by the human brain, offering an unprecedented insight into the basis of human language.
The finding, they said, may add to our understanding of language disorders, including dyslexia.
Scientists have known for some time the location in the brain where speech sounds are interpreted, but little has been discovered about how this process works.
Now, in the Jan. 30 edition of Science Express, the fast-tracked online version of the journal Science, the UCSF team reports that the brain does not respond to the individual sound segments known as phonemes – such as the b sound in “boy” – but is instead exquisitely tuned to detect simpler elements, which are known to linguists as “features.”
This organization may give listeners an important advantage in interpreting speech, the researchers said, since the articulation of phonemes varies considerably across speakers, and even in individual speakers over time.
Head first: reshaping how traumatic brain injury is treated
Traumatic brain injury affects 10 million people a year worldwide and is the leading cause of death and disability in children and young adults. A new study will identify how to match treatments to patients, to achieve the best possible outcome for recovery.
The human brain – despite being encased snugly within its protective skull – is terrifyingly vulnerable to traumatic injury. A severe blow to the head can set in train a series of events that continue to play out for months, years and even decades ahead. First, there is bleeding, clotting and bruising at the site of impact. If the blow is forceful enough, the brain is thrust against the far side of the skull, where bony ridges cause blood vessels to lacerate. Sliding of grey matter over white matter can irreparably shear nerve fibres, causing damage that has physical, cognitive and behavioural consequences. As response mechanisms activate, the brain then swells, increasing intracranial pressure, and closing down parts of the microcirculatory network, reducing the passage of oxygen from blood vessels into the tissues, and causing further tissue injury.
It is the global nature of the damage – involving many parts of the brain – that defines these types of traumatic brain injuries (TBIs), which might result from transport accidents, assaults, falls or sporting injuries. Unfortunately, both the pattern of damage and the eventual outcome are extremely variable from patient to patient.
“This variability has meant that TBI is often considered as the most complex disease in our most complex organ,” said Professor David Menon, Co-Chair of the Acute Brain Injury Programme at the University of Cambridge. “Despite advances in care, the sad truth is that we are no closer to knowing how to navigate past this variability to the point where we can link the particular characteristics of a TBI to the best treatment and outcome.”
Read moreThe iPod in the head: How the brain processes musical hallucinations
A woman with an “iPod in her head” has helped scientists at Newcastle University and University College London identify the areas of the brain that are affected when patients experience a rare condition called musical hallucinations.
Sufferers persistently perceive music, as if they were hearing it with their ears, when no music is actually being played. Initially they often mistake the experience for actual music playing and while musical hallucinations can occasionally be a symptom of a neurological or psychiatric disorder, it is usually caused by hearing loss in people who are in normal physical and mental health.
Dr Sukhbinder Kumar from the Institute of Neuroscience at Newcastle University, lead author of the paper published in Cortex said: “We found that a network of brain areas, that are usually involved in processing of melodies and retrieval of memory of music, were particularly active during hallucinations of music in the absence of any sound or music being played externally.”
Nearly one in ten people suffer from tinnitus which is technically an auditory hallucination, in which tones or buzzing noises are heard following hearing loss. However in a small number of people with hearing loss these hallucinations take the form of music, but until now the brain mechanisms underlying this process were poorly understood.
This study by researchers at Newcastle University and University College London and funded by the Wellcome Trust has looked in depth at one sufferer of the condition and pinpointed the regions of the brain involved in producing the hallucinations. These findings could lead to a better understanding of the condition and possibly treatments in the future.
Musical hallucination
Sylvia, 69, a maths teacher who is also a musician with perfect pitch, started to go deaf about 20 years ago after a viral infection. Then about eleven years later she experienced a sudden acute hearing loss and severe tinnitus and her musical hallucinations developed after this. Due to her musical knowledge Sylvia was able to notate what she was hearing.
Initially the condition was irritating and affected Sylvia’s sleep, but she learnt to live with it. “I did everything I could to get rid of them but they persisted, always in a minor key and therefore a bit depressing,” she said.
“Eventually the number of notes increased until they seemed to be parts of tunes. One day I recognized something and, once I had done so, more and more phrases from classical music appeared in my brain.”
Among the pieces of music that Sylvia was hearing in her hallucinations was Gilbert and Sullivan’s HMS Pinafore, as well as music by Bach. Amazingly Sylvia found that by playing music herself, she was able to alter the music in her hallucinations.
“I can change the hallucination playing in my head to the music I am practising. This is particularly the case with the music of Bach - the hallucination will pause and then a whole page will start to play in my head, gradually curtailing itself until just a phrase remains and is repeated. That might then repeat a thousand times a day. It is as if I have my own internal ipod.”
Sylvia’s experience is fairly typical, though the condition occurs just as often in non-musicians, and sometimes starts abruptly rather than slowly developing as in her case.
How we hear
As Sylvia’s hallucinations could be manipulated by playing an external piece of music that allowed the researchers to understand what was happening in her brain during hallucinations. They first identified pieces of music that suppressed her hallucinations and these pieces were then played to her while her brain activity was being monitored using magnetoencephalography MEG), which measures magnetic fields around the scalp as the brain processes information.
During normal perception of music what we actually ‘hear’ is a complex interplay of the sound entering the ear and our brain’s interpretations and predictions. Normally the strength and quality of the input from the ear is so high that it dominates what we actually perceive however the brain fills in the gaps when the ears do not provide enough input.
“With hearing loss, as in Sylvia’s case, the signal from the ear becomes weak and noisy, like a poorly-tuned radio. The brain’s predictive mechanisms therefore have to work very hard to make sense of what we are hearing. What we have found is that these processes sometimes end up running away with themselves to cause hallucinations,” said author Dr William Sedley also of Newcastle University.
Dr Kumar added: “This also explains why listening to an external piece of music suppresses hallucinations. When external music is playing the signal entering her brain is much stronger and more reliable, which constrains the aberrant communication going on in the brain areas during hallucinations.”
This new understanding of musical hallucinations may provide better treatment in the future as Newcastle University’s Professor Tim Griffiths, professor of Cognitive Neurology who lead the study explained: “It might be possible to disrupt the abnormal communication between the brain areas using brain stimulation, or to use pharmacological treatments to disrupt chemical transmitters that drive communication between them.
“Better hearing aids also appear to help suppress hallucinations, so we would advise people experiencing musical hallucinations to seek medical attention, if for nothing more than to ensure they have the best available hearing aids.”
Dr John Williams, Head of Neuroscience and Mental Health at the Wellcome Trust, says: “This case is extremely fascinating, but the condition is relatively rare. However, it is unusual cases such as this that can give us profound insights into how the brain works and, one hopes, lead to potential new treatments to improve the patient’s life.”
Assessing structural and functional changes in the brain may predict future memory performance in healthy children and adolescents, according to a study appearing January 29 in The Journal of Neuroscience. The findings shed new light on cognitive development and suggest MRI and other tools may one day help identify children at risk for developmental challenges earlier than current testing methods allow.

Working memory capacity — the ability to hold onto information for a short period of time — is one of the strongest predictors of future achievements in math and reading. While previous studies showed that MRI could predict current working memory performance in children, scientists were unsure if MRI could predict their future cognitive capacity.
In the current study, Henrik Ullman, Rita Almeida, PhD, and Torkel Klingberg, MD, PhD, at the Karolinska Institutet in Sweden evaluated the cognitive abilities of a group of healthy children and adolescents and measured each child’s brain structure and function using MRI. Based on the MRI data collected during this initial testing, the researchers found they could predict the children’s working memory performance two years later, a prediction that was not possible using the cognitive tests.
“Our results suggest that future cognitive development can be predicted from anatomical and functional information offered by MRI above and beyond that currently achieved by cognitive tests,” said Ullman, the lead author of the study. “This has wide implications for understanding the neural mechanisms of cognitive development.”
The scientists recruited 62 children and adolescents between the ages of 6 and 20 years to the lab, where they completed working memory and reasoning tests. They also received multiple MRI scans to assess brain structure and changes in brain activity as they performed a working memory task. Two years later, the group returned to the lab to perform the same cognitive tests.
Using a statistical model, the researchers evaluated whether MRI data obtained during the initial tests correlated with the children’s working memory performance during the follow-up visit. They found that while brain activity in the frontal cortex correlated with children’s working memory at the time of the initial tests, activity in the basal ganglia and thalamus predicted how well children scored on the working memory tests two years later.
“This study is another contribution to the growing body of neuroimaging research that yields insights into unraveling present and predicting future cognitive capacity in development,” said Judy Illes, PhD, a neuroethicist at the University of British Columbia. “However, the appreciation of this important new knowledge is simpler than its application to everyday life. How a child performs today and tomorrow relies on multiple positive and negative life events that cannot be assessed by today’s technology alone.”
(Source: alphagalileo.org)
Study Examines the Development of Children’s Prelife Reasoning
Most people, regardless of race, religion or culture, believe they are immortal. That is, people believe that part of themselves–some indelible core, soul or essence–will transcend the body’s death and live forever. But what is this essence? Why do we believe it survives? And why is this belief so unshakable?
A new Boston University study led by postdoctoral fellow Natalie Emmons and published in the January 16, 2014 online edition of Child Development sheds light on these profound questions by examining children’s ideas about “prelife,” the time before conception. By interviewing 283 children from two distinct cultures in Ecuador, Emmons’s research suggests that our bias toward immortality is a part of human intuition that naturally emerges early in life. And the part of us that is eternal, we believe, is not our skills or ability to reason, but rather our hopes, desires and emotions. We are, in fact, what we feel.
Emmons’ study fits into a growing body of work examining the cognitive roots of religion. Although religion is a dominant force across cultures, science has made little headway in examining whether religious belief–such as the human tendency to believe in a creator–may actually be hard-wired into our brains.
“This work shows that it’s possible for science to study religious belief,” said Deborah Kelemen, an Associate Professor of Psychology at Boston University and co-author of the paper. “At the same time, it helps us understand some universal aspects of human cognition and the structure of the mind.”
Most studies on immortality or “eternalist” beliefs have focused on people’s views of the afterlife. Studies have found that both children and adults believe that bodily needs, such as hunger and thirst, end when people die, but mental capacities, such as thinking or feeling sad, continue in some form. But these afterlife studies leave one critical question unanswered: where do these beliefs come from? Researchers have long suspected that people develop ideas about the afterlife through cultural exposure, like television or movies, or through religious instruction. But perhaps, thought Emmons, these ideas of immortality actually emerge from our intuition. Just as children learn to talk without formal instruction, maybe they also intuit that part of their mind could exist apart from their body.
Emmons tackled this question by focusing on “prelife,” the period before conception, since few cultures have beliefs or views on the subject. “By focusing on prelife, we could see if culture causes these beliefs to appear, or if they appear spontaneously,” said Emmons.
“I think it’s a brilliant idea,” said Paul Bloom, a Professor of Psychology and Cognitive Science at Yale who was not involved with the study. “One persistent belief is that children learn these ideas through school or church. That’s what makes the prelife research so cool. It’s a very clever way to get at children’s beliefs on a topic where they aren’t given answers ahead of time.”
Emmons interviewed children from an indigenous Shuar village in the Amazon Basin of Ecuador. She chose the group because they have no cultural prelife beliefs, and she suspected that indigenous children, who have regular exposure to birth and death through hunting and farming, would have a more rational, biologically-based view of the time before they were conceived. For comparison, she also interviewed children from an urban area near Quito, Ecuador. Most of the urban children were Roman Catholic, a religion that teaches that life begins only at conception. If cultural influences were paramount, reasoned Emmons, both urban and indigenous children should reject the idea of life before birth.
Emmons showed the children drawings of a baby, a young woman, and the same woman while pregnant, then asked a series of questions about the child’s abilities, thoughts and emotions during each period: as babies, in the womb, and before conception.
The results were surprising. Both groups gave remarkably similar answers, despite their radically different cultures. The children reasoned that their bodies didn’t exist before birth, and that they didn’t have the ability to think or remember. However, both groups also said that their emotions and desires existed before they were born. For example, while children generally reported that they didn’t have eyes and couldn’t see things before birth, they often reported being happy that they would soon meet their mother, or sad that they were apart from their family.
“They didn’t even realize they were contradicting themselves,” said Emmons. “Even kids who had biological knowledge about reproduction still seemed to think that they had existed in some sort of eternal form. And that form really seemed to be about emotions and desires.”
Why would humans have evolved this seemingly universal belief in the eternal existence of our emotions? Emmons said that this human trait might be a by-product of our highly developed social reasoning. “We’re really good at figuring out what people are thinking, what their emotions are, what their desires are,” she said. We tend to see people as the sum of their mental states, and desires and emotions may be particularly helpful when predicting their behavior. Because this ability is so useful and so powerful, it flows over into other parts of our thinking. We sometimes see connections where potentially none exist, we hope there’s a master plan for the universe, we see purpose when there is none, and we imagine that a soul survives without a body.
These ideas, while nonscientific, are natural and deep-seated. “I study these things for a living but even find myself defaulting to them. I know that my mind is a product of my brain but I still like to think of myself as something independent of my body,” said Emmons.
“We have the ability to reflect and reason scientifically, and we have the ability to reason based on our gut and intuition,” she added. “And depending on the situation, one may be more useful than the other.”
Honesty beats dishonesty for making you feel good
A University of Toronto report based on two neural imaging studies that monitored brain activity has found a reward given for telling the truth gives people greater satisfaction than the same reward given for deceit.
These studies were published recently in the neuroscience journals Neuropsychologia and NeuroImage.
"Our findings together show that people typically find truth-telling to be more rewarding than lying in different types of deceptive situations,” said Professor Kang Lee,whose research is funded in part by the Social Sciences and Humanities Research Council.
The findings are based on two studies of Chinese participants using a new neuroimaging method called near-infrared spectroscopy. The studies are among the first to address the question of whether lying makes people feel better or worse than telling the truth.
The studies explored two different types of deception. In first-order deception, the recipient does not know the deceiver is lying. In second-order deception, the deceivers are fully aware that the recipient knows their intention, such as bluffing in poker.
The researchers were surprised to find that a liar’s cortical reward system was more active when a reward was gained through truth-telling than lying. This was true in both types of deception.
Researchers also found that in both types of deception, telling a lie produced greater brain activations than telling the truth in the frontal lobe, suggesting lying is cognitively more taxing than truth-telling and uses more neural resources.
The researchers hope this study will advance understanding of the neural mechanisms underlying lying, a ubiquitous and frequent human behaviour, and help to diagnose pathological liars who may have different neural responses when lying or telling the truth.
‘Love hormone’ oxytocin carries unexpected side effect
The love hormone, the monogamy hormone, the cuddle hormone, the trust-me drug: oxytocin has many nicknames. That’s because this naturally occurring human hormone has recently been shown to help people with autism and schizophrenia overcome social deficits.
As a result, certain psychologists prescribe oxytocin off-label, to treat mild social unease in patients who don’t suffer from a diagnosed disorder. But that’s not such a good idea, according to researchers at Concordia’s Centre for Research in Human Development. Their recent study — published in Emotion, a journal of the American Psychological Association — shows that in healthy young adults, too much oxytocin can actually result in oversensitivity to the emotions of others.
With the help of psychology professor Mark Ellenbogen, PhD candidates Christopher Cardoso and Anne-Marie Linnen recruited 82 healthy young adults who showed no signs of schizophrenia, autism or related disorders. Half of the participants were given measured doses of oxytocin, while the rest were offered a placebo.
The participants then completed an emotion identification accuracy test in which they compared different facial expressions showing various emotional states. As expected, the test subjects who had taken oxytocin saw greater emotional intensity in the faces they were rating.
“For some, typical situations like dinner parties or job interviews can be a source of major social anxiety,” says Cardoso, the study’s lead author. “Many psychologists initially thought that oxytocin could be an easy fix in overcoming these worries. Our study proves that the hormone ramps up innate social reasoning skills, resulting in an emotional oversensitivity that can be detrimental in those who don’t have any serious social deficiencies.”
As Cardoso explains, “If your potential boss grimaces because she’s uncomfortable in her chair and you think she’s reacting negatively to what you’re saying, or if the guy you’re talking to at a party smiles to be friendly and you think he’s coming on to you, it can lead you to overreact — and that can be a real problem. That’s why we’re cautioning against giving oxytocin to people who don’t really need it.”
Ultimately, however, oxytocin does have the potential to help people with diagnosed disorders like autism to overcome social deficits.
But, says Cardoso, “The potential social benefits of oxytocin in most people may be countered by unintended negative consequences, like being too sensitive to emotional cues in everyday life.”
TAU researcher finds that adults still think about numbers like kids

Children understand numbers differently than adults. For kids, one and two seem much further apart then 101 and 102, because two is twice as big as one, and 102 is just a little bigger than 101. It’s only after years of schooling that we’re persuaded to see the numbers in both sets as only one integer apart on a number line.
Now Dror Dotan, a doctoral student at Tel Aviv University’s School of Education and Sagol School of Neuroscience and Prof. Stanislas Dehaene of the Collège de France, a leader in the field of numerical cognition, have found new evidence that educated adults retain traces of their childhood, or innate, number sense — and that it’s more powerful than many scientists think.
"We were surprised when we saw that people never completely stop thinking about numbers as they did when they were children," said Dotan. "The innate human number sense has an impact, even on thinking about double-digit numbers." The findings, a significant step forward in understanding how people process numbers, could contribute to the development of methods to more effectively educate or treat children with learning disabilities and people with brain injuries.
Digital proof of a primal sense
Educated adults understand numbers “linearly,” based on the familiar number line from 0 to infinity. But children and uneducated adults, like tribespeople in the Amazon, understand numbers “logarithmically” — in terms of what percentage one number is of another. To analyze how educated adults process numbers in real time, Dotan and Dehaene asked the participants in their study to place numbers on a number line displayed on an iPad using a finger.
Previous studies showed that people who understand numbers linearly perform the task differently than people who understand numbers logarithmically. For example, linear thinkers place the number 20 in the middle of a number line marked from 0 to 40. But logarithmic thinkers like children may place the number 6 in the middle of the number line, because 1 is about the same percentage of 6 as 6 is of 40.
On the iPad used in the study, the participants were shown a number line marked only with “0” on one end and “40” on the other. Numbers popped up one at a time at the top of the iPad screen, and the participants dragged a finger from the middle of the screen down to the place on the number line where they thought each number belonged. Software tracked the path the finger took.
Changing course
Statistical analysis of the results showed that the participants placed the numbers on the number line in a linear way, as expected. But surprisingly — for only a few hundred milliseconds — they appeared to be influenced by their innate number sense. In the case of 20, for example, the participants drifted slightly rightward with their finger — toward where 20 would belong in a ratio-based number line — and then quickly corrected course. The results provide some of the most direct evidence to date that the innate number sense remains active, even if largely dormant, in educated adults.
"It really looks like the two systems in the brain compete with each other," said Dotan.
Significantly, the drift effect was found with two-digit as well as one-digit numbers. Many researchers believe that people can only convert two-digit numbers into quantities using the learned linear numerical system, which processes the quantity of each digit separately — for example, 34 is processed as 3 tens plus 4 ones. But Dotan and Dehaene’s research showed that the innate number sense is, in fact, capable of handling the complexity of two-digit numbers as well.
(Source: aftau.org)
Using a simple study of eye movements, Johns Hopkins scientists report evidence that people who are less patient tend to move their eyes with greater speed. The findings, the researchers say, suggest that the weight people give to the passage of time may be a trait consistently used throughout their brains, affecting the speed with which they make movements, as well as the way they make certain decisions.

Caption: Despite claims to the contrary, the eyes of the Mona Lisa do not make saccades. Credit: Leonardo da Vinci
In a summary of the research to be published Jan. 21 in The Journal of Neuroscience, the investigators note that a better understanding of how the human brain evaluates time when making decisions might also shed light on why malfunctions in certain areas of the brain make decision-making harder for those with neurological disorders like schizophrenia, or for those who have experienced brain injuries.
Principal investigator Reza Shadmehr, Ph.D., professor of biomedical engineering and neuroscience at The Johns Hopkins University, and his team set out to understand why some people are willing to wait and others aren’t. “When I go to the pharmacy and see a long line, how do I decide how long I’m willing to stand there?” he asks. “Are those who walk away and never enter the line also the ones who tend to talk fast and walk fast, perhaps because of the way they value time in relation to rewards?”
To address the question, the Shadmehr team used very simple eye movements, known as saccades, to stand in for other bodily movements. Saccades are the motions that our eyes make as we focus on one thing and then another. “They are probably the fastest movements of the body,” says Shadmehr. “They occur in just milliseconds.” Human saccades are fastest when we are teenagers and slow down as we age, he adds.
In earlier work, using a mathematical theory, Shadmehr and colleagues had shown that, in principle, the speed at which people move could be a reflection of the way the brain calculates the passage of time to reduce the value of a reward. In the current study, the team wanted to test the idea that differences in how subjects moved were a reflection of differences in how they evaluated time and reward.
For the study, the team first asked healthy volunteers to look at a screen upon which dots would appear one at a time –– first on one side of the screen, then on the other, then back again. A camera recorded their saccades as they looked from one dot to the other. The researchers found a lot of variability in saccade speed among individuals but very little variation within individuals, even when tested at different times and on different days. Shadmehr and his team concluded that saccade speed appears to be an attribute that varies from person to person. “Some people simply make fast saccades,” he says.
To determine whether saccade speed correlated with decision-making and impulsivity, the volunteers were told to watch the screen again. This time, they were given visual commands to look to the right or to the left. When they responded incorrectly, a buzzer sounded.
After becoming accustomed to that part of the test, they were forewarned that during the following round of testing, if they followed the command right away, they would be wrong 25 percent of the time. In those instances, after an undetermined amount of time, the first command would be replaced by a second command to look in the opposite direction.
To pinpoint exactly how long each volunteer was willing to wait to improve his or her accuracy on that phase of the test, the researchers modified the length of time between the two commands based on a volunteer’s previous decision. For example, if a volunteer chose to wait until the second command, the researchers increased the time they had to wait each consecutive time until they determined the maximum time the volunteer was willing to wait — only 1.5 seconds for the most patient volunteer. If a volunteer chose to act immediately, the researchers decreased the wait time to find the minimum time the volunteer was willing to wait to improve his or her accuracy.
When the speed of the volunteers’ saccades was compared to their impulsivity during the patience test, there was a strong correlation. “It seems that people who make quick movements, at least eye movements, tend to be less willing to wait,” says Shadmehr. “Our hypothesis is that there may be a fundamental link between the way the nervous system evaluates time and reward in controlling movements and in making decisions. After all, the decision to move is motivated by a desire to improve one’s situation, which is a strong motivating factor in more complex decision-making, too.”
(Source: eurekalert.org)
Forget about forgetting – The elderly know more and use it better
What happens to our cognitive abilities as we age? If your think our brains go into a steady decline, research reported this week in the Journal Topics in Cognitive Science may make you think again. The work, headed by Dr. Michael Ramscar of Tübingen University, takes a critical look at the measures usually thought to show that our cognitive abilities decline across adulthood. Instead of finding evidence of decline, the team discovered that most standard cognitive measures, which date back to the early twentieth century, are flawed. “The human brain works slower in old age,” says Ramscar, “but only because we have stored more information over time.”
Computers were trained, like humans, to read a certain amount each day, and to learn new things. When the researchers let a computer “read” only so much, its performance on cognitive tests resembled that of a young adult. But if the same computer was exposed to the experiences we might encounter over a lifetime – with reading simulated over decades – its performance now looked like that of an older adult. Often it was slower, but not because its processing capacity had declined. Rather, increased “experience” had caused the computer’s database to grow, giving it more data to process – which takes time.
Technology now allows researchers to make quantitative estimates of the number of words an adult can be expected to learn across a lifetime, enabling the Tübingen team to separate the challenge that increasing knowledge poses to memory from the actual performance of memory itself. “Imagine someone who knows two people’s birthdays and can recall them almost perfectly. Would you really want to say that person has a better memory than a person who knows the birthdays of 2000 people, but can ‘only’ match the right person to the right birthday nine times out of ten?” asks Ramscar.
The answer appears to be “no.” When Ramscar’s team trained their computer models on huge linguistic datasets, they found that standardized vocabulary tests, which are used to take account of the growth of knowledge in studies of ageing, massively underestimate the size of adult vocabularies. It takes computers longer to search databases of words as their sizes grow, which is hardly surprising but may have important implications for our understanding of age-related slowdowns. The researchers found that to get their computers to replicate human performance in word recognition tests across adulthood, they had to keep their capacities the same. “Forget about forgetting,” explained Tübingen researcher Peter Hendrix, “if I wanted to get the computer to look like an older adult, I had to keep all the words it learned in memory and let them compete for attention.”
The research shows that studies of the problems older people have with recalling names suffer from a similar blind spot: there is a far greater variety of given names today than there were two generations ago. This cultural shift toward greater name diversity means the number of different names anyone learns over their lifetime has increased dramatically. The work shows how this makes locating a name in memory far harder than it used to be. Even for computers.
Ramscar and his colleagues’ work provides more than an explanation of why, in the light of all the extra information they have to process, we might expect older brains to seem slower and more forgetful than younger brains. Their work also shows how changes in test performance that have been taken as evidence for declining cognitive abilities in fact demonstrates older adults’ greater mastery of the knowledge they have acquired.
Take “paired-associate learning,” a commonly used cognitive test that involves learning to connect words like “up” to “down” or “necktie” to “cracker” in memory. Using Big Data sets to quantify how often different words appear together in English, the Tuebingen team show that younger adults do better when asked to learn to pair “up” with “down” than “necktie” and “cracker” because “up” and “down” appear in close proximity to one another more frequently. However, whereas older adults also understand which words don’t usually go together, young adults notice this less. When the researchers examined performance on this test across a range of word pairs that go together more and less in English, they found older adult’s scores to be far more closely attuned to the actual information in hundreds of millions of words of English than their younger counterparts.
As Prof. Harald Baayen, who heads the Alexander von Humboldt Quantitative Linguistics research group where the work was carried out puts it, “If you think linguistic skill involves something like being able to choose one word given another, younger adults seem to do better in this task. But, of course, proper understanding of language involves more than this. You have also to not put plausible but wrong pairs of words together. The fact that older adults find nonsense pairs – but not connected pairs – harder to learn than young adults simply demonstrates older adults’ much better understanding of language. They have to make more of an effort to learn unrelated word pairs because, unlike the youngsters, they know a lot about which words don’t belong together.”
The Tübingen research conclude that we need different tests for the cognitive abilities of older people – taking into account the nature and amount of information our brains process. “The brains of older people do not get weak,” says Michael Ramscar. “On the contrary, they simply know more.”