Posts tagged cognitive function

Posts tagged cognitive function
Study examines change in cognitive function following physical, mental activity in older adults
A randomized controlled trial finds that 12 weeks of physical plus mental activity in inactive older adults with cognitive complaints was associated with significant improvement in cognitive function but there was no difference between intervention and control groups, according to a report published Online First by JAMA Internal Medicine, a JAMA Network publication.
An epidemic of dementia worldwide is anticipated during the next 40 years because of longer life expectancies and demographic changes. Behavioral interventions are a potential strategy to prevent or delay dementia in asymptomatic individuals, but few randomized controlled trials have studied the effects of physical and mental activity together, according to the study background.
"We found that cognitive scores improved significantly over the course of 12 weeks, but there were no significant differences between the intervention and active control groups. These results may suggest that in this study population, the amount of activity is more important than the type of activity, because all groups participated in both mental activity and exercise for [60 minutes/per day, three days/per week] for 12 weeks. Alternatively, the cognitive improvements observed may be due to practice effects," the authors note.
The study by Deborah E. Barnes, Ph.D., M.P.H., of the University of California, San Francisco, and colleagues included 126 inactive, community-dwelling older adults with cognitive complaints. All the individuals engaged in home-based mental activity (1 hour/per day, 3 days/per week) plus class-based physical activity (1 hour/per day, 3 days/per week) for 12 weeks and were assigned to either mental activity intervention (MA-I, intensive computer work); or mental activity control (MA-C, educational DVDs) plus exercise intervention (EX-1, aerobic) or exercise control (EX-C, stretching and toning). The study design meant there were four groups: MA-I/EX-I, MA-I/EX-C, MA-C/EX-1 and MA-C/EX-C.
Global cognitive scores improved significantly over time but did not differ between groups in the comparison between MA-I and MA-C (ignoring exercise), the comparison between EX-I and EX-C (ignoring mental activity), or across all four randomization groups, according to the study results.
"The prevalence of cognitive impairment and dementia are projected to rise dramatically during the next 40 years, and strategies for maintaining cognitive function with age are critically needed. Physical or mental activity alone result in small, domain-specific improvements in cognitive function in older adults; combined interventions may have more global effects," the study concludes.
(Image: Getty Images)

New urgency in battle against ‘bound legs’ disease
The harm done by konzo – a disease overshadowed by the war and drought it tends to accompany – goes beyond its devastating physical effects to impair children’s memory, problem solving and other cognitive functions.
Even children without physical symptoms of konzo appear to lose cognitive ability when exposed to the toxin that causes the disease, researchers report in the journal Pediatrics.
“That’s what’s especially alarming,” said lead author Michael Boivin, a Michigan State University associate professor of psychiatry and of neurology and ophthalmology. “We found subtle effects that haven’t been picked up before. These kids aren’t out of the woods, even if they don’t have the disease.”
Konzo means “bound legs” in the African Yaka language, a reference to how its victims walk with feet bent inward after the disease strips away motor control in their lower limbs. Its onset is rapid, and the damage is permanent.
People contract konzo by consuming poorly processed bitter cassava, a drought-resistant staple food in much of sub-Saharan Africa. Typically, the plant’s tuber is soaked for a few days, then dried in the sun and ground into flour – a process that degrades naturally occurring cyanide.
“As long as they do that, the food’s pretty safe,” said Boivin, who began studying konzo in 1990 as a Fulbright researcher in the Democratic Republic of Congo. “But in times of war, famine, displacement and hardship, people take shortcuts. If they’re subsisting on poorly processed cassava and they don’t have other sources of protein, it can cause permanent damage to the nervous system.
“Konzo doesn’t make many headlines because it usually follows other geopolitical aspects of human suffering,” he added. “Still, there are potentially tens of millions of kids at risk throughout central and western Africa. The public health scope is huge.”
To find out if the disease affects cognitive function, Boivin and colleagues from Oregon Health and Science University turned to the war-torn Congo. They randomly selected 123 children with konzo and 87 neighboring children who showed no signs of the disease but whose blood and urine samples indicated elevated levels of the toxin.
Using cognitive tests, the researchers found that children with konzo had a much harder time using working memory to solve problems and organize visual and spatial information.
They also found that konzo and non-konzo children from the outbreak area showed poor working memory and impaired fine-motor skills when compared to a reference group of children from a part of the region unaffected by the disease.
Konzo’s subtler impacts might seem minor compared to its striking physical symptoms, but Boivin noted that the cognitive damage is similar to that caused by chronic low-grade exposures to other toxic substances such as lead.
Scientists eventually may be able to prevent such damage by creating nontoxic cassava varieties and introducing other resilient crops to affected regions, Boivin said. Meanwhile, public health education programs are under way to help stop outbreaks.
“For now,” he said, “if we could just avoid the worst of it – the full-blown konzo disease that has such devastating effects for children and families – that’s a good start.”
Unraveling the molecular roots of Down syndrome
Researchers discover that the extra chromosome inherited in Down syndrome impairs learning and memory because it leads to low levels of SNX27 protein in the brain.
What is it about the extra chromosome inherited in Down syndrome—chromosome 21—that alters brain and body development? Researchers have new evidence that points to a protein called sorting nexin 27, or SNX27. SNX27 production is inhibited by a molecule encoded on chromosome 21. The study, published March 24 in Nature Medicine, shows that SNX27 is reduced in human Down syndrome brains. The extra copy of chromosome 21 means a person with Down syndrome produces less SNX27 protein, which in turn disrupts brain function. What’s more, the researchers showed that restoring SNX27 in Down syndrome mice improves cognitive function and behavior.
“In the brain, SNX27 keeps certain receptors on the cell surface—receptors that are necessary for neurons to fire properly,” said Huaxi Xu, Ph.D., Sanford-Burnham professor and senior author of the study. “So, in Down syndrome, we believe lack of SNX27 is at least partly to blame for developmental and cognitive defects.”
SNX27’s role in brain function
Xu and colleagues started out working with mice that lack one copy of the snx27 gene. They noticed that the mice were mostly normal, but showed some significant defects in learning and memory. So the team dug deeper to determine why SNX27 would have that effect. They found that SNX27 helps keep glutamate receptors on the cell surface in neurons. Neurons need glutamate receptors in order to function correctly. With less SNX27, these mice had fewer active glutamate receptors and thus impaired learning and memory.
SNX27 levels are low in Down syndrome
Then the team got thinking about Down syndrome. The SNX27-deficient mice shared some characteristics with Down syndrome, so they took a look at human brains with the condition. This confirmed the clinical significance of their laboratory findings—humans with Down syndrome have significantly lower levels of SNX27.
Next, Xu and colleagues wondered how Down syndrome and low SNX27 are connected—could the extra chromosome 21 encode something that affects SNX27 levels? They suspected microRNAs, small pieces of genetic material that don’t code for protein, but instead influence the production of other genes. It turns out that chromosome 21 encodes one particular microRNA called miR-155. In human Down syndrome brains, the increase in miR-155 levels correlates almost perfectly with the decrease in SNX27.
Xu and his team concluded that, due to the extra chromosome 21 copy, the brains of people with Down syndrome produce extra miR-155, which by indirect means decreases SNX27 levels, in turn decreasing surface glutamate receptors. Through this mechanism, learning, memory, and behavior are impaired.
Restoring SNX27 function rescues Down syndrome mice
If people with Down syndrome simply have too much miR-155 or not enough SNX27, could that be fixed? The team explored this possibility. They used a noninfectious virus as a delivery vehicle to introduce new human SNX27 in the brains of Down syndrome mice.
“Everything goes back to normal after SNX27 treatment. It’s amazing—first we see the glutamate receptors come back, then memory deficit is repaired in our Down syndrome mice,” said Xin Wang, a graduate student in Xu’s lab and first author of the study. “Gene therapy of this sort hasn’t really panned out in humans, however. So we’re now screening small molecules to look for some that might increase SNX27 production or function in the brain.”
Cognitive impairments are disabling for individuals with schizophrenia, and no satisfactory treatments currently exist. These impairments affect a wide range of cognition, including memory, attention, verbal and motor skills, and IQ. They appear in the earliest stages of the disease and disrupt or even prevent normal day-to-day functioning.
Scientists are exploring a variety of strategies to reduce these impairments including “exercising the brain” with specially designed computer games and medications that might improve the function of brain circuits.
In this issue of Biological Psychiatry, Dr. Mera Barr and her colleagues at University of Toronto provide new evidence that stimulating the brain using repetitive transcranial magnetic stimulation (rTMS) may be an effective strategy to improve cognitive function.
“In a randomized controlled trial, we evaluated whether rTMS can improve working memory in schizophrenia,” said Barr and senior author Dr. Zafiris Daskalakis. “Our results showed that rTMS resulted in a significant improvement in working memory performance relative to baseline.”
Transcranial magnetic stimulation is a non-invasive procedure that uses magnetic fields to stimulate nerve cells. It does not require sedation or anesthesia and so patients remain awake, reclined in a chair, while treatment is administered through coils placed near the forehead.
“TMS can have lasting effects on brain circuit function because this approach not only changes the activity of the circuit that is being stimulated, but it also may change the plasticity of that circuit, i.e., the capacity of the circuit to remodel itself functionally and structurally to support cognitive functions,” explained Dr. John Krystal, Editor of Biological Psychiatry.
Previous work has shown that rTMS improves working memory in healthy individuals and a recent open-label trial showed promising findings for verbal memory in schizophrenia patients. This series of findings led this study to determine if high frequency rTMS could improve memory in individuals with schizophrenia.
They recruited medicated schizophrenia patients who completed a working memory task before and after 4 weeks of treatment. Importantly, this was a double-blind study, where neither the patients nor the researchers knew who was receiving real rTMS or a sham treatment that was designed to entirely mimic the procedure without actually delivering brain stimulation.
rTMS not only improved working memory in patients after 4 weeks, but the improvement was to a level comparable to healthy subjects. These findings suggest that rTMS may be a novel, efficacious, and safe treatment for working memory deficits in schizophrenia.
In 2008, rTMS was FDA-approved to treat depression for individuals who don’t respond to pharmacotherapy. The hope is that additional research will replicate these findings and finally provide an approved treatment for cognitive impairments in schizophrenia.
The authors concluded: “Working memory is an important predictor of functional outcome. Developing novel treatments aimed at improving these deficits may ultimately translate into meaningful changes in the lives of patients suffering from this debilitating disorder.”
(Source: elsevier.com)
Monday’s medical myth: alcohol kills brain cells
Do you ever wake up with a raging hangover and picture the row of brain cells that you suspect have have started to decay? Or wonder whether that final glass of wine was too much for those tiny cells, and pushed you over the line?
Well, it’s true that alcohol can indeed harm the brain in many ways. But directly killing off brain cells isn’t one of them.
The brain is made up of nerve cells (neurons) and glial cells. These cells communicate with each other, sending signals from one part of the brain to the other, telling your body what to do. Brain cells enable us to learn, imagine, experience sensation, feel emotion and control our body’s movement.
Alcohol’s effects can be seen on our brain even after a few drinks, causing us to feel tipsy. But these symptoms are temporary and reversible. The available evidence suggests alcohol doesn’t kill brain cells directly.
There is some evidence that moderate drinking is linked to improved mental function. A 2005 Australian study of 7,500 people in three age cohorts (early 20s, early 40s and early 60s) found moderate drinkers (up to 14 drinks for men and seven drinks for women per week) had better cognitive functioning than non-drinkers, occasional drinkers and heavy drinkers.
But there is also evidence that even moderate drinking may impair brain plasticity and cell production. Researchers in the United States gave rats alcohol over a two-week period, to raise their alcohol blood concentration to about 0.08. While this level did not impair the rats’ motor skills or short-term learning, it impacted the brain’s ability to produce and retain new cells, reducing new brain cell production by almost 40%. Therefore, we need to protect our brains as best we can.
Excessive alcohol undoubtedly damages brain cells and brain function. Heavy consumption over long periods can damage the connections between brain cells, even if the cells are not killed. It can also affect the way your body functions. Long-term drinking can cause brain atrophy or shrinkage, as seen in brain diseases such as stroke and Alzheimer’s disease.
There is debate about whether permanent brain damage is caused directly or indirectly.
We know, for example, that severe alcoholic liver disease has an indirect effect on the brain. When the liver is damaged, it’s no longer effective at processing toxins to make them harmless. As a result, poisonous toxins reach the brain, and may cause hepatic encephalopathy (decline in brain function). This can result in changes to cognition and personality, sleep disruption and even coma and death.
Alcoholism is also associated with nutritional and absorptive deficiencies. A lack of Vitamin B1 (thiamine) causes brain disorders called Wernicke’s ncephalopathy (which manifests in confusion, unsteadiness, paralysis of eye movements) and Korsakoff’s syndrome (where patients lose their short-term memory and coordination).
So, how much alcohol is okay?
To reduce the lifetime risk of harm from alcohol-related disease or injury, the National Health and Medical Research Council recommends healthy adults drink no more than two standard drinks on any day. Drinking less frequently (such as weekly rather than daily) and drinking less on each occasion will reduce your lifetime risk.
To avoid alcohol-related injuries, adults shouldn’t drink more than four standard drinks on a single occasion. This applies to both sexes because while women become intoxicated with less alcohol, men tend to take more risks and experience more harmful effects.
For pregnant women and young people under the age of 18, the guidelines say not drinking is the safest option.
So while alcohol may not kill brain cells, if this myth encourages us to rethink that third beer or glass of wine, I won’t mind if it hangs around.
The Hidden Costs of Cognitive Enhancement
Gentle electrical zaps to the brain can accelerate learning and boost performance on a wide range of mental tasks, scientists have reported in recent years. But a new study suggests there may be a hidden price: Gains in one aspect of cognition may come with deficits in another.
Researchers who study transcranial electrical stimulation, which uses electrodes placed on the scalp, see it as a potentially promising way to enhance cognition in neurological patients, struggling students, and perhaps even ordinary people. Scientists have used it to speed up rehab in people whose speech or movement has been affected by a stroke, and DARPA has studied it as a way to accelerate learning in intelligence analysts or soldiers on the lookout for bad guys and bombs.
Until now, the papers coming out of this field have reported one good-news finding after another.
“This is the first paper to my knowledge to show a cost associated with the gains in cognitive function,” said neuropsychologist Rex Jung of the University of New Mexico, who was not associated with the study. “It’s a really nice demonstration.”
Cognitive neuroscientist Roi Cohen Kadosh of the University of Oxford, who led the study, has been investigating brain stimulation to boost mathematical abilities. He has applied for a patent on a brain stimulator he hopes could help math-challenged students get a better grip on the basics, or even help the mathematically inclined perform even better.
Cohen Kadosh and his colleague Teresa Iuculano investigated 19 volunteers as they learned a new numerical system by trial and error. The new system was based on arbitrary symbols: A cylinder represented the number five, for example, and a triangle represented the number nine. In several training sessions the volunteers viewed pairs of symbols on a computer screen and pressed a key to indicate which one represented a bigger quantity. At first they had to guess, but they eventually learned which symbols corresponded with which numbers.
All of the volunteers wore electrodes on their scalp during these training session. Some received mild electrical stimulation that targeted the posterior parietal cortex, an area implicated in previous studies of numerical cognition. Others received stimulation of the dorsolateral prefrontal cortex, an area involved in a wide range of functions, including learning and memory. A third group received sham stimulation that caused a slight tingling of the skin but no change in brain activity.
Those who had the parietal area involved in numerical cognition stimulated learned the new number system more quickly than those who got sham stimulation, the researchers report in the Journal of Neuroscience. But at the end of the weeklong study their reaction times were slower when they had to put their newfound knowledge to use to solve a new task that they hadn’t seen during the training sessions. ”They had trouble accessing what they’d learned,” Cohen Kadosh said.
The volunteers who had the prefrontal area involved in learning and memory stimulated showed the opposite pattern. They were slower than the control group to learn the new numerical system, but they performed faster on the new test at the end of the experiment. The bottom line, says Cohen Kadosh, is that stimulating either brain region had both benefits and drawbacks. ”Just like with drugs, there seem to be side effects,” he said.
Going forward, Cohen Kadosh says, more work is needed on how to maximize the benefits and minimize the costs of electrical brain stimulation. He thinks the approach has promise, but only when it’s used strategically, by picking the right brain regions to target and stimulating them while a person is training on the skill they want to improve. ”I think it’s going to be useless unless you pair it with some type of cognitive training,” he said.
But that’s not stopping some people from giving it a try on their own. Although it should be obvious that DIY brain stimulation is a bad idea, both Jung and Cohen Kadosh say there seems to be growing interest in the general public in using it for cognitive enhancement.
“There are some do it yourself websites I’ve stumbled across that are pretty frightening,” Jung said. “People are definitely tinkering around with this in their garage.”
The new study suggests one way that could backfire. And that’s not all, said Jung. ”You can burn yourself if nothing else.”
Solving the ‘Cocktail Party Problem’
Many smartphones claim to filter out background noise, but they’ve got nothing on the human brain. We can tune in to just one speaker at a noisy cocktail party with little difficulty—an ability that has been a scientific mystery since the early 1950s. Now, researchers argue that the competing noise of other partygoers is filtered out in the brain before it reaches regions involved in higher cognitive functions, such as language and attention control. Their experiments were the first to demonstrate this process.
The scientists didn’t do anything as social as attend a noisy party. Instead, Charles Schroeder, a psychiatrist at the Columbia University College of Physicians and Surgeons in New York City, and colleagues recorded the brain activity of six people with intractable epilepsy who required brain surgery. In order to identify the part of their brains responsible for seizures, the patients underwent 1 to 4 weeks of observation through electrocorticography (ECoG), a technique that provides precise neural recordings via electrodes placed directly on the surface of the brain. Schroeder and his team, using the ECoG data, conducted their experiments during this time.
The researchers showed the patients two videos simultaneously, each of a person telling a 9- to 12-second story; they were asked to concentrate on just one speaker. To determine which neural recordings corresponded to the “ignored” and “attended” speech, the team reconstructed speech patterns from the brain’s electrical activity using a mathematical model. The scientists then matched the reconstructed patterns with the original patterns coming from the ignored and attended speakers.
The patients’ brains had registered both attended and ignored speech, though they showed some preference for the attended speech, the researchers report online in Neuron. Because the researchers were able to record several regions of the patients’ brains, they saw that regions associated with “higher-order” abilities—like the inferior frontal cortex, which is involved with language—had only representations of attended speech. Moreover, this representation of attended speech improved as the speaker’s story unfolded. These findings support a continuous model of attention—called the “selective entrainment hypothesis”—in which the brain tracks and becomes increasingly selective to a particular voice.
The research supports the selective entrainment hypothesis, agrees Jason Bohland, director of Boston University’s Quantitative Neuroscience Laboratory, but it “doesn’t necessarily tell us how that happens. That’s a really hard question, and is still left very much up in the air.”
Though a technology less-invasive than ECoG would be needed, Bohland and Schroeder agree that this research could help provide good clinical markers for people with certain social disorders. People with attention deficit disorder, for example, may struggle in tracking specific voices or filtering out unwanted neural representations of sounds. And those problems should be represented in their brain activity.
Schroeder explained that this study was a part of a new wave of research that aims to “approximate a map of the total brain circuit that’s involved in [complex] things like speech and music perception, which people consider—rightly or wrongly—to be uniquely human.”
New gene variant may explain psychotic features in bipolar disorder
Researchers at Karolinska Institutet have found an explanation for why the level of kynurenic acid (KYNA) is higher in the brains of people with schizophrenia or bipolar disease with psychosis. The study, which is published in the scientific periodical Molecular Psychiatry, identifies a gene variant associated with an increased production of KYNA.
The discovery contributes to the further understanding of the link between inflammation and psychosis, and might pave the way for improved therapies. Kynurenic acid (KYNA) is a substance that affects several signalling pathways in the brain and that is integral to cognitive function. Earlier studies of cerebrospinal fluid have shown that levels of KYNA are elevated in the brains of patients with schizophrenia or bipolar diseases with psychotic features. The reason for this has, however, not been fully understood.
KMO is an enzyme involved in the production of KYNA, and the Karolinska Institutet team has now shown that some individuals have a particular genetic variant of KMO that affects its quantity, resulting in higher levels of KYNA. The study also shows that patients with bipolar disease who carry this gene variant had almost twice the chance of developing psychotic episodes.
KYNA is produced in inflammation, such as when the body is exposed to stress and infection. It is also known that stress and infection may trigger psychotic episodes. The present study provides a likely description of this process, which is more likely to occur in those individuals with the gene variant related to higher production of KYNA. The researchers also believe that the discovery can help explain certain features of schizophrenia or development of other psychotic conditions.
"Psychosis related to bipolar disease has a very high degree of heredity, up to 80 per cent, but we don’t know which genes and which mechanisms are involved," says Martin Schalling, Professor of medical genetics at Karolinska Institutet’s Department of Molecular Medicine and Surgery, also affiliated to the Center for Molecular Medicine (CMM). "This is where our study comes in, with a new explanation that can be linked to signal systems activated by inflammation. This has consequences for diagnostics, and paves the way for new therapies, since there is a large arsenal of already approved drugs that modulate inflammation."

Study looks to distinguish cognitive functioning in centenarians
As life expectancy continues to increase, more and more people will reach and surpass the century mark in age. But even as greater numbers reach and surpass the 100-year milestone, little is known about what constitutes normal levels of cognitive function in the second century of life.
Led by Adam Davey, associate professor in Temple’s Department of Public Health, a group of researchers used a new method called factor mixture analysis — a statistical technique for identifying different groups within a population — to identify the prevalence of cognitive impairment in centenarians and try to understand the cognitive changes that are part of extreme aging. They published their findings, “Profiles of Cognitive Functioning in a Population-Based Sample of Centenarians Using Factor Mixture Analysis,” in the journal Experimental Aging Research.
“One of the motivations for studying centenarians is that they are very close to the upper limit of human life expectancy right now,” said Davey. “By looking at their cognitive functioning we can learn a lot in terms of how common or prevalent cognitive impairment is among that age group.”
Using voter registration lists and nursing home records in 44 counties in northern Georgia, the researchers identified 244 people between the ages of 98-108 — approximately 20 percent of all centenarians living in that region — who participated in the study. Participants were assessed based on a series of standard tests used to measure cognitive functioning.
“As people get into later life and the prevalence of cognitive impairment becomes relatively high, we need some way of distinguishing between those people who are aging normally and the people who have cognitive impairment, which could indicate dementia,” said Davey.
The researchers found that even though approximately two-thirds of centenarians were at or below the threshold for cognitive impairment by one commonly used measure, only one-third of centenarians were identified as cognitively impaired using their new approach.
“That’s consistent with the level of cognitive impairment found in another study that looked at people up to the age of 85-plus,” said Davey. “But even the normal folks have had cognitive declines to the point that they are functioning at a level that would indicate impairment at younger ages.”
The researchers found that characteristics such as age, race and educational attainment can help to distinguish those in the lower cognitive performance group.
“This is the first study that I’m aware of that allows us to distinguish between these two groups of centenarians, so that we can start to develop benchmarks for what is normal cognitive functioning among members of this age group,” said Davey. “These people have lived so long that even their normal cognitive function could be mistaken for a form of dementia if a physician were to treat them as they would someone who was merely old.”
(Image credit: Krissy_77)
IQ loss linked to Schizophrenia genes
People at greater genetic risk of schizophrenia could see a fall in IQ as they age, study shows.
Scientists at the University say IQ decline in those at risk could happen even if they do not develop schizophrenia.
The findings could lead to new research into how different genes for schizophrenia affect brain function over time. Schizophrenia - a severe mental disorder characterised by delusions and by hallucinations - is in part caused by genetic factors.
The researchers used the latest genetic analysis techniques to reach their conclusion on how thinking skills change with age.
Retaining our thinking skills as we grow older is important for living well and independently. If nature has loaded a person’s genes towards schizophrenia, then there is a slight but detectable worsening in cognitive functions between childhood and old age. -Professor Ian Deary (Director of the University of Edinburgh’s Centre for Cognitive Ageing and Cognitive Epidemiology)
Historical data
They compared the IQ scores of more than 1,000 people from Edinburgh.
The people were tested for general cognitive functions in 1947, aged 11, and again when they were around 70 years old.
The researchers were able to examine people’s genes and calculate each subject’s genetic likelihood of developing schizophrenia, even though none of the group had ever developed the illness.
They then compared the IQ scores of people with a high and low risk of developing schizophrenia.
Scientists found that there was no difference at age 11, but people with a greater genetic risk of schizophrenia had slightly lower IQs at age 70.
Those people who had more genes linked to schizophrenia also had a greater estimated fall in IQ over their lifetime than those at lower risk.
Cognitive impact
With further research into how these genes affect the brain, it could become possible to understand how genes linked to schizophrenia affect people’s cognitive functions as they age. -Professor Andrew McIntosh (Centre for Clinical Brain Sciences)
Schizophrenia affects around 1 per cent of the population, often in the teenage or early adult years, and is associated with problems in mental ability and memory.
The study, which was funded by the BBSRC, Age UK, and the Chief Scientist Office, is published in the journal Biological Psychiatry.