Posts tagged cognitive function

Posts tagged cognitive function

Drinking alcohol during pregnancy affects learning and memory function in offspring?
Maternal alcohol consumption during pregnancy has detrimental effects on fetal central nervous system development. Maternal alcohol consumption prior to and during pregnancy significantly affects cognitive functions in offspring, which may be related to changes in cyclin-dependent kinase 5 because it is associated with modulation of synaptic plasticity and impaired learning and memory. Prof. Ruiling Zhang and team from Xinxiang Medical University explored the correlation between cyclin-dependent kinase 5 expression in the hippocampus and neurological impairments following prenatal ethanol exposure, and found that prenatal ethanol exposure could affect cyclin-dependent kinase 5 and its activator p35 in the hippocampus of offspring rats. These findings, which reported in the Neural Regeneration Research (Vol. 8, No. 18, 2013), propose new insights into the mechanisms underlying the role of ethanol exposure in central nervous system injuries, and provide a new strategy for treating the consequences of prenatal ethanol exposure.
Unique Epigenomic Code Identified During Human Brain Development
Changes in the epigenome, including chemical modifications of DNA, can act as an extra layer of information in the genome, and are thought to play a role in learning and memory, as well as in age-related cognitive decline. The results of a new study by scientists at the Salk Institute for Biological Studies show that the landscape of DNA methylation, a particular type of epigenomic modification, is highly dynamic in brain cells during the transition from birth to adulthood, helping to understand how information in the genomes of cells in the brain is controlled from fetal development to adulthood. The brain is much more complex than all other organs in the body and this discovery opens the door to a deeper understanding of how the intricate patterns of connectivity in the brain are formed.
“These results extend our knowledge of the unique role of DNA methylation in brain development and function,” says senior author Joseph R. Ecker, professor and director of Salk’s Genomic Analysis Laboratory and holder of the Salk International Council Chair in Genetics. “They offer a new framework for testing the role of the epigenome in healthy function and in pathological disruptions of neural circuits.”
A healthy brain is the product of a long process of development. The front-most part of our brain, called the frontal cortex, plays a key role in our ability to think, decide and act. The brain accomplishes all of this through the interaction of special cells such as neurons and glia. We know that these cells have distinct functions, but what gives these cells their individual identities? The answer lies in how each cell expresses the information contained in its DNA. Epigenomic modifications, such as DNA methylation, can control which genes are turned on or off without changing letters of the DNA alphabet (A-T-C-G), and thus help distinguish different cell types.
In this new study, published July 4 in Science, the scientists found that the patterns of DNA methylation undergo widespread reconfiguration in the frontal cortex of mouse and human brains during a time of development when synapses, or connections between nerve cells, are growing rapidly. The researchers identified the exact sites of DNA methylation throughout the genome in brains from infants through adults. They found that one form of DNA methylation is present in neurons and glia from birth. Strikingly, a second form of “non-CG” DNA methylation that is almost exclusive to neurons accumulates as the brain matures, becoming the dominant form of methylation in the genome of human neurons. These results help us to understand how the intricate DNA landscape of brain cells develops during the key stages of childhood.
The genetic code in DNA is made up of four chemical bases: adenine (A), guanine (G), cytosine (C), and thymine (T). DNA methylation typically occurs at so-called CpG sites, where C (cytosine) sits next to G (guanine) in the DNA alphabet. About 80 to 90 percent of CpG sites are methylated in human DNA. Salk researchers previously discovered that in human embryonic stem cells and induced pluripotent stem cells, a type of artificially derived stem cell, DNA methylation can also occur when G does not follow C, hence “non-CG methylation.” Originally, they thought that this type of methylation disappeared when stem cells differentiate into specific tissue-types such as lung or fat cells. The current study finds this is not the case in the brain, where non-CG methylation appears after cells differentiate, usually during childhood and adolescence when the brain is maturing.
By sequencing the genomes of mouse and human brain tissue as well as neurons and glia (from the frontal cortex of the brain) during early postnatal, juvenile, adolescent and adult stages, the Salk team found that non-CG methylation accumulates in neurons through early childhood and adolescence, and becomes the dominant form of DNA methylation in mature human neurons. “This shows that the period during which the neural circuits of the brain mature is accompanied by a parallel process of large-scale reconfiguration of the neural epigenome,” says Ecker, who is a Howard Hughes Medical Institute and Gordon and Betty Moore Foundation investigator.
The study provides the first comprehensive maps of how DNA methylation patterns change in the mouse and human brain during development, forming a critical foundation to now explore whether changes in methylation patterns may be linked to human diseases, including psychiatric disorders. Recent studies have demonstrated a possible role for DNA methylation in schizophrenia, depression, suicide and bipolar disorder. “Our work will let us begin to ask more detailed questions about how changes in the epigenome sculpt the complex identities of brain cells through life,” says co-first author Eran Mukamel, from Salk’s Computational Neurobiology Laboratory.
“The human brain has been called the most complex system that we know of in the universe,” says Ryan Lister, co-corresponding author on the new paper, previously a postdoctoral fellow in Ecker’s laboratory at Salk and now a group leader at The University of Western Australia. “So perhaps we shouldn’t be so surprised that this complexity extends to the level of the brain epigenome. These unique features of DNA methylation that emerge during critical phases of brain development suggest the presence of previously unrecognized regulatory processes that may be critically involved in normal brain function and brain disorders.”
At present, there is consensus among neuroscientists that many mental disorders have a neurodevelopmental origin and arise from an interaction between genetic predisposition and environmental influences (for example, early-life stress or drug abuse), the outcome of which is altered activity of brain networks. The building and shaping of these brain networks requires a long maturation process in which central nervous system cell types (neurons and glia) need to fine-tune the way they express their genetic code.
“DNA methylation fulfills this role,” says study co-author Terrence J. Sejnowski, a Howard Hughes Medical Institute Investigator, holder of the Francis Crick Chair and head of Salk’s Computational Neurobiology Laboratory. “We found that patterns of methylation are dynamic during brain development, in particular for non-CG methylation during early childhood and adolescence, which changes the way that we think about normal brain function and dysfunction.”
By disrupting the transcriptional expression of neurons, adds co-corresponding author M. Margarita Behrens, a staff scientist in the Computational Neurobiology Laboratory, “the alterations of these methylation patterns will change the way in which networks are formed, which could, in turn, lead to the appearance of mental disorders later in life.”

Nicotinic receptor essential for cognition and mental health
The ability to maintain mental representations of ourselves and the world — the fundamental building block of human cognition — arises from the firing of highly evolved neuronal circuits, a process that is weakened in schizophrenia. In a new study, researchers at Yale University School of Medicine pinpoint key molecular actions of proteins that allow the creation of mental representations necessary for higher cognition that are genetically altered in schizophrenia. The study was released July 1 in the Proceedings of the National Academy of Sciences.
Working memory, the mind’s mental sketch pad, depends upon the proper functioning of a network of pyramid-shaped brain cells in the prefrontal cortex, the seat of higher order thinking in humans. To keep information in the conscious mind, these pyramidal cells must stimulate each other through a special group of receptors. The Yale team discovered this stimulation requires the neurotransmitter acetylcholine to activate a specific protein in the nicotinic family of receptors — the alpha7 nicotinic receptor.
Acetycholine is released when we are awake — but not in deep sleep. These receptors allow prefrontal circuits to come “online” when we awaken, allowing us to perform complex mental tasks. This process is enhanced by caffeine in coffee, which increases acetylcholine release. As their name suggests, nicotinic alpha-7 receptors are also activated by nicotine, which may may help to explain why smoking can focus attention and calm behavior, functions of the prefrontal cortex.
The results also intrigued researchers because alpha7 nicotinic receptors are genetically altered in schizophrenia, a disease marked by disorganized thinking. “Prefrontal networks allow us to form and hold coherent thoughts, a process that is impaired in schizophrenia,” said Amy Arnsten, professor of neurobiology, investigator for Kavli Institute, and one of the senior authors of the paper. “A great majority of schizophrenics smoke, which makes sense because stimulation of the nicotinic alpha7 receptors would strengthen mental representations and lessen thought disorder.”
Arnsten said that new medications that stimulate alpha-7 nicotinic receptors may hold promise for treating cognitive disorders.
Publication of the PNAS paper comes on the eve of the 10th anniversary of the death of Yale neurobiologist Patricia Goldman-Rakic, who was hit by a car in Hamden Ct. on July 31, 2003. Goldman-Rakic first identified the central role of prefrontal cortical circuits in working memory.
“Patricia’s work has provided the neural foundation for current studies of molecular influences on cognition and their disruption in cognitive disorders,” said Arnsten. “Our ability to apply a scientific approach to perplexing disorders such as schizophrenia is due to her groundbreaking research.”
Past Brain Activation Revealed in Scans
Weizmann Institute scientists discover that spontaneously emerging brain activity patterns preserve traces of previous cognitive activity
What if experts could dig into the brain, like archaeologists, and uncover the history of past experiences? This ability might reveal what makes each of us a unique individual, and it could enable the objective diagnosis of a wide range of neuropsychological diseases. New research at the Weizmann Institute hints that such a scenario is within the realm of possibility: It shows that spontaneous waves of neuronal activity in the brain bear the imprints of earlier events for at least 24 hours after the experience has taken place.
The new research stems from earlier findings in the lab of Prof. Rafi Malach of the Institute’s Neurobiology Department and others that the brain never rests, even when its owner is resting. When a person is resting with closed eyes – that is, no visual stimulus is entering the brain – the normal bursts of nerve cell activity associated with incoming information are replaced by ultra-slow patterns of neuronal activity. Such spontaneous or “resting” waves travel in a highly organized and reproducible manner through the brain’s outer layer – the cortex – and the patterns they create are complex, yet periodic and symmetrical.
Like hieroglyphics, it seemed that these patterns might have some meaning, and research student Tal Harmelech, under the guidance of Malach and Dr. Son Preminger, set out to uncover their significance. Their idea was that the patterns of resting brain waves may constitute “archives” for earlier experiences. As we add new experiences, the activation of our brain’s networks lead to long-term changes in the links between brain cells, a facility referred to as plasticity. As our experiences become embedded in these connections, they create “expectations” that come into play before we perform any type of mental task, enabling us to anticipate the result. The researchers hypothesized that information about earlier experiences would thus be incorporated into the links between networks of nerve cells in the cortex, and these would show up in the brain’s spontaneously emerging wave patterns.
In the experiment, the researchers had volunteers undertake a training exercise that would strongly activate a well-defined network of nerve cells in the frontal lobes. While undergoing scans of their brain activity in the Institute’s functional magnetic resonance imaging (fMRI) scanner, the subjects were asked to imagine a situation in which they had to make rapid decisions. The subjects received auditory feedback in real time, based on the information obtained directly from their frontal lobe, which indicated the level of neuronal activity in the trained network. This “neurofeedback” strategy proved highly successful in activating the frontal network – a part of the brain that is notoriously difficult to activate under controlled conditions.
To test whether the connections created in the brain during this exercise would leave their traces in the patterns formed by the resting brain waves, the researchers performed fMRI scans on the resting subjects before the exercise, immediately afterward, and 24 hours later. Their findings, which appeared in the Journal of Neuroscience, showed that the activation of the specific areas in the cortex did indeed remodel the resting brain wave patterns. Surprisingly, the new patterns not only remained the next day, they were significantly strengthened. These observations fit in with the classic learning principles proposed by Donald Hebb in the mid-20th century, in which the co-activation of two linked nerve cells leads to long term strengthening of their link, while activity that is not coordinated weakens this link. The fMRI images of the resting brain waves showed that brain areas that were activated together during the training sessions exhibited an increase in their functional link a day after the training, while those areas that were deactivated by the training showed a weakened functional connectivity.
This research suggests a number of future possibilities for exploring the brain. For example, spontaneously emerging brain patterns could be used as a “mapping tool” for unearthing cognitive events from an individual’s recent past. Or, on a wider scale, each person’s unique spontaneously emerging activity patterns might eventually reveal a sort of personal profile – highlighting each individual’s abilities, shortcomings, biases, learning skills, etc. “Today, we are discovering more and more of the common principles of brain activity, but we have not been able to account for the differences between individuals,” says Malach. “In the future, spontaneous brain patterns could be the key to obtaining unbiased individual profiles.” Such profiles could be especially useful in diagnosing or learning the brain pathologies associated with a wide array of cognitive disabilities.
In a new study, post-menopausal women on testosterone therapy showed a significant improvement in verbal learning and memory, offering a promising avenue for research into memory and ageing.

Led by Director of the Women’s Health Research Program at Monash University, Professor Susan Davis, and presented at ENDO 2103, the research is the first large, randomised, placebo-controlled investigation into the effects of testosterone on cognitive function in postmenopausal women.
Testosterone has been implicated as being important for brain function in men and these results indicate that it has a role in optimising learning and memory in women.
Dementia, which was estimated to affect more than 35 million people worldwide in 2010, is more common in women than men. There are no effective treatments to prevent memory decline.
In the study, 96 postmenopausal women recruited from the community were randomly allocated to receive a testosterone gel or a visually identical placebo gel to be applied to the skin. Participants underwent a comprehensive series of cognitive tests at the beginning of the study and 26 weeks later.
All women performed in the normal range for their age at the beginning of the trial. There was a statistically significant and clinically meaningful improvement in verbal learning and memory amongst the women using the testosterone gel after 26 weeks.
Professor Davis said the results indicated that testosterone played an important role in women’s health.
"Much of the research on testosterone in women to date has focused on sexual function. But testosterone has widespread effects in women, including, it appears, significant favourable effects on verbal learning and memory," Professor Davis said.
"Our findings provide compelling evidence for the conduct of larger clinical studies to further investigate the role of testosterone in cognitive function in women.
Androgen levels did increase in the cohort on testosterone therapy, but on average, remained in the normal female range. No negative side-effects of the therapy were observed.
A line of genetically modified mice that Western University scientists call “Forrest Gump” because, like the movie character, they can run far but they aren’t smart, is furthering the understanding of a key neurotransmitter called acetylcholine (ACh). Marco Prado, PhD, and his team at Robarts Research Institute say the mice show what happens when too much of this neurotransmitter becomes available in the brain. Boosting ACh is a therapeutic target for Alzheimer’s disease because it’s found in reduced amounts when there’s cognitive failure. Prado’s research is published in the Journal of Neuroscience.
“We wanted to know what happens if you have more of the gene which controls how much acetylcholine is secreted by neurons,” says Prado, a Robarts scientist and professor in the Departments of Physiology and Pharmacology and Anatomy and Cell Biology at Western’s Schulich School of Medicine & Dentistry. “The response was the complete opposite of what we expected. It’s not a good thing. Acetylcholine release was increased threefold in these mice, which seemed to disturb cognitive function. But put them on a treadmill and they can run twice as far as normal mice before tiring. They’re super-athletes.” In addition to its function in modulating cognitive abilities, ACh drives muscle contraction which allowed for the marked improvement in motor endurance.
One of the tests the scientists, including first author Benjamin Kolisnyk, used is called the touch screen test for mice which uses technology similar to a tablet. After initiating the test, the mice have to scan five different spots on the touch screen to see a light flash, and then run and touch that area. If they get it right they get a reward. Compared to the control mice, the “Forrest Gump” mice failed miserably at the task. The researchers found the mice, which have the scientific name ChAT-ChR2-EYFP, had terrible attention spans, as well as dysfunction in working memory and spatial memory.
Prado interprets the research as showing ACh is very important for differentiating cues. So if your brain is presented with a lot of simultaneous information, it helps to pick what’s important. But when you flood the brain with ACh, your brain loses the ability to discern what’s relevant. This study was funded mainly by the Canadian Institutes of Health Research.
(Source: communications.uwo.ca)

Alzheimer’s and Low Blood Sugar in Diabetes May Trigger a Vicious Cycle
A new UC San Francisco-led study looks at the close link between diabetes and dementia, which can create a vicious cycle.
Diabetes-associated episodes of low blood sugar may increase the risk of developing dementia, while having dementia or even milder forms of cognitive impairment may increase the risk of experiencing low blood sugar, according to the study published online Monday in JAMA Internal Medicine.
Researchers analyzed data from 783 diabetic participants and found that hospitalization for severe hypoglycemia among the diabetic, elderly participants in the study was associated with a doubled risk of developing dementia later. Similarly, study participants with dementia were twice as likely to experience a severe hypoglycemic event.
The study results suggest some patients risk entering a downward spiral in which hypoglycemia and cognitive impairment fuel one another, leading to worse health, said Kristine Yaffe, MD, senior author and principal investigator for the study, and a UCSF professor of psychiatry, neurology and epidemiology based at the San Francisco Veterans Affair Medical Center.
“Older patients with diabetes may be especially vulnerable to a vicious cycle in which poor diabetes management may lead to cognitive decline and then to even worse diabetes management,” she said.
Cognitive Function a Factor in Managing Diabetes
The researchers analyzed hospital records of patients from Memphis and Pittsburgh, ages 70 to 79 at the time of enrollment, who participated in the federally funded Health, Aging and Body Composition (Health ABC) study, begun in 1997. The UCSF results are based on an average of 12 years of follow-up study. Participants in the Health ABC study periodically underwent tests to measure cognitive function.
Nearly half of participants included in the newly published analysis were black, and the rest were white. None had dementia at the start of the study, and all either had diabetes at the beginning of the study or were diagnosed during the course of the study.
“Individuals with dementia or even those with milder forms of cognitive impairment may be less able to effectively manage complex treatment regimens for diabetes and less able to recognize the symptoms of hypoglycemia and to respond appropriately, increasing their risk of severe hypoglycemia,” Yaffe said. “Physicians should take cognitive function into account in managing diabetes in elderly individuals.”
Certain medications known to carry a higher risk for hypoglycemia — such as insulin secretagogues and certain sulfonylureas — may be inappropriate for older adults with dementia or who are at risk for cognitive impairment, according to Yaffe.
Previous studies in which researchers investigated hypoglycemia and cognitive function have had inconsistent findings. A strength of the current study is that individuals were tracked from baseline over a relatively long time, and the older age of participants may also have been a factor in the highly statistically significant outcome, Yaffe said.
Weird: Nuclear Bomb Tests Reveal Adults Grow New Brain Cells
Aboveground nuclear bomb testing in the 1950s and 1960s inadvertently gave modern scientists a way to prove the adult brain regularly creates new neurons, research reveals.
Researchers used to believe that the brain changed little once it finished maturing. That view is now considered out of date, as studies have revealed how changeable — or plastic — the adult brain can be.
Much of this plasticity is related to the brain’s organization; brain cells can alter their connections and communications with other brain cells. What has been less clear is whether, and to what extent, the human brain grows brand-new neurons in adulthood.
"There was a lot in the literature showing there was neurogenesis in rodents and every animal studied," said study researcher Kirsty Spalding, a biologist at the Karolinska Institute in Sweden, "But there was very little evidence of whether this happens in humans."
Tantalizing clues
Scientists had reason to believe it does. In adult mice, the hippocampus, a structure deep in the brain involved in memory and navigation, turns over cells all the time. Some of the biological markers linked to this turnover are seen in the human hippocampus. But the only direct evidence of new brain cells forming in the region came from a 1998 study in which researchers looked at the brains of five people who had been injected with a compounded called BrdU that cells take up into their DNA. (The compound was once used in experimental cancer studies, but is not used anymore for safety reasons.)
The BrdU study revealed that neurons in the hippocampuses of the participants contained the compound in their DNA, indicating these brain cells had formed after the injections. The oldest person in the study was 72, suggesting new neuron creation, known as neurogenesis, continues well into old age.
The 1998 study was the only direct evidence of such neurogenesis in the human hippocampus, however. Spalding and her colleagues wanted to change that. Ten years ago, they began a project to track the age of neurons in the human brain using an unusual tool: spare molecules left over from Cold War-era nuclear bomb tests.
Learning to love the bomb
Between 1945 and 1962, the United States conducted hundreds of aboveground nuclear bomb tests. These tests largely stopped with the Limited Test Ban Treaty of 1963, but their effects remained in the atmosphere. The neutrons sent flying by the bombs reacted with nitrogen in the atmosphere, creating a spike in carbon 14, an isotope (or variation) of carbon.
This carbon 14, in turn, did what carbon in the atmosphere does. It combined with oxygen to form carbon dioxide, and was then taken in by plants, which use carbon dioxide in photosynthesis. Humans ate some of these plants, along with some of the animals that also ate these plants, and the carbon 14 inside ended up in their bodies.
When a cell divides, it uses this carbon 14, integrating it into the DNA of the new cells that are forming. Carbon 14 decays over time at a known rate, so scientists can pinpoint from that decay exactly when the new cells were born.
Over the past decade, Spalding and her colleagues have used the technique in a variety of cells, including fat cells, refining it along the way until it became sensitive enough to measure tiny amounts of carbon 14 in small hippocampus samples. The researchers collected samples, with family permission, from autopsies in Sweden.
They found the tantalizing 1998 evidence was correct: Human hippocampuses do grow new neurons. In fact, about a third of the brain region is subject to cell turnover, with about 700 new neurons being formed each day in each hippocampus (humans have two, a mirror-image set on either side of the brain). Hippocampus neurons die each day, too, keeping the overall number more or less in balance, with some slow loss of cells with aging, Spalding said.
This turnover occurs at a ridge in the hippocampus known as the dentate gyrus, a spot known to contribute to the formation of new memories. Researchers aren’t sure what the function of this constant renewal is, but it could relate to allowing the brain to cope with novel situations, Spalding told LiveScience.
"Neurogenesis gives a particular kind of plasticity to the brain, a cognitive flexibility," she said.
Spalding and her colleagues had used the same techniques in other regions of the brain, including the cortex, the cerebellum and the olfactory bulb, and found no evidence of newborn neurons being integrated into those areas. The researchers now plan to study whether there are any links between neurogenesis and psychiatric conditions such as depression.
The new findings are detailed in the journal Cell.
Clouds in the Head: New Model of Brain’s Thought Processes
A new model of the brain’s thought processes explains the apparently chaotic activity patterns of individual neurons. They do not correspond to a simple stimulus/response linkage, but arise from the networking of different neural circuits. Scientists funded by the Swiss National Science Foundation (SNSF) propose that the field of brain research should expand its focus.
Many brain researchers cannot see the forest for the trees. When they use electrodes to record the activity patterns of individual neurons, the patterns often appear chaotic and difficult to interpret. “But when you zoom out from looking at individual cells, and observe a large number of neurons instead, their global activity is very informative,” says Mattia Rigotti, a scientist at Columbia University and New York University who is supported by the SNSF and the Janggen-Pöhn-Stiftung. Publishing in Nature together with colleagues from the United States, he has shown that these difficult-to-interpret patterns in particular are especially important for complex brain functions.
What goes on in the heads of apes
The researchers have focussed their attention on the activity patterns of 237 neurons that had been recorded some years previously using electrodes implanted in the frontal lobes of two rhesus monkeys. At that time, the apes had been taught to recognise images of different objects on a screen. Around one third of the observed neurons demonstrated activity that Rigotti describes as “mixed selectivity.” A mixed selective neuron does not always respond to the same stimulus (the flowers or the sailing boat on the screen) in the same way. Rather, its response differs as it also takes account of the activity of other neurons. The cell adapts its response according to what else is going on in the ape’s brain.
Chaotic patterns revealed in context
Just as individual computers are networked to create concentrated processing and storage capacity in the field of Cloud Computing, links in the complex cognitive processes that take place in the prefrontal cortex play a key role. The greater the density of the network in the brain, in other words the greater the proportion of mixed selectivity in the activity patterns of the neurons, the better the apes were able to recall the images on the screen, as demonstrated by Rigotti in his analysis. Given that the brain and cognitive capabilities of rhesus monkeys are similar to those of humans, mixed selective neurons should also be important in our own brains. For him this is reason enough why brain research from now on should no longer be satisfied with just the simple activity patterns, but should also consider the apparently chaotic patterns that can only be revealed in context.
Mediterranean diet seems to boost ageing brain power
A Mediterranean diet with added extra virgin olive oil or mixed nuts seems to improve the brain power of older people better than advising them to follow a low-fat diet, indicates research published online in the Journal of Neurology Neurosurgery and Psychiatry.
The authors from the University of Navarra in Spain base their findings on 522 men and women aged between 55 and 80 without cardiovascular disease but at high vascular risk because of underlying disease/conditions.
These included either type 2 diabetes or three of the following: high blood pressure; an unfavourable blood fat profile; overweight; a family history of early cardiovascular disease; and being a smoker.
Participants, who were all taking part in the PREDIMED trial looking at how best to ward off cardiovascular disease, were randomly allocated to a Mediterranean diet with added olive oil or mixed nuts or a control group receiving advice to follow the low-fat diet typically recommended to prevent heart attack and stroke
A Mediterranean diet is characterised by the use of virgin olive oil as the main culinary fat; high consumption of fruits, nuts, vegetables and pulses; moderate to high consumption of fish and seafood; low consumption of dairy products and red meat; and moderate intake of red wine.
Participants had regular check-ups with their family doctor and quarterly checks on their compliance with their prescribed diet.
After an average of 6.5 years, they were tested for signs of cognitive decline using a Mini Mental State Exam and a clock drawing test, which assess higher brain functions, including orientation, memory, language, visuospatial and visuoconstrution abilities and executive functions such as working memory, attention span, and abstract thinking.
At the end of the study period, 60 participants had developed mild cognitive impairment: 18 on the olive oil supplemented Mediterranean diet; 19 on the diet with added mixed nuts; and 23 on the control group.
A further 35 people developed dementia: 12 on the added olive oil diet; six on the added nut diet; and 17 on the low fat diet.
The average scores on both tests were significantly higher for those following either of the Mediterranean diets compared with those on the low fat option.
These findings held true irrespective of other influential factors, including age, family history of cognitive impairment or dementia, the presence of ApoE protein—associated with Alzheimer’s disease—educational attainment, exercise levels, vascular risk factors; energy intake and depression.
The authors acknowledge that their sample size was relatively small, and that because the study involved a group at high vascular risk, it doesn’t necessarily follow that their findings are applicable to the general population.
But they say, theirs is the first long term trial to look at the impact of the Mediterranean diet on brain power, and that it adds to the increasing body of evidence suggesting that a high quality dietary pattern seems to protect cognitive function in the ageing brain.