Scientists have identified a key molecule responsible for triggering the chemical processes in our brain linked to our formation of memories. The findings, published in the journal Frontiers in Neural Circuits, reveal a new target for therapeutic interventions to reverse the devastating effects of memory loss.

The BBSRC-funded research, led by scientists at the University of Bristol, aimed to better understand the mechanisms that enable us to form memories by studying the molecular changes in the hippocampus — the part of the brain involved in learning.
Previous studies have shown that our ability to learn and form memories is due to an increase in synaptic communication called Long Term Potentiation [LTP]. This communication is initiated through a chemical process triggered by calcium entering brain cells and activating a key enzyme called ‘Ca2+ responsive kinase’ [CaMKII]. Once this protein is activated by calcium it triggers a switch in its own activity enabling it to remain active even after the calcium has gone. This special ability of CaMKII to maintain its own activity has been termed ‘the molecular memory switch’.
Until now, the question still remained as to what triggers this chemical process in our brain that allows us to learn and form long-term memories. The research team, comprising scientists from the University’s School of Physiology and Pharmacology, conducted experiments using the common fruit fly [Drosophila] to analyse and identify the molecular mechanisms behind this switch. Using advanced molecular genetic techniques that allowed them to temporarily inhibit the flies’ memory the team were able to identify a gene called CASK as the synaptic molecule regulating this ‘memory switch’.
Dr James Hodge, the study’s lead author, said: “Fruit flies are remarkably compatible for this type of study as they possess similar neuronal function and neural responses to humans. Although small they are very smart, for instance, they can land on the ceiling and detect that the fruit in your fruit bowl has gone off before you can.”
“In experiments whereby we tested the flies’ learning and memory ability, involving two odours presented to the flies with one associated with a mild shock, we found that around 90 per cent were able to learn the correct choice remembering to avoid the odour associated with the shock. Five lessons of the odour with punishment made the fly remember to avoid that odour for between 24 hours and a week, which is a long time for an insect that only lives a couple of months.“
By localising the function of the key molecules CASK and CaMKII to the flies’ equivalent brain area to the human hippocampus, the team found that the flies lacking these genes showed disrupted memory formation. In repeat memory tests those lacking these key genes were shown to have no ability to remember at three hours (mid-term memory) and 24 hours (long-term memory) although their initial learning or short-term memory wasn’t affected.
Finally, the team introduced a copy of the human CASK gene — it is 80 per cent identical to the fly CASK gene — into the genome of a fly that completely lacked its own CASK gene and was therefore not usually able to remember. The researchers found that flies which had a copy of the human CASK gene could remember like a normal wildtype fly.
Dr Hodge, from the University’s School of Physiology and Pharmacology, said: “Research into memory is particularly important as it gives us our sense of identity, and deficits in learning and memory occur in many diseases, injuries and during aging”.
“CASK’s control of CaMKII ‘molecular memory switch’ is clearly a critical step in how memories are written into neurons in the brain. These findings not only pave the way for to developing new therapies which reverse the effects of memory loss but also prove the compatibility of Drosophila to model these diseases in the lab and screen for new drugs to treat these diseases. Furthermore, this work provides an important insight into how brains have evolved their huge capacity to acquire and store information.”
These findings clearly demonstrate that neuronal function of CASK is conserved between flies and human, validating the use of Drosophila to understand CASK function in both the healthy and diseased brain. Mutations in human CASK gene have been associated with neurological and cognitive defects including severe learning difficulties.
New research from the Massachusetts Eye and Ear, Harvard Medical School and Harvard Program in Speech and Hearing Bioscience and Technology may have discovered a key piece in the puzzle of how hearing works by identifying the role of the olivocochlear efferent system in protecting ears from hearing loss. The findings could eventually lead to screening tests to determine who is most susceptible to hearing loss. Their paper is published today in the Journal of Neuroscience.
Until recently, it was common knowledge that exposure to a noisy environment (concert, iPod, mechanical tools, firearm, etc.), could lead to permanent or temporary hearing loss. Most audiologists would assess the damage caused by this type of exposure by measuring hearing thresholds, the lowest level at which one starts to detect/sense a sound at a particular frequency (pitch). Drs. Sharon Kujawa and Charles Liberman, both researchers at Mass. Eye and Ear, showed in 2009 that noise exposures leading to a temporary hearing loss in mice (when hearing thresholds return to what they were before exposure) in fact can be associated with cochlear neuropathy, a situation in which, despite having a normal threshold, a portion of auditory nerve fibers is missing).
The inner ear, the organ that converts sounds into messages that will be conveyed to and decoded by the brain, receives in turn fibers from the central nervous system. Those fibers are known as the olivocochlear efferent system. Up to now, the involvement of this efferent system in the protection from acoustic injury – although clearly demonstrated – has been a matter of debate because all the previous experiments were probing its protective effects following noise exposures very unlikely to be found in nature.
Stephane Maison, Ph.D., investigator at the Eaton-Peabody Laboratory at Mass. Eye and Ear and lead author, explains. “Humans are currently exposed to the type of noise used in those experiments but it’s hard to conceive that some vertebrates, thousands of years ago, were submitted to stimuli similar to those delivered by speakers. So many researchers believed that the protective effects of the efferent system were an epiphenomenon – not its true function.”
Instead of using loud noise exposures evoking a change in hearing threshold, we used a moderate noise exposure at a level similar to those found in restaurants, conferences, malls, and also in nature (some frogs emit vocalizations at similar or higher levels) and instead of looking at thresholds, we looked for signs of cochlear neuropathy, Dr. Maison continued.
The researchers demonstrated that such moderate exposure lead to cochlear neuropathy (loss of auditory nerve fibers), which causes difficulty to hear in noisy environments.
"This is tremendously important because all of us are submitted to such acoustic environments and it takes a lot of auditory nerve fiber loss before it gets to be detected by simply measuring thresholds as it’s done when preforming an audiogram," Dr. Maison said. "The second important discovery is that, in mice where the efferent system has been surgically removed, cochlear neuropathy is tremendously exacerbated. That second piece proves that the efferent system does play a very important role in protecting the ear from cochlear neuropathy and we may have found its main function."
The researchers say they are excited about this discovery because the strength of the efferent system can be recorded non-invasively in humans and a non-invasive assay to record the efferent system strength has already been developed and shows that one is able to predict vulnerability to acoustic injury (Maison and Liberman, Predicting vulnerability to acoustic injury with a noninvasive assay of olivocochlear reflex strength, Journal of Neuroscience, 20:4701-4707, 2000).
"One could envision applying this assay or a modified version of it to human populations to screen for individuals most at risk in noise environments," Dr. Maison concluded.
Novel intercellular transportation system may have potential for delivering RNAi and other gene-based therapeutics
Important new research from UMass Medical School demonstrates how exosomes shuttle proteins from neurons to muscle cells where they take part in critical signaling mechanisms, an exciting discovery that means these tiny vehicles could one day be loaded with therapeutic agents, such as RNA interference (RNAi), and directly target disease-carrying cells. The study, published this month in the journal Neuron, is the first evidence that exosomes can transfer membrane proteins that play an important role in cell-to-cell signaling in the nervous system.

“There has been a long-held belief that certain cellular materials, such as integral membrane proteins, are unable to pass from one cell to another, essentially trapping them in the cell where they are made,” said Vivian Budnik, PhD, professor of neurobiology and lead author of the study. “What we’ve shown in this study is that these cellular materials can actually move between different cell types by riding in the membrane of exosomes.
“What is so exciting about this discovery is that these exosomes can deliver materials from one cell, over a distance, to a very specific and different cell,” said Dr. Budnik. “Once inside the recipient cell, the materials contained in the exosome can influence or perform processes in the new cell. This raises the enticing possibility that exosomes can be packed with gene therapies, such as RNAi, and delivered to diseased cells where they could have a therapeutic effect for people.”
Discovered in the mid-80s, exosomes have only recently attracted the attention of scientists at large, according to Budnik. Exosomes are small vesicles containing cellular materials such as microRNA, messenger RNAs (mRNAs) and proteins, packaged inside larger, membrane-bound bodies called multivesicular bodies (MVBs) inside cells. When MVBs containing exosomes fuse with the cell plasma membrane, they release these exosome vesicles into the extracellular space. Once outside the cell, exosomes can then travel to other cells, where they are taken up. The recipient cells can then use the materials contained within exosomes, influencing cellular function and allowing the recipient cell to carry out certain processes that it might not be able to complete otherwise.
Budnik and colleagues made this startling discovery while investigating how the synapses at the end of neurons and nearby muscle cells communicate in the developing Drosophila fruit fly to form the neuromuscular junction (NMJ). The NMJ is essential for transmitting electrical signals between neurons and muscles, allowing the organism to move and control important physiological processes. Alterations of the NMJ can lead to devastating diseases, such as muscular dystrophy and Amyotrophic lateral sclerosis (ALS). Understanding how the NMJ develops and is maintained is important for human health.
As organisms develop, the synapse and muscle cell need to grow in concert. If one or the other grows too quickly or not quickly enough, it could have dire consequences for the ability of the organism to move and survive. To coordinate development, signals are sent from the neuron to the muscle cell (anterograde signals) and from the muscle cell to the neuron (retrograde signals). However, the identity of these signals and how their release is coordinated is poorly understood.
Normally, the vesicle protein Synaptotagmin 4 (Syt4) is found in both the synapse and the muscle cells. Previous knockout experiments eliminating the Syt4 protein from Drosophila have resulted in stunted NMJs. Suspecting that Syt4 played an important role in retrograde signaling at the developing NMJ, Budnik and colleagues used knockdown experiments to decrease Syt4 protein levels in either the neurons or the muscle cells. Surprisingly, when RNAi was used to knockdown Syt4 in the neurons alone, Syt4 protein was eliminated in both neurons and muscles. The opposite was not the case. When Syt4 was knocked down in muscle cells only, there was no change in the levels of Syt4 in either muscles or neurons.
To confirm this, Budnik and colleagues inserted a Syt4 gene into the neurons of a Drosophila mutant completely lacking the normal protein. This restored Syt4 in both neurons and muscle cells. Further experiments suggested that the only source of Syt4 is the neuron. These observations were consistent with the model that Syt4 is actually transferred from neurons to muscle cells. As a transmembrane protein, however, Syt4 was thought to be unable to move from one cell to another through traditional avenues. How the Syt4 protein was moving from neuron to muscle cell was unclear.
Knowing that exosomes had been observed to carry transmembrane proteins in other systems and from their own work on the Drosophila NMJ, Budnik and colleagues began testing to see if exosomes could be the vehicle responsible for carrying Syt4 form neurons to muscles. “We had previously observed that it was possible to transfer transmembrane proteins across the NMJ through exosomes, a process also observed in the immune system,” said Budnik. “We suspect this was how Syt4 was making its way from the neuron to the muscle.”
When exosomes were purified from cultured cells containing Syt4, they found that exosomes indeed contained Syt4. In addition, when these purified exosomes were applied to cultured muscle cells from fly embryos, these cells were able to take up the purified Syt4 exosomes. Taken together, these findings indicate that Syt4 plays a critical role in the signaling process between synapse and muscle cell that allows for coordinated development of the NMJ. While Syt4 is required to release a retrograde signal from muscle to neuron, a component of this retrograde signal must be supplied from the neuron to the muscle. This establishes a positive feedback loop that ensures coordinated growth of the NMJ. Equally important is the finding that this feedback mechanism is enabled by the use of exosomes, which can shuttle transmembrane proteins across cells.
“While this discovery greatly enhances our understanding of how the neural muscular junction develops and works, it also has tremendous promise as a potential vector for targeted genetic therapies,” said Budnik. “More work needs to be done, but this study significantly supports the possibility that exosomes could be loaded with therapeutic agents and delivered to specific cells in patients.”
Neurobiologists at the Friedrich Miescher Institute have been able to dissect a mechanism in the retina that facilitates our ability to see both in the dark and in the light. They identified a cellular switch that activates distinct neuronal circuits at a defined light level. The switch cells of the retina act quickly and reliably to turn on and off computations suited specifically for vision in low and high light levels thus facilitating the transition from night to day vision. The scientists have published their results online in Neuron.

"It was fascinating to see how modern neurobiological methods allowed us to answer a question about vision that has been controversially discussed for the last 50 years", said Karl Farrow, postdoctoral fellow in Botond Roska’s group at the Friedrich Miescher Institute for Biomedical Research. Since the late 1950 scientists debated how the retina handles the different visual processes at low and high light intensities, at starlight and at daylight. Farrow and his colleagues have now identified a cellular switch in the retina that controls perception during these two settings.
At first glance, everything seems clear. The interplay of two photoreceptor types in the retina, the rods and the cones, allow us to see across a wide range of light intensities. The rods are highly sensitive and spring into action in the dark; the cones are activated during the day and in humans come in three diversities allowing us to see color. The rods help us detect objects during the night; while the cones allow us to discriminate the fine details of those objects during the day. The plethora of initial signals originating from the photoreceptors is computed in a system of only approximately 20 neuronal channels that transport information to the brain. The relay stations are the roughly 20 types of ganglion cells in the retina. How they manage the transition from light to dark and enable vision at the different light regimes has remained unclear.
In the retina several cell layers are stacked on top of each other. The photoreceptors are the first to be activated by light; they relay the information to bipolar cells, which in turn activate ganglion cells. The different types of ganglion cells take on distinct tasks during vision. These ganglion cells are embedded in a mesh of amacrine cells that modulate their activity. “Here is where our new genetic tools proofed very helpful,” said Farrow, “because they allowed us to look at individual ganglion cell types and to specifically measure their activities at different light intensities.” Farrow and colleagues could thus show that the activity of one particular type of ganglion cells, called PV1, is modulated like a switch by amacrine cells. The amacrine cells inhibit the ganglion cell strongly at high light intensities and weakly at low ambient light levels. This switch is abrupt and reversible and it occurs at the light intensities where cones are starting to be activated. “We were surprised to see how fast this switch occurs and how reliable we were able to switch between the two states at defined light intensities”, comments Farrow.
While the above experiments were done in a mouse model, the FMI neurobiologists could show that a similar switch operates in human vision. Their volunteers had to look at narrow and broader stripes at different light levels. They could show that there again a switch operates. While the general ability to see all striped patterns improved with increasing light intensity, suddenly, at a certain light level, the volunteers were much better able to detect thinner patterns as compared to the broader ones. Interestingly enough this switch happened at precisely the light level where the volunteers were also able to discriminate between red and blue, hence where the cones spring into action. “We think we have found a regulatory principle that could apply to several processes in the brain”, said Roska, “This principle could explain some situations when gradual changes in the sensory environment leads to abrupt changes in brain computations and perception”
A new study published in the March issue of Autism Research from the University of Tennessee Health Science Center and Le Bonheur researchers is making the genetic connections between autism and Chromosome 15q Duplication Syndrome (Dup15q).
The Memphis researchers determined that the maternally derived or inherited duplication of the region inclusive of the UBE3A gene (also known as the Angelman/Prader-Willi syndrome locus) are sufficient to produce a phenotype on the autism spectrum in all ten maternal duplication subjects. The number of subjects was too small to determine if parental duplications do not cause autism. The team assembled the largest single cohort of interstitial 15q duplication subjects for phenotype/genotype analysis of the autism component of the syndrome.
Chromosome 15q Duplication Syndrome (Dup15q) results from duplications of chromosome 15q11-q13. Duplications that are maternal in origin often result in developmental problems. The larger 15q duplication syndrome, which includes individuals with idic15, manifests itself in a wide range of developmental disabilities including autism spectrum disorders; motor, cognitive and speech/language delays; and seizure disorders among others. While there is no specific treatment plan, therapies are available to address or manage symptoms.
Previous research suggests that as many as 1,000 genes may contribute to autism phenotypes, but as much as 1-3 percent of all autism spectrum disorder cases may be a result of 15q11-q13 duplication alone.
The researchers also found through EEG evaluations a pattern that looks like the type of signal you see when individuals take GABA promoting drugs (benzodiazepines). The lead researcher on this study, Lawrence T. Reiter, PhD, says this signal gives clinicians a clue about what types of anti-seizure medication may be most useful in children with 15q duplications.
Reiter says genetic testing can help families connect to resources, like the Dup15q Alliance. Reiter is an associate professor in Department of Neurology with an adjunct appointment in Pediatrics at UTHSC.
“If a pediatrician suspects autism due to hypotonia and developmental delay, I highly recommend they order an arrayCGH test. Duplication 15q is the second most common duplication in autism. The test will help families in future treatments specific to this sub-type of autism,” he said.
A new study suggests that migraines are related to brain abnormalities present at birth and others that develop over time. The research is published online in the journal Radiology.

Migraines are intense, throbbing headaches, sometimes accompanied by nausea, vomiting and sensitivity to light. Some patients experience auras, a change in visual or sensory function that precedes or occurs during the migraine. More than 300 million people suffer from migraines worldwide, according to the World Health Organization.
Previous research on migraine patients has shown atrophy of cortical regions in the brain related to pain processing, possibly due to chronic stimulation of those areas. Cortical refers to the cortex, or outer layer of the brain.
Much of that research has relied on voxel-based morphometry, which provides estimates of the brain’s cortical volume. In the new study, Italian researchers used a different approach: a surface-based MRI method to measure cortical thickness.
"For the first time, we assessed cortical thickness and surface area abnormalities in patients with migraine, which are two components of cortical volume that provide different and complementary pieces of information," said Massimo Filippi, M.D., director of the Neuroimaging Research Unit at the University Ospedale San Raffaele and professor of neurology at the University Vita-Salute’s San Raffaele Scientific Institute in Milan. "Indeed, cortical surface area increases dramatically during late fetal development as a consequence of cortical folding, while cortical thickness changes dynamically throughout the entire life span as a consequence of development and disease."
Dr. Filippi and colleagues used magnetic resonance imaging (MRI) to acquire T2-weighted and 3-D T1-weighted brain images from 63 migraine patients and 18 healthy controls. Using special software and statistical analysis, they estimated cortical thickness and surface area and correlated it with the patients’ clinical and radiologic characteristics.
Compared to controls, migraine patients showed reduced cortical thickness and surface area in regions related to pain processing. There was only minimal anatomical overlap of cortical thickness and cortical surface area abnormalities, with cortical surface area abnormalities being more pronounced and distributed than cortical thickness abnormalities. The presence of aura and white matter hyperintensities—areas of high intensity on MRI that appear to be more common in people with migraine—was related to the regional distribution of cortical thickness and surface area abnormalities, but not to disease duration and attack frequency.
"The most important finding of our study was that cortical abnormalities that occur in patients with migraine are a result of the balance between an intrinsic predisposition, as suggested by cortical surface area modification, and disease-related processes, as indicated by cortical thickness abnormalities," Dr. Filippi said. "Accurate measurements of cortical abnormalities could help characterize migraine patients better and improve understanding of the pathophysiological processes underlying the condition."
Additional research is needed to fully understand the meaning of cortical abnormalities in the pain processing areas of migraine patients, according to Dr. Filippi.
"Whether the abnormalities are a consequence of the repetition of migraine attacks or represent an anatomical signature that predisposes to the development of the disease is still debated," he said. "In my opinion, they might contribute to make migraine patients more susceptible to pain and to an abnormal processing of painful conditions and stimuli."
The researchers are conducting a longitudinal study of the patient group to see if their cortical abnormalities are stable or tend to worsen over the course of the disease. They are also studying the effects of treatments on the observed modifications of cortical folding and looking at pediatric patients with migraine to assess whether the abnormalities represent a biomarker of the disease.
The field of cell therapy, which aims to form new cells in the body in order to cure disease, has taken another important step in the development towards new treatments. A new report from researchers at Lund University in Sweden shows that it is possible to re-programme other cells to become nerve cells, directly in the brain.

Two years ago, researchers in Lund were the first in the world to re-programme human skin cells, known as fibroblasts, to dopamine-producing nerve cells – without taking a detour via the stem cell stage. The research group has now gone a step further and shown that it is possible to re-programme both skin cells and support cells directly to nerve cells, in place in the brain.
“The findings are the first important evidence that it is possible to re-programme other cells to become nerve cells inside the brain”, said Malin Parmar, research group leader and Reader in Neurobiology.
The researchers used genes designed to be activated or de-activated using a drug. The genes were inserted into two types of human cells: fibroblasts and glia cells – support cells that are naturally present in the brain. Once the researchers had transplanted the cells into the brains of rats, the genes were activated using a drug in the animals’ drinking water. The cells then began their transformation into nerve cells.
In a separate experiment on mice, where similar genes were injected into the mice’s brains, the research group also succeeded in re-programming the mice’s own glia cells to become nerve cells.
“The research findings have the potential to open the way for alternatives to cell transplants in the future, which would remove previous obstacles to research, such as the difficulty of getting the brain to accept foreign cells, and the risk of tumour development”, said Malin Parmar.
All in all, the new technique of direct re-programming in the brain could open up new possibilities to more effectively replace dying brain cells in conditions such as Parkinson’s disease.
“We are now developing the technique so that it can be used to create new nerve cells that replace the function of damaged cells. Being able to carry out the re-programming in vivo makes it possible to imagine a future in which we form new cells directly in the human brain, without taking a detour via cell cultures and transplants”, concluded Malin Parmar.
The research article is entitled ‘Generation of induced neurons via direct conversion in vivo’ and has been published in the Proceedings of the National Academy of Science (PNAS)
The virus that causes cold sores, along with other viral or bacterial infections, may be associated with cognitive problems, according to a new study published in the March 26, 2013, print issue of Neurology®, the medical journal of the American Academy of Neurology.
The study found that people who have had higher levels of infection in their blood (measured by antibody levels), meaning they had been exposed over the years to various pathogens such as the herpes simplex type 1 virus that causes cold sores, were more likely to have cognitive problems than people with lower levels of infection in the blood. “We found the link was greater among women, those with lower levels of education and Medicaid or no health insurance, and most prominently, in people who do not exercise,” said author Mira Katan, MD, with the Northern Manhattan Study at Columbia University Medical Center in New York and a member of the American Academy of Neurology. The study was performed in collaboration with the Miller School of Medicine at the University of Miami in Miami, FL.
For the study, researchers tested thinking and memory in 1,625 people with an average age of 69 from northern Manhattan in New York. Participants gave blood samples that were tested for five common low grade infections: three viruses (herpes simplex type 1 (oral) and type 2 (genital), and cytomegalovirus), chlamydia pneumoniae (a common respiratory infection) and Helicobacter pylori (a bacteria found in the stomach).
The results showed that the people who had higher levels of infection had a 25 percent increase in the risk of a low score on a common test of cognition called the Mini-Mental State Examination.
The memory and thinking skills were tested every year for an average of eight years. But infection was not associated with changes in memory and thinking abilities over time.
“While this association needs to be further studied, the results could lead to ways to identify people at risk of cognitive impairment and eventually lower that risk,” said Katan. “For example, exercise and childhood vaccinations against viruses could decrease the risk for memory problems later in life.” The study was supported by the National Institutes of Neurological Disorders and Stroke (NINDS), the Swiss National Science Foundation and the Leducq Foundation.
A new mechanism for guiding the growth of nerves that involves cell-death machinery has been found by scientists at the University of Nevada, Reno that may bring advances in neurological medicine and research. The team obtained the evidence in studies of fruit flies and reported their discovery in an article published in the prestigious science publication Cell Reports.

"Although the fly is a relatively simple organism, almost every gene identified in this species appears to be carrying out similar functions in humans," said Thomas Kidd, associate professor in the University’s biology department in whose lab the work was performed.
The Kidd lab is part of a $10 million Center for Biomedical Research Excellence Project in Cell Biology of Signaling at the University, which is funded by the National Institute of Health’s Institute of General Medical Sciences. The project is also funded by the National Science Foundation.
"Flies are useful because the neural mechanisms we are studying are similar to those in mammals," said Gunnar Newquist, lead author of the Cell Reports article and a post-doctoral neuroscience researcher in Kidd’s lab. "We’ve found something no one has seen before, that blocking the cell-death pathway can make nerves deprived of guidance cues figure out the right way to connect with other neurons. This was completely unexpected and novel, but really exciting because it changes the way we look at nerve growth.
"Neurons have a natural ability to die, if they fail to make the right connections they usually die. Neurons, like most other cell types, have the capacity to commit suicide and many do so during the formation of the nervous system."
The wiring of nervous systems is composed of axons, specialized extensions of neurons that transmit electrical impulses. During development axons navigate long distances to their targets by using signals in their environment. Netrin-B is one of those signals. Kidd, Newquist and colleagues have shown that Netrin-B also keeps neurons alive.
"Take away the Netrin-B and growth and cell death goes haywire," Newquist said.
This led them to the discovery that the cell-death machinery is active in growing nerves, and appears to be an integral part of the navigation mechanism.
"We use fruit fly genetics to study how these axons navigate these long distances correctly when developing," Kidd said. "Understanding the mechanisms they use to navigate is of great interest, not only for understanding how our brains form, but also as a starting point to devise ways to stimulate the re-growth of axons after injury, especially spinal cord injuries.
"Our work suggests that therapeutics designed to keep neurons alive after injury may be able to stimulate neurons to start re-growing or sprouting new connections."
"I am very pleased to see Tom’s and Gunnar’s hard work come to fruition," said Chris von Bartheld, director of the University’s cell-biology COBRE and a professor in the University of Nevada School of Medicine. "Linking axonal path finding and cell death signaling opens exciting new venues to better understand both topics. It also shows that our recently established center in cell biology is achieving its goals of producing top-level biomedical research."
A drug widely used to treat Parkinson’s Disease can help to reverse age-related impairments in decision making in some older people, a study from researchers at the Wellcome Trust Centre for Neuroimaging has shown.
The study, published today in the journal Nature Neuroscience, also describes changes in the patterns of brain activity of adults in their seventies that help to explain why they are worse at making decisions than younger people.
Poorer decision-making is a natural part of the ageing process that stems from a decline in our brains’ ability to learn from our experiences. Part of the decision-making process involves learning to predict the likelihood of getting a reward from the choices that we make.
An area of the brain called the nucleus accumbens is responsible for interpreting the difference between the reward that we’re expecting to get from a decision and the reward that is actually received. These so called ‘prediction errors’, reported by a brain chemical called dopamine, help us to learn from our actions and modify our behaviour to make better choices the next time.
Dr Rumana Chowdhury, who led the study at the Wellcome Trust Centre for Neuroimaging at UCL, said: “We know that dopamine decline is part of the normal aging process so we wanted to see whether it had any effect on reward-based decision making. We found that when we treated older people who were particularly bad at making decisions with a drug that increases dopamine in the brain, their ability to learn from rewards improved to a level comparable to somebody in their twenties and enabled them to make better decisions.”
The team used a combination of behavioural testing and brain imaging techniques, to investigate the decision-making process in 32 healthy volunteers aged in their early seventies compared with 22 volunteers in their mid-twenties. Older participants were tested on and off L-DOPA, a drug that increases levels of dopamine in the brain. L-DOPA, more commonly known as Levodopa, is widely used in the clinic to treat Parkinson’s.
The participants were asked to complete a behavioural learning task called the two-arm bandit, which mimics the decisions that gamblers make while playing slot machines. Players were shown two images and had to choose the one that they thought would give them the biggest reward. Their performance before and after drug treatment was assessed by the amount of money they won in the task.
"The older volunteers who were less able to predict the likelihood of a reward from their decisions, and so performed worst in the task, showed a significant improvement following drug treatment," Dr Chowdhury explains.
The team then looked at brain activity in the participants as they played the game using functional Magnetic Resonance Imaging (fMRI), and measured connections between areas of the brain that are involved in reward prediction using a technique called Diffusor Tensor Imaging (DTI).
The findings reveal that the older adults who performed best in the gambling game before drug treatment had greater integrity of their dopamine pathways. Older adults who performed poorly before drug treatment were not able to adequately signal reward expectation in the brain – this was corrected by L-DOPA and their performance improved on the drug.
Dr John Williams, Head of Neuroscience and Mental Health at the Wellcome Trust, said: “This careful investigation into the subtle cognitive changes that take place as we age offers important insights into what may happen at both a functional and anatomical level in older people who have problems with making decisions. That the team were able to reverse these changes by manipulating dopamine levels offers the hope of therapeutic approaches that could allow older people to function more effectively in the wider community.”
The strongest predictor of whether a man is developing dementia with Lewy bodies — the second most common form of dementia in the elderly — is whether he acts out his dreams while sleeping, Mayo Clinic researchers have discovered. Patients are five times more likely to have dementia with Lewy bodies if they experience a condition known as rapid eye movement (REM) sleep behavior disorder than if they have one of the risk factors now used to make a diagnosis, such as fluctuating cognition or hallucinations, the study found.
The findings were being presented at the annual meeting of the American Academy of Neurology in San Diego. REM sleep behavior disorder is caused by loss of the normal muscle paralysis that occurs during REM sleep. It can appear three decades or more before a diagnosis of dementia with Lewy bodies is made in males, the researchers say. The link between dementia with Lewy bodies and the sleep disorder is not as strong in women, they add.
"While it is, of course, true that not everyone who has this sleep disorder develops dementia with Lewy bodies, as many as 75 to 80 percent of men with dementia with Lewy bodies in our Mayo database did experience REM sleep behavior disorder. So it is a very powerful marker for the disease," says lead investigator Melissa Murray, Ph.D., a neuroscientist at Mayo Clinic in Florida.
The study’s findings could improve diagnosis of this dementia, which can lead to beneficial treatment, Dr. Murray says.
"Screening for the sleep disorder in a patient with dementia could help clinicians diagnose either dementia with Lewy bodies or Alzheimer’s disease," she says. "It can sometimes be very difficult to tell the difference between these two dementias, especially in the early stages, but we have found that only 2 to 3 percent of patients with Alzheimer’s disease have a history of this sleep disorder."
Once the diagnosis of dementia with Lewy bodies is made, patients can use drugs that can treat cognitive issues, Dr. Murray says. No cure is currently available.
Researchers at Mayo Clinic in Minnesota and Florida, led by Dr. Murray, examined magnetic resonance imaging, or MRI, scans of the brains of 75 patients diagnosed with probable dementia with Lewy bodies. A low-to-high likelihood of dementia was made upon an autopsy examination of the brain.
The researchers checked the patients’ histories to see if the sleep disorder had been diagnosed while under Mayo care. Using this data and the brain scans, they matched a definitive diagnosis of the sleep disorder with a definite diagnosis of dementia with Lewy bodies five times more often than they could match risk factors, such as loss of brain volume, now used to aid in the diagnosis. The researchers also showed that low-probability dementia with Lewy bodies patients who did not have the sleep disorder had findings characteristic of Alzheimer’s disease.
"When there is greater certainty in the diagnosis, we can treat patients accordingly. Dementia with Lewy bodies patients who lack Alzheimer’s-like atrophy on an MRI scan are more likely to respond to therapy — certain classes of drugs — than those who have some Alzheimer’s pathology," Dr. Murray says.