Long-term spinal cord stimulation stalls symptoms of Parkinson’s-like disease
Researchers at Duke Medicine have shown that continuing spinal cord stimulation appears to produce improvements in symptoms of Parkinson’s disease, and may protect critical neurons from injury or deterioration.
The study, performed in rats, is published online Jan. 23, 2014, in the journal Scientific Reports. It builds on earlier findings from the Duke team that stimulating the spinal cord with electrical signals temporarily eased symptoms of the neurological disorder in rodents.
"Finding novel treatments that address both the symptoms and progressive nature of Parkinson’s disease is a major priority," said the study’s senior author Miguel Nicolelis, M.D., Ph.D., professor of neurobiology at Duke University School of Medicine. "We need options that are safe, affordable, effective and can last a long time. Spinal cord stimulation has the potential to do this for people with Parkinson’s disease."
Parkinson’s disease is caused by the progressive loss of neurons that produce dopamine, an essential molecule in the brain, and affects movement, muscle control and balance.
L-dopa, the standard drug treatment for Parkinson’s disease, works by replacing dopamine. While L-dopa helps many people, it can cause side effects and lose its effectiveness over time. Deep brain stimulation, which emits electrical signals from an implant in the brain, has emerged as another valuable therapy, but less than 5 percent of those with Parkinson’s disease qualify for this treatment.
"Even though deep brain stimulation can be very successful, the number of patients who can take advantage of this therapy is small, in part because of the invasiveness of the procedure," Nicolelis said.
In 2009, Nicolelis and his colleagues reported in the journal Science that they developed a device for rodents that sends electrical stimulation to the dorsal column, a main sensory pathway in the spinal cord carrying information from the body to the brain. The device was attached to the surface of the spinal cord in rodents with depleted levels of dopamine, mimicking the biologic characteristics of someone with Parkinson’s disease. When the stimulation was turned on, the animals’ slow, stiff movements were replaced with the active behaviors of healthy mice and rats.
Because research on spinal cord stimulation in animals has been limited to the stimulation’s acute effects, in the current study, Nicolelis and his colleagues investigated the long-term effects of the treatment in rats with the Parkinson’s-like disease.
For six weeks, the researchers applied electrical stimulation to a particular location in the dorsal column of the rats’ spinal cords twice a week for 30-minute sessions. They observed a significant improvement in the rats’ symptoms, including improved motor skills and a reversal of severe weight loss.
In addition to the recovery in clinical symptoms, the stimulation was associated with better survival of neurons and a higher density of dopaminergic innervation in two brain regions controlling movement – the loss of which cause Parkinson’s disease in humans. The findings suggest that the treatment protects against the loss or damage of neurons.
Clinicians are currently using a similar application of dorsal column stimulation to manage certain chronic pain syndromes in humans. Electrodes implanted over the spinal cord are connected to a portable generator, which produces electrical signals that create a tingling sensation to relieve pain. Studies in a small number of humans worldwide have shown that dorsal column stimulation may also be effective in restoring motor function in people with Parkinson’s disease.
"This is still a limited number of cases, so studies like ours are important in examining the basic science behind the treatment and the potential mechanisms of why it is effective," Nicolelis said.
The researchers are continuing to investigate how spinal cord stimulation works, and are beginning to explore using the technology in other neurological motor disorders.
Filed under spinal cord parkinson's disease spinal cord stimulation dopamine neurons neuroscience science
Alzheimer’s drugs fail, but lessons are learned
After the failure of two novel drugs using antibodies to fight the buildup of brain plaque in Alzheimer’s patients, scientists said on Wednesday they have learned lessons for the future.
The biologic drugs solanezumab, by pharmaceutical giant Eli Lilly, and bapineuzumab, by Johnson and Johnson, made it to phase III trials and were taken by thousands of patients, according to a full report on the research published in the New England Journal of Medicine.
Read more
Filed under alzheimer's disease dementia solanezumab bapineuzumab drug trials medicine science
Scientists have identified a channel present in many pain detecting sensory neurons that acts as a ‘brake’, limiting spontaneous pain. It is hoped that the new research, published today [22 January] in the Journal of Neuroscience, will ultimately contribute to new pain relief treatments.
Spontaneous pain is ongoing pathological pain that occurs constantly (slow burning pain) or intermittently (sharp shooting pain) without any obvious immediate cause or trigger. The slow burning pain is the cause of much suffering and debilitation. Because the mechanisms underlying this type of slow burning pain are poorly understood, it remains very difficult to treat effectively.
Spontaneous pain of peripheral origin is pathological, and is associated with many types of disease, inflammation or damage of tissues, organs or nerves (neuropathic pain). Examples of neuropathic pain are nerve injury/crush, post-operative pain, and painful diabetic neuropathy.
Previous research has shown that this spontaneous burning pain is caused by continuous activity in small sensory nerve fibers, known as C-fiber nociceptors (pain neurons). Greater activity translates into greater pain, but what causes or limits this activity remained poorly understood.
Now, new research from the University of Bristol, has identified a particular ion channel present exclusively in these C-fiber nociceptors This ion channel, known as TREK2, is present in the membranes of these neurons, and the researchers showed that it provides a natural innate protection against this pain.
Ion channels are specialised proteins that are selectively permeable to particular ions. They form pores through the neuronal membrane. Leak potassium channels are unusual, in that they are open most of the time allowing positive potassium ions (K+) to leak out of the cell. This K+ leakage is the main cause of the negative membrane potentials in all neurons. TREK2 is one of these leak potassium channels. Importantly, the C-nociceptors that express TREK2 have much more negative membrane potentials than those that do not.
Researchers showed that when TREK2 was removed from the proximity of the cell membrane, the potential in those neurons became less negative. In addition, when the neuron was prevented from synthesizing the TREK2, the membrane potential also became less negative.
They also found that spontaneous pain associated with skin inflammation, was increased by reducing the levels of synthesis of TREK2 in these C-fiber neurons.
They concluded that in these C-fiber nociceptors the TREK2 keeps membrane potentials more negative, stabilizing their membrane potential, reducing firing and thus limiting the amount of spontaneous burning pain.
Professor Sally Lawson, from the School of Physiology and Pharmacology at Bristol University, explained: “It became evident that TREK2 kept the C-fiber nociceptor membrane at a more negative potential. Despite the difficulties inherent in the study of spontaneous pain, and the lack of any drugs that can selectively block or activate TREK2, we demonstrated that TREK2 in C-fiber nociceptors is important for stabilizing their membrane potential and decreasing the likelihood of firing. It became apparent that TREK2 was thus likely to act as a natural innate protection against pain. Our data supported this, indicating that in chronic pain states, TREK2 is acting as a brake on the level of spontaneous pain.”
Dr Cristian Acosta, the first author on the paper and now working at the Institute of Histology and Embriology of Mendoza in Argentina, said “Given the role of TREK2 in protecting against spontaneous pain, it is important to advance our understanding of the regulatory mechanisms controlling its expression and trafficking in these C-fiber nociceptors. We hope that this research will enable development of methods of enhancing the actions of TREK2 that could potentially some years hence provide relief for sufferers of ongoing spontaneous burning pain.”
(Source: eurekalert.org)
Filed under pain sensory neurons ion channels c-fiber nociceptors TREK2 neuroscience science
Using a simple study of eye movements, Johns Hopkins scientists report evidence that people who are less patient tend to move their eyes with greater speed. The findings, the researchers say, suggest that the weight people give to the passage of time may be a trait consistently used throughout their brains, affecting the speed with which they make movements, as well as the way they make certain decisions.

Caption: Despite claims to the contrary, the eyes of the Mona Lisa do not make saccades. Credit: Leonardo da Vinci
In a summary of the research to be published Jan. 21 in The Journal of Neuroscience, the investigators note that a better understanding of how the human brain evaluates time when making decisions might also shed light on why malfunctions in certain areas of the brain make decision-making harder for those with neurological disorders like schizophrenia, or for those who have experienced brain injuries.
Principal investigator Reza Shadmehr, Ph.D., professor of biomedical engineering and neuroscience at The Johns Hopkins University, and his team set out to understand why some people are willing to wait and others aren’t. “When I go to the pharmacy and see a long line, how do I decide how long I’m willing to stand there?” he asks. “Are those who walk away and never enter the line also the ones who tend to talk fast and walk fast, perhaps because of the way they value time in relation to rewards?”
To address the question, the Shadmehr team used very simple eye movements, known as saccades, to stand in for other bodily movements. Saccades are the motions that our eyes make as we focus on one thing and then another. “They are probably the fastest movements of the body,” says Shadmehr. “They occur in just milliseconds.” Human saccades are fastest when we are teenagers and slow down as we age, he adds.
In earlier work, using a mathematical theory, Shadmehr and colleagues had shown that, in principle, the speed at which people move could be a reflection of the way the brain calculates the passage of time to reduce the value of a reward. In the current study, the team wanted to test the idea that differences in how subjects moved were a reflection of differences in how they evaluated time and reward.
For the study, the team first asked healthy volunteers to look at a screen upon which dots would appear one at a time –– first on one side of the screen, then on the other, then back again. A camera recorded their saccades as they looked from one dot to the other. The researchers found a lot of variability in saccade speed among individuals but very little variation within individuals, even when tested at different times and on different days. Shadmehr and his team concluded that saccade speed appears to be an attribute that varies from person to person. “Some people simply make fast saccades,” he says.
To determine whether saccade speed correlated with decision-making and impulsivity, the volunteers were told to watch the screen again. This time, they were given visual commands to look to the right or to the left. When they responded incorrectly, a buzzer sounded.
After becoming accustomed to that part of the test, they were forewarned that during the following round of testing, if they followed the command right away, they would be wrong 25 percent of the time. In those instances, after an undetermined amount of time, the first command would be replaced by a second command to look in the opposite direction.
To pinpoint exactly how long each volunteer was willing to wait to improve his or her accuracy on that phase of the test, the researchers modified the length of time between the two commands based on a volunteer’s previous decision. For example, if a volunteer chose to wait until the second command, the researchers increased the time they had to wait each consecutive time until they determined the maximum time the volunteer was willing to wait — only 1.5 seconds for the most patient volunteer. If a volunteer chose to act immediately, the researchers decreased the wait time to find the minimum time the volunteer was willing to wait to improve his or her accuracy.
When the speed of the volunteers’ saccades was compared to their impulsivity during the patience test, there was a strong correlation. “It seems that people who make quick movements, at least eye movements, tend to be less willing to wait,” says Shadmehr. “Our hypothesis is that there may be a fundamental link between the way the nervous system evaluates time and reward in controlling movements and in making decisions. After all, the decision to move is motivated by a desire to improve one’s situation, which is a strong motivating factor in more complex decision-making, too.”
(Source: eurekalert.org)
Filed under eye movements saccades decision making patience psychology neuroscience science
The brain’s RAM: Rats, like humans, have a “working memory”
Thousands of times a day, the brain stores sensory information for very short periods of time in a working memory, to be able to use it later. A research study carried out with the collaboration of SISSA has shown, for the first time, that this function also exists in the brain of rodents, a finding that sheds light on the evolutionary origins of this cognitive mechanism.
In computers it’s called “RAM”, but the mechanism is conceptually similar to what scientists call a “working memory” in the brain of humans and primates: when we interact with the environment our senses gather information that a temporary memory system keeps fresh and readily accessible for a few minutes, so that the body can carry out operations (for example, an action). For the first time, a research team coordinated by Mathew Diamond of the International School for Advanced Studies (SISSA) in Trieste has shown that this memory system also exists in simpler mammals like rodents.
Working memory has been studied in detail in humans and primates, but little was known about its existence in other animals. “Knowing that a working memory also exists in the brain of evolutionarily simpler organisms helps us to understand the origins of this important cognitive mechanism”, explains Diamond. “Comparative psychology studies have historically helped scientists not only to trace the evolutionary roots of human brain functions but also to gain deeper insight into human cognitive processes themselves”.
The type of sensory memory studied by Diamond and co-workers in rats is tactile memory. The performance of rodents in tasks assessing recognition of vibratory stimuli was compared with that of humans performing similar tasks (rats used their whiskers and humans their fingertips). “Rats exhibited similar behaviour patterns to humans, demonstrating that these animals use a tactile working memory that enables them to recognise and interact with environmental stimuli”. The research paper has been published in The Proceedings of the National Academy of Sciences (PNAS).
More in detail…
“Working memory is an extraordinary cognitive mechanism”, comments Diamond. “It’s like a container where the brain stores little bits of very recent experience, to be able to assess the best course of action. Without this temporary memory, experience would slip away without any chance of being used”.
“Working memory can hold only a limited amount of information for a fairly short period of time. These limits are the result of a cost-benefit balance: the brain’s computational capacity is fixed and decisions as to what action to take often need to be quick and effective as the same time. Our working memory’s capacity is therefore the best we can achieve in terms of accuracy and speed with our brain”.
“The brain regions responsible for working memory have not yet been identified in rats. Some believe that rats don’t have the brain centres known as “prefrontal cortex” which are involved in this function in primates”, continues Diamond. ”Our surprise was to discover that rodents realize memory in a manner similar to humans. Now we are continuing our studies to understand how these mechanisms work in detail”.
Filed under working memory tactile memory rodents prefrontal cortex neuroscience science
A new brain-imaging technique enables people to ‘watch’ their own brain activity in real time and to control or adjust function in pre-determined brain regions. The study from the Montreal Neurological Institute and Hospital – The Neuro, McGill University and the McGill University Health Centre, published in NeuroImage, is the first to demonstrate that magnetoencephalography (MEG) can be used as a potential therapeutic tool to control and train specific targeted brain regions. This advanced brain-imaging technology has important clinical applications for numerous neurological and neuropsychiatric conditions.

MEG is a non-invasive imaging technology that measures magnetic fields generated by nerve cell circuits in the brain. MEG captures these tiny magnetic fields with remarkable accuracy and has unrivaled time resolution - a millisecond time scale across the entire brain. “This means you can observe your own brain activity as it happens,” says Dr. Sylvain Baillet, acting Director of the Brain Imaging Centre at The Neuro and lead investigator on the study. “We can use MEG for neurofeedback – a process by which people can see on-going physiological information that they aren’t usually aware of, in this case, their own brain activity, and use that information to train themselves to self-regulate. Our ultimate hope and aim is to enable patients to train specific regions of their own brain, in a way that relates to their particular condition. For example neurofeedback can be used by people with epilepsy so that they could train to modify brain activity in order to avoid a seizure.”
In this proof of concept study, participants had nine sessions in the MEG and used neurofeedback to reach a specific target. The target was to look at a coloured disc on a display screen and find their own strategy to change the disc’s colour from dark red to bright yellow white, and to maintain that bright colour for as long as possible. The disc colour was indexed on a very specific aspect of their ongoing brain activity: the researchers had set it up so that the experiment was accessing predefined regions of the motor cortex in the participants’ brain. The colour presented was changing according to a predefined combination of slow and faster brain activity within these regions. This was possible because the researchers combined MEG with MRI, which provides information on the brain’s structures, known as magnetic source imaging (MSI).
“The remarkable thing is that with each training session, the participants were able to reach the target aim faster, even though we were raising the bar for the target objective in each session, the way you raise the bar each time in a high jump competition. These results showed that participants were successfully using neurofeedback to alter their pattern of brain activity according to a predefined objective in specific regions of their brain’s motor cortex, without moving any body part. This demonstrates that MEG source imaging can provide brain region-specific real time neurofeedback and that longitudinal neurofeedback training is possible with this technique.”
These findings pave the way for MEG as an innovative therapeutic approach for treating patients. To date, work with epilepsy patients has shown the most promise but there is great potential to use MEG to investigate other neurological syndromes and neuropsychiatric disorders (e.g., stroke, dementia, movement disorders, chronic depression, etc). MEG has potential to reveal dynamics of brain activity involved in perception, cognition and behaviour: it has provided unique insight on brain functions (language, motor control, visual and auditory perception, etc.) and dysfunctions (movement disorders, tinnitus, chronic pain, dementia, etc.).
Dr. Baillet and his team are collaborating presently with Prof. Isabelle Peretz at Université de Montréal to use this technique with people that have amusia, a disorder that makes them unable to process musical pitch. It is hypothesized that amusia results from poor connectivity between the auditory cortex and prefrontal regions in the brain. In an ongoing study, the team is measuring the intensity of functional connectivity between these brain regions in amusic patients and aged-matched healthy controls. Using MEG-neurofeedback, they hope to take advantage of the brain’s plasticity to reinforce the functional connectivity between the target brain regions. If the approach demonstrates an improvement in pitch discrimination in participants, that will demonstrate the clinical and rehabilitative applications of this approach. The baseline measurements have been taken already, and the training sessions will take place over this year.
(Source: mcgill.ca)
Filed under neurofeedback brain imaging MEG brain activity brain training amusia neuroscience science
Index Detects Early Signs of Deviation from Normal Brain Development
Researchers at Penn Medicine have generated a brain development index from MRI scans that captures the complex patterns of maturation during normal brain development. This index will allow clinicians and researchers for the first time to detect subtle, yet potentially critical early signs of deviation from normal development during late childhood to early adult.
The study, published online in the journal Cerebral Cortex, shows a relationship between cognitive development and physical changes in the developing young brain (aged 8 to 21).
“Our findings suggest that brain imaging via sophisticated MRI scans may be a useful biomarker for the early detection of subtle developmental abnormalities,” said Guray Erus, PhD, a research associate in the department of Radiology at the Perelman School of Medicine at the University of Pennsylvania, and the study’s lead author. “The abnormalities may, in turn, be the first manifestations of subsequent neuropsychiatric problems.”
Among its key findings is the consistency in healthy brain development of young people. The study examined cognitive performance of outliers – adolescents whose brains developed faster or slower than the normal rates. Early maturers performed significantly better than those with delayed brain development in the speed at which they completed certain tasks. The improved speed of performance indicates increased efficiency in neuronal organization and communication. Slower performance in such tests is a precursor to neuropsychiatric disorders, (the research suggests), including adolescent-onset psychosis.
The 14 tests used in the Penn study evaluate a broad range of cognitive functions including abstraction and mental flexibility, attention, working memory, verbal memory, face memory, spatial memory, language reasoning, nonverbal reasoning, spatial processing, emotion identification, and sensorimotor speed.
Penn’s brain development index consolidates a number of complex visual maps derived from sophisticated analysis of MRI scans into a unified developmental template. By looking at an individual’s brain maps in relation to the consolidated findings, researchers can estimate the age of the subject. Subjects whose brain development index was higher than their chronological age had significantly superior cognitive processing speed as measured by the cognitive tests compared to subjects whose brain indices were lower than their actual age.
“This is analogous to producing growth charts used in pediatrics to screen for gross abnormalities of physical development,” said Christos Davatzikos, PhD, professor of Radiology and Electrical and Systems Engineering at Penn and one of the study’s co-senior authors. “We can assess individuals in terms of where they place in relation to the overall trends. While single image maps can be used for an accurate estimation of the age of the subject, the combination of all maps achieves a higher accuracy in age prediction than the accuracy of each map independently.”
Previous studies have outlined normative trajectories of growth for individual brain regions across the lifespan; the Penn study is the first to present a comprehensive index for the entire brain during late childhood, adolescence, and young adulthood — periods when the healthy human brain maturates in a remarkably consistent way, deviations from which possibly signify later neuropsychiatric problems.
The Penn study used a sample of 621 participants in the Philadelphia Neurodevelopmental Cohort, a Grand Opportunity study funded by the National Institute of Mental Health, designed to understand how brain maturation mediates cognitive development and vulnerability to psychiatric illness and how genetics impacts this process.
“All of our young study participants have received a standardized neuropsychiatric evaluation at intake, and all agreed to be contacted for future studies. Some are followed up longitudinally,” said Ruben C. Gur, PhD, director of the Brain Behavior Laboratory at Penn and the study’s other co-senior author. “We can therefore follow those who score low on our index and examine whether interventions such as cognitive remediation can mitigate potential symptoms.”
Filed under brain development maturation cognitive development cognitive function brain imaging neuroscience science
Researchers discover an epigenetic lesion in the hippocampus of Alzheimer’s patients
Alzheimer’s disease can reach epidemic range in the coming decades, by the increasing average age of society. There are two key issues for Alzheimer’s disease: there is currently no effective treatment and it has been described very few associated genetic changes (mutations) which reduces the number of targets for future therapies.
Alzheimer’s disease
Pathologically, Alzheimer’s disease is characterized by the accumulation of protein deposits in the brain of patients. These deposits are formed by plates of a protein called amyloid-beta and rolled tangles of tau protein. The root cause of these lesions in most cases is unknown, but specific alterations in regulating genes expression might be involved.
Today, the prestigious international journal in neurology Hippocampus publishes an article led by Manel Esteller, Director of Epigenetics and Cancer Biology, Institute of Biomedical Research of Bellvitge (IDIBEL), ICREA researcher and Professor of Genetics at the University of Barcelona, with the collaboration of the Institute of Neuropathology IDIBELL led by Isidre Ferrer, demonstrating for the first time the existence of an epigenetic lesion in the hippocampus of the brain of patients with Alzheimer.
Switches in the hippocampus
"We first started studying 30,000 molecular switches that turn on and off genes in the hippocampal region in the brains of Alzheimer patients in different stages of disease and compared with that of healthy patients of the same age. We note that dusp22 gene switches off (methylates) as the disease advances" explained Manel Esteller, director of the study.
"But more importantly" continues "was the discovery that this gene regulates tau protein. Perhaps therefore the accumulation of tau protein produced in the brain of patients with Alzheimer results from dusp22 epigenetic inactivation".
According Esteller “the finding is relevant not only to determine the causes of the disease, but also to test potential treatments in the future to act on these epigenetic molecular switches”.
Filed under alzheimer's disease hippocampus epigenetic lesion dusp22 tau protein neuroscience science
Forget about forgetting – The elderly know more and use it better
What happens to our cognitive abilities as we age? If your think our brains go into a steady decline, research reported this week in the Journal Topics in Cognitive Science may make you think again. The work, headed by Dr. Michael Ramscar of Tübingen University, takes a critical look at the measures usually thought to show that our cognitive abilities decline across adulthood. Instead of finding evidence of decline, the team discovered that most standard cognitive measures, which date back to the early twentieth century, are flawed. “The human brain works slower in old age,” says Ramscar, “but only because we have stored more information over time.”
Computers were trained, like humans, to read a certain amount each day, and to learn new things. When the researchers let a computer “read” only so much, its performance on cognitive tests resembled that of a young adult. But if the same computer was exposed to the experiences we might encounter over a lifetime – with reading simulated over decades – its performance now looked like that of an older adult. Often it was slower, but not because its processing capacity had declined. Rather, increased “experience” had caused the computer’s database to grow, giving it more data to process – which takes time.
Technology now allows researchers to make quantitative estimates of the number of words an adult can be expected to learn across a lifetime, enabling the Tübingen team to separate the challenge that increasing knowledge poses to memory from the actual performance of memory itself. “Imagine someone who knows two people’s birthdays and can recall them almost perfectly. Would you really want to say that person has a better memory than a person who knows the birthdays of 2000 people, but can ‘only’ match the right person to the right birthday nine times out of ten?” asks Ramscar.
The answer appears to be “no.” When Ramscar’s team trained their computer models on huge linguistic datasets, they found that standardized vocabulary tests, which are used to take account of the growth of knowledge in studies of ageing, massively underestimate the size of adult vocabularies. It takes computers longer to search databases of words as their sizes grow, which is hardly surprising but may have important implications for our understanding of age-related slowdowns. The researchers found that to get their computers to replicate human performance in word recognition tests across adulthood, they had to keep their capacities the same. “Forget about forgetting,” explained Tübingen researcher Peter Hendrix, “if I wanted to get the computer to look like an older adult, I had to keep all the words it learned in memory and let them compete for attention.”
The research shows that studies of the problems older people have with recalling names suffer from a similar blind spot: there is a far greater variety of given names today than there were two generations ago. This cultural shift toward greater name diversity means the number of different names anyone learns over their lifetime has increased dramatically. The work shows how this makes locating a name in memory far harder than it used to be. Even for computers.
Ramscar and his colleagues’ work provides more than an explanation of why, in the light of all the extra information they have to process, we might expect older brains to seem slower and more forgetful than younger brains. Their work also shows how changes in test performance that have been taken as evidence for declining cognitive abilities in fact demonstrates older adults’ greater mastery of the knowledge they have acquired.
Take “paired-associate learning,” a commonly used cognitive test that involves learning to connect words like “up” to “down” or “necktie” to “cracker” in memory. Using Big Data sets to quantify how often different words appear together in English, the Tuebingen team show that younger adults do better when asked to learn to pair “up” with “down” than “necktie” and “cracker” because “up” and “down” appear in close proximity to one another more frequently. However, whereas older adults also understand which words don’t usually go together, young adults notice this less. When the researchers examined performance on this test across a range of word pairs that go together more and less in English, they found older adult’s scores to be far more closely attuned to the actual information in hundreds of millions of words of English than their younger counterparts.
As Prof. Harald Baayen, who heads the Alexander von Humboldt Quantitative Linguistics research group where the work was carried out puts it, “If you think linguistic skill involves something like being able to choose one word given another, younger adults seem to do better in this task. But, of course, proper understanding of language involves more than this. You have also to not put plausible but wrong pairs of words together. The fact that older adults find nonsense pairs – but not connected pairs – harder to learn than young adults simply demonstrates older adults’ much better understanding of language. They have to make more of an effort to learn unrelated word pairs because, unlike the youngsters, they know a lot about which words don’t belong together.”
The Tübingen research conclude that we need different tests for the cognitive abilities of older people – taking into account the nature and amount of information our brains process. “The brains of older people do not get weak,” says Michael Ramscar. “On the contrary, they simply know more.”
Filed under cognitive decline aging forgetting memory learning psychology neuroscience science
Scientists from the Montreal Neurological Institute and Hospital in Canada have discovered that two genes linked to hereditary Parkinson’s disease are involved in the early-stage quality control of mitochondria. The protective mechanism, which is reported in The EMBO Journal, removes damaged proteins that arise from oxidative stress from mitochondria.
“PINK1 and parkin, are implicated in selectively targeting dysfunctional components of mitochondria to the lysosome under conditions of excessive oxidative damage within the organelle,” said Edward Fon, Professor at the McGill Parkinson Program at the Montreal Neurological Institute and Hospital. “Our study reveals a quality control mechanism where vesicles bud off from mitochondria and proceed to the lysosome for degradation. This method is distinct from the degradation pathway for damaged whole mitochondria which has been known for some time. It is also an early response, proceeding on a timescale of hours instead of days.”
The deterioration of mechanisms designed to maintain the integrity and function of mitochondria throughout the lifetime of a cell has been suggested to underlie the progression of several neurodegenerative diseases, including Parkinson’s disease. When mitochondria, the “power plants” of the cell that provide energy, malfunction they can contribute to Parkinson’s disease. If they are to survive and function mitochondria need to degrade oxidized and damaged proteins.
In the study, immunofluorescence and confocal microscopy were used to observe how the vesicles “pinch off” from mitochondria with their damaged cargo. “Our conclusion is that the loss of this PINK1 and parkin-dependent trafficking system impairs the ability of mitochondria to selectively degrade oxidized and damaged proteins and leads, over time, to the mitochondrial dysfunction noted in hereditary Parkinson’s disease,” said Heidi McBride, Professor in the Neuromuscular Group in the Department of Neurology and Neurosurgery at the Montreal Neurological Institute and Hospital.
Both salvage pathways are operational in the cell. If the vesicular pathway, the first line of defense, is overwhelmed and the damage is irreversible then the entire organelle is targeted for degradation.
(Source: embo.org)
Filed under mitochondria oxidative stress neurodegenerative diseases parkin PINK1 neuroscience science