Neuroscience

Articles and news from the latest research reports.

105 notes

Deep Brain Stimulation Improves Non Motor Symptoms in Parkinson’s Disease as well as Motor Symptoms
Deep brain stimulation (DBS) has become a well-recognized non-pharmacologic treatment that improves motor symptoms of patients with early and advanced Parkinson’s disease. Evidence now indicates that DBS can decrease the number and severity of non motor symptoms of patients with Parkinson’s disease (PD) as well, according to a review published in the Journal of Parkinson’s Disease.
“Non motor features are common in PD patients, occur across all disease stages, and while well described, are still under-recognized when considering their huge impact on patients’ quality of life,” says Lisa Klingelhoefer, MD, a fellow at the National Parkinson Foundation International Centre of Excellence, Department of Neurology, King’s College Hospital and King’s College, London.
For example, DBS of the subthalamic nucleus (STN) is effective for alleviating sleep problems and fatigue associated with PD, producing noticeable long-term improvements in sleep efficiency and the quality and duration of continuous sleep. DBS also decreases nighttime and early morning dystonia and improves nighttime mobility. “DBS can contribute to better sleep, less daytime somnolence, improved mobility, and less need for dopamine replacement therapy,” says Dr. Klingelhoefer.
The effects of DBS on some other non motor symptoms of PD are less clear cut and transient worsening of neuropsychological and psychiatric symptoms have been reported. For instance, behavioral disorders such as impulsivity (e.g. hypersexuality, pathological gambling, and excessive eating) can occur or worsen in PD patients after STN DBS. While pre-existing drug-induced psychotic symptoms like hallucinations often disappear after STN DBS, transient psychotic symptoms such as delirium may emerge in the immediate post-operative period. Similarly, conflicting reports have found that STN DBS improves, worsens, or does not change mood disorders such as depression, mania, or anxiety.
“Further work is required in order to fully understand the mechanisms and impact of DBS of the STN or other brain structures on the non motor symptoms of PD,” concludes Dr. Klingelhoefer. She suggests that in the future, non motor symptoms of PD may become an additional primary indication for DBS.
PD is the second most common neurodegenerative disorder in the United States, affecting approximately one million Americans and five million people worldwide. Its prevalence is projected to double by 2030. The most characteristic symptoms are movement-related, such as involuntary shaking and muscle stiffness. Non motor symptoms, such as worsening depression, anxiety, olfactory dysfunction, sweating, bladder and bowel dysfunction, and sleep disturbances, can appear prior to the onset of motor symptoms.

Deep Brain Stimulation Improves Non Motor Symptoms in Parkinson’s Disease as well as Motor Symptoms

Deep brain stimulation (DBS) has become a well-recognized non-pharmacologic treatment that improves motor symptoms of patients with early and advanced Parkinson’s disease. Evidence now indicates that DBS can decrease the number and severity of non motor symptoms of patients with Parkinson’s disease (PD) as well, according to a review published in the Journal of Parkinson’s Disease.

“Non motor features are common in PD patients, occur across all disease stages, and while well described, are still under-recognized when considering their huge impact on patients’ quality of life,” says Lisa Klingelhoefer, MD, a fellow at the National Parkinson Foundation International Centre of Excellence, Department of Neurology, King’s College Hospital and King’s College, London.

For example, DBS of the subthalamic nucleus (STN) is effective for alleviating sleep problems and fatigue associated with PD, producing noticeable long-term improvements in sleep efficiency and the quality and duration of continuous sleep. DBS also decreases nighttime and early morning dystonia and improves nighttime mobility. “DBS can contribute to better sleep, less daytime somnolence, improved mobility, and less need for dopamine replacement therapy,” says Dr. Klingelhoefer.

The effects of DBS on some other non motor symptoms of PD are less clear cut and transient worsening of neuropsychological and psychiatric symptoms have been reported. For instance, behavioral disorders such as impulsivity (e.g. hypersexuality, pathological gambling, and excessive eating) can occur or worsen in PD patients after STN DBS. While pre-existing drug-induced psychotic symptoms like hallucinations often disappear after STN DBS, transient psychotic symptoms such as delirium may emerge in the immediate post-operative period. Similarly, conflicting reports have found that STN DBS improves, worsens, or does not change mood disorders such as depression, mania, or anxiety.

“Further work is required in order to fully understand the mechanisms and impact of DBS of the STN or other brain structures on the non motor symptoms of PD,” concludes Dr. Klingelhoefer. She suggests that in the future, non motor symptoms of PD may become an additional primary indication for DBS.

PD is the second most common neurodegenerative disorder in the United States, affecting approximately one million Americans and five million people worldwide. Its prevalence is projected to double by 2030. The most characteristic symptoms are movement-related, such as involuntary shaking and muscle stiffness. Non motor symptoms, such as worsening depression, anxiety, olfactory dysfunction, sweating, bladder and bowel dysfunction, and sleep disturbances, can appear prior to the onset of motor symptoms.

Filed under deep brain stimulation parkinson's disease subthalamic nucleus globus pallidus neuroscience science

58 notes

3-D Computer Model May Help Refine Target for Deep Brain Stimulation Therapy for Dystonia

Although deep brain stimulation can be an effective therapy for dystonia – a potentially crippling movement disorder – the treatment isn’t always effective, or benefits may not be immediate. Precise placement of DBS electrodes is one of several factors that can affect results, but few studies have attempted to identify the “sweet spot,” where electrode placement yields the best results.

image

Researchers led by investigators at Cedars-Sinai, using a complex set of data from records and imaging scans of patients who have undergone successful DBS implantation, have created 3-D, computerized models that map the brain region involved in dystonia. The models identify an anatomical target for further study and provide information for neurologists and neurosurgeons to consider when planning surgery and making device programming decisions.

“We know DBS works as a treatment for dystonia, but we don’t know exactly how it works or why some patients have better, quicker results than others. Patient age, disease duration and other underlying factors have a role, and we believe electrode positioning and device programming are critical, but there is no consensus on ideal device placement and optimal programming strategies,” said Michele Tagliati, MD, director of the Movement Disorders Program in the Department of Neurology at Cedars-Sinai.

“This modeling paves the way for the construction of practical therapeutic and investigational targets,” added Tagliati, senior author of an article now available on the online edition of Annals of Neurology.

Medications usually are the first line of treatment for dystonia and several other movement disorders, but if drugs fail – as frequently happens – or side effects are excessive, neurologists and neurosurgeons may supplement them with deep brain stimulation. Electrical leads are implanted deep in the brain, and a pulse generator is placed near the collarbone. The device is later programmed with a remote, hand-held controller.

To calm the disorganized muscle contractions of dystonia, doctors generally target a brain structure called the globus pallidus, but studies on precise positioning of electrode contacts and the best programming parameters – such as the intensity and frequency of electrical stimulation – are rare and conflicting. Finding the most effective settings can take months of fine-tuning.

In this retrospective study, investigators examined a database of 94 patients with the most common genetic form of dystonia, DYT1, who had been treated with DBS for at least a year. They selected 21 patients who had good responses to treatment, compiled their demographic and treatment information, and used magnetic resonance imaging scans to create 3-D anatomical models with a fine grid to show exact location of relevant brain structures.

The investigators then simulated the placement of electrodes as they were positioned in the patients’ brains and input the actual stimulation parameters into a computer program – a “volume of tissue activation” model – which calculated detailed information specific to each patient and each electrode. The model draws on principles of neurophysiology – the way nerve cells respond to DBS – the biophysics of voltage distribution from electrodes, and the anatomy of the globus pallidus and surrounding structures.

“We found that clinicians were applying relatively large amounts of energy to wide swaths of the globus pallidus, but the area in common among most individuals was much smaller. We interpret this as being the potential ‘target within the target,’ and if our results are validated in further research and clinical practice, computer modeling may offer a physiologically-based, data-driven, visualized approach to clinical decision-making,” Tagliati said.

(Source: newswise.com)

Filed under deep brain stimulation dystonia globus pallidus DYT1 neuroscience science

66 notes

Neuroscientists explain how mutated X-linked mental retardation protein impairs neuronal function

There are new clues about malfunctions in brain cells that contribute to intellectual disability and possibly other developmental brain disorders.

image

(Image caption: False color image of a mouse hippocampal neuron (cell
body is at lower right) with branchlike dendrites that provide surfaces at which projections from other neurons can connect, by forming synapses. Van Aelst and colleagues have shown that when the OPHN1 protein is mutated, interfering with its ability to interact with another protein called Homer1b/c, AMPA receptors don’t recycle to the surface at synapses at the rate they normally do. This adversely impacts synaptic plasticity, the process by which neurons adjust the strength of their connections. Such pathology may play a role in X-linked mental retardation.
)

Professor Linda Van Aelst of Cold Spring Harbor Laboratory (CSHL) has been scrutinizing how the normal version of a protein called OPHN1 helps enable excitatory nerve transmission in the brain, particularly at nerve-cell docking ports containing AMPA receptors (AMPARs). Her team’s new work, published June 24 in the Journal of Neuroscience, provides new mechanistic insight into how OPHN1 defects can lead to impairments in the maturation and adjustment of synaptic strength of AMPAR-expressing neurons, which are ubiquitous in the brain and respond to the excitatory neurotransmitter glutamate.

Mutations in a gene called oligophrenin-1 (OPHN1) – located on the X chromosome – have previously been linked to X-linked intellectual disability (also known as X-linked mental retardation), a condition that affects boys disproportionately and could account for as much as one-fifth of all intellectual disability among males.

Several different mutations in the OPHN1 gene have been identified to date, all of which perturb nerve cells’ manufacture of OPHN1 protein. Previously, Van Aelst and colleagues demonstrated that OPHN1 has a vital role in synaptic plasticity, the process through which adjacent nerve cells adjust the strength of their connections. Cells in the brain are constantly adjusting connection strength as they respond to streams of stimuli.

The new discovery shows how OPHN1 is involved in the trafficking of AMPARs, an essential feature of plasticity in neurons. Neurons move receptors away from synapses into their interior and then back to the surface of synapses to control connection strength. At the synaptic surface, receptors provide an opportunity for the docking of neurotransmitters, in this case glutamate molecules. After a cell has fired, surface receptors are typically brought back into the interior, where they are recycled for future use.

When OPHN1 is misshapen or missing due to genetic mutation, the CSHL team demonstrated, it can no longer properly perform its role in receptor recycling, thus also impairing neurons’ ability to maintain strong long-term connections with their neighbors, called long-term potentiation. 

Van Aelst’s new experiments explain how OPHN1 in complex with another protein called Homer1b/c should normally interact with an area called the endocytic zone (EZ) to provide a pool of AMPARs to be brought to the synapse at a location called the post-synaptic density (PSD). When OPHN1 is mutated, the pool does not form and receptors needed for strengthening synapses are not available. Long-term potentiation is impaired.

“This suggests a previously unknown way in which genetic defects in OPHN1 can lead to dysfunctions in the glutamate system,” says Dr. Van Aelst. “Our earlier studies had already shown that OPHN1 is essential in stabilizing AMPA receptors at the synapse. Together, these two essential roles suggest how defective OPHN1 protein may contribute to pathology that underlies X-linked intellectual disability.”

(Source: cshl.edu)

Filed under OPHN1 brain cells mental retardation x chromosome synapses neuroscience science

169 notes

Study shows how brain tumor cells move and damage tissue, points to possible therapy
Researchers at the University of Alabama at Birmingham have shed new light on how cells called gliomas migrate in the brain and cause devastating tumors. The findings, published June 19, 2014 in Nature Communications, show that gliomas — malignant glial cells — disrupt normal neural connections and hijack control of blood vessels.
The study provides insight into the mechanisms of how glioma cells spread throughout the brain as a devastating form of brain cancer, and potentially offers a tantalizing opportunity for therapy.
A hallmark of gliomas is that the cells can migrate away from a central tumor, invading healthy brain tissue. Even if a tumor mass is surgically removed, malignant cells that have migrated are left behind, and can grow into a new tumor.
To grow, glioma cells need access to nutrients in the blood supply, and it is known that gliomas travel along blood vessels within the brain. Now, researchers in the lab of neuroscientist Harald Sontheimer, Ph.D., professor in the UAB Department of Neurobiology, have discovered that, as they move, gliomas dislodge astrocytic endfeet, which play a critical role in regulating blood flow in the brain.
Astrocytes are star-shaped cells in the brain that surround blood vessels and connect to them through projections called endfeet, which extend from the astrocyte and latch onto the vessel wall. The surface of nearly every blood vessel in the brain is covered by endfeet, which regulate the smooth muscle cells on the walls of blood vessels. Through that connection, instructions can be given to the muscle cells to constrict the blood vessel and limit blood flow, or dilate the vessel and increase blood flow.
Sontheimer, director of the UAB Center for Glial Biology in Medicine, says that, as a person performs different neurological functions, blood flow needs to be increased to the areas responsible for that function and correspondingly decreased in other areas to maintain balance.
The arrival of a glioma cell changes all that.
“Glioma cells traveling along blood vessels literally cut the connection of astrocytic endfeet with the vessels and push them out of the way,” said Sontheimer. “By disrupting this important neural connection, adverse cognitive effects could be expected. Additionally, our study showed that gliomas then take control of the blood vessels for their own ends. And those ends are primarily to obtain nutrients from blood so that they can continue to grow and spread.”
Sontheimer’s team says the glioma cells tend to congregate at blood vessel junctions, almost as if camping alongside a stream where it joins a river. The ready supply of nutrients would allow the cell to grow into a larger tumor mass.
By traveling on the outside of a blood vessel, glioma cells are able to access nutrients from the blood stream. As a side effect to that process, they damage the blood brain barrier. The barrier, a layer of endothelial cells, protects the brain by restricting passage of harmful substances from the blood stream into brain tissue.
“We found that, when gliomas push away the astrocytic endfeet, damage occurs to the integrity of the endothelial cells that make up the blood brain barrier,” said Stefanie Robel, Ph.D., a postdoctoral researcher in Sontheimer’s lab and co-first author of the study. “The barrier becomes weakened, and begins to leak. A leak across the barrier can cause severe damage to brain tissue.”
“That leakage appears to be a consequence of glioma cells’ migrating along the blood vessels in their search for nutrients,” said Stacey Watkins, an M.D./Ph.D. student in Sontheimer’s lab and co-first author. “When glioma cells contact the vessels, they have direct access to nutrients.”
But amid the deleterious effects that Sontheimer’s team observed — shearing away the endfeet from their blood vessels, disrupting normal brain activity, hijacking control of blood vessels and causing leaks in the blood brain barrier — he says there may be a silver lining. The idea that gliomas cause the blood brain barrier to become porous and leak might open up a new avenue to kill the malignant cells as they migrate.
Chemotherapy, usually delivered intravenously, is not considered an effective strategy for killing gliomas. Chemotherapeutic agents are very effective in killing cancer cells elsewhere in the body, but the predominant belief is that such drugs will not pass the blood brain barrier and thus will not reach their target.
“Chemotherapy is typically not tried in cases of glioma until after other therapies such as surgery and radiation have been employed,” Sontheimer said. “Our findings, which suggest that gliomas actually weaken the blood brain barrier and cause leakage, might indicate that high-dose, intravenous chemotherapy used early on following a diagnosis of brain cancer would be beneficial.”
The study, funded by the National Institutes of Health and the American Brain Tumor Association, was conducted on a clinically relevant mouse model of human malignant glioma.
Sontheimer says logical next steps would be to further examine the cognitive impact of severing the astrocytic endfeet connection to blood vessels.

Study shows how brain tumor cells move and damage tissue, points to possible therapy

Researchers at the University of Alabama at Birmingham have shed new light on how cells called gliomas migrate in the brain and cause devastating tumors. The findings, published June 19, 2014 in Nature Communications, show that gliomas — malignant glial cells — disrupt normal neural connections and hijack control of blood vessels.

The study provides insight into the mechanisms of how glioma cells spread throughout the brain as a devastating form of brain cancer, and potentially offers a tantalizing opportunity for therapy.

A hallmark of gliomas is that the cells can migrate away from a central tumor, invading healthy brain tissue. Even if a tumor mass is surgically removed, malignant cells that have migrated are left behind, and can grow into a new tumor.

To grow, glioma cells need access to nutrients in the blood supply, and it is known that gliomas travel along blood vessels within the brain. Now, researchers in the lab of neuroscientist Harald Sontheimer, Ph.D., professor in the UAB Department of Neurobiology, have discovered that, as they move, gliomas dislodge astrocytic endfeet, which play a critical role in regulating blood flow in the brain.

Astrocytes are star-shaped cells in the brain that surround blood vessels and connect to them through projections called endfeet, which extend from the astrocyte and latch onto the vessel wall. The surface of nearly every blood vessel in the brain is covered by endfeet, which regulate the smooth muscle cells on the walls of blood vessels. Through that connection, instructions can be given to the muscle cells to constrict the blood vessel and limit blood flow, or dilate the vessel and increase blood flow.

Sontheimer, director of the UAB Center for Glial Biology in Medicine, says that, as a person performs different neurological functions, blood flow needs to be increased to the areas responsible for that function and correspondingly decreased in other areas to maintain balance.

The arrival of a glioma cell changes all that.

“Glioma cells traveling along blood vessels literally cut the connection of astrocytic endfeet with the vessels and push them out of the way,” said Sontheimer. “By disrupting this important neural connection, adverse cognitive effects could be expected. Additionally, our study showed that gliomas then take control of the blood vessels for their own ends. And those ends are primarily to obtain nutrients from blood so that they can continue to grow and spread.”

Sontheimer’s team says the glioma cells tend to congregate at blood vessel junctions, almost as if camping alongside a stream where it joins a river. The ready supply of nutrients would allow the cell to grow into a larger tumor mass.

By traveling on the outside of a blood vessel, glioma cells are able to access nutrients from the blood stream. As a side effect to that process, they damage the blood brain barrier. The barrier, a layer of endothelial cells, protects the brain by restricting passage of harmful substances from the blood stream into brain tissue.

“We found that, when gliomas push away the astrocytic endfeet, damage occurs to the integrity of the endothelial cells that make up the blood brain barrier,” said Stefanie Robel, Ph.D., a postdoctoral researcher in Sontheimer’s lab and co-first author of the study. “The barrier becomes weakened, and begins to leak. A leak across the barrier can cause severe damage to brain tissue.”

“That leakage appears to be a consequence of glioma cells’ migrating along the blood vessels in their search for nutrients,” said Stacey Watkins, an M.D./Ph.D. student in Sontheimer’s lab and co-first author. “When glioma cells contact the vessels, they have direct access to nutrients.”

But amid the deleterious effects that Sontheimer’s team observed — shearing away the endfeet from their blood vessels, disrupting normal brain activity, hijacking control of blood vessels and causing leaks in the blood brain barrier — he says there may be a silver lining. The idea that gliomas cause the blood brain barrier to become porous and leak might open up a new avenue to kill the malignant cells as they migrate.

Chemotherapy, usually delivered intravenously, is not considered an effective strategy for killing gliomas. Chemotherapeutic agents are very effective in killing cancer cells elsewhere in the body, but the predominant belief is that such drugs will not pass the blood brain barrier and thus will not reach their target.

“Chemotherapy is typically not tried in cases of glioma until after other therapies such as surgery and radiation have been employed,” Sontheimer said. “Our findings, which suggest that gliomas actually weaken the blood brain barrier and cause leakage, might indicate that high-dose, intravenous chemotherapy used early on following a diagnosis of brain cancer would be beneficial.”

The study, funded by the National Institutes of Health and the American Brain Tumor Association, was conducted on a clinically relevant mouse model of human malignant glioma.

Sontheimer says logical next steps would be to further examine the cognitive impact of severing the astrocytic endfeet connection to blood vessels.

Filed under glioma brain tumours glial cells brain tissue blood vessels neuroscience science

228 notes

Those with episodic amnesia are not ‘stuck in time,’ says philosopher Carl Craver
In 1981, a motorcycle accident left Toronto native Kent Cochrane with severe brain damage and dramatically impaired episodic memory. Following the accident, Cochrane could no longer remember events from his past. Nor could he predict specific events that might happen in the future.
When neuroscientist Endel Tulving, PhD, asked him to describe what he would do tomorrow, Cochrane could not answer and described his state of mind as “blank.”
Psychologists and neuroscientists came to know Cochrane, who passed away earlier this year, simply as “KC.” Many scientists have described KC as “stuck in time,” or trapped in a permanent present.
It has generally been assumed that people with episodic amnesia experience time much differently than those with more typical memory function. 
However, a recent paper in Neuropsychologia co-authored by Carl F. Craver, PhD, professor of philosophy and of philosophy-neuroscience-psychology, both in Arts & Sciences at Washington University in St. Louis, disputes this type of claim.
“It’s our whole way of thinking about these people that we wanted to bring under pressure,” Craver said. “There are sets of claims that sound empirical, like ‘These people are stuck in time.’ But if you ask, ‘Have you actually tested what they know about time?’ the answer is no.”
Time and consciousness
A series of experiments convinced Craver and his co-authors that although KC could not remember specific past experiences, he did in fact have an understanding of time and an appreciation of its significance to his life.
Interviews with KC by Craver and his colleagues revealed that KC retained much of what psychologists refer to as “temporal consciousness.” KC could order significant events from his life on a timeline, and he seemed to have complete mastery of central temporal concepts.
For example, KC understood that events in the past have already happened, that they influence the future, and that once they happen, they cannot be changed. 
He also knew that events in the future don’t remain in the future, but eventually become present. Even more interestingly, KC’s understanding of time influenced his decision-making.
If KC truly had no understanding of time, Craver argues, then he and others with his type of amnesia would act as if only the present mattered. Without understanding that present actions have future consequences or rewards, KC would have based his actions only upon immediate outcomes. However, this was not the case.
On a personality test, KC scored as low as possible on measures of hedonism, or the tendency to be a self-indulgent pleasure-seeker.
In systematic tests of his decision-making, carried out with WUSTL’s Len Green, PhD, professor of psychology, and Joel Myerson, PhD, research professor of psychology, and researchers at York University in Toronto, KC also showed that he was willing to trade a smaller, sooner reward for a larger, later reward.
In other words, KC’s inability to remember past events did not affect his ability to appreciate the value of future rewards. 
‘Questions are now wide open’
KC’s case reveals how much is left to discover about memory and how it relates to human understanding of time.
“If you think about memory long enough it starts to sound magical,” Craver said. “How is it that we can replay these events from our lives? And what’s going on in our brains that allows us to re-experience these events from our past?”
Craver hopes that this article — the last to be published about KC during his lifetime — brings these types of questions to the forefront. 
“These findings open up a whole new set of questions about people with amnesia,” Craver said. “Things that we previously thought were closed questions are now wide open.”
(Image credit)

Those with episodic amnesia are not ‘stuck in time,’ says philosopher Carl Craver

In 1981, a motorcycle accident left Toronto native Kent Cochrane with severe brain damage and dramatically impaired episodic memory. Following the accident, Cochrane could no longer remember events from his past. Nor could he predict specific events that might happen in the future.

When neuroscientist Endel Tulving, PhD, asked him to describe what he would do tomorrow, Cochrane could not answer and described his state of mind as “blank.”

Psychologists and neuroscientists came to know Cochrane, who passed away earlier this year, simply as “KC.” Many scientists have described KC as “stuck in time,” or trapped in a permanent present.

It has generally been assumed that people with episodic amnesia experience time much differently than those with more typical memory function. 

However, a recent paper in Neuropsychologia co-authored by Carl F. Craver, PhD, professor of philosophy and of philosophy-neuroscience-psychology, both in Arts & Sciences at Washington University in St. Louis, disputes this type of claim.

“It’s our whole way of thinking about these people that we wanted to bring under pressure,” Craver said. “There are sets of claims that sound empirical, like ‘These people are stuck in time.’ But if you ask, ‘Have you actually tested what they know about time?’ the answer is no.”

Time and consciousness

A series of experiments convinced Craver and his co-authors that although KC could not remember specific past experiences, he did in fact have an understanding of time and an appreciation of its significance to his life.

Interviews with KC by Craver and his colleagues revealed that KC retained much of what psychologists refer to as “temporal consciousness.” KC could order significant events from his life on a timeline, and he seemed to have complete mastery of central temporal concepts.

For example, KC understood that events in the past have already happened, that they influence the future, and that once they happen, they cannot be changed. 

He also knew that events in the future don’t remain in the future, but eventually become present. Even more interestingly, KC’s understanding of time influenced his decision-making.

If KC truly had no understanding of time, Craver argues, then he and others with his type of amnesia would act as if only the present mattered. Without understanding that present actions have future consequences or rewards, KC would have based his actions only upon immediate outcomes. However, this was not the case.

On a personality test, KC scored as low as possible on measures of hedonism, or the tendency to be a self-indulgent pleasure-seeker.

In systematic tests of his decision-making, carried out with WUSTL’s Len Green, PhD, professor of psychology, and Joel Myerson, PhD, research professor of psychology, and researchers at York University in Toronto, KC also showed that he was willing to trade a smaller, sooner reward for a larger, later reward.

In other words, KC’s inability to remember past events did not affect his ability to appreciate the value of future rewards. 

‘Questions are now wide open’

KC’s case reveals how much is left to discover about memory and how it relates to human understanding of time.

“If you think about memory long enough it starts to sound magical,” Craver said. “How is it that we can replay these events from our lives? And what’s going on in our brains that allows us to re-experience these events from our past?”

Craver hopes that this article — the last to be published about KC during his lifetime — brings these types of questions to the forefront. 

“These findings open up a whole new set of questions about people with amnesia,” Craver said. “Things that we previously thought were closed questions are now wide open.”

(Image credit)

Filed under amnesia episodic memory consciousness time perception psychology neuroscience science

154 notes

Study finds association between maternal exposure to agricultural pesticides, autism in offspring

Pregnant women who lived in close proximity to fields and farms where chemical pesticides were applied experienced a two-thirds increased risk of having a child with autism spectrum disorder or other developmental delay, a study by researchers with the UC Davis MIND Institute has found. The associations were stronger when the exposures occurred during the second and third trimesters of the women’s pregnancies.

image

The large, multisite California-based study examined associations between specific classes of pesticides, including organophosphates, pyrethroids and carbamates, applied during the study participants’ pregnancies and later diagnoses of autism and developmental delay in their offspring. It is published online today in Environmental Health Perspectives.

“This study validates the results of earlier research that has reported associations between having a child with autism and prenatal exposure to agricultural chemicals in California,” said lead study author Janie F. Shelton, a UC Davis graduate student who now consults with the United Nations. “While we still must investigate whether certain sub-groups are more vulnerable to exposures to these compounds than others, the message is very clear: Women who are pregnant should take special care to avoid contact with agricultural chemicals whenever possible.”

California is the top agricultural producing state in the nation, grossing $38 billion in revenue from farm crops in 2010. Statewide, approximately 200 million pounds of active pesticides are applied each year, most of it in the Central Valley, north to the Sacramento Valley and south to the Imperial Valley on the California-Mexico border. While pesticides are critical for the modern agriculture industry, certain commonly used pesticides are neurotoxic and may pose threats to brain development during gestation, potentially resulting in developmental delay or autism.

The study was conducted by examining commercial pesticide application using the California Pesticide Use Report and linking the data to the residential addresses of approximately 1,000 participants in the Northern California-based Childhood Risk of Autism from Genetics and the Environment (CHARGE) Study. The study includes families with children between 2 and 5 diagnosed with autism or developmental delay or with typical development. It is led by principal investigator Irva Hertz-Picciotto, a MIND Institute researcher and professor and vice chair of the Department of Public Health Sciences at UC Davis. The majority of study participants live in the Sacramento Valley, Central Valley and the greater San Francisco Bay Area.

Twenty-one chemical compounds were identified in the organophosphate class, including chlorpyrifos, acephate and diazinon. The second most commonly applied class of pesticides was pyrethroids, one quarter of which was esfenvalerate, followed by lambda-cyhalothrin permethrin, cypermethrin and tau-fluvalinate. Eighty percent of the carbamates were methomyl and carbaryl.

For the study, researchers used questionnaires to obtain study participants’ residential addresses during the pre-conception and pregnancy periods. The addresses then were overlaid on maps with the locations of agricultural chemical application sites based on the pesticide-use reports to determine residential proximity. The study also examined which participants were exposed to which agricultural chemicals.

“We mapped where our study participants’ lived during pregnancy and around the time of birth. In California, pesticide applicators must report what they’re applying, where they’re applying it, dates when the applications were made and how much was applied,” Hertz-Picciotto said. “What we saw were several classes of pesticides more commonly applied near residences of mothers whose children developed autism or had delayed cognitive or other skills.”

The researchers found that during the study period approximately one-third of CHARGE Study participants lived in close proximity – within 1.25 to 1.75 kilometers – of commercial pesticide application sites. Some associations were greater among mothers living closer to application sites and lower as residential proximity to the application sites decreased, the researchers found.

Organophosphates applied over the course of pregnancy were associated with an elevated risk of autism spectrum disorder, particularly for chlorpyrifos applications in the second trimester. Pyrethroids were moderately associated with autism spectrum disorder immediately prior to conception and in the third trimester. Carbamates applied during pregnancy were associated with developmental delay.

Exposures to insecticides for those living near agricultural areas may be problematic, especially during gestation, because the developing fetal brain may be more vulnerable than it is in adults. Because these pesticides are neurotoxic, in utero exposures during early development may distort the complex processes of structural development and neuronal signaling, producing alterations to the excitation and inhibition mechanisms that govern mood, learning, social interactions and behavior.

“In that early developmental gestational period, the brain is developing synapses, the spaces between neurons, where electrical impulses are turned into neurotransmitting chemicals that leap from one neuron to another to pass messages along. The formation of these junctions is really important and may well be where these pesticides are operating and affecting neurotransmission,” Hertz-Picciotto said.

Research from the CHARGE Study has emphasized the importance of maternal nutrition during pregnancy, particularly the use of prenatal vitamins to reduce the risk of having a child with autism. While it’s impossible to entirely eliminate risks due to environmental exposures, Hertz-Picciotto said that finding ways to reduce exposures to chemical pesticides, particularly for the very young, is important.

“We need to open up a dialogue about how this can be done, at both a societal and individual level,” she said. “If it were my family, I wouldn’t want to live close to where heavy pesticides are being applied.”

(Source: ucdmc.ucdavis.edu)

Filed under autism ASD pregnancy pesticides health neurotransmission science

77 notes

Hormones affect voting behavior
Researchers from the University of Nebraska at Omaha (UNO), the University of Nebraska-Lincoln (UNL) and Rice University have released a study that shows hormone levels can affect voter turnout.
As witnessed by recent voter turnout in primary elections, participation in U.S. national elections is low, relative to other western democracies. In fact, voter turnout in biennial national elections ranges includes only 40 to 60 percent of eligible voters.
The study, published June 22 in Physiology and Behavior, reports that while participation in electoral politics is affected by a host of social and demographic variables, there are also biological factors that may play a role, as well. Specifically, the paper points to low levels of the stress hormone cortisol as a strong predictor of actual voting behavior, determined via voting records maintained by the Secretary of State.
"Politics and political participation is an inherently stressful activity," explained the paper’s lead author, Jeff French, Varner Professor of Psychology and Biology and director of UNO’s neuroscience program. "It would logically follow that those individuals with low thresholds for stress might avoid engaging in that activity and our study confirmed that hypothesis."
Additional authors on the paper are Adam Guck and Andrew K. Birnie from UNO’s Department of Psychology; Kevin B. Smith and John R. Hibbing from UNL’s Department of Political Science; and John R. Alford from the Department of Political Science at Rice University.
The study is part of a larger body of research exploring connections between biology and political orientation, led by Smith and Hibbing. Previous studies have involved twins, eye-tracking equipment and skin conductance in their efforts to identify physical and genetic links to political beliefs.
"It’s one more piece of solid evidence that there are biological markers for political attitudes and behavior," said Smith. "It’s long been known that cortisol levels are associated with your willingness to interact socially – that’s something fairly well established in the research literature. The big contribution here is that nobody really looked at politics and voting behaviors before."
"This research shows that cortisol is related to a willingness to participate in politics," he said.
To reach their conclusion, researchers collected the saliva of over 100 participants who identified themselves as highly conservative, highly liberal or disinterested in politics altogether and analyzed the levels of cortisol found.
Cortisol was measured in saliva collected from the participants before and during activities designed to raise and lower stress. These data were then compared against the participants’ earlier responses regarding involvement in political activities (voting and nonvoting) and religious participation.
"Not only did the study show, expectedly, that high-stress activities led to higher levels of cortisol production, but that political participation was significantly correlated with low baseline levels of cortisol," French explained. "Participation in another group-oriented activity, specifically religious participation, was not as strongly associated with cortisol levels. Involvement in nonvoting political activities, such as volunteering for a campaign, financial political contributions, or correspondence with elected officials, was not predicted by levels of stress hormones."
According to the study, the only other factor that was predictive of voting behavior was age; older adults were likely to have voted more often than younger adults. Research from other groups has also pointed to education, income, and race as important predictors of voting behavior.
In explaining why elevated cortisol could be linked with lower rates of participation in elections, French cited previous experiments in which high levels of afternoon cortisol are linked to major depressive disorder, social withdrawal, separation anxiety and enhanced memory for fearful stimuli.
"High afternoon cortisol is reflective of a variety of social, cognitive, and emotional processes, and may also influence a trait as complex as voting behavior," French suggested.
"The key takeaway from this research, I believe, is that while social scientists have spent decades trying to predict voting behavior based on demographic information, there is much to be learned from looking at biological differences as well," he said. "Many factors influence the decision to participate in the most important political activity in our democracy, and our study demonstrates that stress physiology is an important biological factor in this decision. Our experiment helps to more fully explain why some people engage in electoral politics and others do not."

Hormones affect voting behavior

Researchers from the University of Nebraska at Omaha (UNO), the University of Nebraska-Lincoln (UNL) and Rice University have released a study that shows hormone levels can affect voter turnout.

As witnessed by recent voter turnout in primary elections, participation in U.S. national elections is low, relative to other western democracies. In fact, voter turnout in biennial national elections ranges includes only 40 to 60 percent of eligible voters.

The study, published June 22 in Physiology and Behavior, reports that while participation in electoral politics is affected by a host of social and demographic variables, there are also biological factors that may play a role, as well. Specifically, the paper points to low levels of the stress hormone cortisol as a strong predictor of actual voting behavior, determined via voting records maintained by the Secretary of State.

"Politics and political participation is an inherently stressful activity," explained the paper’s lead author, Jeff French, Varner Professor of Psychology and Biology and director of UNO’s neuroscience program. "It would logically follow that those individuals with low thresholds for stress might avoid engaging in that activity and our study confirmed that hypothesis."

Additional authors on the paper are Adam Guck and Andrew K. Birnie from UNO’s Department of Psychology; Kevin B. Smith and John R. Hibbing from UNL’s Department of Political Science; and John R. Alford from the Department of Political Science at Rice University.

The study is part of a larger body of research exploring connections between biology and political orientation, led by Smith and Hibbing. Previous studies have involved twins, eye-tracking equipment and skin conductance in their efforts to identify physical and genetic links to political beliefs.

"It’s one more piece of solid evidence that there are biological markers for political attitudes and behavior," said Smith. "It’s long been known that cortisol levels are associated with your willingness to interact socially – that’s something fairly well established in the research literature. The big contribution here is that nobody really looked at politics and voting behaviors before."

"This research shows that cortisol is related to a willingness to participate in politics," he said.

To reach their conclusion, researchers collected the saliva of over 100 participants who identified themselves as highly conservative, highly liberal or disinterested in politics altogether and analyzed the levels of cortisol found.

Cortisol was measured in saliva collected from the participants before and during activities designed to raise and lower stress. These data were then compared against the participants’ earlier responses regarding involvement in political activities (voting and nonvoting) and religious participation.

"Not only did the study show, expectedly, that high-stress activities led to higher levels of cortisol production, but that political participation was significantly correlated with low baseline levels of cortisol," French explained. "Participation in another group-oriented activity, specifically religious participation, was not as strongly associated with cortisol levels. Involvement in nonvoting political activities, such as volunteering for a campaign, financial political contributions, or correspondence with elected officials, was not predicted by levels of stress hormones."

According to the study, the only other factor that was predictive of voting behavior was age; older adults were likely to have voted more often than younger adults. Research from other groups has also pointed to education, income, and race as important predictors of voting behavior.

In explaining why elevated cortisol could be linked with lower rates of participation in elections, French cited previous experiments in which high levels of afternoon cortisol are linked to major depressive disorder, social withdrawal, separation anxiety and enhanced memory for fearful stimuli.

"High afternoon cortisol is reflective of a variety of social, cognitive, and emotional processes, and may also influence a trait as complex as voting behavior," French suggested.

"The key takeaway from this research, I believe, is that while social scientists have spent decades trying to predict voting behavior based on demographic information, there is much to be learned from looking at biological differences as well," he said. "Many factors influence the decision to participate in the most important political activity in our democracy, and our study demonstrates that stress physiology is an important biological factor in this decision. Our experiment helps to more fully explain why some people engage in electoral politics and others do not."

Filed under stress cortisol voting behavior psychology neuroscience science

221 notes

Schizophrenia and cannabis use may share common genes

Genes that increase the risk of developing schizophrenia may also increase the likelihood of using cannabis, according to a new study led by King’s College London, published today in Molecular Psychiatry

Previous studies have identified a link between cannabis use and schizophrenia, but it has remained unclear whether this association is due to cannabis directly increasing the risk of the disorder.

image

The new results suggest that part of this association is due to common genes, but do not rule out a causal relationship between cannabis use and schizophrenia risk. 

The study is a collaboration between King’s and the Queensland Institute of Medical Research in Australia, partly funded by the UK Medical Research Council (MRC). 

Mr Robert Power, lead author from the MRC Social, Genetic and Developmental Psychiatry (SGDP) Centre at the Institute of Psychiatry at King’s, says: “Studies have consistently shown a link between cannabis use and schizophrenia. We wanted to explore whether this is because of a direct cause and effect, or whether there may be shared genes which predispose individuals to both cannabis use and schizophrenia.”

Cannabis is the most widely used illicit drug in the world, and its use is higher amongst people with schizophrenia than in the general population. Schizophrenia affects approximately 1 in 100 people and people who use cannabis are about twice as likely to develop the disorder. The most common symptoms of schizophrenia are delusions (false beliefs) and auditory hallucinations (hearing voices). Whilst the exact cause is unknown, a combination of physical, genetic, psychological and environmental factors can make people more likely to develop the disorder.

Previous studies have identified a number of genetic risk variants associated with schizophrenia, each of these slightly increasing an individual’s risk of developing the disorder.  

The new study included 2,082 healthy individuals of whom 1,011 had used cannabis. Each individual’s ‘genetic risk profile’ was measured – that is, the number of genes related to schizophrenia each individual carried. 

The researchers found that people genetically pre-disposed to schizophrenia were more likely to use cannabis, and use it in greater quantities than those who did not possess schizophrenia risk genes.

Power says: “We know that cannabis increases the risk of schizophrenia. Our study certainly does not rule this out, but it suggests that there is likely to be an association in the other direction as well – that a pre-disposition to schizophrenia also increases your likelihood of cannabis use.”

“Our study highlights the complex interactions between genes and environments when we talk about cannabis as a risk factor for schizophrenia. Certain environmental risks, such as cannabis use, may be more likely given an individual’s innate behaviour and personality, itself influenced by their genetic make-up. This is an important finding to consider when calculating the economic and health impact of cannabis.”

(Source: kcl.ac.uk)

Filed under schizophrenia cannabis genes genetics neuroscience science

227 notes

Study shows moving together builds bonds from the time we learn to walk
Whether they march in unison, row in the same boat or dance to the same song, people who move in time with one another are more likely to bond and work together afterward.
It’s a principle established by previous studies, but now researchers at McMaster have shown that moving in time with others even affects the social behaviour of babies who have barely learned to walk.
“Moving in sync with others is an important part of musical activities,” says Laura Cirelli, lead author of a paper now posted online and scheduled to appear in an upcoming issue of the journal Developmental Science. “These effects show that movement is a fundamental part of music that affects social behavior from a very young age.”
Cirelli and her colleagues in the Department of Psychology, Neuroscience & Behaviour showed that 14-month-old babies were much more likely to help another person after the experience of bouncing up and down in time to music with that person.
Cirelli and fellow doctoral student Kate Einarson worked under the supervision of Professor Laurel Trainor, a specialist in child development research.
They tested 68 babies in all, to see if bouncing to music with another person makes a baby more likely to assist that person by handing back “accidentally” dropped objects.
Working in pairs, one researcher held a baby in a forward-facing carrier and stood facing the second researcher. When the music started to play, both researchers would gently bounce up and down, one bouncing the baby with them. Some babies were bounced in sync with the researcher across from them, and others were bounced at a different tempo.
When the song was over, the researcher who had been facing the baby then performed several simple tasks, including drawing a picture with a marker. While drawing the picture, she would pretend to drop the marker to see whether the infant would pick it up and hand it back to her – a classic test of altruism in babies.
The babies who had been bounced in time with the researcher were much more likely to toddle over, pick up the object and pass it back to the researcher, compared to infants who had been bounced at a different tempo than the experimenter.
While babies who had been bounced out of sync with the researcher only picked up and handed back 30 per cent of the dropped objects, in-sync babies came to the researcher’s aid 50 per cent of the time. The in-sync babies also responded more quickly.
The findings suggest that when we sing, clap, bounce or dance in time to music with our babies, these shared experiences of synchronous movement help form social bonds between us and our babies.
It’s a significant finding, Cirelli believes, because it shows that moving together to music with others encourages the development of altruistic helping behaviour among those in a social group. It suggests that music is an important part of day care and kindergarten curriculums because it helps to build a co-operative social climate.
Cirelli is now researching whether the experience of synchronous movement with one person leads babies to extend their increased helpfulness to other people or whether infants reserve their altruistic behaviour for their dancing partners.

Study shows moving together builds bonds from the time we learn to walk

Whether they march in unison, row in the same boat or dance to the same song, people who move in time with one another are more likely to bond and work together afterward.

It’s a principle established by previous studies, but now researchers at McMaster have shown that moving in time with others even affects the social behaviour of babies who have barely learned to walk.

“Moving in sync with others is an important part of musical activities,” says Laura Cirelli, lead author of a paper now posted online and scheduled to appear in an upcoming issue of the journal Developmental Science. “These effects show that movement is a fundamental part of music that affects social behavior from a very young age.”

Cirelli and her colleagues in the Department of Psychology, Neuroscience & Behaviour showed that 14-month-old babies were much more likely to help another person after the experience of bouncing up and down in time to music with that person.

Cirelli and fellow doctoral student Kate Einarson worked under the supervision of Professor Laurel Trainor, a specialist in child development research.

They tested 68 babies in all, to see if bouncing to music with another person makes a baby more likely to assist that person by handing back “accidentally” dropped objects.

Working in pairs, one researcher held a baby in a forward-facing carrier and stood facing the second researcher. When the music started to play, both researchers would gently bounce up and down, one bouncing the baby with them. Some babies were bounced in sync with the researcher across from them, and others were bounced at a different tempo.

When the song was over, the researcher who had been facing the baby then performed several simple tasks, including drawing a picture with a marker. While drawing the picture, she would pretend to drop the marker to see whether the infant would pick it up and hand it back to her – a classic test of altruism in babies.

The babies who had been bounced in time with the researcher were much more likely to toddle over, pick up the object and pass it back to the researcher, compared to infants who had been bounced at a different tempo than the experimenter.

While babies who had been bounced out of sync with the researcher only picked up and handed back 30 per cent of the dropped objects, in-sync babies came to the researcher’s aid 50 per cent of the time. The in-sync babies also responded more quickly.

The findings suggest that when we sing, clap, bounce or dance in time to music with our babies, these shared experiences of synchronous movement help form social bonds between us and our babies.

It’s a significant finding, Cirelli believes, because it shows that moving together to music with others encourages the development of altruistic helping behaviour among those in a social group. It suggests that music is an important part of day care and kindergarten curriculums because it helps to build a co-operative social climate.

Cirelli is now researching whether the experience of synchronous movement with one person leads babies to extend their increased helpfulness to other people or whether infants reserve their altruistic behaviour for their dancing partners.

Filed under infants prosocial behavior motor synchrony child development psychology neuroscience science

156 notes

How Aging Can Intensify Damage of Spinal Cord Injury
In the complex environment of a spinal cord injury, researchers have found that immune cells in the central nervous system of elderly mice fail to activate an important signaling pathway, dramatically lowering chances for repair after injury.
These studies were the first to show that spinal cord injuries are more severe in elderly mice than in young adults, corroborating previous anecdotal findings from clinical settings. They also revealed a previously unknown player in the repair of spinal cord injuries in young adults.
A key messenger in that pathway is a receptor on the surface of microglia, immune system cells in the central nervous system that are called into action by the trauma of the spinal cord injury.
In young adult mice, this receptor is activated by microglia to recognize and make use of an inflammation-related signaling chemical that is found in the central nervous system after a spinal cord injury. The microglia in the elderly mice, however, do not activate the receptor at all.
The study showed that the difference in receptor activation has consequences later in the recovery process. The kinds of cells recruited to the injury site in young adult mice appear to have more value in the repair process than do the cells that show up in elderly mice. A host of experiments traced those differing effects back to whether or not microglia activated the receptor.
“The microglia are regulated by several different cell types and different signals, and it appears a lot of those systems change with age,” said Jonathan Godbout, associate professor of neuroscience at The Ohio State University and senior author of the study.
“We’ve shown evidence that this more severe injury occurs in an aging animal, and that the difference in recovery is related to the ability to express the receptor. The consequence is we have a different profile of cells at the injury site, and in aging mice, that environment is less reparative.”
These differences at the cellular level were associated with vast differences in the characteristics of injury and recovery. The lesions on the injured spinal cord were 38 percent longer, on average, in elderly mice than in young adult mice. In addition, the older mice were unable to gain movement of their hind limbs by the time most younger mice had regained that mobility.
The research is published in the Journal of Neuroscience.
The receptor in question is called the IL-4 alpha receptor, and its job is to “see” the infusion of interleukin-4, or IL-4, in the central nervous system after the spinal injury. IL-4 is a cytokine, a type of protein connected to immune system function. Many cytokines promote inflammation, but IL-4 is associated with curbing inflammation.
Godbout and colleagues observed that IL-4 in the central nervous systems in both young adult and aging mice sent signals to recruit additional repair cells to the injury site – cells called macrophages and monocytes. These are types of white blood cells that originate in the bone marrow and circulate in what is known as the “periphery,” via blood and outside the central nervous system. But only in young adult mice were these types of cells contributors to wound healing and clearing of debris, necessary inflammatory functions that help rather than harm.
“This was surprising to us because aging is typically associated with increased inflammation so we’d expect to see higher levels of inflammatory cytokines in the aged mice,” said first author Ashley Fenn, who just received her Ph.D. in neuroscience from Ohio State. “But in the aged mice with a spinal cord injury, we saw reduced levels of some inflammatory signals associated with a failure to reprogram the microglia with IL-4 toward a reparative profile. That’s how we figured out the IL-4 is unique in the spinal cord injury paradigm, that it induces an inflammatory response that appears to be beneficial.”
The IL-4 in the young adult mice also led to production of arginase, a protein that serves as a biomarker of the injury repair response. Significantly less arginase was detected in the injured elderly mice, another signal that the disabled receptor interfered with IL-4’s assistance in injury repair.
The communication among systems has long been a focus of Godbout’s research. He is an investigator in Ohio State’s Institute for Behavioral Medicine Research (IBMR) and Center for Brain and Spinal Cord Repair.
“There is some level of communication going on between the central nervous system microglia and the peripheral immune system’s macrophages. In our model, differences in that communication affected the ability to bring in cells to the site of the injury. Maybe the aging microenvironment brings in cells that are less beneficial,” he said.
About 200,000 people are currently living with a spinal cord injury in the United States, and an estimated 12,000 to 20,000 new injuries occur each year, according to the Centers for Disease Control and Prevention.
Though any therapy based on this research would take many years to develop, Godbout and Fenn said that finding a drug that could stimulate expression of the IL-4 alpha receptor in elderly spinal cord injury patients might have potential to improve their outcomes.

How Aging Can Intensify Damage of Spinal Cord Injury

In the complex environment of a spinal cord injury, researchers have found that immune cells in the central nervous system of elderly mice fail to activate an important signaling pathway, dramatically lowering chances for repair after injury.

These studies were the first to show that spinal cord injuries are more severe in elderly mice than in young adults, corroborating previous anecdotal findings from clinical settings. They also revealed a previously unknown player in the repair of spinal cord injuries in young adults.

A key messenger in that pathway is a receptor on the surface of microglia, immune system cells in the central nervous system that are called into action by the trauma of the spinal cord injury.

In young adult mice, this receptor is activated by microglia to recognize and make use of an inflammation-related signaling chemical that is found in the central nervous system after a spinal cord injury. The microglia in the elderly mice, however, do not activate the receptor at all.

The study showed that the difference in receptor activation has consequences later in the recovery process. The kinds of cells recruited to the injury site in young adult mice appear to have more value in the repair process than do the cells that show up in elderly mice. A host of experiments traced those differing effects back to whether or not microglia activated the receptor.

“The microglia are regulated by several different cell types and different signals, and it appears a lot of those systems change with age,” said Jonathan Godbout, associate professor of neuroscience at The Ohio State University and senior author of the study.

“We’ve shown evidence that this more severe injury occurs in an aging animal, and that the difference in recovery is related to the ability to express the receptor. The consequence is we have a different profile of cells at the injury site, and in aging mice, that environment is less reparative.”

These differences at the cellular level were associated with vast differences in the characteristics of injury and recovery. The lesions on the injured spinal cord were 38 percent longer, on average, in elderly mice than in young adult mice. In addition, the older mice were unable to gain movement of their hind limbs by the time most younger mice had regained that mobility.

The research is published in the Journal of Neuroscience.

The receptor in question is called the IL-4 alpha receptor, and its job is to “see” the infusion of interleukin-4, or IL-4, in the central nervous system after the spinal injury. IL-4 is a cytokine, a type of protein connected to immune system function. Many cytokines promote inflammation, but IL-4 is associated with curbing inflammation.

Godbout and colleagues observed that IL-4 in the central nervous systems in both young adult and aging mice sent signals to recruit additional repair cells to the injury site – cells called macrophages and monocytes. These are types of white blood cells that originate in the bone marrow and circulate in what is known as the “periphery,” via blood and outside the central nervous system. But only in young adult mice were these types of cells contributors to wound healing and clearing of debris, necessary inflammatory functions that help rather than harm.

“This was surprising to us because aging is typically associated with increased inflammation so we’d expect to see higher levels of inflammatory cytokines in the aged mice,” said first author Ashley Fenn, who just received her Ph.D. in neuroscience from Ohio State. “But in the aged mice with a spinal cord injury, we saw reduced levels of some inflammatory signals associated with a failure to reprogram the microglia with IL-4 toward a reparative profile. That’s how we figured out the IL-4 is unique in the spinal cord injury paradigm, that it induces an inflammatory response that appears to be beneficial.”

The IL-4 in the young adult mice also led to production of arginase, a protein that serves as a biomarker of the injury repair response. Significantly less arginase was detected in the injured elderly mice, another signal that the disabled receptor interfered with IL-4’s assistance in injury repair.

The communication among systems has long been a focus of Godbout’s research. He is an investigator in Ohio State’s Institute for Behavioral Medicine Research (IBMR) and Center for Brain and Spinal Cord Repair.

“There is some level of communication going on between the central nervous system microglia and the peripheral immune system’s macrophages. In our model, differences in that communication affected the ability to bring in cells to the site of the injury. Maybe the aging microenvironment brings in cells that are less beneficial,” he said.

About 200,000 people are currently living with a spinal cord injury in the United States, and an estimated 12,000 to 20,000 new injuries occur each year, according to the Centers for Disease Control and Prevention.

Though any therapy based on this research would take many years to develop, Godbout and Fenn said that finding a drug that could stimulate expression of the IL-4 alpha receptor in elderly spinal cord injury patients might have potential to improve their outcomes.

Filed under aging spinal cord injury microglia cytokines neuroscience science

free counters