Posts tagged neuroscience

Posts tagged neuroscience
July 18, 2012
A new guideline released by the American Academy of Neurology recommends several treatments for people with Huntington’s disease who experience chorea—jerky, random, uncontrollable movements that can make everyday activities challenging. The guideline is published in the July 18, 2012, online issue of Neurology.
"Chorea can be disabling, worsen weight loss and increase the risk of falling," said guideline lead author Melissa Armstrong, MD, MSc, with the University of Maryland Department of Neurology and a member of the American Academy of Neurology.
Huntington’s disease is a complex disease with physical, cognitive and behavioral symptoms. The new guideline addresses only one aspect of the disease that may require treatment.
The guideline found that the drugs tetrabenazine (TBZ), riluzole and amantadine can be helpful and the drug nabilone may also be considered to treat chorea. The medications riluzole, amantadine and nabilone are not often prescribed for Huntington’s disease.
"People with Huntington’s disease who have chorea should discuss with their doctors whether treating chorea is a priority. Huntington’s disease is complex with a wide range of sometimes severe symptoms and treating other symptoms may be a higher priority than treating chorea," said Armstrong.
Armstrong adds that it is important for patients to understand that their doctors may try drugs not recommended in this guideline to treat chorea. More research is needed to know if drugs such as those used for psychosis are effective; however, doctors may prescribe them on the basis of past clinical experience.
Provided by American Academy of Neurology
Source: medicalxpress.com

Researchers say they’ve identified an indicator, or “biomarker,” in the blood that may help predict a person’s risk of developing Alzheimer’s disease.
For their study, the investigators tested the blood of 99 women, aged 70 to 79, for levels of a fatty compound called ceramides, which is associated with inflammation and cell death. The women were then followed for up to nine years and 27 of them developed dementia, including 18 who were diagnosed with probable Alzheimer’s disease.
Compared to women with the lowest levels of ceramides, those with the highest levels were 10 times more likely to develop Alzheimer’s and those with middle levels of the biomarker were nearly eight times more likely to develop the memory-robbing disease, according to the findings published in the July 18 online issue of the journal Neurology.
"Our study identifies this biomarker as a potential new target for treating or preventing Alzheimer’s disease," Michelle Mielke, an epidemiologist with the Mayo Clinic in Rochester, Minn., said in a news release from the American Academy of Neurology. She was with Johns Hopkins University at the time of the research.
Another expert stressed the importance of the study and the need for further research.
"These findings are important because identifying an accurate biomarker for early Alzheimer’s that requires little cost and inconvenience to a patient could help change our focus from treating the disease to preventing or delaying it," Valory Pavlik, of the Alzheimer’s Disease and Memory Disorders Center of Baylor College of Medicine in Houston, wrote in an accompanying editorial.
"While a larger, more diverse study is needed to confirm these findings, projections that the global prevalence of Alzheimer’s disease will double every 20 years for the foreseeable future have certainly increased the sense of urgency among researchers and health care agencies to identify more effective screening, prevention and treatment strategies," Pavlik noted.
Source: healthfinder.org
July 18, 2012
Sleep deprivation in the first few hours after exposure to a significantly stressful threat actually reduces the risk of Post-Traumatic Stress Disorder (PTSD), according to a study by researchers from Ben-Gurion University of the Negev (BGU) and Tel Aviv University.
The new study was published in the international scientific journal, Neuropsychopharmacology. It revealed in a series of experiments that sleep deprivation of approximately six hours immediately after exposure to a traumatic event reduces the development of post trauma-like behavioral responses. As a result, sleep deprivation the first hours after stress exposure might represent a simple, yet effective, intervention for PTSD.
The research was conducted by Prof. Hagit Cohen, director of the Anxiety and Stress Research Unit at BGU’s Faculty of Health Sciences, in collaboration with Prof. Joseph Zohar of Tel Aviv University.
Approximately 20 percent of people exposed to a severe traumatic event, such as a car or work accident, terrorist attack or war, cannot normally carry on their lives. These people retain the memory of the event for many years. It causes considerable difficulties in the person’s functioning in daily life and, in extreme cases, may render the individual completely dysfunctional.
"Often those close to someone exposed to a traumatic event, including medical teams, seek to relieve the distress and assume that it would be best if they could rest and "sleep on it," says Prof. Cohen. "Since memory is a significant component in the development of post-traumatic symptoms, we decided to examine the various effects of sleep deprivation immediately after exposure to trauma."
In the experiments, rats that underwent sleep deprivation after exposure to trauma (predator scent stress exposure), later did not exhibit behavior indicating memory of the event, while a control group of rats that was allowed to sleep after the stress exposure did remember, as shown by their post trauma-like behavior.
"As is the case for human populations exposed to severe stress, 15 to 20 percent of the animals develop long-term disruptions in their behavior," says Cohen. "Our research method for this study is, we believe, a breakthrough in biomedical research."
A pilot study in humans is currently being planned. The studies were funded by a Israel Academy of Science and Humanities grant and the Israel Ministry of Health.
Provided by American Associates, Ben-Gurion University of the Negev
Source: medicalxpress.com
In the insect brain, dopamine-releasing nerve cells are crucial to the formation of both punished and rewarded memories.
Hiromu Tanimoto and his colleagues at the Max Planck Institute of Neurobiology recently localised and identified the most important types of nerve cells involved in forming positive and negative memories of a fruit fly. All four nerve cell types they discovered use dopamine to communicate with other nerve cells. The dopamine signals released by these cells are received in the mushroom body, a prominent brain structure in insect brains. “It is really surprising that similar dopamine-releasing nerve cells can play such different roles,” says Tanimoto.
Read more: Dopamine – A substance with many messages
July 18, 2012
(Phys.org) — New research at the Hebrew University of Jerusalem sheds light on pluripotency—the ability of embryonic stem cells to renew themselves indefinitely and to differentiate into all types of mature cells. Solving this problem, which is a major challenge in modern biology, could expedite the use of embryonic stem cells in cell therapy and regenerative medicine. If scientists can replicate the mechanisms that make pluripotency possible, they could create cells in the laboratory which could be implanted in humans to cure diseases characterized by cell death, such as Alzheimer’s, Parkinson’s, diabetes and other degenerative diseases.
To shed light on these processes, researchers in the lab of Dr. Eran Meshorer, in the Department of Genetics at the Hebrew University’s Alexander Silberman Institute of Life Sciences, are combining molecular, microscopic and genomic approaches. Meshorer’s team is focusing on epigenetic pathways—which cause biological changes without a corresponding change in the DNA sequence—that are specific to embryonic stem cells.
The molecular basis for epigenetic mechanisms is chromatin, which is comprised of a cell’s DNA and structural and regulatory proteins. In groundbreaking research performed by Shai Melcer, a PhD student in the Meshorer lab, the mechanisms which support an “open” chromatin conformation in embryonic stem cells were examined. The researchers found that chromatin is less condensed in embryonic stem cells, allowing them the flexibility or “functional plasticity” to turn into any kind of cell.
A distinct pattern of chemical modifications of chromatin structural proteins (referred to as the acetylation and methylation of histones) enables a looser chromatin configuration in embryonic stem cells. During the early stages of differentiation, this pattern changes to facilitate chromatin compaction.
But even more interestingly, the authors found that a nuclear lamina protein, lamin A, is also a part of the secret. In all differentiated cell types, lamin A binds compacted domains of chromatin and anchors them to the cell’s nuclear envelope. Lamin A is absent from embryonic stem cells and this may enable the freer, more dynamic chromatin state in the cell nucleus. The authors believe that chromatin plasticity is tantamount to functional plasticity since chromatin is made up of DNA that includes all genes and codes for all proteins in any living cell. Understanding the mechanisms that regulate chromatin function will enable intelligent manipulations of embryonic stem cells in the future.
"If we can apply this new understanding about the mechanisms that give embryonic stem cells their plasticity, then we can increase or decrease the dynamics of the proteins that bind DNA and thereby increase or decrease the cells’ differentiation potential," concludes Dr. Meshorer. “This could expedite the use of embryonic stem cells in cell therapy and regenerative medicine, by enabling the creation of cells in the laboratory which could be implanted in humans to cure diseases characterized by cell death, such as Alzheimer’s, Parkinson’s, diabetes and other degenerative diseases.”
Source: PHYS.ORG
Scientists have developed a statistical method using evolutionary information to significantly enhance the likelihood of identifying disease-associated alleles in the genome that show better consistency across populations.
The group’s research appeared in the advanced online issue of the journal Molecular Biology and Evolution. The new method is now available to use via the web, so that researchers worldwide can apply it as an aid to discovering disease-associated mutations that are more consistently reproducible and therefore useable as diagnostic markers. Kumar refers to this new approach, combining standard comparative genomic studies with phylogenetic data as phylomedicine, a rapidly developing field that promises to streamline genomic information and improve its diagnostic power.
Read more: Evolutionary information improves discovery of mutations associated with diseases
ScienceDaily (July 17, 2012) — The ability of infants to recognize speech is more sophisticated than previously known, researchers in New York University’s Department of Psychology have found. Their study, which appears in the journal Developmental Psychology, showed that infants, as early as nine months old, could make distinctions between speech and non-speech sounds in both humans and animals.

A new study shows that infants, as early as nine months old, could make distinctions between speech and non-speech sounds in both humans and animals. (Credit: © ChantalS / Fotolia)
"Our results show that infant speech perception is resilient and flexible," explained Athena Vouloumanos, an assistant professor at NYU and the study’s lead author. "This means that our recognition of speech is more refined at an earlier age than we’d thought."
It is well-known that adults’ speech perception is fine-tuned — they can detect speech among a range of ambiguous sounds. But much less is known about the capability of infants to make similar assessments. Understanding when these abilities become instilled would shed new light on how early in life we develop the ability to recognize speech.
In order to gauge the aptitude to perceive speech at any early age, the researchers examined the responses of infants, approximately nine months in age, to recorded human and parrot speech and non-speech sounds. Human (an adult female voice) and parrot speech sounds included the words “truck,” “treat,” “dinner,” and “two.” The adult non-speech sounds were whistles and a clearing of the throat while the parrot non-speech sounds were squawks and chirps. The recorded parrot speech sounds were those of Alex, an African Gray parrot that had the ability to talk and reason and whose behaviors were studied by psychology researcher Irene Pepperberg.
Since infants cannot verbally communicate their recognition of speech, the researchers employed a commonly used method to measure this process: looking longer at what they find either interesting or unusual. Under this method, looking longer at a visual paired with a sound may be interpreted as a reflection of recognition. In this study, sounds were paired with a series of visuals: a checkerboard-like image, adult female faces, and a cup.
The results showed that infants listened longer to human speech compared to human non-speech sounds regardless of the visual stimulus, revealing the ability recognize human speech independent of the context.
Their findings on non-human speech were more nuanced. When paired with human-face visuals or human artifacts like cups, the infants listened to parrot speech longer than they did non-speech, such that their preference for parrot speech was similar to their preference for human speech sounds. However, this did not occur in the presence of other visual stimuli. In other words, infants were able to distinguish animal speech from non-speech, but only in some contexts.
"Parrot speech is unlike human speech, so the results show infants have the ability to detect different types of speech, even if they need visual cues to assist in this process," explained Vouloumanos.
Source: Science Daily
ScienceDaily (July 17, 2012) — Johns Hopkins researchers say they have discovered a cause-and-effect relationship between two well-established biological risk factors for schizophrenia previously believed to be independent of one another.
The findings could eventually lead researchers to develop better drugs to treat the cognitive dysfunction associated with schizophrenia and possibly other mental illnesses.
Researchers have long studied the role played in the brain’s neurons by the Disrupted-in-Schizophrenia 1 (DISC1) gene, a mutation with one of the strongest links to an increased risk of developing the debilitating psychiatric illness.
In a study published in the journal Molecular Psychiatry, the laboratory of Mikhail V. Pletnikov, M.D., Ph.D., in collaboration with the laboratory of Solomon H. Snyder, M.D., D.Sc., instead looked at the role the DISC1 gene plays in glia cells known as astrocytes, a kind of support cell in the brain that helps neurons communicate with one another.
"Abnormalities in glia cells could be as important as abnormalities in neuronal cells themselves," says Pletnikov, an associate professor of psychiatry and behavioral sciences at the Johns Hopkins University School of Medicine, and the study’s leader. "Most gene work has been done with neurons. But we also need to understand a lot more about the role that genetic mutations in glia cells play because neuron-glia interaction appears crucial in ensuring the brain operates normally."
Besides the paranoia and hallucinations that characterize the disease, schizophrenics have cognitive deficits, leaving them unable to think clearly or organize their thoughts and behavior.
Previous studies found that one of the roles of astrocytes is to secrete the neurotransmitter D-serine, which helps promote the transmission of glutamate in the brain, believed to be a key to cognitive function. Schizophrenics have decreased glutamate transmission. It appears, Pletnikov says, that people with DISC1 mutations associated with the psychiatric illness are faster to metabolize D-serine, which leads to a decrease in the apparently crucial transmitter.
In clinical trials, other researchers are trying to boost D-serine levels in people with schizophrenia to see if they can boost cognitive function.
In the new study, the Johns Hopkins researchers found that DISC1 is directly involved in regulating the production of D-serine by the enzyme known as serine racemase.
The researchers found that DISC1 normally binds to serine racemase and stabilizes it. The mutant DISC1 in patients with schizophrenia cannot bind with serine racemase, and instead destabilizes and destroys it. The result is a deficiency of D-serine.
The Hopkins researchers bred mice with the mutant DISC1 protein expressed only in astrocytes and, as predicted, the animals had decreased levels of D-serine. These mice also showed abnormal behavior “consistent with schizophrenia,” Pletnikov says. For example, the rodents showed sensitivity to psycho-stimulants that target glutamate transmission. By treating the mice with D-serine, the scientists were able to ameliorate the schizophrenic-like symptoms. Mice without the DISC1 mutation in astrocytes had normal D-serine levels.
Pletnikov says that in the future, researchers hope that they can target the unstable junction between the abnormal DISC1 and serine racemase. If drugs, for example, can be found to increase glutamate transmission in humans, doctors may be able to improve cognitive function in schizophrenics. He says a DISC1 mutation may also be an important risk factor in other psychiatric disorders.
"Abnormal glutamate transmission is believed to be present in patients with bipolar disorder, major depression and possibly anxiety disorders, so our findings could apply to other psychiatric diseases," he says.
Source: Science Daily
ScienceDaily (July 17, 2012) — Scientists have discovered two genetic variants associated with the substantial, rapid weight gain occurring in nearly half the patients treated with antipsychotic medications, according to two studies involving the Centre for Addiction and Mental Health (CAMH).
These results could eventually be used to identify which patients have the variations, enabling clinicians to choose strategies to prevent this serious side-effect and offer more personalized treatment.
"Weight gain occurs in up to 40 per cent of patients taking medications called second-generation or atypical antipsychotics, which are used because they’re effective in controlling the major symptoms of schizophrenia," says CAMH Scientist Dr. James Kennedy, senior author on the most recent study published online in the Archives of General Psychiatry.
This weight gain can lead to obesity, type 2 diabetes, heart problems and a shortened life span. “Identifying genetic risks leading to these side-effects will help us prescribe more effectively,” says Dr. Kennedy, head of the new Tanenbaum Centre for Pharmacogenetics, which is part of CAMH’s Campbell Family Mental Health Research Institute. Currently, CAMH screens for two other genetic variations that affect patients’ responses to psychiatric medications.
Each study identified a different variation near the melanocortin-4 receptor (MC4R) gene, which is known to be linked to obesity.
In the Archives of General Psychiatry study, people carrying two copies of a variant gained about three times as much weight as those with one or no copies, after six to 12 weeks of treatment with atypical antipsychotics. (The difference was approximately 6 kg versus 2 kg.) The study had four patient groups: two from the U.S., one in Germany and one from a larger European study.
"The weight gain was associated with this genetic variation in all these groups, which included pediatric patients with severe behaviour or mood problems, and patients with schizophrenia experiencing a first episode or who did not respond to other antipsychotic treatments," says CAMH Scientist Dr. Daniel Müller. "The results from our genetic analysis combined with this diverse set of patients provide compelling evidence for the role of this MC4R variant. Our research group has discovered other gene variants associated with antipsychotic-induced weight gain in the past, but this one appears to be the most compelling finding thus far."
Three of the four groups had never previously taken atypical antipsychotics. Different groups were treated with drugs such as olanzapine, risperidone, aripiprazole or quetiapine, and compliance was monitored to ensure the treatment regime was followed. Weight and other metabolic-related measures were taken at the start and during treatment.
A genome-wide association study was conducted on pediatric patients by the study’s lead researcher, Dr. Anil Malhotra, at the Zucker Hillside Hospital in Glen Oaks, NY. In this type of study, variations are sought across a person’s entire set of genes to identify those associated with a particular trait. The result pointed to the MC4R gene.
This gene’s role in antipsychotic-induced weight gain had been identified in a CAMH study published earlier this year in The Pharmacogenomics Journal, involving Drs. Müller and Kennedy, and conducted by PhD student Nabilah Chowdhury. They found a different variation on MC4R that was linked to the side-effect.
For both studies, CAMH researchers did genotyping experiments to identify the single changes to the sequence of the MC4R gene — known as single nucleotide polymorphisms (SNPs) — related to the drug-induced weight gain side-effect.
The MC4R gene encodes a receptor involved in the brain pathways regulating weight, appetite and satiety. “We don’t know exactly how the atypical antipsychotics disrupt this pathway, or how this variation affects the receptor,” says Dr. Müller. “We need further studies to validate this result and eventually turn this into a clinical application.”
Source: Science Daily
ScienceDaily (July 17, 2012) — Researchers at the University of Colorado School of Medicine have found a drug that boosts memory function in those with Down syndrome, a major milestone in the treatment of this genetic disorder that could significantly improve quality of life.
"Before now there had never been any positive results in attempts to improve cognitive abilities in persons with Down syndrome through medication," said Alberto Costa, MD, Ph.D., who led the four- year study at the CU School of Medicine. "This is the first time we have been able to move the needle at all and that means improvement is possible."
The study was published July 17 in the journal Translational Psychiatry.
Costa, an associate professor of medicine, and his colleagues studied 38 adolescents and young adults with Down syndrome. Half took the drug memantine, used to treat Alzheimer’s disease, and the others took a placebo.
Costa’s research team hypothesized that memantine, which improved memory in mice with Down syndrome, could increase test scores of young adults with the disorder in the area of spatial and episodic memory, functions associated with the hippocampus region of the brain.
Participants underwent a 16-week course of either memantine or a placebo while scientists compared the adaptive and cognitive function of the two groups.