A research led by the Research Institute Vall d’Hebron (VHIR), in which the University of Valencia participated, has shown that pathological forms of the α-synuclein protein present in deceased patients with Parkinson’s disease are able to initiate and spread in mice and primates the neurodegenerative process that typifies this disease. The discovery, published in the March cover of Annals of Neurology, opens the door to the development of new treatments that allow to stop the progression of Parkinson’s disease, aimed at blocking the expression, the pathological conversion and the transmission of this protein.

Recent studies have shown that synthetic forms of α-synuclein are toxic for the neurons, both in vitro (cell culture) and in vivo (mice), which can spread from one cell to another. However, until now it was not known if this pathogenic protein synthetic capacity could be extended to the pathological human protein found in patients with Parkinson and, therefore, whether it was relevant for the disease in humans.
In the present study, led by Doctor Miquel Vila, from the group of Neurodegenerative Diseases of the VHIR and CIBERNED member, and in which two other groups of CIBERNED have also participated (the lead by Doctor Isabel Fariñas, University of Valencia, and the led by Doctor José Obeso, CIMA-University of Navarra), as well as a group from the University of Bordeaux in France (Doctor Erwan Bezard), the researchers extracted α-synuclein aggregates of brains of dead patients because of the Parkinson’s disease to inject them into the brains of rodents and primates.
Four months after the injection into mice, and nine months after the injection into monkeys, these animals began to present degeneration of dopaminergic neurons and intracellular cumulus of α-synuclein pathology in these cells, as occurs in the Parkinson’s disease. Months later, the animals also showed cumulus of this protein in other brain remote areas, with a pattern of similar extension to that observed in the brains of patients after years of disease evolution.
According to Doctor Vila, these results indicate that “the pathological aggregates of this protein obtained from patients with the Parkinson’s disease have the ability to initiate and extend the neurodegenerative process that typifies the Parkinson’s disease in mice and primates”. A discovery that, he adds, “provides new insights about the possible mechanisms of initiation and progression of the disease and opens the door to new therapeutic opportunities”. Therefore, the next step is to find out how to stop the progression and spread of the disease, by blocking the transmission of cell to cell of the α-synuclein, as well as regulating the levels of expression and stopping the pathological conversion of this protein.
The Parkinson’s disease
The Parkinson’s disease is the second most common neurodegenerative disease after the Alzheimer’s disease. It is characterized by progressive loss of neurons that produce dopamine in a brain region (the substantia nigra of the ventral midbrain) and the presence in these cells of pathological intracellular aggregates of the α-synuclein protein, called Lewy bodies. The loss of brain dopamine as a consequence of neuronal death results in the typical motor manifestations of the disease, such as muscle stiffness, tremors and slow movement.
The most effective treatment for this disease is the levodopa, a palliative drug that allows to restore the missing dopamine. However, as the disease progresses, the pathological process of neurodegeneration and accumulation of α-synuclein progressively extends beyond the ventral midbrain to other brain areas. As a result, there is a progressive worsening of the patient and the emergence of non-motor clinical manifestations unresponsive to dopaminergic drugs. There is currently no treatment that avoids, delays or halts the progressive evolution of the neurodegenerative process.
Research linked to stress in mice confirms blood-brain comparison is valid

Johns Hopkins researchers say they have confirmed suspicions that DNA modifications found in the blood of mice exposed to high levels of stress hormone — and showing signs of anxiety — are directly related to changes found in their brain tissues.
The proof-of-concept study, reported online ahead of print in the June issue of Psychoneuroendocrinology, offers what the research team calls the first evidence that epigenetic changes that alter the way genes function without changing their underlying DNA sequence — and are detectable in blood — mirror alterations in brain tissue linked to underlying psychiatric diseases.
The new study reports only on so-called epigenetic changes to a single stress response gene called FKBP5, which has been implicated in depression, bipolar disorder and post-traumatic stress disorder. But the researchers say they have discovered the same blood and brain matches in dozens more genes, which regulate many important processes in the brain.
“Many human studies rely on the assumption that disease-relevant epigenetic changes that occur in the brain — which is largely inaccessible and difficult to test — also occur in the blood, which is easily accessible,” says study leader Richard S. Lee, Ph.D., an instructor in the Department of Psychiatry and Behavioral Sciences at the Johns Hopkins University School of Medicine. “This research on mice suggests that the blood can legitimately tell us what is going on in the brain, which is something we were just assuming before, and could lead us to better detection and treatment of mental disorders and for a more empirical way to test whether medications are working.”
For the study, the Johns Hopkins team worked with mice with a rodent version of Cushing’s disease, which is marked by the overproduction and release of cortisol, the primary stress hormone also called glucocorticoid. For four weeks, the mice were given different doses of stress hormones in their drinking water to assess epigenetic changes to FKBP5. The researchers took blood samples weekly to measure the changes and then dissected the brains at the end of the month to study what changes were occurring in the hippocampus as a result of glucocorticoid exposure. The hippocampus, in both mice and humans, is vital to memory formation, information storage and organizational abilities.
The measurements showed that the more stress hormones the mice got, the greater the epigenetic changes in the blood and brain tissue, although the scientists say the brain changes occurred in a different part of the gene than expected. This was what made finding the blood-brain connection very challenging, Lee says.
Also, the more stress hormone, the more RNA from the FKBP5 gene was expressed in the blood and brain, and the greater the association with depression. However, it was the underlying epigenetic changes that proved to be more robust. This is important, because while RNA levels may return to normal after stress hormone levels decrease or change due to small fluctuations in hormone levels, epigenetic changes persist, reflect overall stress hormone exposure and predict how much RNA will be made when stress hormone levels increase.
The team of researchers used an epigenetic assay previously developed in their laboratory that requires just one drop of blood to accurately assess overall exposure to stress hormone over 30 days. Elevated levels of stress hormone exposure are considered a risk factor for mental illness in humans and other mammals.
Babies and young children make giant developmental leaps all of the time. Sometimes, it seems, even overnight they figure out how to recognize certain shapes or what the word “no” means no matter who says it. It turns out that making those leaps could be a nap away: New research finds that infants who nap are better able to apply lessons learned to new skills, while preschoolers are better able to retain learned knowledge after napping.

“Sleep plays a crucial role in learning from early in development,” says Rebecca Gómez of the University of Arizona. She will be presenting her new work, which looks specifically at how sleep enables babies and young children to learn language over time, at the Cognitive Neuroscience Society (CNS) annual meeting in Boston today, as part of a symposium on sleep and memory.
“We want to show that sleep is not just a necessary evil for the organism to stay functional,” says Susanne Diekelmann of the University of Tübingen in Germany who is chairing the symposium. “Sleep is an active state that is essential for the formation of lasting memories.”
A growing body of research shows how memories become reactivated during sleep, and new work is shedding light on exactly when and how memories get stored and reactivated. “Sleep is a highly selective state that preferentially strengthens memories that are relevant for our future behavior,” Diekelmann says. “Sleep can also abstract general rules from single experiences, which helps us to deal more efficiently with similar situations in the future.”
When you throw a wild pitch or sing a flat note, it could be that your basal ganglia made you do it. This area in the middle of the brain is involved in motor control and learning. And one reason for that errant toss or off-key note may be that your brain prompted you to vary your behavior to help you learn, from trial-and-error, to perform better.

But how does the brain do this, how does it cause you to vary your behavior?
Along with researchers from the University of California, San Francisco, Indian Institute of Science Education and Research and Duke University, Professor Sarah Woolley, Department of Biology, investigated this question in songbirds, which learn their songs during development in a manner similar to how humans learn to speak. In particular, songbirds memorize the song of their father or tutor, then practice that song until they can produce a similar song.
“As adults, they continue to produce this learned song, but what’s interesting is that they keep it just a little bit variable” says Woolley. “The variability isn’t a default, it isn’t that they can’t produce a better version, they can — in particular when they sing to a female. So when they sing alone and their song is variable it’s because they are actively making it that way.”
The team used this change in the variability of the song to look at how the activity of single cells in different parts of the brain altered their activity depending on the social environment.
“We found that the social modulation of variability emerged within the basal ganglia, a brain area known to be important for learning and producing movements not only in birds but also in mammals, including humans” says Woolley. “This indicates that one way that the basal ganglia may be important in motor learning across species is through its involvement in generating variability.”
The researchers studied song birds because they have a cortical-basal ganglia circuit that is specific for singing. In contrast, for most behaviors in other species, the cortical-basal ganglia cells and circuits that are important for particular behaviors, like learning to walk, may be situated right next to, or even intermingled with cells and circuits important for other behaviors. “The evolution in songbirds of an identifiable circuit for a single complex behavior gives us a tremendous advantage as we try to parse out exactly what these parts of the brain do and how they do it,” says Woolley.
Useful for Parkinson’s disease
The basal ganglia is dramatically affected in illnesses such as Parkinson’s and Huntington disease. The team’s findings may eventually be relevant to understanding changes to learning and flexibility in movement that occur in those diseases.
“These are the kind of questions that we are now starting to pursue in the lab: how variability is affected when you radically manipulate the system akin to what happens during disease”, says Woolley.
Early study found they can be safely transplanted into the brain; 2 patients showed significant improvement

In an early test, researchers report they’ve safely injected stem cells into the brains of 18 patients who had suffered strokes. And two of the patients showed significant improvement.
All the patients saw some improvement in weakness or paralysis within six months of their procedures. Although three people developed complications related to the surgery, they all recovered. There were no adverse reactions to the transplanted stem cells themselves, the study authors said.
What’s more, the researchers said, two patients experienced dramatic recoveries almost immediately after the treatments.
Those patients, who were both women, started to regain the ability to talk and walk the morning after their operations. In both cases, they were more than two years past their strokes, a point where doctors wouldn’t have expected further recovery.
Procrastination and impulsivity are genetically linked, suggesting that the two traits stem from similar evolutionary origins, according to research published in Psychological Science, a journal of the Association for Psychological Science. The research indicates that the traits are related to our ability to successfully pursue and juggle goals.

“Everyone procrastinates at least sometimes, but we wanted to explore why some people procrastinate more than others and why procrastinators seem more likely to make rash actions and act without thinking,” explains psychological scientist and study author Daniel Gustavson of the University of Colorado Boulder. “Answering why that’s the case would give us some interesting insights into what procrastination is, why it occurs, and how to minimize it.”
From an evolutionary standpoint, impulsivity makes sense: Our ancestors should have been inclined to seek immediate rewards when the next day was uncertain.
Procrastination, on the other hand, may have emerged more recently in human history. In the modern world, we have many distinct goals far in the future that we need to prepare for – when we’re impulsive and easily distracted from those long-term goals, we often procrastinate.
Thinking about the two traits in that context, it seems logical that people who are perpetual procrastinators would also be highly impulsive. Many studies have observed this positive relationship, but it is unclear what cognitive, biological, and environmental influences are responsible for it.
The most effective way to understand why these traits are correlated is to study human twins. Identical twins — who share 100% of their genes — tend to show greater similarities in behavior than fraternal twins, who only share 50% of their genes (just like any other siblings). Researchers take advantage of this genetic discrepancy to figure out the relative importance of genetic and environmental influences on particular behaviors, like procrastination and impulsivity.
Gustavson and colleagues had 181 identical-twin pairs and 166 fraternal-twin pairs complete several surveys intended to probe their tendencies toward impulsivity and procrastination, as well as their ability to set and maintain goals.
They found that procrastination is indeed heritable, just like impulsivity. Not only that, there seems to be a complete genetic overlap between procrastination and impulsivity — that is, there are no genetic influences that are unique to either trait alone.
That finding suggests that, genetically speaking, procrastination is an evolutionary byproduct of impulsivity — one that likely manifests itself more in the modern world than in the world of our ancestors.
In addition, the link between procrastination and impulsivity also overlapped genetically with the ability to manage goals, lending support to the idea that delaying, making rash decisions, and failing to achieve goals all stem from a shared genetic foundation.
Gustavson and colleagues are now investigating how procrastination and impulsivity are related to higher-level cognitive abilities, such as executive functions, and whether these same genetic influences are related to other aspects of self-regulation in our day-to-day lives.
“Learning more about the underpinnings of procrastination may help develop interventions to prevent it, and help us overcome our ingrained tendencies to get distracted and lose track of work,” Gustavson concludes.
Whether at the office, dorm, PTA meeting, or any other social setting, we all know intuitively who the popular people are – who is most liked – even if we can’t always put our finger on why. That information is often critical to professional or social success as you navigate your social networks. Yet until now, scientists have not understood how our brains recognize these popular people. In new work, researchers say that we track people’s popularity largely through the brain region involved in anticipating rewards.

“Being able to track other people’s status in your group is incredibly important in survival terms,” says Kevin Ochsner of Columbia University. “Knowing who is popular or likeable is critically important in times of need or distress, when you seek an alliance, or need help – whether physical or political – etc.” While sociologists, psychologists, and anthropologists have long studied these group dynamics, neuroscientists have only begun to scratch the surface of how we think about people’s social status.
That is all changing, though, Ochsner says with many areas of work bringing together social psychology and sociology with cognitive neuroscience to better understand how individual brain processes connect to group membership. As will be presented today at the annual meeting of the Cognitive Neuroscience Society (CNS) in Boston, researchers are now studying at the neural level everything from social popularity to how ideas successfully spread in groups.
Improved thinking. Decreased appetite. Lowered blood pressure. The potential health benefits of dark chocolate keep piling up, and scientists are now homing in on what ingredients in chocolate might help prevent obesity, as well as type-2 diabetes. They found that one particular type of antioxidant in cocoa prevented laboratory mice from gaining excess weight and lowered their blood sugar levels. The report appears in ACS’ Journal of Agricultural & Food Chemistry.

Andrew P. Neilson and colleagues explain that cocoa, the basic ingredient of chocolate, is one of the most flavanol-rich foods around. That’s good for chocolate lovers because previous research has shown that flavanols in other foods such as grapes and tea can help fight weight gain and type-2 diabetes. But not all flavanols, which are a type of antioxidant, are created equal. Cocoa has several different kinds of these compounds, so Neilson’s team decided to tease them apart and test each individually for health benefits.
The scientists fed groups of mice different diets, including high-fat and low-fat diets, and high-fat diets supplemented with different kinds of flavanols. They found that adding one particular set of these compounds, known as oligomeric procyanidins (PCs), to the food made the biggest difference in keeping the mice’s weight down if they were on high-fat diets. They also improved glucose tolerance, which could potentially help prevent type-2 diabetes. “Oligomeric PCs appear to possess the greatest antiobesity and antidiabetic bioactivities of the flavanols in cocoa, particularly at the low doses employed for the present study,” the researchers state.
Cigarette smoking among obese women appears to interfere with their ability to taste fats and sweets, a new study shows. Despite craving high-fat, sugary foods, these women were less likely than others to perceive these tastes, which may drive them to consume more calories.

M. Yanina Pepino, PhD, assistant professor of medicine at Washington University School of Medicine in St. Louis, and Julie Mennella, PhD, a biopsychologist at the Monell Center in Philadelphia, where the research was conducted, studied four groups of women ages 21 to 41: obese smokers, obese nonsmokers, smokers of normal weight and nonsmokers of normal weight. The women tasted several vanilla puddings containing varying amounts of fat and were asked to rate them for sweetness and creaminess, a measure of fat content.
“Compared with the other three groups, smokers who were obese perceived less creaminess and sweetness,” Pepino said. “They also derived less pleasure from tasting the puddings.”
The findings are published in the April issue of the journal Obesity.
Pepino cautioned that the study only identified associations between smoking and taste rather than definitive reasons why obese smokers were less likely to detect fat and sweetness. But the findings imply that the ability to perceive fat and sweetness — and to derive pleasure from food — is compromised in female smokers who are obese, which could contribute to the consumption of more calories.
“Obese people often crave high-fat foods,” she said. “Our findings suggest that having this intense craving but not perceiving fat and sweetness in food may lead these women to eat more. Since smoking and obesity are risk factors for cardiovascular and metabolic diseases, the additional burden of craving more fats and sugars, while not fully tasting them, could be detrimental to health.”
Interestingly, it was the combination of smoking and obesity that created something of a “double-whammy” because smokers who were not overweight could perceive fat and sweetness that was similar to women who did not smoke.
Previous studies have linked smoking to increased food cravings and greater consumption of fat, regardless of whether a smoker is obese. Studies also have found that smokers tend to have increased waist-to-hip ratios. That is, they tend to be shaped more like apples than pears, another risk factor for heart disease and metabolic problems.
The findings contribute to a growing body of knowledge that challenges the lingering perception that smoking helps a person maintain a healthy weight.
“Women are much more likely than men to take up smoking as an aid to weight control,” Pepino said. “But there is no good evidence showing that it helps maintain a healthy weight over the long term. And in the case of obese women who smoke, it appears the smoking may make things even worse than previously thought.”
According to a new study by researchers at Ben-Gurion University of the Negev (BGU) and the University of Amsterdam, oxytocin caused participants to lie more to benefit their groups, and to do so more quickly and without expectation of reciprocal dishonesty from their group. Oxytocin is a hormone the body naturally produces to stimulate bonding.
The research was published this week in the Proceedings of the National Academy of Science (PNAS).
"Our results suggest people are willing to bend ethical rules to help the people close to us, like our team or family," says Dr. Shaul Shalvi of Ben-Gurion University of the Negev’s Department of Psychology and director of BGU’s Center for Decision Making and Economic Psychology. "This raises an interesting, although perhaps more philosophical, question: Are all lies immoral?"
Dr. Shalvi’s research focuses on ethical decision-making and the justifications people use to do wrong and still feel moral. Specifically, he looks at what determines how much people lie and which settings increase people’s honesty. Very little is known about the biological foundations of immoral behavior.
"Together, these findings fit a functional perspective on morality revealing dishonesty to be plastic and rooted in evolved neurobiological circuitries, and align with work showing that oxytocin shifts the decision-maker’s focus from self to group interests," Shalvi says.
"The results highlight the role of bonding and cooperation in shaping dishonesty, providing insight into when and why collaboration turns into corruption."
Oxytocin is a peptide of nine amino acids produced in the brain’s hypothalamus, functioning as both a hormone and neurotransmitter. Research has shown that in addition to its bonding effect in couples and between mothers and babies, it also stimulates one’s social approach.
Higher levels of oxytocin correlate with greater empathy, lower social anxiety and more pro-social choice in anonymous games; reduction in fear response; and greater trust in interpersonal exchange. It also stimulates defense-related aggression.
In the experiment designed by Shalvi and fellow researcher Carsten K. W. De Dreu of the University of Amsterdam’s Department of Psychology, 60 male participants received an intranasal dose of either oxytocin or placebo. They were then split into teams of three and asked to predict the results of 10 coin tosses.
Participants were asked to toss the coin, see the outcome and report whether their prediction was correct. They knew that for each correct prediction, they could lie and earn more money to split between their group members, who were engaging in the same task.
"The statistical probability of someone correctly guessing the results of nine or 10 coin tosses is about one percent," says Shalvi. "Yet, 53 percent of those who were given oxytocin claimed to have correctly predicted that many coin tosses, which is extremely unlikely."
Only 23 percent of the participants who received the placebo reported the same results, reflecting a high likelihood that they were also lying, but to a lesser extent compared to those receiving oxytocin.
The first UK study of the use of ketamine intravenous infusions in people with treatment-resistant depression has been carried out in an NHS clinic by researchers at Oxford Health NHS Foundation Trust and the University of Oxford.

'Ketamine is a promising new antidepressant which works in a different way to existing antidepressants. We wanted to see whether it would be safe if given repeatedly, and whether it would be practical in an NHS setting. We especially wanted to check that repeated infusions didn't cause cognitive problems,' explains principal investigator Dr Rupert McShane, a consultant psychiatrist at Oxford Health and a researcher in Oxford University's Department of Psychiatry.
The researchers confirmed that ketamine has a rapid antidepressant effect in some patients with severe depression who have not responded to other treatments. These are patients suffering from severe depression which may have lasted years despite multiple antidepressants and talking therapies. Although many patients relapsed within a day or two, 29% had benefit which lasted at least three weeks and 15% took over two months to relapse.
Ketamine did not cause cognitive or bladder side effects when given on up to six occasions, although some people did experience other side effects such as anxiety during the infusion or being sick. The team have now given over 400 infusions to 45 patients and are exploring ways to maintain the effect. They report their findings in the Journal of Psychopharmacology. The study was funded by National Institute for Health Research (NIHR) Research for Patient Benefit Programme.