Neuroscience

Articles and news from the latest research reports.

Posts tagged psychology

61 notes

What Your Neural Stem Cells Aren’t Telling You

In 2000, a team of neuroscientists put an unusual idea to the test. Stress and depression, they knew, made neurons wither and die – particularly in the hippocampus, a brain area crucial for memory. So the researchers put some stressed-out rats on an antidepressant regimen, hoping the mood boost might protect some of those hippocampal neurons. When they checked in a few weeks later, though, the team found that rats’ hippocampuses hadn’t just survived intact; they’d grown whole new neurons – bundles of them. But that’s only the beginning of our tale.

Neural stem cells (green) in the hippocampus huddle around a neuron (purple), listening for stray signals.

By the time 2009 rolled around, another team of researchers was suggesting that human brains might get a similar hippocampal boost from antidepressants. The press announced the discovery with headlines like, “Antidepressants Grow New Brain Cells” – although not everyone agreed with that conclusion. Still, whether the principle applied to humans or not, a far more basic question was begging to be answered: How, exactly, does a brain tell new cells to form?

“Well, through synapses, of course,” you might answer – and that’d be a very reasonable guess. After all, synapses are how most neurons talk to each other: electrochemical information is “squirted” from a tiny tendril of one neuron into the tip of a tendril on another; and cells throughout most of the brain share essentially this same mechanism for passing signals along: The signals coming out of Neuron A’s synapses keep bugging Neuron B by stimulating its synapses, until finally Neuron B caves under peer pressure and bugs Neuron C with the signal… and so on.

There are, however, two significant exceptions to this system.

The first exception was discovered a few years ago, as scientists got more and more curious about the role of neuroglia (also known as just “glia”), synapse-less cells that many had assumed were just there to serve as structural support for neurons. A 2008 study showed that glia help control cerebral blood flow, and research in 2010 demonstrated that some glia – cells known as astrocytes – actively listen for and respond to certain neurotransmitter messages. These so-called “quiet cells” are actually pretty loud talkers once you learn to tune in to their chatter.

The second exception to the synapse rule is even more mysterious – in large part because it’s a brand-new discovery: As the journal Nature reports, a team led by Hongjun Song at the Johns Hopkins University School of Medicine have found that neural stem cells “listen in” on the stray chemical signals that leak from synapses.

You can imagine neural stem cells as being sort of “neural embryos” – depending on the surrounding conditions, they can develop into neurons or into glia. And here’s what’s strange about the way these cells communicate: They respond not to any single synaptic signal, but to the overall chemical “vibe” of their environment – to chronic feelings of stress, for instance. By way of response, they may morph into neurons or glia – or even tell the brain to crank out some all-new cells.

Neural stem cells seem to be particularly interested in the chemical GABA (gamma-aminobutyric acid) – a neurotransmitter that’s known to be involved in inhibiting signals from other neurons. When scientists artificially block these stem cells’ GABA receptors from receiving messages, the cells “wake up” and start replicating – but when those GABA signals are allowed to reach the receptors, the stem cells stay dormant.

“In this case,” Song explains, “GABA communication keeps the brain stem cells in reserve, so if we don’t need them, we don’t use them up.”

In short, leaky synapses aren’t wasteful – as a matter of fact, they’re essential to the brain’s self-sculpting abilities. And this implies something pretty interesting: It isn’t just individual signals that convey neural information, but whole experiences. In that respect, a brain – whether it belongs to a rat or a human – is unlike any computer on earth.

August 15, 2012

Source: Scientific American

Filed under stem cells science neuroscience brain psychology

19 notes

Holding on to faulty protein delays brain degeneration

When something goes wrong in your brain, you’d think it would be a good idea to get rid of the problem. Turns out, sometimes it’s best to keep hold of it. By preventing faulty proteins from being destroyed, researchers have delayed the symptoms of a degenerative brain disorder.

SNAP25 is one of three proteins that together make up a complex called SNARE, which plays a vital role in allowing neurons to communicate with each other. In order to work properly, all the proteins must be folded in a specific way. CSP alpha is one of the key proteins that ensures SNAP25 is correctly folded.

Cells have a backup system to deal with any misfolded proteins – they are destroyed by a bell-shaped enzyme called a proteasome, which pulls the proteins inside itself and breaks them down.

People with a genetic mutation that affects the CSP alpha protein – and its ability to correctly fold SNAP25 – can develop a rare brain disorder called neuronal ceroid lipofuscinosis (NCL). The disorder causes significant damage to neurons – people affected gradually lose their cognitive abilities and struggle to move normally.

To find out what role proteasomes might play in NCL, Manu Sharma and his colleagues at Stanford University in California blocked the enzyme in mice that were bred to lack CSP alpha. “We weren’t sure what would happen,” says Sharma. Either the misfolded SNAP25 would accumulate and harm the cells, or some of the misfolded proteins may work well enough to retain some of their function.

Longer life

It appears it was the latter. Mice bred to lack CSP alpha suffer the same physical and cognitive problems as humans, and tend to survive for about 65 to 80 days, rather than the normal 670 days. But mice injected with a drug that blocked protease lived, on average, an extra 15 days. “Fifteen days might not sound like much, but as a percentage it’s quite significant,” says Sharma. What’s more, treated mice were able to stave off measurable movement and cognitive symptoms for an extra 10 days.

The finding goes against the idea that neurodegenerative disorders should be treated by clearing away misfolded proteins, rather than trying to rescue their function. “People normally think that protease isn’t working hard enough,” says Nico Dantuma at the Karolinska Institute in Stockholm, Sweden, who was not involved in the study.

But whether or not the drugs are likely to work in other neurodegenerative disorders involving aggregations of misfolded proteins, such as Alzheimer’s and Parkinson’s disease, is up for debate. “I don’t think their results prove that clearing misfolded proteins is not a useful therapeutic,” says Ana Maria Cuervo at Albert Einstein College of Medicine in New York. Other studies that increase the degrading of misfolded proteins have been shown to improve symptoms in other neurodegenerative diseases, she says.

"There are two sides of the coin," says Dantuma. "You might rescue functioning proteins from being degraded… but it’s too early to extrapolate these results to Alzheimer’s and Parkinson’s disease."

In the meantime, drugs that block proteasome are already used to treat cancer, so Sharma hopes they can soon be trialled in people with NCL.

Source: NewScientist

Filed under science neuroscience brain psychology degeneration protein disorders

36 notes

The 2007 study by Yale University researchers provided the first evidence that 6- and 10-month-old infants could assess individuals based on their behaviour towards others, showing a preference for those who helped rather than hindered another individual.
Based on a series of experiments, researchers in the Department of Psychology at Otago have shown that the earlier findings may simply be the result of infants’ preferences for interesting and attention grabbing events, rather than an ability to evaluate individuals based on their social interactions with others.

"The paper received a lot of attention when it was first published, including coverage in the New York Times. It has received well over 100 citations since 2007, a phenomenal number over such a short period. The paper was initially brought to our attention by one of the PhD students in our lab. The head of the lab, Professor Harlene Hayne, suggested that a group of us read the paper together and then meet to discuss it. Our original motivation for reading the paper was merely interest. Obviously, the idea that morality is innate is extremely interesting and, if true, would raise questions about which components of our moral system are innate and also have implications for the wider issue of the roles that nature and nurture play in development," says Dr Scarf.

The Otago study was recently published in PLoS One

The 2007 study by Yale University researchers provided the first evidence that 6- and 10-month-old infants could assess individuals based on their behaviour towards others, showing a preference for those who helped rather than hindered another individual.

Based on a series of experiments, researchers in the Department of Psychology at Otago have shown that the earlier findings may simply be the result of infants’ preferences for interesting and attention grabbing events, rather than an ability to evaluate individuals based on their social interactions with others.

"The paper received a lot of attention when it was first published, including coverage in the New York Times. It has received well over 100 citations since 2007, a phenomenal number over such a short period. The paper was initially brought to our attention by one of the PhD students in our lab. The head of the lab, Professor Harlene Hayne, suggested that a group of us read the paper together and then meet to discuss it. Our original motivation for reading the paper was merely interest. Obviously, the idea that morality is innate is extremely interesting and, if true, would raise questions about which components of our moral system are innate and also have implications for the wider issue of the roles that nature and nurture play in development," says Dr Scarf.

The Otago study was recently published in PLoS One

Filed under science neuroscience brain psychology research development

37 notes

Tripping the switches on brain growth to treat depression

Depression takes a substantial toll on brain health. Brain imaging and post-mortem studies provide evidence that the wealth of connections in the brain are reduced in individuals with depression, with the result of impaired functional connections between key brain centers involved in mood regulation. Glial cells are one of the cell types that appear to be particularly reduced when analyzing post-mortem brain tissue from people who had depression. Glial cells support the growth and function of nerve cells and their connections.

Over the past several years, it has become increasingly recognized that antidepressants produce positive effects on brain structure that complement their effects on symptoms of depression. These structural effects of antidepressants appear to depend, in large part, on their ability to raise the levels of growth factors in the brain.

In a new study, Elsayed and colleagues from the Yale University School of Medicine report their findings on a relatively novel growth factor named fibroblast growth factor-2 or FGF2. They found that FGF2 can increase the number of glial cells and block the decrease caused by chronic stress exposure by promoting the generation of new glial cells.

Senior author Dr. Ronald Duman said, “Our study uncovers a new pathway that can be targeted for treating depression. Our research shows that we can increase the production and maintenance of glial cells that are important for supporting neurons, providing an enriched environment for proper neuronal function.”

To study whether FGF2 can treat depression, the researchers used rodent models where animals are subjected to various natural stressors, which can trigger behaviors that are similar to those expressed by depressed humans, such as despair and loss of pleasure. FGF2 infusions restored the deficit in glial cell number caused by chronic stress. An underlying molecular mechanism was also identified when the data showed that antidepressants increase glial generation and function via increasing FGF2 signaling.

"Although more research is warranted to explore the contribution of glial cells to the antidepressant effects of FGF2, the results of this study present a fundamental new mechanism that merits attention in the quest to find more efficacious and faster-acting antidepressant drugs," concluded Duman.

"The deeper that science digs into the biology underlying antidepressant action, the more complex it becomes. Yet understanding this complexity increases the power of the science, suggesting reasons for the limitations of antidepressant treatment and pointing to novel approaches to the treatment of depression," commented Dr. John Krystal, Editor of Biological Psychiatry and Chairman of the Department of Psychiatry at the Yale University School of Medicine.

Source: Bio-Medicine

Filed under science neuroscience brain psychology depression antidepressants

22 notes

Long-Term Methadone Treatment Can Affect Nerve Cells in Brain

ScienceDaily (Aug. 15, 2012) — Long-term methadone treatment can cause changes in the brain, according to recent studies from the Norwegian Institute of Public Health. The results show that treatment may affect the nerve cells in the brain. The studies follow on from previous studies where methadone was seen to affect cognitive functioning, such as learning and memory.

Since it is difficult to perform controlled studies of methadone patients and unethical to attempt in healthy volunteers, rats were used in the studies. Previous research has shown that methadone can affect cognitive functioning in both humans and experimental animals.

Sharp decrease in key signaling molecule

Rats were given a daily dose of methadone for three weeks. Once treatment was completed, brain areas which are central for learning and memory were removed and examined for possible neurobiological changes or damage.

In one study, on the day after the last exposure to methadone, there was a significant reduction (around 70 per cent) in the level of a signal molecule which is important in learning and memory, in both the hippocampus and in the frontal area of the brain. This reduction supports findings from a previous study (Andersen et al., 2011) where impaired attention in rats was found at the same time. At this time, methadone is no longer present in the brain. This indicates that methadone can lead to cellular changes that affect cognitive functioning after the drug has left the body, which may be cause for concern.

No effect on cell generation

The second study, a joint project with Southwestern University in Texas, investigated whether methadone affects the formation of nerve cells in the hippocampus. Previous research has shown that new nerve cells are generated in the hippocampus in both adult humans and rats, and that this formation is probably important for learning and memory. Furthermore, it has been shown that other opiates such as morphine and heroin can inhibit this formation. It was therefore reasonable to assume that methadone, which is also an opiate, would have the same effect.

However, the researchers did not find any change in the generation of new nerve cells after long-term methadone treatment. If the same is true in humans, this is probably more positive for methadone patients than continuing with heroin. However, the researchers do not know what effect methadone has on nerve cells that have previously been exposed to heroin.

Large gaps in knowledge

Since the mid-1960s, methadone has been used to treat heroin addiction. This is considered to be a successful treatment but, despite extensive and prolonged use, little is known about possible side effects. There are large knowledge gaps in this field.

Our studies show that prolonged methadone treatment can affect the nerve cells, and thus behaviour, but the results are not always as expected. Many more pre-clinical and clinical studies are needed to understand methadone’s effect on the brain, how this can result in altered cognitive function, and, if so, how long these changes last. Knowledge of this is important — both for the individual methadone patient and the outcome of treatment.

Source: Science Daily

Filed under brain learning memory methadone neuroscience science psychology

15 notes

Brain scans have revealed distinctive features in the brain structure of karate experts that are associated with how well they performed in a test of punching ability. 
Researchers from Imperial College London and UCL looked for differences in brain structure between 12 karate practitioners with a black belt rank and an average of 13.8 years’ karate experience, and 12 people of similar age who exercised regularly but did not have any martial arts experience.
Dr Ed Roberts, from the Department of Medicine at Imperial College London, who led the study, explained: "The karate black belts were able to repeatedly coordinate their punching action with a level of coordination that novices can’t produce. We think that ability might be related to fine-tuning of neural connections in the cerebellum, allowing them to synchronise their arm and trunk movements very accurately."
The scans used in this study, called diffusion tensor imaging (DTI), detected structural differences in the white matter of parts of the brain called the cerebellum and the primary motor cortex, which are known to be involved in controlling movement. The differences measured by DTI in the cerebellum correlated with the synchronicity of the subjects’ wrist and shoulder movements when punching.
The DTI signal also correlated with the age at which karate experts began training and their total experience of the discipline. These findings suggest that the structural differences in the brain are related to the black belts’ punching ability.
(Image credit: Adam J. Merton on Flickr)

Brain scans have revealed distinctive features in the brain structure of karate experts that are associated with how well they performed in a test of punching ability.

Researchers from Imperial College London and UCL looked for differences in brain structure between 12 karate practitioners with a black belt rank and an average of 13.8 years’ karate experience, and 12 people of similar age who exercised regularly but did not have any martial arts experience.

Dr Ed Roberts, from the Department of Medicine at Imperial College London, who led the study, explained: "The karate black belts were able to repeatedly coordinate their punching action with a level of coordination that novices can’t produce. We think that ability might be related to fine-tuning of neural connections in the cerebellum, allowing them to synchronise their arm and trunk movements very accurately."

The scans used in this study, called diffusion tensor imaging (DTI), detected structural differences in the white matter of parts of the brain called the cerebellum and the primary motor cortex, which are known to be involved in controlling movement. The differences measured by DTI in the cerebellum correlated with the synchronicity of the subjects’ wrist and shoulder movements when punching.

The DTI signal also correlated with the age at which karate experts began training and their total experience of the discipline. These findings suggest that the structural differences in the brain are related to the black belts’ punching ability.

(Image credit: Adam J. Merton on Flickr)

Filed under diffusion tensor imaging neuroimaging brain psychology neuroscience martial arts science

51 notes

Scientists Discover Previously Unknown Cleansing System in Brain

A previously unrecognized system that drains waste from the brain at a rapid clip has been discovered by neuroscientists at the University of Rochester Medical Center. The findings were published online August 15 in Science Translational Medicine.

The highly organized system acts like a series of pipes that piggyback on the brain’s blood vessels, sort of a shadow plumbing system that seems to serve much the same function in the brain as the lymph system does in the rest of the body – to drain away waste products.

“Waste clearance is of central importance to every organ, and there have been long-standing questions about how the brain gets rid of its waste,” said Maiken Nedergaard, M.D., D.M.Sc., senior author of the paper and co-director of the University’s Center for Translational Neuromedicine. This work shows that the brain is cleansing itself in a more organized way and on a much larger scale than has been realized previously.

Filed under science neuroscience brain glymphatic system neurodegenerative diseases psychology

38 notes

Acute Stress Alters Control of Gene Activity: Researchers Examine DNA Methylation

ScienceDaily (Aug. 15, 2012) — Acute stress alters the methylation of the DNA and thus the activity of certain genes. This is reported by researchers at the Ruhr-Universität Bochum together with colleagues from Basel, Trier and London for the first time in the journal Translational Psychiatry. “The results provide evidence how stress could be related to a higher risk of mental or physical illness,” says Prof. Dr. Gunther Meinlschmidt from the Clinic of Psychosomatic Medicine and Psychotherapy at the LWL University Hospital of the RUB. The team looked at gene segments which are relevant to biological stress regulation.

In stressful social situations, the methylation patterns (bright spheres) of the DNA change. (Credit: Illustration: Christoph Unternährer and Christian Horisberger)

Epigenetics — the “second code” — regulates gene activity

Our genetic material, the DNA, provides the construction manual for the proteins that our bodies need. Which proteins a cell produces depends on the cell type and the environment. So-termed epigenetic information determines which genes are read, acting quasi as a biological switch. An example of such a switch is provided by methyl (CH3) groups that attach to specific sections of the DNA and can remain there for a long time — even when the cell divides. Previous studies have shown that stressful experiences and psychological trauma in early life are associated with long-term altered DNA methylation. Whether the DNA methylation also changes after acute psychosocial stress, was, however, previously unknown.

Two genes tested

To clarify this issue, the research group examined two genes in particular: the gene for the oxytocin receptor, i.e. the docking site for the neurotransmitter oxytocin, which has become known as the “trust hormone” or “anti-stress hormone”; and the gene for the nerve growth factor Brain-Derived Neurotrophic Factor (BDNF), which is mainly responsible for the development and cross-linking of brain cells. The researchers tested 76 people who had to participate in a fictitious job interview and solve arithmetic problems under observation — a proven means for inducing acute stress in an experiment. For the analysis of the DNA methylation, they took blood samples from the subjects before the test as well as ten and ninety minutes afterwards.

DNA methylation changes under acute psychosocial stress

Stress had no effect on the methylation of the BDNF gene. In a section of the oxytocin receptor gene, however, methylation already increased within the first ten minutes of the stressful situation. This suggests that the cells formed less oxytocin receptors. Ninety minutes after the stress test, the methylation dropped below the original level before the test. This suggests that the receptor production was excessively stimulated.

Possible link between stress and disease

Stress increases the risk of physical or mental illness. The stress-related costs in Germany alone amount to many billions of Euros every year. In recent years, there have been indications that epigenetic processes are involved in the development of various chronic diseases such as cancer or depression. “Epigenetic changes may well be an important link between stress and chronic diseases” says Prof. Meinlschmidt, Head of the Research Department of Psychobiology, Psychosomatics and Psychotherapy at the LWL University Hospital. “We hope to identify more complex epigenetic stress patterns in future and thus to be able to determine the associated risk of disease. This could provide information on new approaches to treatment and prevention.” The work originated within the framework of an interdisciplinary research consortium with the University of Trier, the University of Basel and King’s College London. The German Research Foundation and the Swiss National Science Foundation supported the study.

Source: Science Daily

Filed under brain neuroscience psychology science stress disease DNA methylation DNA

72 notes

In a major breakthrough, an international team of scientists has proven that addiction to morphine and heroin can be blocked, while at the same time increasing pain relief.
The team from the University of Adelaide and University of Colorado has discovered the key mechanism in the body’s immune system that amplifies addiction to opioid drugs. Laboratory studies have shown that the drug (+)-naloxone will selectively block the immune-addiction response. The results - which could eventually lead to new co-formulated drugs that assist patients with severe pain, as well as helping heroin users to kick the habit - will be published in the Journal of Neuroscience.
"Our studies have shown conclusively that we can block addiction via the immune system of the brain, without targeting the brain’s wiring," says the lead author of the study, Dr Mark Hutchinson, ARC Research Fellow in the University of Adelaide’s School of Medical Sciences.
"Both the central nervous system and the immune system play important roles in creating addiction, but our studies have shown we only need to block the immune response in the brain to prevent cravings for opioid drugs."
Watch a video of Dr Mark Hutchinson talking about this study.

In a major breakthrough, an international team of scientists has proven that addiction to morphine and heroin can be blocked, while at the same time increasing pain relief.

The team from the University of Adelaide and University of Colorado has discovered the key mechanism in the body’s immune system that amplifies addiction to opioid drugs. Laboratory studies have shown that the drug (+)-naloxone will selectively block the immune-addiction response. The results - which could eventually lead to new co-formulated drugs that assist patients with severe pain, as well as helping heroin users to kick the habit - will be published in the Journal of Neuroscience.

"Our studies have shown conclusively that we can block addiction via the immune system of the brain, without targeting the brain’s wiring," says the lead author of the study, Dr Mark Hutchinson, ARC Research Fellow in the University of Adelaide’s School of Medical Sciences.

"Both the central nervous system and the immune system play important roles in creating addiction, but our studies have shown we only need to block the immune response in the brain to prevent cravings for opioid drugs."

Watch a video of Dr Mark Hutchinson talking about this study.

Filed under science neuroscience brain psychology pain morphine heroin opioid drugs addiction

141 notes

Yale researchers studying epileptic seizures have shed new light on the neurological origins of consciousness.
When epileptics lose consciousness during a variety of types of seizures, the signals converge on the same brain structures, but through different pathways, says Dr. Hal Blumenfeld, professor of neurology, neurobiology, and neurosurgery.
“Understanding of these mechanisms could lead to improved treatment strategies to prevent impairment of consciousness and improve the quality of life of people with epilepsy,” he said.
Blumenfeld’s research is described in the current issue of the journal Lancet Neurology.
(Image: The fMRI images are different viewpoints of the brain of a child experiencing an epileptic seizure. Areas in yellow and orange represent increased brain activity compared to its normal state, while areas in blue show decreased activity. These are the areas of the brain needed for normal consciousness.)

Yale researchers studying epileptic seizures have shed new light on the neurological origins of consciousness.

When epileptics lose consciousness during a variety of types of seizures, the signals converge on the same brain structures, but through different pathways, says Dr. Hal Blumenfeld, professor of neurology, neurobiology, and neurosurgery.

“Understanding of these mechanisms could lead to improved treatment strategies to prevent impairment of consciousness and improve the quality of life of people with epilepsy,” he said.

Blumenfeld’s research is described in the current issue of the journal Lancet Neurology.

(Image: The fMRI images are different viewpoints of the brain of a child experiencing an epileptic seizure. Areas in yellow and orange represent increased brain activity compared to its normal state, while areas in blue show decreased activity. These are the areas of the brain needed for normal consciousness.)

Filed under consciousness epilepsy seizures science brain psychology neuroscience

free counters