Posts tagged science

Posts tagged science
In 2000, a team of neuroscientists put an unusual idea to the test. Stress and depression, they knew, made neurons wither and die – particularly in the hippocampus, a brain area crucial for memory. So the researchers put some stressed-out rats on an antidepressant regimen, hoping the mood boost might protect some of those hippocampal neurons. When they checked in a few weeks later, though, the team found that rats’ hippocampuses hadn’t just survived intact; they’d grown whole new neurons – bundles of them. But that’s only the beginning of our tale.

Neural stem cells (green) in the hippocampus huddle around a neuron (purple), listening for stray signals.
By the time 2009 rolled around, another team of researchers was suggesting that human brains might get a similar hippocampal boost from antidepressants. The press announced the discovery with headlines like, “Antidepressants Grow New Brain Cells” – although not everyone agreed with that conclusion. Still, whether the principle applied to humans or not, a far more basic question was begging to be answered: How, exactly, does a brain tell new cells to form?
“Well, through synapses, of course,” you might answer – and that’d be a very reasonable guess. After all, synapses are how most neurons talk to each other: electrochemical information is “squirted” from a tiny tendril of one neuron into the tip of a tendril on another; and cells throughout most of the brain share essentially this same mechanism for passing signals along: The signals coming out of Neuron A’s synapses keep bugging Neuron B by stimulating its synapses, until finally Neuron B caves under peer pressure and bugs Neuron C with the signal… and so on.
There are, however, two significant exceptions to this system.
The first exception was discovered a few years ago, as scientists got more and more curious about the role of neuroglia (also known as just “glia”), synapse-less cells that many had assumed were just there to serve as structural support for neurons. A 2008 study showed that glia help control cerebral blood flow, and research in 2010 demonstrated that some glia – cells known as astrocytes – actively listen for and respond to certain neurotransmitter messages. These so-called “quiet cells” are actually pretty loud talkers once you learn to tune in to their chatter.
The second exception to the synapse rule is even more mysterious – in large part because it’s a brand-new discovery: As the journal Nature reports, a team led by Hongjun Song at the Johns Hopkins University School of Medicine have found that neural stem cells “listen in” on the stray chemical signals that leak from synapses.
You can imagine neural stem cells as being sort of “neural embryos” – depending on the surrounding conditions, they can develop into neurons or into glia. And here’s what’s strange about the way these cells communicate: They respond not to any single synaptic signal, but to the overall chemical “vibe” of their environment – to chronic feelings of stress, for instance. By way of response, they may morph into neurons or glia – or even tell the brain to crank out some all-new cells.
Neural stem cells seem to be particularly interested in the chemical GABA (gamma-aminobutyric acid) – a neurotransmitter that’s known to be involved in inhibiting signals from other neurons. When scientists artificially block these stem cells’ GABA receptors from receiving messages, the cells “wake up” and start replicating – but when those GABA signals are allowed to reach the receptors, the stem cells stay dormant.
“In this case,” Song explains, “GABA communication keeps the brain stem cells in reserve, so if we don’t need them, we don’t use them up.”
In short, leaky synapses aren’t wasteful – as a matter of fact, they’re essential to the brain’s self-sculpting abilities. And this implies something pretty interesting: It isn’t just individual signals that convey neural information, but whole experiences. In that respect, a brain – whether it belongs to a rat or a human – is unlike any computer on earth.
August 15, 2012
Source: Scientific American
When something goes wrong in your brain, you’d think it would be a good idea to get rid of the problem. Turns out, sometimes it’s best to keep hold of it. By preventing faulty proteins from being destroyed, researchers have delayed the symptoms of a degenerative brain disorder.
SNAP25 is one of three proteins that together make up a complex called SNARE, which plays a vital role in allowing neurons to communicate with each other. In order to work properly, all the proteins must be folded in a specific way. CSP alpha is one of the key proteins that ensures SNAP25 is correctly folded.
Cells have a backup system to deal with any misfolded proteins – they are destroyed by a bell-shaped enzyme called a proteasome, which pulls the proteins inside itself and breaks them down.
People with a genetic mutation that affects the CSP alpha protein – and its ability to correctly fold SNAP25 – can develop a rare brain disorder called neuronal ceroid lipofuscinosis (NCL). The disorder causes significant damage to neurons – people affected gradually lose their cognitive abilities and struggle to move normally.
To find out what role proteasomes might play in NCL, Manu Sharma and his colleagues at Stanford University in California blocked the enzyme in mice that were bred to lack CSP alpha. “We weren’t sure what would happen,” says Sharma. Either the misfolded SNAP25 would accumulate and harm the cells, or some of the misfolded proteins may work well enough to retain some of their function.
It appears it was the latter. Mice bred to lack CSP alpha suffer the same physical and cognitive problems as humans, and tend to survive for about 65 to 80 days, rather than the normal 670 days. But mice injected with a drug that blocked protease lived, on average, an extra 15 days. “Fifteen days might not sound like much, but as a percentage it’s quite significant,” says Sharma. What’s more, treated mice were able to stave off measurable movement and cognitive symptoms for an extra 10 days.
The finding goes against the idea that neurodegenerative disorders should be treated by clearing away misfolded proteins, rather than trying to rescue their function. “People normally think that protease isn’t working hard enough,” says Nico Dantuma at the Karolinska Institute in Stockholm, Sweden, who was not involved in the study.
But whether or not the drugs are likely to work in other neurodegenerative disorders involving aggregations of misfolded proteins, such as Alzheimer’s and Parkinson’s disease, is up for debate. “I don’t think their results prove that clearing misfolded proteins is not a useful therapeutic,” says Ana Maria Cuervo at Albert Einstein College of Medicine in New York. Other studies that increase the degrading of misfolded proteins have been shown to improve symptoms in other neurodegenerative diseases, she says.
"There are two sides of the coin," says Dantuma. "You might rescue functioning proteins from being degraded… but it’s too early to extrapolate these results to Alzheimer’s and Parkinson’s disease."
In the meantime, drugs that block proteasome are already used to treat cancer, so Sharma hopes they can soon be trialled in people with NCL.
Source: NewScientist
The 2007 study by Yale University researchers provided the first evidence that 6- and 10-month-old infants could assess individuals based on their behaviour towards others, showing a preference for those who helped rather than hindered another individual.
Based on a series of experiments, researchers in the Department of Psychology at Otago have shown that the earlier findings may simply be the result of infants’ preferences for interesting and attention grabbing events, rather than an ability to evaluate individuals based on their social interactions with others.
"The paper received a lot of attention when it was first published, including coverage in the New York Times. It has received well over 100 citations since 2007, a phenomenal number over such a short period. The paper was initially brought to our attention by one of the PhD students in our lab. The head of the lab, Professor Harlene Hayne, suggested that a group of us read the paper together and then meet to discuss it. Our original motivation for reading the paper was merely interest. Obviously, the idea that morality is innate is extremely interesting and, if true, would raise questions about which components of our moral system are innate and also have implications for the wider issue of the roles that nature and nurture play in development," says Dr Scarf.
The Otago study was recently published in PLoS One
Depression takes a substantial toll on brain health. Brain imaging and post-mortem studies provide evidence that the wealth of connections in the brain are reduced in individuals with depression, with the result of impaired functional connections between key brain centers involved in mood regulation. Glial cells are one of the cell types that appear to be particularly reduced when analyzing post-mortem brain tissue from people who had depression. Glial cells support the growth and function of nerve cells and their connections.
Over the past several years, it has become increasingly recognized that antidepressants produce positive effects on brain structure that complement their effects on symptoms of depression. These structural effects of antidepressants appear to depend, in large part, on their ability to raise the levels of growth factors in the brain.
In a new study, Elsayed and colleagues from the Yale University School of Medicine report their findings on a relatively novel growth factor named fibroblast growth factor-2 or FGF2. They found that FGF2 can increase the number of glial cells and block the decrease caused by chronic stress exposure by promoting the generation of new glial cells.
Senior author Dr. Ronald Duman said, “Our study uncovers a new pathway that can be targeted for treating depression. Our research shows that we can increase the production and maintenance of glial cells that are important for supporting neurons, providing an enriched environment for proper neuronal function.”
To study whether FGF2 can treat depression, the researchers used rodent models where animals are subjected to various natural stressors, which can trigger behaviors that are similar to those expressed by depressed humans, such as despair and loss of pleasure. FGF2 infusions restored the deficit in glial cell number caused by chronic stress. An underlying molecular mechanism was also identified when the data showed that antidepressants increase glial generation and function via increasing FGF2 signaling.
"Although more research is warranted to explore the contribution of glial cells to the antidepressant effects of FGF2, the results of this study present a fundamental new mechanism that merits attention in the quest to find more efficacious and faster-acting antidepressant drugs," concluded Duman.
"The deeper that science digs into the biology underlying antidepressant action, the more complex it becomes. Yet understanding this complexity increases the power of the science, suggesting reasons for the limitations of antidepressant treatment and pointing to novel approaches to the treatment of depression," commented Dr. John Krystal, Editor of Biological Psychiatry and Chairman of the Department of Psychiatry at the Yale University School of Medicine.
Source: Bio-Medicine
New genetic data shows humans and great apes diverged earlier than thought
To calculate when a species diverged, researchers look at the average age of members of the species when they give birth and mutation rates. The older the average age, the more time it takes for mutations to cause changes. Insects that produce offspring in a matter of months, for example, can adapt much more quickly to environmental changes than large animals that produce offspring many years after they themselves are born. To find such data for both chimps and gorillas, the research team worked with many groups in Africa that included studies of the animals that totaled 105 gorillas and 226 chimps. They also looked at fossilized excrement that contained DNA data. In so doing they found that the average age of giving birth for female chimps was 25 years old. They then divided the number of mutations found by the average age of birth to get the mutation rate. In so doing, they found it to be slower than humans, which meant that estimates based on it to calculate divergence times were likely off by as much as a million years.
The end result of the team’s research indicates that humans and chimps likely diverged some seven to eight million years ago, while the divergence of gorillas (which led to both humans and chimps) came approximately eight to nineteen million years ago. To put the numbers in perspective, humans and Neanderthals split just a half to three quarters of a million years ago.
Low-Power Chips to Model a Billion Neurons
It’s a little sobering, actually. The average human brain packs a hundred billion or so neurons—connected by a quadrillion (1015) constantly changing synapses—into a space the size of a cantaloupe. It consumes a paltry 20 watts, much less than a typical incandescent lightbulb. But simulating this mess of wetware with traditional digital circuits would require a supercomputer that’s a good 1000 times as powerful as the best ones we have available today. And we’d need the output of an entire nuclear power plant to run it.
Fortunately, we don’t have to rely on traditional, power-hungry computers to get us there. Scattered around the world are at least half a dozen projects dedicated to building brain models using specialized analog circuits. Unlike the digital circuits in traditional computers, which could take weeks or even months to model a single second of brain operation, these analog circuits can model brain activity as fast as or even faster than it really occurs, and they consume a fraction of the power. But analog chips do have one serious drawback—they aren’t very programmable. The equations used to model the brain in an analog circuit are physically hardwired in a way that affects every detail of the design, right down to the placement of every analog adder and multiplier. This makes it hard to overhaul the model, something we’d have to do again and again because we still don’t know what level of biological detail we’ll need in order to mimic the way brains behave.
To help things along, my colleagues and I are building something a bit different: the first low-power, large-scale digital model of the brain. Dubbed SpiNNaker, for Spiking Neural Network Architecture, our machine looks a lot like a conventional parallel computer, but it boasts some significant changes to the way chips communicate. We expect it will let us model brain activity with speeds matching those of biological systems but with all the flexibility of a supercomputer.
Another team, led by Dharmendra Modha at IBM Almaden Research Center, in San Jose, Calif., works on supercomputer models of the cortex, the outer, information-processing layer of the brain, using simpler neuron models. In 2009, team members at IBM and Lawrence Livermore National Laboratory showed they could simulate the activity of 900 million neurons connected by 9 trillion synapses, more than are in a cat’s cortex. But as has been the case for all such models, its simulations were quite slow. The computer needed many minutes to model a second’s worth of brain activity.
One way to speed things up is by using custom-made analog circuits that directly mimic the operation of the brain. Traditional analog circuits—like the chips being developed by the BrainScaleS project at the Kirchhoff Institute for Physics, in Heidelberg, Germany—can run 10 000 times as fast as the corresponding parts of the brain. They’re also fabulously energy efficient. A digital logic circuit may need thousands of transistors to perform a multiplication, but analog circuits need only a few. When you break it down to the level of modeling the transmission of a single neural signal, these circuits consume about 0.001 percent as much energy as a supercomputer would need to perform the same task. Considering you’d need to perform that operation 10 quadrillion times a second, that translates into some significant energy savings. While a whole brain model built using today’s digital technology could easily consume more than US $10 billion a year in electricity, the power bill for a similar-scale analog system would likely come to less than $1 million.
ScienceDaily (Aug. 15, 2012) — Long-term methadone treatment can cause changes in the brain, according to recent studies from the Norwegian Institute of Public Health. The results show that treatment may affect the nerve cells in the brain. The studies follow on from previous studies where methadone was seen to affect cognitive functioning, such as learning and memory.
Since it is difficult to perform controlled studies of methadone patients and unethical to attempt in healthy volunteers, rats were used in the studies. Previous research has shown that methadone can affect cognitive functioning in both humans and experimental animals.
Sharp decrease in key signaling molecule
Rats were given a daily dose of methadone for three weeks. Once treatment was completed, brain areas which are central for learning and memory were removed and examined for possible neurobiological changes or damage.
In one study, on the day after the last exposure to methadone, there was a significant reduction (around 70 per cent) in the level of a signal molecule which is important in learning and memory, in both the hippocampus and in the frontal area of the brain. This reduction supports findings from a previous study (Andersen et al., 2011) where impaired attention in rats was found at the same time. At this time, methadone is no longer present in the brain. This indicates that methadone can lead to cellular changes that affect cognitive functioning after the drug has left the body, which may be cause for concern.
No effect on cell generation
The second study, a joint project with Southwestern University in Texas, investigated whether methadone affects the formation of nerve cells in the hippocampus. Previous research has shown that new nerve cells are generated in the hippocampus in both adult humans and rats, and that this formation is probably important for learning and memory. Furthermore, it has been shown that other opiates such as morphine and heroin can inhibit this formation. It was therefore reasonable to assume that methadone, which is also an opiate, would have the same effect.
However, the researchers did not find any change in the generation of new nerve cells after long-term methadone treatment. If the same is true in humans, this is probably more positive for methadone patients than continuing with heroin. However, the researchers do not know what effect methadone has on nerve cells that have previously been exposed to heroin.
Large gaps in knowledge
Since the mid-1960s, methadone has been used to treat heroin addiction. This is considered to be a successful treatment but, despite extensive and prolonged use, little is known about possible side effects. There are large knowledge gaps in this field.
Our studies show that prolonged methadone treatment can affect the nerve cells, and thus behaviour, but the results are not always as expected. Many more pre-clinical and clinical studies are needed to understand methadone’s effect on the brain, how this can result in altered cognitive function, and, if so, how long these changes last. Knowledge of this is important — both for the individual methadone patient and the outcome of treatment.
Source: Science Daily
New University of Otago research into two sex hormones released by the testes of male fetuses and boys may help solve the enduring mystery of why autism is much more common in boys than girls.
The researchers studied blood samples from 82 boys with ASD and 16 control boys, all aged between 4.4 to 8.9 years. Measuring the levels of the two hormones, the researchers found that these were highly variable from boy to boy, but no different on average between the two groups of boys.
Professor McLennan says the findings indicate that male hormones are important for autism, but not because autistic boys have abnormal levels.
While it has been previously suggested that exposure in the womb to excessive levels of testosterone might be creating an ‘extreme male brain’, this does not explain why some females have autism, or why males with autism do not exhibit an extreme male physical form.
"Our data suggest that the still-elusive primary initiating cause of ASD is common to both males and females, with the condition being more frequent in males because normal levels of male hormones exacerbates the pathology,” he says.
The researchers say that their hypothesis now needs further testing through longitudinal studies of at-risk male babies to determine whether their levels of AMH and InhB early in development can predict the breadth of autistic traits later in life.
(Image credit: ©iStockphoto.com/ktaylorg)
Brain scans have revealed distinctive features in the brain structure of karate experts that are associated with how well they performed in a test of punching ability.
Researchers from Imperial College London and UCL looked for differences in brain structure between 12 karate practitioners with a black belt rank and an average of 13.8 years’ karate experience, and 12 people of similar age who exercised regularly but did not have any martial arts experience.
Dr Ed Roberts, from the Department of Medicine at Imperial College London, who led the study, explained: "The karate black belts were able to repeatedly coordinate their punching action with a level of coordination that novices can’t produce. We think that ability might be related to fine-tuning of neural connections in the cerebellum, allowing them to synchronise their arm and trunk movements very accurately."
The scans used in this study, called diffusion tensor imaging (DTI), detected structural differences in the white matter of parts of the brain called the cerebellum and the primary motor cortex, which are known to be involved in controlling movement. The differences measured by DTI in the cerebellum correlated with the synchronicity of the subjects’ wrist and shoulder movements when punching.
The DTI signal also correlated with the age at which karate experts began training and their total experience of the discipline. These findings suggest that the structural differences in the brain are related to the black belts’ punching ability.
(Image credit: Adam J. Merton on Flickr)
Scientists Discover Previously Unknown Cleansing System in Brain
A previously unrecognized system that drains waste from the brain at a rapid clip has been discovered by neuroscientists at the University of Rochester Medical Center. The findings were published online August 15 in Science Translational Medicine.
The highly organized system acts like a series of pipes that piggyback on the brain’s blood vessels, sort of a shadow plumbing system that seems to serve much the same function in the brain as the lymph system does in the rest of the body – to drain away waste products.
“Waste clearance is of central importance to every organ, and there have been long-standing questions about how the brain gets rid of its waste,” said Maiken Nedergaard, M.D., D.M.Sc., senior author of the paper and co-director of the University’s Center for Translational Neuromedicine. This work shows that the brain is cleansing itself in a more organized way and on a much larger scale than has been realized previously.