Posts tagged science

Posts tagged science
Piglets substitute for human babies in cognitive science maze test
A team from the Beckman Institute at the University of Illinois is using piglets instead of human babies to try and model the cognitive development of infants.
Human infants cannot be used as laboratory subjects. The idea to use piglets came when one of neuroscientist Rodney Johnson’s former students, who was working for an infant formula company, asked him about finding ways to monitor the differences in cognitive development between breast-fed and formula-fed children.
As a result he and his colleague Ryan Dilger became interested in using the neonatal piglet as a model for human brain development. The growth and development of the piglet brain is similar to that of the human brain — at birth the human brain is 25 percent of adult size. In the first two years of life, it reaches 85 to 90 percent of adult size. The piglet brain grows in a similar way in a shorter time.
They wanted to see whether they could develop tests to look at learning and memory development using these pigs. First of all they developed MRI techniques to quantify the size of the brain, taking measurements at regular intervals.
They then developed a test using a maze to assess piglets’ learning and memory. This turned out to be much more complicated than expected. Johnson said: “When we first started these studies, we used things like Skittles and apple slices as a reward because that’s what people using older pigs had done.”
However, the piglets were used to being fed on infant formula and so had no interest in solid food, nor were they motivated to perform tasks if the reward was the same as their regular food. The solution was to use Nesquik chocolate milk as a reward.
Tests took place in a plus-sign-shaped maze with one arm blocked off to leave a T shape. Piglets were trained to locate the milk reward using visual cues from outside the maze. When they learned how to do this and the reward location was moved, and the pigs were retested to assess learning and working memory.
Having established that the tests can be used to measure cognitive abilities, the team will examine how nutrient deficiencies (such as iron) and infections (such as pneumonia) affect the human brain during this time of early brain growth.
Johnson said: “There is a lot of interest in the concept of programming, the notion that things that occur early in life set that individual up for problems that occur many years later. Because the pig brain grows so much like a human brain, we thought this could be a very attractive model.”
In order to measure changes in the brain they look at neuroinflammation, neuron growth and changes, as well as biochemical in the brain.
The team hopes to receive funding to look at maternal viral infections, where pregnant pigs will be infected with diseases to see how it affects the brain development of their offspring.

Connectomics: Mapping the Neural Network Governing Male Roundworm Mating
In a study published today online in Science, researchers at Albert Einstein College of Medicine of Yeshiva University have determined the complete wiring diagram for the part of the nervous system controlling mating in the male roundworm Caenorhabditis elegans, an animal model intensively studied by scientists worldwide.
The study represents a major contribution to the new field of connectomics – the effort to map the myriad neural connections in a brain, brain region or nervous system to find the specific nerve connections responsible for particular behaviors. A long-term goal of connectomics is to map the human “connectome” – all the nerve connections within the human brain.
Because C. elegans is such a tiny animal – adults are one millimeter long and consist of just 959 cells – its simple nervous system totaling 302 neurons make it one of the best animal models for understanding the millions-of-times-more-complex human brain.
The Einstein scientists solved the structure of the male worm’s neural mating circuits by developing software that they used to analyze serial electron micrographs that other scientists had taken of the region. They found that male mating requires 144 neurons – nearly half the worm’s total number – and their paper describes the connections between those 144 neurons and 64 muscles involving some 8,000 synapses. A synapse is the junction at which one neuron (nerve cell) passes an electrical or chemical signal to another neuron.
"Establishing the complete structure of the synaptic network governing mating behavior in the male roundworm has been highly revealing," said Scott Emmons, Ph.D., senior author of the paper and professor in the department of genetics and in the Dominick P. Purpura Department of Neuroscience at Einstein. "We can see that the structure of this network has spatial characteristics that help explain how it exerts neural control over the multi-step decision-making process involved in mating."
In addition to determining how the neurons and muscles are connected, Dr. Emmons and his colleagues for the first time accurately measured the weights of those connections, i.e., an estimate of the strength with which one neuron or muscle communicates with another.
ScienceDaily (July 25, 2012) — A team of University of California, Berkeley, scientists in collaboration with researchers at the University of Munich and University of Washington, in Seattle, has discovered a chemical that temporarily restores some vision to blind mice, and is working on an improved compound that may someday allow people with degenerative blindness to see again.

Mice with a genetic disease that causes blindness regained some sight after injection with a chemical “photoswitch.” The eye of the untreated mouse on the left shows no response to light, while the pupil of the mouse on the right, which was injected with the chemical, contracts in light. (Credit: Image courtesy of University of California - Berkeley)
The approach could eventually help those with retinitis pigmentosa, a genetic disease that is the most common inherited form of blindness, as well as age-related macular degeneration, the most common cause of acquired blindness in the developed world. In both diseases, the light sensitive cells in the retina — the rods and cones — die, leaving the eye without functional photoreceptors.
The chemical, called AAQ, acts by making the remaining, normally “blind” cells in the retina sensitive to light, said lead researcher Richard Kramer, UC Berkeley professor of molecular and cell biology. AAQ is a photoswitch that binds to protein ion channels on the surface of retinal cells. When switched on by light, AAQ alters the flow of ions through the channels and activates these neurons much the way rods and cones are activated by light.
"This is similar to the way local anesthetics work: they embed themselves in ion channels and stick around for a long time, so that you stay numb for a long time," Kramer said. "Our molecule is different in that it’s light sensitive, so you can turn it on and off and turn on or off neural activity."
Because the chemical eventually wears off, it may offer a safer alternative to other experimental approaches for restoring sight, such as gene or stem cell therapies, which permanently change the retina. It is also less invasive than implanting light-sensitive electronic chips in the eye.
"The advantage of this approach is that it is a simple chemical, which means that you can change the dosage, you can use it in combination with other therapies, or you can discontinue the therapy if you don’t like the results. As improved chemicals become available, you could offer them to patients. You can’t do that when you surgically implant a chip or after you genetically modify somebody," Kramer said.
"This is a major advance in the field of vision restoration," said co-author Dr. Russell Van Gelder, an ophthalmologist and chair of the Department of Ophthalmology at the University of Washington, Seattle.
Kramer, Van Gelder, chemist Dirk Trauner and their colleagues at UC Berkeley, the University of Washington, Seattle, and the University of Munich will publish their findings on July 26, in the journal Neuron.
The blind mice in the experiment had genetic mutations that made their rods and cones die within months of birth and inactivated other photopigments in the eye. After injecting very small amounts of AAQ into the eyes of the blind mice, Kramer and his colleagues confirmed that they had restored light sensitivity because the mice’s pupils contracted in bright light, and the mice showed light avoidance, a typical rodent behavior impossible without the animals being able to see some light. Kramer is hoping to conduct more sophisticated vision tests in rodents injected with the next generation of the compound.
"The photoswitch approach offers real hope to patients with retinal degeneration," Van Gelder said. "We still need to show that these compounds are safe and will work in people the way they work in mice, but these results demonstrate that this class of compound restores light sensitivity to retinas blind from genetic disease."
From optogenetics to implanted chips
The current technologies being evaluated for restoring sight to people whose rods and cones have died include injection of stem cells to regenerate the rods and cones; “optogenetics,” that is, gene therapy to insert a photoreceptor gene into blind neurons to make them sensitive to light; and installation of electronic prosthetic devices, such as a small light-sensitive retinal chip with electrodes that stimulate blind neurons. Several dozen people already have retinal implants and have had rudimentary, low vision restored, Kramer said.
Eight years ago, Kramer, Trauner, a former UC Berkeley chemist now at the University of Munich, and their colleagues developed an optogenetic technique to chemically alter potassium ion channels in blind neurons so that a photoswitch could latch on. Potassium channels normally open to turn a cell off, but with the attached photoswitch, they were opened when hit by ultraviolet light and closed when hit by green light, thereby activating and deactivating the neurons.
Subsequently, Trauner synthesized AAQ (acrylamide-azobenzene-quaternary ammonium), a photoswitch that attaches to potassium channels without the need to genetically modify the channel. Tests of this compound are reported in the current Neuron paper.
New versions of AAQ now being tested are better, Kramer said. They activate neurons for days rather than hours using blue-green light of moderate intensity, and these photoswitches naturally deactivate in darkness, so that a second color of light is not needed to switch them off.
"This is what we are really excited about," he said.
Source: Science Daily
ScienceDaily (July 25, 2012) — A new gene therapy approach can reverse hearing loss caused by a genetic defect in a mouse model of congenital deafness, according to a preclinical study published by Cell Press in the July 26 issue of the journal Neuron. The findings present a promising therapeutic avenue for potentially treating individuals who are born deaf.

(Credit: © Vasiliy Koval / Fotolia)
"This is the first time that an inherited, genetic hearing loss has been successfully treated in laboratory mice, and as such represents an important milestone for treating genetic deafness in humans," says senior study author Lawrence Lustig of the University of California, San Francisco.
Hearing loss is one of the most common human sensory deficits, and it results from damage to hair cells in the inner ear. About half of the cases of congenital hearing loss are caused by genetic defects. However, the current treatment options — hearing amplification devices and cochlear implants — do not restore hearing to normal levels. Correcting the underlying genetic defects has the potential to fully restore hearing, but previous attempts to reverse hearing loss caused by genetic mutations have not been successful.
Addressing this challenge in the new study, Lustig and his team used mice with hereditary deafness caused by a mutation in a gene coding for a protein called vesicular glutamate transporter-3 (VGLUT3). This protein is crucial for inner hair cells to send signals that enable hearing. Two weeks after the researchers delivered the VGLUT3 gene into the inner ear through an injection, hearing was restored in all of the mice. This improvement lasted between seven weeks and one and a half years when adult mice were treated, and at least nine months when newborn mice received the treatment.
The therapy did not damage the inner ear, and it even corrected some structural defects in the inner hair cells. Because the specific gene delivery method used is safe and effective in animals, the findings hold promise for future human studies. “For years, scientists have been hinting at the possibility of gene therapy as a potential cure for deafness,” Lustig says. “In this study, we now provide a very real and big step towards that goal.”
Source: Science Daily
July 25, 2012
Raising levels of the neurotransmitter dopamine in the frontal cortex of the brain significantly decreased impulsivity in healthy adults, in a study conducted by researchers at the Ernest Gallo Clinic and Research Center at the University of California, San Francisco.
"Impulsivity is a risk factor for addiction to many substances, and it has been suggested that people with lower dopamine levels in the frontal cortex tend to be more impulsive," said lead author Andrew Kayser, PhD, an investigator at Gallo and an assistant professor of neurology at UCSF. "We wanted to see if we could decrease impulsivity by raising dopamine, and it seems as if we can."
The study was published on July 4 in the Journal of Neuroscience.
In a double-blinded, placebo-controlled study, 23 adult research participants were given either tolcapone, a medication approved by the Food and Drug Administration (FDA) that inhibits a dopamine-degrading enzyme, or a placebo. The researchers then gave the participants a task that measured impulsivity, asking them to make a hypothetical choice between receiving a smaller amount of money immediately (“smaller sooner”) or a larger amount at a later time (“larger later”). Each participant was tested twice, once with tolcapone and once with placebo.
Participants – especially those who were more impulsive at baseline – were more likely to choose the less impulsive “larger later” option after taking tolcapone than they were after taking the placebo.
Magnetic resonance imaging conducted while the participants were taking the test confirmed that regions of the frontal cortex associated with decision-making were more active in the presence of tolcapone than in the presence of placebo.
"To our knowledge, this is the first study to use tolcapone to look for an effect on impulsivity," said Kayser.
The study was not designed to investigate the reasons that reduced dopamine is linked with impulsivity. However, explained Kayser, scientists believe that impulsivity is associated with an imbalance in dopamine between the frontal cortex, which governs executive functions such as cognitive control and self-regulation, and the striatum, which is thought to be involved in the planning and modification of more habitual behaviors.
"Most, if not all, drugs of abuse, such as cocaine and amphetamine, directly or indirectly involve the dopamine system," said Kayser. "They tend to increase dopamine in the striatum, which in turn may reward impulsive behavior. In a very simplistic fashion, the striatum is saying ‘go,’ and the frontal cortex is saying ‘stop.’ If you take cocaine, you’re increasing the ‘go’ signal, and the ‘stop’ signal is not adequate to counteract it."
Kayser and his research team plan a follow-up study of the effects of tolcapone on drinking behavior. “Once we determine whether drinkers can safely tolerate this medication, we will see if it has any effect on how much they drink while they’re taking it,” said Kayser.
Tolcapone is approved as a medication for Parkinson’s disease, in which a chronic deficit of dopamine inhibits movement.
Provided by University of California, San Francisco
Source: medicalxpress.com

Ecstasy Harms Memory With One Year of Recreational Use
New research published online July 25 by the scientific journal Addiction, gives some of the first information available on the actual risk of using ecstasy. It shows that even in recreational amounts over a relatively short time period, ecstasy users risk specific memory impairments. Further, as the nature of the impairments may not be immediately obvious to the user, it is possible people wouldn’t get the signs that they are being damaged by drug use until it is too late.
According to the study, new ecstasy users who took ten or more ecstasy pills over their first year of use showed decreased function of their immediate and short-term memory compared with their pre-ecstasy performance. These findings are associated with damage of the hippocampus, the area of the brain that oversees memory function and navigation. Interestingly, hippocampal damage is one of the first signs of Alzheimer’s disease, resulting in memory loss and disorientation.
July 25, 2012
Cognition psychologists at the Ruhr-Universität together with colleagues from the University Hospital Bergmannsheil (Prof. Dr. Martin Tegenthoff) have discovered why stressed persons are more likely to lapse back into habits than to behave goal-directed. The team of PD Dr. Lars Schwabe and Prof. Dr. Oliver Wolf from the Institute for Cognitive Neuroscience have mimicked a stress situation in the body using drugs. They then examined the brain activity using functional MRI scanning. The researchers have now reported in the Journal of Neuroscience that the interaction of the stress hormones hydrocortisone and noradrenaline shut down the activity of brain regions for goal-directed behaviour. The brain regions responsible for habitual behaviour remained unaffected.
In order to test the different stress hormones, the cognition psychologists used three substances - a placebo, the stress hormone hydrocortisone and yohimbine, which ensures that the stress hormone noradrenaline stays active longer. Part of the volunteers received hydrocortisone alone or just yohimbine, others both substances. A fourth group were administered a placebo. Altogether, the data of 69 volunteers was included in the study.
In the experiment, all participants - both male and female - learned that they would receive cocoa or orange juice as a reward if they chose certain symbols on the computer. After this learning phase, volunteers were allowed to eat as many oranges or as much chocolate pudding as they liked. “That weakens the value of the reward”, explained Schwabe. “Whoever eats chocolate pudding will lose the attraction to cocoa. Whoever is satiated with oranges, has less appetite for orange juice.” In this context, goal-directed behaviour means: Whoever has previously eaten the chocolate pudding, chooses the symbols leading to cocoa reward less frequently. Whoever is satiated with oranges, selects less frequently the symbols associated with orange juice. Based on previous results, the scientists assumed that only the combination of yohimbine and hydrocortisone attenuates goal-directed behaviour. They have now confirmed this hypothesis.
As expected, volunteers who took yohimbine and hydrocortisone did not behave goal-directed but according to habit. In other words, satiation with oranges or chocolate pudding had no effect. Persons who had taken a placebo or only one medication, on the other hand, behaved goal-directed and showed a satiating effect. The brain data revealed: The combination of yohimbine and hydrocortisone reduced the activity in the forebrain – in the so-called orbitofrontal and medial prefrontal cortex. These areas have been already previously associated with goal-directed behaviour. The brain regions which are important for habitual learning, on the other hand, were similarly active for all volunteers.
Provided by Ruhr-Universitaet-Bochum
Source: medicalxpress.com
July 25, 2012
(Medical Xpress) — New understanding of how the brain processes information from inner ear offers hope for sufferers of vertigo.
If you have ever looked over the edge of a cliff and felt dizzy, you understand the challenges faced by people who suffer from symptoms of vestibular dysfunction such as vertigo and dizziness. There are over 70 million of them in North America. For people with vestibular loss, performing basic daily living activities that we take for granted (e.g. dressing, eating, getting in and out of bed, getting around inside as well as outside the home) becomes difficult since even small head movements are accompanied by dizziness and the risk of falling.
We’ve known for a while that a sensory system in the inner ear (the vestibular system) is responsible for helping us keep our balance by giving us a stable visual field as we move around. And while researchers have already developed a basic understanding of how the brain constructs our perceptions of ourselves in motion, until now no one has understood the crucial step by which the neurons in the brain select the information needed to keep us in balance.
The way that the brain takes in and decodes information sent by neurons in the inner ear is complex. The peripheral vestibular sensory neurons in the inner ear take in the time varying acceleration and velocity stimuli caused by our movement in the outside world (such as those experienced while riding in a car that moves from a stationary position to 50 km per hour). These neurons transmit detailed information about these stimuli to the brain (i.e. information that allows one to reconstruct how these stimuli vary over time) in the form of nerve impulses.
Scientists had previously believed that the brain decoded this information linearly and therefore actually attempted to reconstruct the time course of velocity and acceleration stimuli. But by combining electrophysiological and computational approaches, Kathleen Cullen and Maurice Chacron, two professors in McGill University’s Department of Physiology, have been able to show for the first time that the neurons in the vestibular nuclei in the brain instead decode incoming information nonlinearly as they respond preferentially to unexpected, sudden changes in stimuli.
It is known that representations of the outside world change at each stage in this sensory pathway. For example, in the visual system neurons located closer to the periphery of the sensory system (e.g. ganglion cells in the retina) tend to respond to a wide range of sensory stimuli (a “dense” code), whereas central neurons (e.g. in the primary visual cortex at the back of the head tend to respond much more selectively (a “sparse” code). Chacron and Cullen have discovered that the selective transmission of vestibular information they were able to document for the first time occurs as early as the first synapse in the brain. “We were able to show that the brain has developed this very sophisticated computational strategy to represent sudden changes in movement in order to generate quick accurate responses and maintain balance,” explained Prof. Cullen. “I keep describing it as elegant, because that’s really how it strikes me.”
This kind of selectivity in response is important for everyday life, since it enhances the brain’s perception of sudden changes in body posture. So that if you step off an unseen curb, within milliseconds, your brain has both received the essential information and performed the sophisticated computation needed to help you readjust your position. This discovery is expected to apply to other sensory systems and eventually to the development of better treatments for patients who suffer from vertigo, dizziness, and disorientation during their daily activities. It should also lead to treatments that will help alleviate the symptoms that accompany motion and/or space sickness produced in more challenging environments.
Provided by McGill University
Source: medicalxpress.com

Sheep backpacks reveal flocking strategy
UK researchers have shown for the first time that instead of fleeing randomly when faced with danger, sheep head straight for the center of the flock.
Understanding this behavior in healthy animals may help researchers understand the breakdown in social behaviours caused by neurological disorders in sheep, as well as those in humans, such as Huntington’s disease.
The findings support a 40-year-old idea put forward by evolutionary biologist Bill Hamilton. He suggested that creatures as different as insects, fish and cattle all react to danger by moving towards the middle of their respective swarms, schools or herds. “Scientists agree that flocking behavior has evolved in response to the risk of being attacked by predators.
The idea is that being part of a tight-knit group not only increases the chances that you might spot a predator, but decreases the chance that you are the one the predator goes for when it attacks,” explains Dr. Andrew King from The Royal Veterinary College (RVC), lead author the study, published in Current Biology today.
July 25, 2012
(HealthDay) — Shortened telomere length (TL) is associated with risks for dementia and mortality in a population of older adults, according to a study published online July 23 in the Archives of Neurology.

Lawrence S. Honig, M.D., Ph.D., from the Columbia University College of Physicians and Surgeons in New York City, and colleagues used real-time polymerase chain reaction analysis to determine TL in stored leukocyte DNA from 1,983 participants in a community-based study of aging. Participants were 65 years or older and blood was drawn at a mean age of 78.3 years. Participants were followed for a median of 9.3 years for mortality, and 9.6 percent developed incident dementia.
The researchers found that TL correlated inversely with age and was shorter in men than women. TL was significantly shorter in persons dying during follow-up compared with survivors, even after adjusting for age, sex, education, and apolipoprotein E genotype. TL was significantly shorter in the participants with incident and prevalent dementia, compared with those who remained dementia-free. Shorter TL correlated with earlier onset of dementia but this association was significant in women only.
"Our results show an association between shortened TL and mortality, and more specifically an association of shortened TL with Alzheimer’s disease, and are consistent with but not indicative of the possibility that TL may be a factor indicative of biological age," the authors conclude.
Source: medicalxpress.com