Neuroscience

Articles and news from the latest research reports.

11 notes

Gene Therapy Holds Promise for Reversing Congenital Hearing Loss

ScienceDaily (July 25, 2012) — A new gene therapy approach can reverse hearing loss caused by a genetic defect in a mouse model of congenital deafness, according to a preclinical study published by Cell Press in the July 26 issue of the journal Neuron. The findings present a promising therapeutic avenue for potentially treating individuals who are born deaf.

(Credit: © Vasiliy Koval / Fotolia)

"This is the first time that an inherited, genetic hearing loss has been successfully treated in laboratory mice, and as such represents an important milestone for treating genetic deafness in humans," says senior study author Lawrence Lustig of the University of California, San Francisco.

Hearing loss is one of the most common human sensory deficits, and it results from damage to hair cells in the inner ear. About half of the cases of congenital hearing loss are caused by genetic defects. However, the current treatment options — hearing amplification devices and cochlear implants — do not restore hearing to normal levels. Correcting the underlying genetic defects has the potential to fully restore hearing, but previous attempts to reverse hearing loss caused by genetic mutations have not been successful.

Addressing this challenge in the new study, Lustig and his team used mice with hereditary deafness caused by a mutation in a gene coding for a protein called vesicular glutamate transporter-3 (VGLUT3). This protein is crucial for inner hair cells to send signals that enable hearing. Two weeks after the researchers delivered the VGLUT3 gene into the inner ear through an injection, hearing was restored in all of the mice. This improvement lasted between seven weeks and one and a half years when adult mice were treated, and at least nine months when newborn mice received the treatment.

The therapy did not damage the inner ear, and it even corrected some structural defects in the inner hair cells. Because the specific gene delivery method used is safe and effective in animals, the findings hold promise for future human studies. “For years, scientists have been hinting at the possibility of gene therapy as a potential cure for deafness,” Lustig says. “In this study, we now provide a very real and big step towards that goal.”

Source: Science Daily

Filed under science neuroscience psychology congenital deafness hearing loss genetics VGLUT3

26 notes

Increasing dopamine in brain’s frontal cortex decreases impulsive tendency: research

July 25, 2012

Raising levels of the neurotransmitter dopamine in the frontal cortex of the brain significantly decreased impulsivity in healthy adults, in a study conducted by researchers at the Ernest Gallo Clinic and Research Center at the University of California, San Francisco.

"Impulsivity is a risk factor for addiction to many substances, and it has been suggested that people with lower dopamine levels in the frontal cortex tend to be more impulsive," said lead author Andrew Kayser, PhD, an investigator at Gallo and an assistant professor of neurology at UCSF. "We wanted to see if we could decrease impulsivity by raising dopamine, and it seems as if we can."

The study was published on July 4 in the Journal of Neuroscience.

In a double-blinded, placebo-controlled study, 23 adult research participants were given either tolcapone, a medication approved by the Food and Drug Administration (FDA) that inhibits a dopamine-degrading enzyme, or a placebo. The researchers then gave the participants a task that measured impulsivity, asking them to make a hypothetical choice between receiving a smaller amount of money immediately (“smaller sooner”) or a larger amount at a later time (“larger later”). Each participant was tested twice, once with tolcapone and once with placebo.

Participants – especially those who were more impulsive at baseline – were more likely to choose the less impulsive “larger later” option after taking tolcapone than they were after taking the placebo.

Magnetic resonance imaging conducted while the participants were taking the test confirmed that regions of the frontal cortex associated with decision-making were more active in the presence of tolcapone than in the presence of placebo.

"To our knowledge, this is the first study to use tolcapone to look for an effect on impulsivity," said Kayser.

The study was not designed to investigate the reasons that reduced dopamine is linked with impulsivity. However, explained Kayser, scientists believe that impulsivity is associated with an imbalance in dopamine between the frontal cortex, which governs executive functions such as cognitive control and self-regulation, and the striatum, which is thought to be involved in the planning and modification of more habitual behaviors.

"Most, if not all, drugs of abuse, such as cocaine and amphetamine, directly or indirectly involve the dopamine system," said Kayser. "They tend to increase dopamine in the striatum, which in turn may reward impulsive behavior. In a very simplistic fashion, the striatum is saying ‘go,’ and the frontal cortex is saying ‘stop.’ If you take cocaine, you’re increasing the ‘go’ signal, and the ‘stop’ signal is not adequate to counteract it."

Kayser and his research team plan a follow-up study of the effects of tolcapone on drinking behavior. “Once we determine whether drinkers can safely tolerate this medication, we will see if it has any effect on how much they drink while they’re taking it,” said Kayser.

Tolcapone is approved as a medication for Parkinson’s disease, in which a chronic deficit of dopamine inhibits movement.

Provided by University of California, San Francisco

Source: medicalxpress.com

Filed under science neuroscience brain psychology dopamine neurotransmitter impulsive tendency

240 notes


Ecstasy Harms Memory With One Year of Recreational Use
New research published online July 25 by the scientific journal Addiction, gives some of the first information available on the actual risk of using ecstasy. It shows that even in recreational amounts over a relatively short time period, ecstasy users risk specific memory impairments. Further, as the nature of the impairments may not be immediately obvious to the user, it is possible people wouldn’t get the signs that they are being damaged by drug use until it is too late.
According to the study, new ecstasy users who took ten or more ecstasy pills over their first year of use showed decreased function of their immediate and short-term memory compared with their pre-ecstasy performance. These findings are associated with damage of the hippocampus, the area of the brain that oversees memory function and navigation. Interestingly, hippocampal damage is one of the first signs of Alzheimer’s disease, resulting in memory loss and disorientation.

Ecstasy Harms Memory With One Year of Recreational Use

New research published online July 25 by the scientific journal Addiction, gives some of the first information available on the actual risk of using ecstasy. It shows that even in recreational amounts over a relatively short time period, ecstasy users risk specific memory impairments. Further, as the nature of the impairments may not be immediately obvious to the user, it is possible people wouldn’t get the signs that they are being damaged by drug use until it is too late.

According to the study, new ecstasy users who took ten or more ecstasy pills over their first year of use showed decreased function of their immediate and short-term memory compared with their pre-ecstasy performance. These findings are associated with damage of the hippocampus, the area of the brain that oversees memory function and navigation. Interestingly, hippocampal damage is one of the first signs of Alzheimer’s disease, resulting in memory loss and disorientation.

Filed under addiction brain memory psychology science neuroscience ecstasy cognition

37 notes

Force of habit: Stress hormones switch off areas of the brain for goal-directed behaviour

July 25, 2012

Cognition psychologists at the Ruhr-Universität together with colleagues from the University Hospital Bergmannsheil (Prof. Dr. Martin Tegenthoff) have discovered why stressed persons are more likely to lapse back into habits than to behave goal-directed. The team of PD Dr. Lars Schwabe and Prof. Dr. Oliver Wolf from the Institute for Cognitive Neuroscience have mimicked a stress situation in the body using drugs. They then examined the brain activity using functional MRI scanning. The researchers have now reported in the Journal of Neuroscience that the interaction of the stress hormones hydrocortisone and noradrenaline shut down the activity of brain regions for goal-directed behaviour. The brain regions responsible for habitual behaviour remained unaffected.

In order to test the different stress hormones, the cognition psychologists used three substances - a placebo, the stress hormone hydrocortisone and yohimbine, which ensures that the stress hormone noradrenaline stays active longer. Part of the volunteers received hydrocortisone alone or just yohimbine, others both substances. A fourth group were administered a placebo. Altogether, the data of 69 volunteers was included in the study.

In the experiment, all participants - both male and female - learned that they would receive cocoa or orange juice as a reward if they chose certain symbols on the computer. After this learning phase, volunteers were allowed to eat as many oranges or as much chocolate pudding as they liked. “That weakens the value of the reward”, explained Schwabe. “Whoever eats chocolate pudding will lose the attraction to cocoa. Whoever is satiated with oranges, has less appetite for orange juice.” In this context, goal-directed behaviour means: Whoever has previously eaten the chocolate pudding, chooses the symbols leading to cocoa reward less frequently. Whoever is satiated with oranges, selects less frequently the symbols associated with orange juice. Based on previous results, the scientists assumed that only the combination of yohimbine and hydrocortisone attenuates goal-directed behaviour. They have now confirmed this hypothesis.

As expected, volunteers who took yohimbine and hydrocortisone did not behave goal-directed but according to habit. In other words, satiation with oranges or chocolate pudding had no effect. Persons who had taken a placebo or only one medication, on the other hand, behaved goal-directed and showed a satiating effect. The brain data revealed: The combination of yohimbine and hydrocortisone reduced the activity in the forebrain – in the so-called orbitofrontal and medial prefrontal cortex. These areas have been already previously associated with goal-directed behaviour. The brain regions which are important for habitual learning, on the other hand, were similarly active for all volunteers.

Provided by Ruhr-Universitaet-Bochum

Source: medicalxpress.com

Filed under science neuroscience brain psychology stress habits goal-directed behaviour hydrocortisone yohimbine

1,878 notes

ikenbot:

Study: Proof That We Sexually Objectify Women
We look at women the same way we look at houses and sandwiches: as composites of attractive parts.
Problem: Few would argue that the objectification of women is a real thing — and a real problem — but as yet there’s been no cognitive explanation for it in a literal sense. Do we really look at women differently than we do men, and are they actually objectified in the eye — and brain — of the beholder?
Methodology: Images of average, fully clothed individuals were quickly flashed before the eyes of participants. After each one, the participants would then be shown two side-by-side images that zoomed in on one, “sexual” aspect of the individual (for example, a woman’s midriff) and asked to identify the version that hadn’t been modified. The experiment was also reversed, so that participants first looked at a specific part and then had to identify it in the context of an entire body. The test was designed to clue researchers in on whether the participants were using global or local cognitive processing while looking at the images — in other words, whether they perceived the individuals as a whole or as an assemblage of their various parts.
Results: Regardless of gender, participants consistently recognized women’s sexual body parts more easily when presented in isolation. Men’s sexual body parts, on the other hand, were more memorable as part of their entire bodies.
Conclusion: The cognitive process behind our perception of objects is the same that we use when looking at women, and both genders are guilty of taking in the parts instead of the whole. When we look at men, we use global processing to see them more fully as people.
The full study,”Seeing women as objects: The sexual body part recognition bias,” is published in the European Journal of Social Psychology.

ikenbot:

Study: Proof That We Sexually Objectify Women

We look at women the same way we look at houses and sandwiches: as composites of attractive parts.

Problem: Few would argue that the objectification of women is a real thing — and a real problem — but as yet there’s been no cognitive explanation for it in a literal sense. Do we really look at women differently than we do men, and are they actually objectified in the eye — and brain — of the beholder?

Methodology: Images of average, fully clothed individuals were quickly flashed before the eyes of participants. After each one, the participants would then be shown two side-by-side images that zoomed in on one, “sexual” aspect of the individual (for example, a woman’s midriff) and asked to identify the version that hadn’t been modified. The experiment was also reversed, so that participants first looked at a specific part and then had to identify it in the context of an entire body. The test was designed to clue researchers in on whether the participants were using global or local cognitive processing while looking at the images — in other words, whether they perceived the individuals as a whole or as an assemblage of their various parts.

Results: Regardless of gender, participants consistently recognized women’s sexual body parts more easily when presented in isolation. Men’s sexual body parts, on the other hand, were more memorable as part of their entire bodies.

Conclusion: The cognitive process behind our perception of objects is the same that we use when looking at women, and both genders are guilty of taking in the parts instead of the whole. When we look at men, we use global processing to see them more fully as people.

The full study,”Seeing women as objects: The sexual body part recognition bias,” is published in the European Journal of Social Psychology.

(Source: afro-dominicano)

22 notes

Decoding the secrets of balance

July 25, 2012

(Medical Xpress) — New understanding of how the brain processes information from inner ear offers hope for sufferers of vertigo.

If you have ever looked over the edge of a cliff and felt dizzy, you understand the challenges faced by people who suffer from symptoms of vestibular dysfunction such as vertigo and dizziness. There are over 70 million of them in North America. For people with vestibular loss, performing basic daily living activities that we take for granted (e.g. dressing, eating, getting in and out of bed, getting around inside as well as outside the home) becomes difficult since even small head movements are accompanied by dizziness and the risk of falling.

We’ve known for a while that a sensory system in the inner ear (the vestibular system) is responsible for helping us keep our balance by giving us a stable visual field as we move around. And while researchers have already developed a basic understanding of how the brain constructs our perceptions of ourselves in motion, until now no one has understood the crucial step by which the neurons in the brain select the information needed to keep us in balance.

The way that the brain takes in and decodes information sent by neurons in the inner ear is complex. The peripheral vestibular sensory neurons in the inner ear take in the time varying acceleration and velocity stimuli caused by our movement in the outside world (such as those experienced while riding in a car that moves from a stationary position to 50 km per hour). These neurons transmit detailed information about these stimuli to the brain (i.e. information that allows one to reconstruct how these stimuli vary over time) in the form of nerve impulses.

Scientists had previously believed that the brain decoded this information linearly and therefore actually attempted to reconstruct the time course of velocity and acceleration stimuli. But by combining electrophysiological and computational approaches, Kathleen Cullen and Maurice Chacron, two professors in McGill University’s Department of Physiology, have been able to show for the first time that the neurons in the vestibular nuclei in the brain instead decode incoming information nonlinearly as they respond preferentially to unexpected, sudden changes in stimuli.

It is known that representations of the outside world change at each stage in this sensory pathway. For example, in the visual system neurons located closer to the periphery of the sensory system (e.g. ganglion cells in the retina) tend to respond to a wide range of sensory stimuli (a “dense” code), whereas central neurons (e.g. in the primary visual cortex at the back of the head tend to respond much more selectively (a “sparse” code). Chacron and Cullen have discovered that the selective transmission of vestibular information they were able to document for the first time occurs as early as the first synapse in the brain. “We were able to show that the brain has developed this very sophisticated computational strategy to represent sudden changes in movement in order to generate quick accurate responses and maintain balance,” explained Prof. Cullen. “I keep describing it as elegant, because that’s really how it strikes me.”

This kind of selectivity in response is important for everyday life, since it enhances the brain’s perception of sudden changes in body posture. So that if you step off an unseen curb, within milliseconds, your brain has both received the essential information and performed the sophisticated computation needed to help you readjust your position. This discovery is expected to apply to other sensory systems and eventually to the development of better treatments for patients who suffer from vertigo, dizziness, and disorientation during their daily activities. It should also lead to treatments that will help alleviate the symptoms that accompany motion and/or space sickness produced in more challenging environments.

Provided by McGill University

Source: medicalxpress.com

Filed under neuroscience psychology brain science balance vertigo vestibular system ear motion neuron

13 notes


Sheep backpacks reveal flocking strategy
UK researchers have shown for the first time that instead of fleeing randomly when faced with danger, sheep head straight for the center of the flock.
Understanding this behavior in healthy animals may help researchers understand the breakdown in social behaviours caused by neurological disorders in sheep, as well as those in humans, such as Huntington’s disease.
The findings support a 40-year-old idea put forward by evolutionary biologist Bill Hamilton. He suggested that creatures as different as insects, fish and cattle all react to danger by moving towards the middle of their respective swarms, schools or herds. “Scientists agree that flocking behavior has evolved in response to the risk of being attacked by predators.
The idea is that being part of a tight-knit group not only increases the chances that you might spot a predator, but decreases the chance that you are the one the predator goes for when it attacks,” explains Dr. Andrew King from The Royal Veterinary College (RVC), lead author the study, published in Current Biology today.

Sheep backpacks reveal flocking strategy

UK researchers have shown for the first time that instead of fleeing randomly when faced with danger, sheep head straight for the center of the flock.

Understanding this behavior in healthy animals may help researchers understand the breakdown in social behaviours caused by neurological disorders in sheep, as well as those in humans, such as Huntington’s disease.

The findings support a 40-year-old idea put forward by evolutionary biologist Bill Hamilton. He suggested that creatures as different as insects, fish and cattle all react to danger by moving towards the middle of their respective swarms, schools or herds. “Scientists agree that flocking behavior has evolved in response to the risk of being attacked by predators.

The idea is that being part of a tight-knit group not only increases the chances that you might spot a predator, but decreases the chance that you are the one the predator goes for when it attacks,” explains Dr. Andrew King from The Royal Veterinary College (RVC), lead author the study, published in Current Biology today.

Filed under animals behavior biology huntington's disease neuroscience psychology science neurological disorders

25 notes

Shortened telomere length tied to dementia, mortality risk

July 25, 2012

(HealthDay) — Shortened telomere length (TL) is associated with risks for dementia and mortality in a population of older adults, according to a study published online July 23 in the Archives of Neurology.

Lawrence S. Honig, M.D., Ph.D., from the Columbia University College of Physicians and Surgeons in New York City, and colleagues used real-time polymerase chain reaction analysis to determine TL in stored leukocyte DNA from 1,983 participants in a community-based study of aging. Participants were 65 years or older and blood was drawn at a mean age of 78.3 years. Participants were followed for a median of 9.3 years for mortality, and 9.6 percent developed incident dementia.

The researchers found that TL correlated inversely with age and was shorter in men than women. TL was significantly shorter in persons dying during follow-up compared with survivors, even after adjusting for age, sex, education, and apolipoprotein E genotype. TL was significantly shorter in the participants with incident and prevalent dementia, compared with those who remained dementia-free. Shorter TL correlated with earlier onset of dementia but this association was significant in women only.

"Our results show an association between shortened TL and mortality, and more specifically an association of shortened TL with Alzheimer’s disease, and are consistent with but not indicative of the possibility that TL may be a factor indicative of biological age," the authors conclude.

Source: medicalxpress.com

Filed under science neuroscience psychology brain telomere dementia mortality alzheimer alzheimer's disease research

21 notes


Aesop’s Fable Unlocks How Crows and Kids Think
Scientists have used an age-old fable to help illustrate how we think differently to other animals.
Lucy Cheke, a PhD student at the University of Cambridge’s Department of Experimental Psychology, expanded Aesop’s fable into three tasks of varying complexity and compared the performance of Eurasian Jays with local school children.
The task that set the children apart from the Jays involved a mechanism which was counter-intuitive as it was hidden under an opaque surface. Neither the birds nor the children were able to learn how the mechanism worked, but the children were able to learn how to get the reward, whereas the birds were not.
The results of the study illustrate that children learn about cause and effect in the physical world in a different way to birds. While the Jay’s appear to take account of the mechanism involved in the task, the children are more driven by simple cause-effect relationships.

Aesop’s Fable Unlocks How Crows and Kids Think

Scientists have used an age-old fable to help illustrate how we think differently to other animals.

Lucy Cheke, a PhD student at the University of Cambridge’s Department of Experimental Psychology, expanded Aesop’s fable into three tasks of varying complexity and compared the performance of Eurasian Jays with local school children.

The task that set the children apart from the Jays involved a mechanism which was counter-intuitive as it was hidden under an opaque surface. Neither the birds nor the children were able to learn how the mechanism worked, but the children were able to learn how to get the reward, whereas the birds were not.

The results of the study illustrate that children learn about cause and effect in the physical world in a different way to birds. While the Jay’s appear to take account of the mechanism involved in the task, the children are more driven by simple cause-effect relationships.

Filed under aesop's fable cause-effect relationships education neuroscience psychology science thinking animals

free counters