July 25, 2012
A University study has shown how our minds unconsciously respond to threats.
Researchers studying how our minds develop fears in response to danger found that people can quickly learn to recognise a threat even when they are unaware of it.
However, they also found that this learning is swiftly forgotten. In contrast, when people are aware of the threat, they take longer to learn to be afraid of it, but retain the fear in the long term.
Scientists from the University of Edinburgh and New York University, who carried out the study, say the finding may be a key insight into the differences between conscious and nonconscious mental processes.
Researchers measured physiological fear responses - the amount of sweat on the fingertips - in groups of people who looked at pictures and were given mild electric shocks whenever one of these pictures was shown.
All the people who participated in the study saw the pictures with just one eye. But whereas some of them were allowed to see the pictures clearly, the researchers suppressed the pictures from other subjects’ awareness by showing colourful, dynamic images to the other eye.
The study found that subjects who were prevented from consciously seeing the pictures learned to be afraid of the image associated with a shock more quickly than those who were allowed to see them without suppression.
However, these subjects quickly forgot this association between the images and the electric shocks as the experiment continued.
In contrast, those subjects who were allowed to see the image clearly formed a stronger association over time.
How the brain reacts to threats is key to understanding how human beings function. This study shows that we are capable of learning very rapidly that something is a threat even when we don’t perceive it consciously. Such learning, however, is fleeting.
-David Carmel, Researcher, Department of Psychology
Source: The University of Edinburgh
By Makini Brice | July 26, 2012
Scientists were surprised, expecting the areas of the brain to age more slowly, or even delayed, than those of men.

Photo: Microsoft
Even though the gap is closing now in many high-income countries, on average, women tend to live longer lives than men do. Despite – or perhaps because of – women’s physical longevity, women tend to battle cognitive decline in much greater numbers than men do. In fact, women are more likely to suffer from various types of dementia, including the much-maligned Alzheimer’s disease. Now researchers think that they have an answer to the cause of this double-edged sword: stress. Specifically, stress ages women’s brains more quickly than it does men.
Scientists, and every-day observers, have noted that some body parts age at different rates than others do. As people become older, some genes become more active while others become less so. These changes in activity can be monitored through a “transcriptome,” which collects data on all the RNA – the transcripts that carry DNA’s instructions to cells. A multinational team from Australia, China, Germany, and the United States set out to analyze the transcriptomes for 55 different men and women of various ages.
The researchers were fascinated by what they found. According to the abstract of their article published in Aging Cell, “In the superior frontal gyrus (SFG), a part of the prefrontal cortex, we observed manifest differences between the two sexes in the timing of age-related changes, i.e.sexual heterochrony. Intriguingly, age-related expression changes predominantly occurred earlier, or at a faster pace, in females compared to males. These changes included decreased energy production and neural function, and up-regulation of the immune response, all major features of brain aging.”
In other words, researchers found that the brains of women aged more quickly than those of men, especially in the prefrontal cortex. Scientists were surprised, expecting the areas of the brain to age more slowly, or even delayed, than those of men.
In the superior frontal gyrus, researchers found 667 genes that were expressed differently by gender during the aging process. Within that number, 98 percent were associated with faster aging in women.
Scientists were not convinced that the reason lay in biological differences. In fact, since only half of women displayed accelerated aging, they were convinced that the difference was environmental. Researchers theorize that stress is the difference-maker, and that it affects women’s brains more severely than it does men. While a researcher unaffiliated with the study said that the difference could also be caused by inflammation,
Mehmet Somel and his team have conducted similar research on monkeys that confirms their stress theory.
Source: Medical Daily
July 27, 2012
(Medical Xpress) — Johns Hopkins scientists have discovered a “scaffolding” protein that holds together multiple elements in a complex system responsible for regulating pain, mental illnesses and other complex neurological problems.

Preso1 (green) and mGluR5 (red) appear in the same location inside a neuron.
The finding, published in the May 6 issue of Nature Neuroscience, could give researchers a new target for drugs to treat these often-intractable conditions.
The discovery, detailed in a study led by neuroscience professor Paul Worley, M.D., of the Johns Hopkins University School of Medicine, focuses on a family of proteins called group 1 metabotropic glutamate receptors (mGluRs) that lie on the surfaces of nerve cells. When these receptors lock in glutamate, a chemical that neurons use to communicate, it encourages neurons to fire.
Without a way to turn off these receptors, neurons would remain active indefinitely, keeping pain and other responses going long after they’re useful. Previous research suggested that these mGluRs need to bind to another protein called Homer to shut down, and that this binding is stronger after other molecules called protein kinases modify the receptors. However, Worley explains, thus far it’s been unclear exactly how all these different players come together.
Seeking the mechanism behind this phenomenon, Worley and his colleagues started with a series of experiments to see what other proteins the mGluRs and Homer were binding with in rat brains. Their search turned up a third protein called Preso1, which bound to both mGluRs and Homer. A search in genetic databases shows that the gene responsible for making Preso1 is present in animals ranging from fruit flies to people, highlighting its importance in a wide variety of creatures.
To figure out what Preso1 does, the researchers performed another series of experiments to examine behavior of neurons that produced both mGluRs and Homer. They found that when these neurons also expressed Preso1, the mGluRs bound Homer more efficiently, suggesting that Preso1 might somehow increase modification by protein kinases.
Worley’s team received another clue when they found that protein kinases also bind to Preso1.
Genetically modifying mice so that they don’t make any Preso1, the researchers found that binding between mGluRs and Homer in these animals’ neurons was greatly reduced compared to normal mice.
Additionally, when the researchers injected the modified mice with a chemical that causes pain and inflammation, the animals had a significantly greater and longer-lasting response compared to regular mice. A final experiment showed that neurons taken from the modified animals were significantly more responsive to the neurotransmitter glutamate. When the researchers added Preso1 to the cell cultures, this increased activity disappeared, suggesting that Preso1 is pivotal for mGluRs to signal properly.
Taken together, Worley explains, the findings suggest that Preso1 appears to gather all the important elements in this system — Homer, protein kinases and mGluRs — bringing them all together to coordinate the activation and deactivation of the mGluRs.
With Preso1 so pivotal for regulating group 1 mGluR activity, it could prove a useful new target for drugs to treat a variety of health problems in which these receptors are thought to play a role, including chronic pain, schizophrenia, Alzheimer’s disease, and fragile X syndrome, Worley says.
"Because mGluRs play so many important roles in the brain for so many different mental and neurological health conditions, knowledge of their regulatory mechanisms is extremely important. But we really don’t know how they work in great detail," he says. "You need to know all the players before you can understand the system. Here, we’ve identified an important player that no one had previously known had existed. Preso1 and Homer appear essential for desensitization of mGluR signaling, much like beta-adrenergic receptor kinase and arrestin are important for desensitization of adrenergic and opiate receptors."
Provided by Johns Hopkins University
Source: medicalxpress.com
July 27, 2012
Anyone that has ever had trouble sleeping can attest to the difficulties at work the following day. Experts recommend eight hours of sleep per night for ideal health and productivity, but what if five to six hours of sleep is your norm? Is your work still negatively affected? A team of researchers at Brigham and Women’s Hospital (BWH) have discovered that regardless of how tired you perceive yourself to be, that lack of sleep can influence the way you perform certain tasks.
This finding is published in the July 26, 2012 online edition of The Journal of Vision.
"Our team decided to look at how sleep might affect complex visual search tasks, because they are common in safety-sensitive activities, such as air-traffic control, baggage screening, and monitoring power plant operations," explained Jeanne F. Duffy, PhD, MBA, senior author on this study and associate neuroscientist at BWH. "These types of jobs involve processes that require repeated, quick memory encoding and retrieval of visual information, in combination with decision making about the information."
Researchers collected and analyzed data from visual search tasks from 12 participants over a one month study. In the first week, all participants were scheduled to sleep 10-12 hours per night to make sure they were well-rested. For the following three weeks, the participants were scheduled to sleep the equivalent of 5.6 hours per night, and also had their sleep times scheduled on a 28-hour cycle, mirroring chronic jet lag. The research team gave the participants computer tests that involved visual search tasks and recorded how quickly the participants could find important information, and also how accurate they were in identifying it. The researchers report that the longer the participants were awake, the more slowly they identified the important information in the test. Additionally, during the biological night time, 12 a.m. -6 a.m., participants (who were unaware of the time throughout the study) also performed the tasks more slowly than they did during the daytime.
"This research provides valuable information for workers, and their employers, who perform these types of visual search tasks during the night shift, because they will do it much more slowly than when they are working during the day," said Duffy. "The longer someone is awake, the more the ability to perform a task, in this case a visual search, is hindered, and this impact of being awake is even stronger at night."
While the accuracy of the participants stayed the fairly constant, they were slower to identify the relevant information as the weeks went on. The self-ratings of sleepiness only got slightly worse during the second and third weeks on the study schedule, yet the data show that they were performing the visual search tasks significantly slower than in the first week. This finding suggests that someone’s perceptions of how tired they are do not always match their performance ability, explains Duffy.
Provided by Brigham and Women’s Hospital
Source: medicalxpress.com
July 26, 2012 By Mark Wheeler
UCLA researchers say blocking this molecule may improve and speed recovery
FINDINGS:
Researchers at UCLA have identified a novel molecule in the brain that, after stroke, blocks the formation of new connections between neurons. As a result, it limits the brain’s recovery. In a mouse model, the researchers showed that blocking this molecule—called ephrin-A5—induces axonal sprouting, that is, the growth of new connections between the brain’s neurons, or cells, and as a result promotes functional recovery.
IMPACT:
If duplicated in humans, the identification of this molecule could pave the way for a more rapid recovery from stroke and may allow a synergy with existing treatments, such as physical therapy.
UCLA AUTHOR:
Dr. S. Thomas Carmichael, professor of neurology, and colleagues
JOURNAL:
The research appears online this week in the journal PNAS.
MORE:
Stroke is the leading cause of adult disability because of the brain’s limited capacity for repair. An important process in recovery after stroke may be in the formation of new connections, termed axonal sprouting. The adult brain inhibits axonal sprouting and the formation of these connections. In previous work the researchers found, paradoxically, that the brain sends mixed signals after a stroke—activating molecules that both stimulate and inhibit axonal sprouting. In this present work, the researchers have identified the effect of one molecule that inhibits axonal sprouting and determined the new connections in the brain that are necessary to form for recovery.
The researchers also developed a new tissue bioengineering approach for delivering drugs to the brain after stroke. This approach uses a biopolymer hydrogel, or a gel of naturally occurring brain proteins, to release neural repair molecules directly to the target region for recovery in stroke—the tissue adjacent to the center of the stroke.
Last, the paper also shows that the more behavioral activity after stroke, such as the amount an impaired limb is used, the more new connections are directly stimulated to form in the injured brain. This direct link between movement patterns, like those that occur in neurorehabilitation, and the formation of new brain connections, provides a biological mechanism for the effects of some forms of physical therapy after stroke.
Source: UCLA
July 26, 2012
Excitation of neurons depends on the selected influx of certain ions, namely sodium, calcium and potassium through specific channels. Obviously, these channels were crucial for the evolution of nervous systems in animals. How such channels could have evolved their selectivity has been a puzzle until now. Yehu Moran and Ulrich Technau from the University of Vienna together with Scientists from Tel Aviv University and the Woods Hole Oceanographic Institution (USA) have now revealed that voltage-gated sodium channels, which are responsible for neuronal signaling in the nerves of animals, evolved twice in higher and lower animals. These results were published in Cell Reports.

Close-up of nervous system of a transgenic polyp of the sea anemone Nematostella vectensis, in which a red fluorescent reporter gene (mCherry) is driven by the regulatory sequence of the neuronal ELAV gene. The picture shows the diffuse structure of the nervous system, but also reveals the accumulation of longitudinal axonal tracts along the eight gastric tissue folds (mesenteries). Credit: Copyright: U. Technau
The opening and closing of ion channels enable flow of ions that constitute the electrical signaling in all nervous systems. Every thought we have or every move we make is the result of the highly accurate opening and closing of numerous ion channels. Whereas the channels of most lower animals and their unicellular relatives cannot discern between sodium and calcium ions, those of higher animals are highly specific for sodium, a characteristic that is important for fast and accurate signaling in complex nervous system.
Surprising results in sea anemones and jellyfish
However, the researchers found that a group of basal animals with simple nerve nets including sea anemones and jellyfish also possess voltage-gated sodium channels, which differ from those found in higher animals, yet show the same selectivity for sodium. Since cnidarians separated from the rest of the animals more than 600 million years ago, these findings suggest that the channels of both cnidarians and higher animals originated independently twice, from ancient non-selective channels which also transmit calcium.
Since many other processes of internal cell signaling are highly dependent on calcium ions, the use of non-selective ion channels in neurons would accidently trigger various signaling systems inside the cells and will cause damage. The evolution of selectivity for sodium ions is therefore considered as an important step in the evolution of nervous systems with fast transmission. This study shows that different parts of the channel changed in a convergent manner during the evolution of cnidarians and higher animals in order to perform the same task, namely to select for sodium ions.
This demonstrates that important components for the functional nervous systems evolved twice in basal and higher animals, which suggests that more complex nervous systems that rely on such ion-selective channels could have also evolved twice independently.
Source: PHYS.ORG
ScienceDaily (July 26, 2012) — In one of the first studies to look at transcranial magnetic stimulation (TMS) in real-world clinical practice settings, researchers at Butler Hospital, along with colleagues across the U.S., confirmed that TMS is an effective treatment for patients with depression who are unable to find symptom relief through antidepressant medications. The study findings are published online in the June 11, 2012 edition of Depression and Anxiety in the Wiley Online Library.

(Credit: Butler Hospital)
Previous analysis of the efficacy of TMS has been provided through more than 30 published trials, yielding generally consistent results supporting the use of TMS to treat depression when medications aren’t sufficient. “Those previous studies were key in laying the groundwork for the FDA to approve the first device for delivery of TMS as a treatment for depression in 2008,” said Linda Carpenter, MD, lead author of the report and chief of the Mood Disorders Program and the Neuromodulation Clinic at Butler Hospital. “Naturalistic studies like ours, which provide scrutiny of real-life patient outcomes when TMS therapy is given in actual clinical practice settings, are the next step in further understanding the effectiveness of TMS. They are also important for informing healthcare policy, particularly in an era when difficult decisions must be made about allocation of scarce resources.”
Carpenter explains that naturalistic studies differ from controlled clinical trials because they permit the inclusion of subjects with a wider range of symptomatology and comorbidity, whereas controlled clinical trials typically have more rigid criteria for inclusion. “As a multisite study collecting naturalistic outcomes from patients in clinics in various regions in the U.S., we were also able to capture effects that might arise from introducing a novel psychiatric treatment modality like TMS in non-research settings,” said Carpenter. In all, the study confirms how well TMS works in diverse settings where TMS is administered to a real-life population of patients with depression that have not found relief through many other available treatments.
The published report summarized data collected from 42 clinical TMS practice sites in the US, and included outcomes from 307 patients with Major Depressive Disorder (MDD) who had persistent symptoms despite the use of antidepressant medication. Change during TMS was assessed using both clinicians’ ratings of overall depression severity and scores on patient self-report depression scales, which require the patient to rate the severity of each symptom on the same standardized scale at the end of each 2-week period. Rates for “response” and “remission” to TMS were calculated based on the same cut-off scores and conventions used for other clinical trials of antidepressant treatments. Fifty-eight percent positive response rate to TMS and 37 percent remission rate were observed.
"The patient outcomes we found in this study demonstrated a response rate similar to controlled clinical trial populations," said Dr. Carpenter, explaining that this new data validates TMS efficacy in treating depression for those who have failed to benefit from antidepressant medications. "Continued research and confirmation of the effectiveness of TMS is important for understanding its place in everyday psychiatric care and to support advocacy for insurance coverage of the treatment." Thanks in part to the advocacy efforts of Dr. Carpenter, TMS was recently approved for coverage by Medicare in New England, and it is also now covered by BCBSRI. "Next steps for TMS research involve enhancing our understanding of how to maintain positive response to TMS over time after the course of therapy ends and learning how to customize the treatment for patients using newer technologies, so TMS can help even more patients."
Source: Science Daily
ScienceDaily (July 26, 2012) — Researchers reporting online on July 26 in Current Biology, a Cell Press publication, have for the first time shown that they can control the behavior of monkeys by using pulses of blue light to very specifically activate particular brain cells. The findings represent a key advance for optogenetics, a state-of-the-art method for making causal connections between brain activity and behavior. Based on the discovery, the researchers say that similar light-based mind control could likely also be made to work in humans for therapeutic ends.

(Credit: © Eric Isselée / Fotolia)
"We are the first to show that optogenetics can alter the behavior of monkeys," says Wim Vanduffel of Massachusetts General Hospital and KU Leuven Medical School. "This opens the door to use of optogenetics at a large scale in primate research and to start developing optogenetic-based therapies for humans."
In optogenetics, neurons are made to respond to light through the insertion of light-sensitive genes derived from particular microbial organisms. Earlier studies had primarily validated this method for use in invertebrates and rodents, with only a few studies showing that optogenetics can alter activity in monkey brains on a fine scale.
In the new study, the researchers focused on neurons that control particular eye movements. Using optogenetics together with functional magnetic resonance imaging (fMRI), they showed that they could use light to activate these neurons, generating brain activity and subtle changes in eye-movement behavior.
The researchers also found that optogenetic stimulation of their focal brain region produced changes in the activity of specific neural networks located at some distance from the primary site of light activation.
The findings not only pave the way for a much more detailed understanding of how different parts of the brain control behavior, but they may also have important clinical applications in treating Parkinson’s disease, addiction, depression, obsessive-compulsive disorder, and other neurological conditions.
"Several neurological disorders can be attributed to the malfunctioning of specific cell types in very specific brain regions," Vanduffel says. "As already suggested by one of the leading researchers in optogenetics, Karl Deisseroth from Stanford University, it is important to identify the underlying neuronal circuits and the precise nature of the aberrations that lead to the neurological disorders and potentially to manipulate those malfunctioning circuits with high precision to restore them. The beauty of optogenetics is that, unlike any other method, one can affect the activity of very specific cell types, leaving others untouched."
Source: Science Daily
Can the simple act of recognizing a face as you walk down the street change the way we think? Or can taking the time to notice something new on our way to work change what we remember about that walk? In a new study published in the journal Science, New York University researchers show that remembering something old or noticing something new can bias how you process subsequent information.
This novel finding suggests that our memory system can adaptively bias its processing towards forming new memories or retrieving old ones based on recent experiences. For example, when you walk into a restaurant or for the first time, your memory system can both encode the details of this new environment as well as allow you to remember a similar one where you recently dined with a friend. The results of this study suggest that what you did right before walking into the restaurant can determine which process is more likely to occur.
By contrast, in another experiment, the researchers demonstrated that the same manipulation can also influence how we form new memories. In this study, the researchers tested how well participants were able to form links between overlapping memories. They found that participants were more likely to construct these links when the overlapping memories were formed immediately after retrieving an unrelated old object as compared to identifying a new one. This suggests that after processing old objects, participants were more likely to retrieve the associated memories and link them to an ongoing experience.
”—One act of remembering can influence future acts: study (via myserendipities)ScienceDaily (July 25, 2012) — A team of University of California, Berkeley, scientists in collaboration with researchers at the University of Munich and University of Washington, in Seattle, has discovered a chemical that temporarily restores some vision to blind mice, and is working on an improved compound that may someday allow people with degenerative blindness to see again.

Mice with a genetic disease that causes blindness regained some sight after injection with a chemical “photoswitch.” The eye of the untreated mouse on the left shows no response to light, while the pupil of the mouse on the right, which was injected with the chemical, contracts in light. (Credit: Image courtesy of University of California - Berkeley)
The approach could eventually help those with retinitis pigmentosa, a genetic disease that is the most common inherited form of blindness, as well as age-related macular degeneration, the most common cause of acquired blindness in the developed world. In both diseases, the light sensitive cells in the retina — the rods and cones — die, leaving the eye without functional photoreceptors.
The chemical, called AAQ, acts by making the remaining, normally “blind” cells in the retina sensitive to light, said lead researcher Richard Kramer, UC Berkeley professor of molecular and cell biology. AAQ is a photoswitch that binds to protein ion channels on the surface of retinal cells. When switched on by light, AAQ alters the flow of ions through the channels and activates these neurons much the way rods and cones are activated by light.
"This is similar to the way local anesthetics work: they embed themselves in ion channels and stick around for a long time, so that you stay numb for a long time," Kramer said. "Our molecule is different in that it’s light sensitive, so you can turn it on and off and turn on or off neural activity."
Because the chemical eventually wears off, it may offer a safer alternative to other experimental approaches for restoring sight, such as gene or stem cell therapies, which permanently change the retina. It is also less invasive than implanting light-sensitive electronic chips in the eye.
"The advantage of this approach is that it is a simple chemical, which means that you can change the dosage, you can use it in combination with other therapies, or you can discontinue the therapy if you don’t like the results. As improved chemicals become available, you could offer them to patients. You can’t do that when you surgically implant a chip or after you genetically modify somebody," Kramer said.
"This is a major advance in the field of vision restoration," said co-author Dr. Russell Van Gelder, an ophthalmologist and chair of the Department of Ophthalmology at the University of Washington, Seattle.
Kramer, Van Gelder, chemist Dirk Trauner and their colleagues at UC Berkeley, the University of Washington, Seattle, and the University of Munich will publish their findings on July 26, in the journal Neuron.
The blind mice in the experiment had genetic mutations that made their rods and cones die within months of birth and inactivated other photopigments in the eye. After injecting very small amounts of AAQ into the eyes of the blind mice, Kramer and his colleagues confirmed that they had restored light sensitivity because the mice’s pupils contracted in bright light, and the mice showed light avoidance, a typical rodent behavior impossible without the animals being able to see some light. Kramer is hoping to conduct more sophisticated vision tests in rodents injected with the next generation of the compound.
"The photoswitch approach offers real hope to patients with retinal degeneration," Van Gelder said. "We still need to show that these compounds are safe and will work in people the way they work in mice, but these results demonstrate that this class of compound restores light sensitivity to retinas blind from genetic disease."
From optogenetics to implanted chips
The current technologies being evaluated for restoring sight to people whose rods and cones have died include injection of stem cells to regenerate the rods and cones; “optogenetics,” that is, gene therapy to insert a photoreceptor gene into blind neurons to make them sensitive to light; and installation of electronic prosthetic devices, such as a small light-sensitive retinal chip with electrodes that stimulate blind neurons. Several dozen people already have retinal implants and have had rudimentary, low vision restored, Kramer said.
Eight years ago, Kramer, Trauner, a former UC Berkeley chemist now at the University of Munich, and their colleagues developed an optogenetic technique to chemically alter potassium ion channels in blind neurons so that a photoswitch could latch on. Potassium channels normally open to turn a cell off, but with the attached photoswitch, they were opened when hit by ultraviolet light and closed when hit by green light, thereby activating and deactivating the neurons.
Subsequently, Trauner synthesized AAQ (acrylamide-azobenzene-quaternary ammonium), a photoswitch that attaches to potassium channels without the need to genetically modify the channel. Tests of this compound are reported in the current Neuron paper.
New versions of AAQ now being tested are better, Kramer said. They activate neurons for days rather than hours using blue-green light of moderate intensity, and these photoswitches naturally deactivate in darkness, so that a second color of light is not needed to switch them off.
"This is what we are really excited about," he said.
Source: Science Daily
ScienceDaily (July 25, 2012) — A new gene therapy approach can reverse hearing loss caused by a genetic defect in a mouse model of congenital deafness, according to a preclinical study published by Cell Press in the July 26 issue of the journal Neuron. The findings present a promising therapeutic avenue for potentially treating individuals who are born deaf.

(Credit: © Vasiliy Koval / Fotolia)
"This is the first time that an inherited, genetic hearing loss has been successfully treated in laboratory mice, and as such represents an important milestone for treating genetic deafness in humans," says senior study author Lawrence Lustig of the University of California, San Francisco.
Hearing loss is one of the most common human sensory deficits, and it results from damage to hair cells in the inner ear. About half of the cases of congenital hearing loss are caused by genetic defects. However, the current treatment options — hearing amplification devices and cochlear implants — do not restore hearing to normal levels. Correcting the underlying genetic defects has the potential to fully restore hearing, but previous attempts to reverse hearing loss caused by genetic mutations have not been successful.
Addressing this challenge in the new study, Lustig and his team used mice with hereditary deafness caused by a mutation in a gene coding for a protein called vesicular glutamate transporter-3 (VGLUT3). This protein is crucial for inner hair cells to send signals that enable hearing. Two weeks after the researchers delivered the VGLUT3 gene into the inner ear through an injection, hearing was restored in all of the mice. This improvement lasted between seven weeks and one and a half years when adult mice were treated, and at least nine months when newborn mice received the treatment.
The therapy did not damage the inner ear, and it even corrected some structural defects in the inner hair cells. Because the specific gene delivery method used is safe and effective in animals, the findings hold promise for future human studies. “For years, scientists have been hinting at the possibility of gene therapy as a potential cure for deafness,” Lustig says. “In this study, we now provide a very real and big step towards that goal.”
Source: Science Daily
July 25, 2012
Raising levels of the neurotransmitter dopamine in the frontal cortex of the brain significantly decreased impulsivity in healthy adults, in a study conducted by researchers at the Ernest Gallo Clinic and Research Center at the University of California, San Francisco.
"Impulsivity is a risk factor for addiction to many substances, and it has been suggested that people with lower dopamine levels in the frontal cortex tend to be more impulsive," said lead author Andrew Kayser, PhD, an investigator at Gallo and an assistant professor of neurology at UCSF. "We wanted to see if we could decrease impulsivity by raising dopamine, and it seems as if we can."
The study was published on July 4 in the Journal of Neuroscience.
In a double-blinded, placebo-controlled study, 23 adult research participants were given either tolcapone, a medication approved by the Food and Drug Administration (FDA) that inhibits a dopamine-degrading enzyme, or a placebo. The researchers then gave the participants a task that measured impulsivity, asking them to make a hypothetical choice between receiving a smaller amount of money immediately (“smaller sooner”) or a larger amount at a later time (“larger later”). Each participant was tested twice, once with tolcapone and once with placebo.
Participants – especially those who were more impulsive at baseline – were more likely to choose the less impulsive “larger later” option after taking tolcapone than they were after taking the placebo.
Magnetic resonance imaging conducted while the participants were taking the test confirmed that regions of the frontal cortex associated with decision-making were more active in the presence of tolcapone than in the presence of placebo.
"To our knowledge, this is the first study to use tolcapone to look for an effect on impulsivity," said Kayser.
The study was not designed to investigate the reasons that reduced dopamine is linked with impulsivity. However, explained Kayser, scientists believe that impulsivity is associated with an imbalance in dopamine between the frontal cortex, which governs executive functions such as cognitive control and self-regulation, and the striatum, which is thought to be involved in the planning and modification of more habitual behaviors.
"Most, if not all, drugs of abuse, such as cocaine and amphetamine, directly or indirectly involve the dopamine system," said Kayser. "They tend to increase dopamine in the striatum, which in turn may reward impulsive behavior. In a very simplistic fashion, the striatum is saying ‘go,’ and the frontal cortex is saying ‘stop.’ If you take cocaine, you’re increasing the ‘go’ signal, and the ‘stop’ signal is not adequate to counteract it."
Kayser and his research team plan a follow-up study of the effects of tolcapone on drinking behavior. “Once we determine whether drinkers can safely tolerate this medication, we will see if it has any effect on how much they drink while they’re taking it,” said Kayser.
Tolcapone is approved as a medication for Parkinson’s disease, in which a chronic deficit of dopamine inhibits movement.
Provided by University of California, San Francisco
Source: medicalxpress.com
July 25, 2012
Cognition psychologists at the Ruhr-Universität together with colleagues from the University Hospital Bergmannsheil (Prof. Dr. Martin Tegenthoff) have discovered why stressed persons are more likely to lapse back into habits than to behave goal-directed. The team of PD Dr. Lars Schwabe and Prof. Dr. Oliver Wolf from the Institute for Cognitive Neuroscience have mimicked a stress situation in the body using drugs. They then examined the brain activity using functional MRI scanning. The researchers have now reported in the Journal of Neuroscience that the interaction of the stress hormones hydrocortisone and noradrenaline shut down the activity of brain regions for goal-directed behaviour. The brain regions responsible for habitual behaviour remained unaffected.
In order to test the different stress hormones, the cognition psychologists used three substances - a placebo, the stress hormone hydrocortisone and yohimbine, which ensures that the stress hormone noradrenaline stays active longer. Part of the volunteers received hydrocortisone alone or just yohimbine, others both substances. A fourth group were administered a placebo. Altogether, the data of 69 volunteers was included in the study.
In the experiment, all participants - both male and female - learned that they would receive cocoa or orange juice as a reward if they chose certain symbols on the computer. After this learning phase, volunteers were allowed to eat as many oranges or as much chocolate pudding as they liked. “That weakens the value of the reward”, explained Schwabe. “Whoever eats chocolate pudding will lose the attraction to cocoa. Whoever is satiated with oranges, has less appetite for orange juice.” In this context, goal-directed behaviour means: Whoever has previously eaten the chocolate pudding, chooses the symbols leading to cocoa reward less frequently. Whoever is satiated with oranges, selects less frequently the symbols associated with orange juice. Based on previous results, the scientists assumed that only the combination of yohimbine and hydrocortisone attenuates goal-directed behaviour. They have now confirmed this hypothesis.
As expected, volunteers who took yohimbine and hydrocortisone did not behave goal-directed but according to habit. In other words, satiation with oranges or chocolate pudding had no effect. Persons who had taken a placebo or only one medication, on the other hand, behaved goal-directed and showed a satiating effect. The brain data revealed: The combination of yohimbine and hydrocortisone reduced the activity in the forebrain – in the so-called orbitofrontal and medial prefrontal cortex. These areas have been already previously associated with goal-directed behaviour. The brain regions which are important for habitual learning, on the other hand, were similarly active for all volunteers.
Provided by Ruhr-Universitaet-Bochum
Source: medicalxpress.com
July 25, 2012
(Medical Xpress) — New understanding of how the brain processes information from inner ear offers hope for sufferers of vertigo.
If you have ever looked over the edge of a cliff and felt dizzy, you understand the challenges faced by people who suffer from symptoms of vestibular dysfunction such as vertigo and dizziness. There are over 70 million of them in North America. For people with vestibular loss, performing basic daily living activities that we take for granted (e.g. dressing, eating, getting in and out of bed, getting around inside as well as outside the home) becomes difficult since even small head movements are accompanied by dizziness and the risk of falling.
We’ve known for a while that a sensory system in the inner ear (the vestibular system) is responsible for helping us keep our balance by giving us a stable visual field as we move around. And while researchers have already developed a basic understanding of how the brain constructs our perceptions of ourselves in motion, until now no one has understood the crucial step by which the neurons in the brain select the information needed to keep us in balance.
The way that the brain takes in and decodes information sent by neurons in the inner ear is complex. The peripheral vestibular sensory neurons in the inner ear take in the time varying acceleration and velocity stimuli caused by our movement in the outside world (such as those experienced while riding in a car that moves from a stationary position to 50 km per hour). These neurons transmit detailed information about these stimuli to the brain (i.e. information that allows one to reconstruct how these stimuli vary over time) in the form of nerve impulses.
Scientists had previously believed that the brain decoded this information linearly and therefore actually attempted to reconstruct the time course of velocity and acceleration stimuli. But by combining electrophysiological and computational approaches, Kathleen Cullen and Maurice Chacron, two professors in McGill University’s Department of Physiology, have been able to show for the first time that the neurons in the vestibular nuclei in the brain instead decode incoming information nonlinearly as they respond preferentially to unexpected, sudden changes in stimuli.
It is known that representations of the outside world change at each stage in this sensory pathway. For example, in the visual system neurons located closer to the periphery of the sensory system (e.g. ganglion cells in the retina) tend to respond to a wide range of sensory stimuli (a “dense” code), whereas central neurons (e.g. in the primary visual cortex at the back of the head tend to respond much more selectively (a “sparse” code). Chacron and Cullen have discovered that the selective transmission of vestibular information they were able to document for the first time occurs as early as the first synapse in the brain. “We were able to show that the brain has developed this very sophisticated computational strategy to represent sudden changes in movement in order to generate quick accurate responses and maintain balance,” explained Prof. Cullen. “I keep describing it as elegant, because that’s really how it strikes me.”
This kind of selectivity in response is important for everyday life, since it enhances the brain’s perception of sudden changes in body posture. So that if you step off an unseen curb, within milliseconds, your brain has both received the essential information and performed the sophisticated computation needed to help you readjust your position. This discovery is expected to apply to other sensory systems and eventually to the development of better treatments for patients who suffer from vertigo, dizziness, and disorientation during their daily activities. It should also lead to treatments that will help alleviate the symptoms that accompany motion and/or space sickness produced in more challenging environments.
Provided by McGill University
Source: medicalxpress.com
July 25, 2012
(HealthDay) — Shortened telomere length (TL) is associated with risks for dementia and mortality in a population of older adults, according to a study published online July 23 in the Archives of Neurology.

Lawrence S. Honig, M.D., Ph.D., from the Columbia University College of Physicians and Surgeons in New York City, and colleagues used real-time polymerase chain reaction analysis to determine TL in stored leukocyte DNA from 1,983 participants in a community-based study of aging. Participants were 65 years or older and blood was drawn at a mean age of 78.3 years. Participants were followed for a median of 9.3 years for mortality, and 9.6 percent developed incident dementia.
The researchers found that TL correlated inversely with age and was shorter in men than women. TL was significantly shorter in persons dying during follow-up compared with survivors, even after adjusting for age, sex, education, and apolipoprotein E genotype. TL was significantly shorter in the participants with incident and prevalent dementia, compared with those who remained dementia-free. Shorter TL correlated with earlier onset of dementia but this association was significant in women only.
"Our results show an association between shortened TL and mortality, and more specifically an association of shortened TL with Alzheimer’s disease, and are consistent with but not indicative of the possibility that TL may be a factor indicative of biological age," the authors conclude.
Source: medicalxpress.com
July 25, 2012
(Medical Xpress) — Many people, whether they know it or not, are philosophical dualists. That is, they believe that the brain and the mind are two separate entities. Despite the fact dualist beliefs are found in virtually all human cultures, surprisingly little is known about the impact of these beliefs on how we think and behave in everyday life.
But a new research article forthcoming in Psychological Science, a journal of the Association for Psychological Science, suggests that espousing a dualist philosophy can have important real-life consequences.
Across five related studies, researchers Matthias Forstmann, Pascal Burgmer, and Thomas Mussweiler of the University of Cologne, Germany, found that people primed with dualist beliefs had more reckless attitudes toward health and exercise, and also preferred (and ate) a less healthy diet than those who were primed with physicalist beliefs.
Furthermore, they found that the relationship also worked in the other direction. People who were primed with unhealthy behaviors – such as pictures of unhealthy food – reported a stronger dualistic belief than participants who were primed with healthy behaviors.
Overall, the findings from the five studies provide converging evidence demonstrating that mind-body dualism has a noticeable impact on people’s health-related attitudes and behaviors. Specifically, these findings suggest that dualistic beliefs decrease the likelihood of engaging in healthy behavior.
These findings support the researchers’ original hypothesis that the more people perceive their minds and bodies to be distinct entities, the less likely they will be to engage in behaviors that protect their bodies. Bodies are ultimately viewed as a disposable vessel that helps the mind interact with the physical world.
Evidence of a bidirectional relationship further suggests that metaphysical beliefs, such as beliefs in mind-body dualism, may serve as cognitive tools for coping with threatening or harmful situations.
The fact that the simple priming procedures used in the studies had an immediate impact on health-related attitudes and behavior suggests that these procedures may eventually have profound implications for real-life problems. Interventions that reduce dualistic beliefs through priming could be one way to help promote healthier – or less self-damaging – behaviors in at-risk populations.
Provided by Association for Psychological Science
Source: medicalxpress.com
New research suggests that patients whose mobility has been limited by stroke may one day use their imagination and a computer link to move their hands.

Leuthardt
In patients, scientists at Washington University School of Medicine in St. Louis have shown they can detect the brain simply thinking about moving a partially or completely paralyzed hand. The half of the brain that normally thinks such thoughts and moves the hand can no longer do so because of stroke damage. Instead, the signal comes from the undamaged half of the brain.
The new study suggests it may be possible to harness these signals to restore a fuller range of movement in the patient’s limbs.
“We’ve known for some time that the brain can reroute or otherwise adapt its circuits to cope with an injury,” says senior author Eric Leuthardt, MD, associate professor of neurosurgery, of biomedical engineering and of neurobiology. “Now we have proof-of-principle that we can use technology to aid that process.”
To demonstrate the potential to help restore movement, scientists connected brain signals detected by an electrode-studded cap to the movements of a cursor on a computer screen. In 30 minutes or less, patients learned to control the movement of the cursor with thoughts of moving their impaired hand. Researchers are now working on a motorized glove that will make the imagined movements a reality.
The results are available online in The Journal of Neural Engineering.
Leuthardt, who is director of Washington University’s Center for Innovation in Neuroscience and Technology, is a pioneer in the field of brain-computer interfaces, or devices that allow the brain to communicate directly with computers to restore abilities lost to injury or disease.
Much of Leuthardt’s research has focused on patients with epilepsy who are undergoing surgery to remove the part of the brain where their seizures originate. He uses the electrode grids temporarily implanted on the surface of the brain to pinpoint areas where the seizures begin. With the patients’ permissions, Leuthardt also uses the implants to gather and analyze detailed information on brain activity for future use in brain-computer interfaces. This approach laid the foundations for the technique now being applied to the stroke population.
In the new research, first author David Bundy, a graduate student, worked with four patients who had suffered strokes that caused extensive damage on one side of the brain. All were experiencing paralysis or significant difficulty moving the hand on the opposite side of the body.
The brain signals that control movement are low-frequency signals, which makes them relatively easy to detect with electrodes on the outside of the skull. Researchers fitted patients with an electrode-studded cap connected to a computer, and asked them to perform a finger-tapping activity. Depending on a cue flashed on a screen in front of them, the patients either tapped the fingers of their unimpaired hand or imagined tapping the fingers of the impaired hand. Scientists used the cap to identify signals in healthy part of the brain that accompanied the imaginary movements.
The researchers are now developing motorized braces that can be controlled by similar signals, with the goal of restoring full movement in weak or paralyzed limbs.
“This is an exciting development that opens up new opportunities to help even more patients overcome limitations imposed by brain damage or degeneration,” Leuthardt says.