Neuroscience

Month

July 2012

Jul 28, 201228 notes
#science #neuroscience #brain #psychology #schizophrenia #psychotic disorder #treatment
Efficacy of Transcranial Magnetic Stimulation for Depression Confirmed in New Study

ScienceDaily (July 26, 2012) — In one of the first studies to look at transcranial magnetic stimulation (TMS) in real-world clinical practice settings, researchers at Butler Hospital, along with colleagues across the U.S., confirmed that TMS is an effective treatment for patients with depression who are unable to find symptom relief through antidepressant medications. The study findings are published online in the June 11, 2012 edition of Depression and Anxiety in the Wiley Online Library.

image

(Credit: Butler Hospital)

Previous analysis of the efficacy of TMS has been provided through more than 30 published trials, yielding generally consistent results supporting the use of TMS to treat depression when medications aren’t sufficient. “Those previous studies were key in laying the groundwork for the FDA to approve the first device for delivery of TMS as a treatment for depression in 2008,” said Linda Carpenter, MD, lead author of the report and chief of the Mood Disorders Program and the Neuromodulation Clinic at Butler Hospital. “Naturalistic studies like ours, which provide scrutiny of real-life patient outcomes when TMS therapy is given in actual clinical practice settings, are the next step in further understanding the effectiveness of TMS. They are also important for informing healthcare policy, particularly in an era when difficult decisions must be made about allocation of scarce resources.”

Carpenter explains that naturalistic studies differ from controlled clinical trials because they permit the inclusion of subjects with a wider range of symptomatology and comorbidity, whereas controlled clinical trials typically have more rigid criteria for inclusion. “As a multisite study collecting naturalistic outcomes from patients in clinics in various regions in the U.S., we were also able to capture effects that might arise from introducing a novel psychiatric treatment modality like TMS in non-research settings,” said Carpenter. In all, the study confirms how well TMS works in diverse settings where TMS is administered to a real-life population of patients with depression that have not found relief through many other available treatments.

The published report summarized data collected from 42 clinical TMS practice sites in the US, and included outcomes from 307 patients with Major Depressive Disorder (MDD) who had persistent symptoms despite the use of antidepressant medication. Change during TMS was assessed using both clinicians’ ratings of overall depression severity and scores on patient self-report depression scales, which require the patient to rate the severity of each symptom on the same standardized scale at the end of each 2-week period. Rates for “response” and “remission” to TMS were calculated based on the same cut-off scores and conventions used for other clinical trials of antidepressant treatments. Fifty-eight percent positive response rate to TMS and 37 percent remission rate were observed.

"The patient outcomes we found in this study demonstrated a response rate similar to controlled clinical trial populations," said Dr. Carpenter, explaining that this new data validates TMS efficacy in treating depression for those who have failed to benefit from antidepressant medications. "Continued research and confirmation of the effectiveness of TMS is important for understanding its place in everyday psychiatric care and to support advocacy for insurance coverage of the treatment." Thanks in part to the advocacy efforts of Dr. Carpenter, TMS was recently approved for coverage by Medicare in New England, and it is also now covered by BCBSRI. "Next steps for TMS research involve enhancing our understanding of how to maintain positive response to TMS over time after the course of therapy ends and learning how to customize the treatment for patients using newer technologies, so TMS can help even more patients."

Source: Science Daily

Jul 27, 201221 notes
#science #neuroscience #brain #psychology #depression #TMS #antidepressant treatments
Jul 27, 201218 notes
#science #neuroscience #genetics #ovaries #oocytes #biological clock #biology
Jul 27, 2012111 notes
#science #neuroscience #brain #psychology #caffeine #antagonist #adenosine #caffeine receptors #neurotransmitters
Jul 27, 201227 notes
#science #social sciences #neuroscience #psychology #global health #epidemiology #technology #digital
Controlling Monkey Brains and Behavior With Light

ScienceDaily (July 26, 2012) — Researchers reporting online on July 26 in Current Biology, a Cell Press publication, have for the first time shown that they can control the behavior of monkeys by using pulses of blue light to very specifically activate particular brain cells. The findings represent a key advance for optogenetics, a state-of-the-art method for making causal connections between brain activity and behavior. Based on the discovery, the researchers say that similar light-based mind control could likely also be made to work in humans for therapeutic ends.

image

(Credit: © Eric Isselée / Fotolia)

"We are the first to show that optogenetics can alter the behavior of monkeys," says Wim Vanduffel of Massachusetts General Hospital and KU Leuven Medical School. "This opens the door to use of optogenetics at a large scale in primate research and to start developing optogenetic-based therapies for humans."

In optogenetics, neurons are made to respond to light through the insertion of light-sensitive genes derived from particular microbial organisms. Earlier studies had primarily validated this method for use in invertebrates and rodents, with only a few studies showing that optogenetics can alter activity in monkey brains on a fine scale.

In the new study, the researchers focused on neurons that control particular eye movements. Using optogenetics together with functional magnetic resonance imaging (fMRI), they showed that they could use light to activate these neurons, generating brain activity and subtle changes in eye-movement behavior.

The researchers also found that optogenetic stimulation of their focal brain region produced changes in the activity of specific neural networks located at some distance from the primary site of light activation.

The findings not only pave the way for a much more detailed understanding of how different parts of the brain control behavior, but they may also have important clinical applications in treating Parkinson’s disease, addiction, depression, obsessive-compulsive disorder, and other neurological conditions.

"Several neurological disorders can be attributed to the malfunctioning of specific cell types in very specific brain regions," Vanduffel says. "As already suggested by one of the leading researchers in optogenetics, Karl Deisseroth from Stanford University, it is important to identify the underlying neuronal circuits and the precise nature of the aberrations that lead to the neurological disorders and potentially to manipulate those malfunctioning circuits with high precision to restore them. The beauty of optogenetics is that, unlike any other method, one can affect the activity of very specific cell types, leaving others untouched."

Source: Science Daily

Jul 27, 201214 notes
#science #neuroscience #brain #psychology #biology #behavior #optogenetics #neuron
Jul 27, 201275 notes
#tech #science #neuroscience #brain #psychology #eye movements #vision #eye-writing technology #technology
Jul 27, 201218 notes
#science #neuroscience #cognition #cognitive development #learning #memory #psychology
Jul 27, 20127 notes
#science #neuroscience #connectomics #neural networks #Caenorhabditis elegans #brain #neuron #worm
“

Can the simple act of recognizing a face as you walk down the street change the way we think? Or can taking the time to notice something new on our way to work change what we remember about that walk? In a new study published in the journal Science, New York University researchers show that remembering something old or noticing something new can bias how you process subsequent information.

This novel finding suggests that our memory system can adaptively bias its processing towards forming new memories or retrieving old ones based on recent experiences. For example, when you walk into a restaurant or for the first time, your memory system can both encode the details of this new environment as well as allow you to remember a similar one where you recently dined with a friend. The results of this study suggest that what you did right before walking into the restaurant can determine which process is more likely to occur.

By contrast, in another experiment, the researchers demonstrated that the same manipulation can also influence how we form new memories. In this study, the researchers tested how well participants were able to form links between overlapping memories. They found that participants were more likely to construct these links when the overlapping memories were formed immediately after retrieving an unrelated old object as compared to identifying a new one. This suggests that after processing old objects, participants were more likely to retrieve the associated memories and link them to an ongoing experience.

”
—One act of remembering can influence future acts: study (via myserendipities)
Jul 27, 201275 notes
Chemical Makes Blind Mice See; Compound Holds Promise for Treating Humans

ScienceDaily (July 25, 2012) — A team of University of California, Berkeley, scientists in collaboration with researchers at the University of Munich and University of Washington, in Seattle, has discovered a chemical that temporarily restores some vision to blind mice, and is working on an improved compound that may someday allow people with degenerative blindness to see again.

image

Mice with a genetic disease that causes blindness regained some sight after injection with a chemical “photoswitch.” The eye of the untreated mouse on the left shows no response to light, while the pupil of the mouse on the right, which was injected with the chemical, contracts in light. (Credit: Image courtesy of University of California - Berkeley)

The approach could eventually help those with retinitis pigmentosa, a genetic disease that is the most common inherited form of blindness, as well as age-related macular degeneration, the most common cause of acquired blindness in the developed world. In both diseases, the light sensitive cells in the retina — the rods and cones — die, leaving the eye without functional photoreceptors.

The chemical, called AAQ, acts by making the remaining, normally “blind” cells in the retina sensitive to light, said lead researcher Richard Kramer, UC Berkeley professor of molecular and cell biology. AAQ is a photoswitch that binds to protein ion channels on the surface of retinal cells. When switched on by light, AAQ alters the flow of ions through the channels and activates these neurons much the way rods and cones are activated by light.

"This is similar to the way local anesthetics work: they embed themselves in ion channels and stick around for a long time, so that you stay numb for a long time," Kramer said. "Our molecule is different in that it’s light sensitive, so you can turn it on and off and turn on or off neural activity."

Because the chemical eventually wears off, it may offer a safer alternative to other experimental approaches for restoring sight, such as gene or stem cell therapies, which permanently change the retina. It is also less invasive than implanting light-sensitive electronic chips in the eye.

"The advantage of this approach is that it is a simple chemical, which means that you can change the dosage, you can use it in combination with other therapies, or you can discontinue the therapy if you don’t like the results. As improved chemicals become available, you could offer them to patients. You can’t do that when you surgically implant a chip or after you genetically modify somebody," Kramer said.

"This is a major advance in the field of vision restoration," said co-author Dr. Russell Van Gelder, an ophthalmologist and chair of the Department of Ophthalmology at the University of Washington, Seattle.

Kramer, Van Gelder, chemist Dirk Trauner and their colleagues at UC Berkeley, the University of Washington, Seattle, and the University of Munich will publish their findings on July 26, in the journal Neuron.

The blind mice in the experiment had genetic mutations that made their rods and cones die within months of birth and inactivated other photopigments in the eye. After injecting very small amounts of AAQ into the eyes of the blind mice, Kramer and his colleagues confirmed that they had restored light sensitivity because the mice’s pupils contracted in bright light, and the mice showed light avoidance, a typical rodent behavior impossible without the animals being able to see some light. Kramer is hoping to conduct more sophisticated vision tests in rodents injected with the next generation of the compound.

"The photoswitch approach offers real hope to patients with retinal degeneration," Van Gelder said. "We still need to show that these compounds are safe and will work in people the way they work in mice, but these results demonstrate that this class of compound restores light sensitivity to retinas blind from genetic disease."

From optogenetics to implanted chips

The current technologies being evaluated for restoring sight to people whose rods and cones have died include injection of stem cells to regenerate the rods and cones; “optogenetics,” that is, gene therapy to insert a photoreceptor gene into blind neurons to make them sensitive to light; and installation of electronic prosthetic devices, such as a small light-sensitive retinal chip with electrodes that stimulate blind neurons. Several dozen people already have retinal implants and have had rudimentary, low vision restored, Kramer said.

Eight years ago, Kramer, Trauner, a former UC Berkeley chemist now at the University of Munich, and their colleagues developed an optogenetic technique to chemically alter potassium ion channels in blind neurons so that a photoswitch could latch on. Potassium channels normally open to turn a cell off, but with the attached photoswitch, they were opened when hit by ultraviolet light and closed when hit by green light, thereby activating and deactivating the neurons.

Subsequently, Trauner synthesized AAQ (acrylamide-azobenzene-quaternary ammonium), a photoswitch that attaches to potassium channels without the need to genetically modify the channel. Tests of this compound are reported in the current Neuron paper.

New versions of AAQ now being tested are better, Kramer said. They activate neurons for days rather than hours using blue-green light of moderate intensity, and these photoswitches naturally deactivate in darkness, so that a second color of light is not needed to switch them off.

"This is what we are really excited about," he said.

Source: Science Daily

Jul 27, 201230 notes
#blindness #brain #chemicals #disease #genetics #neuroscience #psychology #science #vision #AAQ
Gene Therapy Holds Promise for Reversing Congenital Hearing Loss

ScienceDaily (July 25, 2012) — A new gene therapy approach can reverse hearing loss caused by a genetic defect in a mouse model of congenital deafness, according to a preclinical study published by Cell Press in the July 26 issue of the journal Neuron. The findings present a promising therapeutic avenue for potentially treating individuals who are born deaf.

image

(Credit: © Vasiliy Koval / Fotolia)

"This is the first time that an inherited, genetic hearing loss has been successfully treated in laboratory mice, and as such represents an important milestone for treating genetic deafness in humans," says senior study author Lawrence Lustig of the University of California, San Francisco.

Hearing loss is one of the most common human sensory deficits, and it results from damage to hair cells in the inner ear. About half of the cases of congenital hearing loss are caused by genetic defects. However, the current treatment options — hearing amplification devices and cochlear implants — do not restore hearing to normal levels. Correcting the underlying genetic defects has the potential to fully restore hearing, but previous attempts to reverse hearing loss caused by genetic mutations have not been successful.

Addressing this challenge in the new study, Lustig and his team used mice with hereditary deafness caused by a mutation in a gene coding for a protein called vesicular glutamate transporter-3 (VGLUT3). This protein is crucial for inner hair cells to send signals that enable hearing. Two weeks after the researchers delivered the VGLUT3 gene into the inner ear through an injection, hearing was restored in all of the mice. This improvement lasted between seven weeks and one and a half years when adult mice were treated, and at least nine months when newborn mice received the treatment.

The therapy did not damage the inner ear, and it even corrected some structural defects in the inner hair cells. Because the specific gene delivery method used is safe and effective in animals, the findings hold promise for future human studies. “For years, scientists have been hinting at the possibility of gene therapy as a potential cure for deafness,” Lustig says. “In this study, we now provide a very real and big step towards that goal.”

Source: Science Daily

Jul 27, 201211 notes
#science #neuroscience #psychology #congenital deafness #hearing loss #genetics #VGLUT3
Increasing dopamine in brain's frontal cortex decreases impulsive tendency: research

July 25, 2012

Raising levels of the neurotransmitter dopamine in the frontal cortex of the brain significantly decreased impulsivity in healthy adults, in a study conducted by researchers at the Ernest Gallo Clinic and Research Center at the University of California, San Francisco.

"Impulsivity is a risk factor for addiction to many substances, and it has been suggested that people with lower dopamine levels in the frontal cortex tend to be more impulsive," said lead author Andrew Kayser, PhD, an investigator at Gallo and an assistant professor of neurology at UCSF. "We wanted to see if we could decrease impulsivity by raising dopamine, and it seems as if we can."

The study was published on July 4 in the Journal of Neuroscience.

In a double-blinded, placebo-controlled study, 23 adult research participants were given either tolcapone, a medication approved by the Food and Drug Administration (FDA) that inhibits a dopamine-degrading enzyme, or a placebo. The researchers then gave the participants a task that measured impulsivity, asking them to make a hypothetical choice between receiving a smaller amount of money immediately (“smaller sooner”) or a larger amount at a later time (“larger later”). Each participant was tested twice, once with tolcapone and once with placebo.

Participants – especially those who were more impulsive at baseline – were more likely to choose the less impulsive “larger later” option after taking tolcapone than they were after taking the placebo.

Magnetic resonance imaging conducted while the participants were taking the test confirmed that regions of the frontal cortex associated with decision-making were more active in the presence of tolcapone than in the presence of placebo.

"To our knowledge, this is the first study to use tolcapone to look for an effect on impulsivity," said Kayser.

The study was not designed to investigate the reasons that reduced dopamine is linked with impulsivity. However, explained Kayser, scientists believe that impulsivity is associated with an imbalance in dopamine between the frontal cortex, which governs executive functions such as cognitive control and self-regulation, and the striatum, which is thought to be involved in the planning and modification of more habitual behaviors.

"Most, if not all, drugs of abuse, such as cocaine and amphetamine, directly or indirectly involve the dopamine system," said Kayser. "They tend to increase dopamine in the striatum, which in turn may reward impulsive behavior. In a very simplistic fashion, the striatum is saying ‘go,’ and the frontal cortex is saying ‘stop.’ If you take cocaine, you’re increasing the ‘go’ signal, and the ‘stop’ signal is not adequate to counteract it."

Kayser and his research team plan a follow-up study of the effects of tolcapone on drinking behavior. “Once we determine whether drinkers can safely tolerate this medication, we will see if it has any effect on how much they drink while they’re taking it,” said Kayser.

Tolcapone is approved as a medication for Parkinson’s disease, in which a chronic deficit of dopamine inhibits movement.

Provided by University of California, San Francisco

Source: medicalxpress.com

Jul 27, 201226 notes
#science #neuroscience #brain #psychology #dopamine #neurotransmitter #impulsive tendency
Jul 27, 2012240 notes
#science #addiction #brain #memory #psychology #neuroscience #ecstasy #cognition
Force of habit: Stress hormones switch off areas of the brain for goal-directed behaviour

July 25, 2012

Cognition psychologists at the Ruhr-Universität together with colleagues from the University Hospital Bergmannsheil (Prof. Dr. Martin Tegenthoff) have discovered why stressed persons are more likely to lapse back into habits than to behave goal-directed. The team of PD Dr. Lars Schwabe and Prof. Dr. Oliver Wolf from the Institute for Cognitive Neuroscience have mimicked a stress situation in the body using drugs. They then examined the brain activity using functional MRI scanning. The researchers have now reported in the Journal of Neuroscience that the interaction of the stress hormones hydrocortisone and noradrenaline shut down the activity of brain regions for goal-directed behaviour. The brain regions responsible for habitual behaviour remained unaffected.

In order to test the different stress hormones, the cognition psychologists used three substances - a placebo, the stress hormone hydrocortisone and yohimbine, which ensures that the stress hormone noradrenaline stays active longer. Part of the volunteers received hydrocortisone alone or just yohimbine, others both substances. A fourth group were administered a placebo. Altogether, the data of 69 volunteers was included in the study.

In the experiment, all participants - both male and female - learned that they would receive cocoa or orange juice as a reward if they chose certain symbols on the computer. After this learning phase, volunteers were allowed to eat as many oranges or as much chocolate pudding as they liked. “That weakens the value of the reward”, explained Schwabe. “Whoever eats chocolate pudding will lose the attraction to cocoa. Whoever is satiated with oranges, has less appetite for orange juice.” In this context, goal-directed behaviour means: Whoever has previously eaten the chocolate pudding, chooses the symbols leading to cocoa reward less frequently. Whoever is satiated with oranges, selects less frequently the symbols associated with orange juice. Based on previous results, the scientists assumed that only the combination of yohimbine and hydrocortisone attenuates goal-directed behaviour. They have now confirmed this hypothesis.

As expected, volunteers who took yohimbine and hydrocortisone did not behave goal-directed but according to habit. In other words, satiation with oranges or chocolate pudding had no effect. Persons who had taken a placebo or only one medication, on the other hand, behaved goal-directed and showed a satiating effect. The brain data revealed: The combination of yohimbine and hydrocortisone reduced the activity in the forebrain – in the so-called orbitofrontal and medial prefrontal cortex. These areas have been already previously associated with goal-directed behaviour. The brain regions which are important for habitual learning, on the other hand, were similarly active for all volunteers.

Provided by Ruhr-Universitaet-Bochum

Source: medicalxpress.com

Jul 27, 201237 notes
#science #neuroscience #brain #psychology #stress #habits #goal-directed behaviour #hydrocortisone #yohimbine
Jul 27, 20121,878 notes
Jul 27, 2012452 notes
Decoding the secrets of balance

July 25, 2012

(Medical Xpress) — New understanding of how the brain processes information from inner ear offers hope for sufferers of vertigo.

If you have ever looked over the edge of a cliff and felt dizzy, you understand the challenges faced by people who suffer from symptoms of vestibular dysfunction such as vertigo and dizziness. There are over 70 million of them in North America. For people with vestibular loss, performing basic daily living activities that we take for granted (e.g. dressing, eating, getting in and out of bed, getting around inside as well as outside the home) becomes difficult since even small head movements are accompanied by dizziness and the risk of falling.

We’ve known for a while that a sensory system in the inner ear (the vestibular system) is responsible for helping us keep our balance by giving us a stable visual field as we move around. And while researchers have already developed a basic understanding of how the brain constructs our perceptions of ourselves in motion, until now no one has understood the crucial step by which the neurons in the brain select the information needed to keep us in balance.

The way that the brain takes in and decodes information sent by neurons in the inner ear is complex. The peripheral vestibular sensory neurons in the inner ear take in the time varying acceleration and velocity stimuli caused by our movement in the outside world (such as those experienced while riding in a car that moves from a stationary position to 50 km per hour). These neurons transmit detailed information about these stimuli to the brain (i.e. information that allows one to reconstruct how these stimuli vary over time) in the form of nerve impulses.

Scientists had previously believed that the brain decoded this information linearly and therefore actually attempted to reconstruct the time course of velocity and acceleration stimuli. But by combining electrophysiological and computational approaches, Kathleen Cullen and Maurice Chacron, two professors in McGill University’s Department of Physiology, have been able to show for the first time that the neurons in the vestibular nuclei in the brain instead decode incoming information nonlinearly as they respond preferentially to unexpected, sudden changes in stimuli.

It is known that representations of the outside world change at each stage in this sensory pathway. For example, in the visual system neurons located closer to the periphery of the sensory system (e.g. ganglion cells in the retina) tend to respond to a wide range of sensory stimuli (a “dense” code), whereas central neurons (e.g. in the primary visual cortex at the back of the head tend to respond much more selectively (a “sparse” code). Chacron and Cullen have discovered that the selective transmission of vestibular information they were able to document for the first time occurs as early as the first synapse in the brain. “We were able to show that the brain has developed this very sophisticated computational strategy to represent sudden changes in movement in order to generate quick accurate responses and maintain balance,” explained Prof. Cullen. “I keep describing it as elegant, because that’s really how it strikes me.”

This kind of selectivity in response is important for everyday life, since it enhances the brain’s perception of sudden changes in body posture. So that if you step off an unseen curb, within milliseconds, your brain has both received the essential information and performed the sophisticated computation needed to help you readjust your position. This discovery is expected to apply to other sensory systems and eventually to the development of better treatments for patients who suffer from vertigo, dizziness, and disorientation during their daily activities. It should also lead to treatments that will help alleviate the symptoms that accompany motion and/or space sickness produced in more challenging environments.

Provided by McGill University

Source: medicalxpress.com

Jul 27, 201222 notes
#neuroscience #psychology #brain #science #balance #vertigo #vestibular system #ear #motion #neuron
Jul 26, 201213 notes
#animals #behavior #biology #huntington's disease #neuroscience #psychology #science #neurological disorders
Shortened telomere length tied to dementia, mortality risk

July 25, 2012

(HealthDay) — Shortened telomere length (TL) is associated with risks for dementia and mortality in a population of older adults, according to a study published online July 23 in the Archives of Neurology.

image

Lawrence S. Honig, M.D., Ph.D., from the Columbia University College of Physicians and Surgeons in New York City, and colleagues used real-time polymerase chain reaction analysis to determine TL in stored leukocyte DNA from 1,983 participants in a community-based study of aging. Participants were 65 years or older and blood was drawn at a mean age of 78.3 years. Participants were followed for a median of 9.3 years for mortality, and 9.6 percent developed incident dementia.

The researchers found that TL correlated inversely with age and was shorter in men than women. TL was significantly shorter in persons dying during follow-up compared with survivors, even after adjusting for age, sex, education, and apolipoprotein E genotype. TL was significantly shorter in the participants with incident and prevalent dementia, compared with those who remained dementia-free. Shorter TL correlated with earlier onset of dementia but this association was significant in women only.

"Our results show an association between shortened TL and mortality, and more specifically an association of shortened TL with Alzheimer’s disease, and are consistent with but not indicative of the possibility that TL may be a factor indicative of biological age," the authors conclude.

Source: medicalxpress.com

Jul 26, 201225 notes
#science #neuroscience #psychology #brain #telomere #dementia #mortality #alzheimer #alzheimer's disease #research
Jul 26, 201221 notes
#aesop's fable #cause-effect relationships #education #neuroscience #psychology #science #thinking #animals
Mind vs. body? Dualist beliefs linked with less concern for healthy behaviors

July 25, 2012

(Medical Xpress) — Many people, whether they know it or not, are philosophical dualists. That is, they believe that the brain and the mind are two separate entities. Despite the fact dualist beliefs are found in virtually all human cultures, surprisingly little is known about the impact of these beliefs on how we think and behave in everyday life.

But a new research article forthcoming in Psychological Science, a journal of the Association for Psychological Science, suggests that espousing a dualist philosophy can have important real-life consequences.

Across five related studies, researchers Matthias Forstmann, Pascal Burgmer, and Thomas Mussweiler of the University of Cologne, Germany, found that people primed with dualist beliefs had more reckless attitudes toward health and exercise, and also preferred (and ate) a less healthy diet than those who were primed with physicalist beliefs.

Furthermore, they found that the relationship also worked in the other direction. People who were primed with unhealthy behaviors – such as pictures of unhealthy food – reported a stronger dualistic belief than participants who were primed with healthy behaviors.

Overall, the findings from the five studies provide converging evidence demonstrating that mind-body dualism has a noticeable impact on people’s health-related attitudes and behaviors. Specifically, these findings suggest that dualistic beliefs decrease the likelihood of engaging in healthy behavior.

These findings support the researchers’ original hypothesis that the more people perceive their minds and bodies to be distinct entities, the less likely they will be to engage in behaviors that protect their bodies. Bodies are ultimately viewed as a disposable vessel that helps the mind interact with the physical world.

Evidence of a bidirectional relationship further suggests that metaphysical beliefs, such as beliefs in mind-body dualism, may serve as cognitive tools for coping with threatening or harmful situations.

The fact that the simple priming procedures used in the studies had an immediate impact on health-related attitudes and behavior suggests that these procedures may eventually have profound implications for real-life problems. Interventions that reduce dualistic beliefs through priming could be one way to help promote healthier – or less self-damaging – behaviors in at-risk populations.

Provided by Association for Psychological Science

Source: medicalxpress.com

Jul 26, 201243 notes
#brain #health #mind-body problem #neuroscience #psychology #science #dualism
Jul 26, 201228 notes
#science #neuroscience #Biometric identification #eye #iris recognition #recognition #synthetic iris
Jul 26, 201220 notes
#science #neuroscience #brain #psychology #depression
Play
Jul 26, 201268 notes
#science #neuroscience #technology #algorithm #pulse #health
Jul 26, 201258 notes
#alzheimer #alzheimer's disease #brain #caregivers #dementia #meditation #neuroscience #psychology #science #stress #yoga
Jul 26, 20126 notes
#science #neuroscience #brain #psychology #stroke #vision
Computers may help patients restore movement after stroke

New research suggests that patients whose mobility has been limited by stroke may one day use their imagination and a computer link to move their hands.

image

Leuthardt

In patients, scientists at Washington University School of Medicine in St. Louis have shown they can detect the brain simply thinking about moving a partially or completely paralyzed hand. The half of the brain that normally thinks such thoughts and moves the hand can no longer do so because of stroke damage. Instead, the signal comes from the undamaged half of the brain.

The new study suggests it may be possible to harness these signals to restore a fuller range of movement in the patient’s limbs.

“We’ve known for some time that the brain can reroute or otherwise adapt its circuits to cope with an injury,” says senior author Eric Leuthardt, MD, associate professor of neurosurgery, of biomedical engineering and of neurobiology. “Now we have proof-of-principle that we can use technology to aid that process.”

To demonstrate the potential to help restore movement, scientists connected brain signals detected by an electrode-studded cap to the movements of a cursor on a computer screen. In 30 minutes or less, patients learned to control the movement of the cursor with thoughts of moving their impaired hand. Researchers are now working on a motorized glove that will make the imagined movements a reality.

The results are available online in The Journal of Neural Engineering.

Leuthardt, who is director of Washington University’s Center for Innovation in Neuroscience and Technology, is a pioneer in the field of brain-computer interfaces, or devices that allow the brain to communicate directly with computers to restore abilities lost to injury or disease.

Much of Leuthardt’s research has focused on patients with epilepsy who are undergoing surgery to remove the part of the brain where their seizures originate. He uses the electrode grids temporarily implanted on the surface of the brain to pinpoint areas where the seizures begin. With the patients’ permissions, Leuthardt also uses the implants to gather and analyze detailed information on brain activity for future use in brain-computer interfaces. This approach laid the foundations for the technique now being applied to the stroke population. 

In the new research, first author David Bundy, a graduate student, worked with four patients who had suffered strokes that caused extensive damage on one side of the brain. All were experiencing paralysis or significant difficulty moving the hand on the opposite side of the body.

The brain signals that control movement are low-frequency signals, which makes them relatively easy to detect with electrodes on the outside of the skull. Researchers fitted patients with an electrode-studded cap connected to a computer, and asked them to perform a finger-tapping activity. Depending on a cue flashed on a screen in front of them, the patients either tapped the fingers of their unimpaired hand or imagined tapping the fingers of the impaired hand. Scientists used the cap to identify signals in healthy part of the brain that accompanied the imaginary movements.

The researchers are now developing motorized braces that can be controlled by similar signals, with the goal of restoring full movement in weak or paralyzed limbs.

“This is an exciting development that opens up new opportunities to help even more patients overcome limitations imposed by brain damage or degeneration,” Leuthardt says.

Source: Washington University in St. Louis

Jul 26, 201211 notes
#science #neuroscience #brain #psychology #stoke #paralysis #movement #brain-computer studies #brain damage
Jul 26, 201221 notes
#Archon Genomics X prize #DNA #biology #genetics #genomics #ion proton sequencer #medicine #neuroscience #psychology #research #science #technology #X prize foundation #ageing
Jul 25, 201235 notes
#brain #illusion #neuroscience #perception #psychology #science #virtual reality #peripersonal space #body image #vision
New drug could treat Alzheimer's, multiple sclerosis and brain injury

July 24, 2012

A new class of drug developed at Northwestern University Feinberg School of Medicine shows early promise of being a one-size-fits-all therapy for Alzheimer’s disease, Parkinson’s disease, multiple sclerosis and traumatic brain injury by reducing inflammation in the brain.

Northwestern has recently been issued patents to cover this new drug class and has licensed the commercial development to a biotech company that has recently completed the first human Phase 1 clinical trial for the drug.

The drugs in this class target a particular type of brain inflammation, which is a common denominator in these neurological diseases and in traumatic brain injury and stroke. This brain inflammation, also called neuroinflammation, is increasingly believed to play a major role in the progressive damage characteristic of these chronic diseases and brain injuries.

By addressing brain inflammation, the new class of drugs — represented by MW151 and MW189 — offers an entirely different therapeutic approach to Alzheimer’s than current ones being tested to prevent the development of beta amyloid plaques in the brain. The plaques are an indicator of the disease but not a proven cause.

A new preclinical study published today in the Journal of Neuroscience, reports that when one of the new Northwestern drugs is given to a mouse genetically engineered to develop Alzheimer’s, it prevents the development of the full-blown disease. The study, from Northwestern’s Feinberg School and the University of Kentucky, identifies the optimal therapeutic time window for administering the drug, which is taken orally and easily crosses the blood-brain barrier.

"This could become part of a collection of drugs you could use to prevent the development of Alzheimer’s," said D. Martin Watterson, a professor of molecular pharmacology and biological chemistry at the Feinberg School, whose lab developed the drug. He is a coauthor of the study.

In previous animal studies, the same drug reduced the neurological damage caused by closed-head traumatic brain injury and inhibited the development of a multiple sclerosis-like disease. In these diseases as well as in Alzheimer’s, the studies show the therapy time window is critical.

Read More →

Jul 25, 201229 notes
#MS #alzheimer #alzheimer's disease #brain #brain injury #drug #medication #neuroscience #parkinson #parkinson's disease #psychology #science #disease #neuroinflammation
How a Single Brain Trauma May Lead to Alzheimer's Disease

ScienceDaily (July 24, 2012) — A study, performed in mice and utilizing post-mortem samples of brains from patients with Alzheimer’s disease, found that a single event of a moderate-to-severe traumatic brain injury (TBI) can disrupt proteins that regulate an enzyme associated with Alzheimer’s. The paper, published in The Journal of Neuroscience, identifies the complex mechanisms that result in a rapid and robust post-injury elevation of the enzyme, BACE1, in the brain. These results may lead to the development of a drug treatment that targets this mechanism to slow the progression of Alzheimer’s disease.

"A moderate-to-severe TBI, or head trauma, is one of the strongest environmental risk factors for Alzheimer’s disease. A serious TBI can lead to a dysfunction in the regulation of the enzyme BACE1. Elevations of this enzyme cause elevated levels of amyloid-beta, the key component of brain plaques associated with senility and Alzheimer’s disease," said first author Kendall Walker, PhD, postdoctoral associate in the department of neuroscience at Tufts University School of Medicine (TUSM).

Building on her previous work, neuroscientist Giuseppina Tesco, MD, PhD, of Tufts University School of Medicine (TUSM), led a research team that first used an in vivo model to determine how a single episode of TBI could alter the brain. In the acute phase (first two days) following injury, levels of two intracellular trafficking proteins (GGA1 and GGA3) were reduced, and an elevation of BACE1 enzyme level was observed.

Next, in an analysis of post-mortem brain samples from patients with Alzheimer’s disease, the researchers found that GGA1 and GGA3 levels were reduced while BACE1 levels were elevated in the brains of Alzheimer’s disease patients compared to the brains of people without Alzheimer’s disease, suggesting a possible inverse association.

In an additional experiment using a mouse strain genetically modified to express the reduced level of GGA3 that was observed in the brains of Alzheimer’s disease patients, the team found that one week following traumatic brain injury, BACE1 and amyloid-beta levels remained elevated even when GGA1 levels had returned to normal. The research suggests that reduced levels of GGA3 were solely responsible for the increase in BACE 1 levels and therefore the sustained amyloid-beta production observed in the sub-acute phase, or seven days, after injury.

"When the proteins are at normal levels, they work as a clean-up crew for the brain by regulating the removal of BACE1 enzymes and facilitating their transport to lysosomes within brain cells, an area of the cell that breaks down and removes excess cellular material. BACE1 enzyme levels may be stabilized when levels of the two proteins are low, likely caused by an interruption in the natural disposal process of the enzyme," said Tesco, assistant professor of neuroscience at Tufts School of Medicine and member of the neuroscience program faculty at the Sackler School of Graduate Biomedical Sciences at Tufts.

"We found that GGA1 and GGA3 act synergistically to regulate BACE1 post-injury. The identification of this interaction may provide a drug target to therapeutically regulate the BACE1 enzyme and reduce the deposition of amyloid-beta in Alzheimer’s patients," she continued. "Our next steps are to confirm these findings in post-mortem brain samples from patients with moderate-to-severe traumatic brain injuries."

Moderate-to-severe TBIs are caused most often by traumas, such as severe falls or motor vehicle accidents, that result in a loss of consciousness. Not all traumas to the head result in a TBI. According to the Centers for Disease Control and Prevention, each year 1.7 million people sustain a TBI. Concussions, the mildest form of a TBI, account for about 75% of all TBIs. Studies have linked repeated head trauma to brain disease and some previous studies have linked single events of brain trauma to brain disease, such as Alzheimer’s. Alzheimer’s disease currently affects as many as 5.1 million Americans and is the most common cause of dementia in adults age 65 and over.

Source: Science Daily

Jul 25, 201211 notes
#science #neuroscience #brain #psychology #alzheimer #alzheimer's disease #TBI #trauma #protein
Jul 25, 201232 notes
#agoraphobia #brain #disorders #emotion #motor reactions #neuroscience #panic #peripheral vision #prostriata #psychology #science #vision #alzheimer's disease #alzheimer #treatment
Chronic pain distorts sufferers’ sense of space and time

July 24, 2012

Einstein’s famous theory of relativity proposed that matter can distort space and time. Now a new study recently published in the journal Neurology suggests that chronic pain can have the same effect.

Neuroscientists from the University of South Australia, Neuroscience Research Australia and the University of Milano Bicocca in Italy, studied people with chronic back pain, the most common painful condition which costs western countries billions of dollars in lost productivity every year.

They presented identical vibration stimuli to the painful area and a non-painful area and noted that the stimuli were processed more slowly by the brain if they came from the painful area.

The most striking finding, however, was that the same effect occurred if the stimuli were delivered to a healthy body part being held near the painful area.

Lead author of the study, Professor Lorimer Moseley from the University of South Australia, says it was not altogether surprising that, in people with chronic pain, there are changes in the way the brain processes information from and about the painful body part.

“But what is remarkable is that the problem affects the space around the body as well as the body itself,” Prof Moseley says.

Experiments showed that if a hand was held near the painful area of the back, the brain would almost ‘neglect’ that hand.

“The potential similarity between our findings and the time-space distortion predicted by the relativity theory is definitely intriguing,” Prof Moseley says.

“Obviously, here it is not external space that is distorted but the ability of the brain to represent that space within its neural circuitry.

“This finding opens up a whole new area of research into the way the brain allows us to interact with the world and how this can be disrupted in chronic pain.”

Provided by University of South Australia

Source: medicalxpress.com

Jul 25, 201244 notes
#brain #chronic pain #neuroscience #pain #psychology #science #sense of time
Jul 25, 201220 notes
#science #neuroscience #brain #psychology #ion channels #pain #inflammatory #neuropathic
Better Understanding of Memory Retrieval Between Children and Adults

ScienceDaily (July 24, 2012) — Neuroscientists from Wayne State University and the Massachusetts Institute of Technology (MIT) are taking a deeper look into how the brain mechanisms for memory retrieval differ between adults and children. While the memory systems are the same in many ways, the researchers have learned that crucial functions with relevance to learning and education differ.

The team’s findings were published on July 17, 2012, in the Journal of Neuroscience.

According to lead author Noa Ofen, Ph.D., assistant professor in WSU’s Institute of Gerontology and Department of Pediatrics, cognitive ability, including the ability to learn and remember new information, dramatically changes between childhood and adulthood. This ability parallels with dramatic changes that occur in the structure and function of the brain during these periods.

In the study, “The Development of Brain Systems Associated with Successful Memory Retrieval of Scenes,” Ofen and her collaborative team tested the development of neural underpinnings of memory from childhood to young adulthood. The team of researchers exposed participants to pictures of scenes and then showed them the same scenes mixed with new ones and asked them to judge whether each picture was presented earlier. Participants made retrieval judgments while researchers collected images of their brains with magnetic resonance imaging (MRI).

Using this method, the researchers were able to see how the brain remembers. “Our results suggest that cortical regions related to attentional or strategic control show the greatest developmental changes for memory retrieval,” said Ofen.

The researchers said that older participants used the cortical regions more than younger participants when correctly retrieving past experiences.

"We were interested to see whether there are changes in the connectivity of regions in the brain that support memory retrieval," Ofen added. "We found changes in connectivity of memory-related regions. In particular, the developmental change in connectivity between regions was profound even without a developmental change in the recruitment of those regions, suggesting that functional brain connectivity is an important aspect of developmental changes in the brain."

This study marks the first time that the development of connectivity within memory systems in the brain has been tested, and the results suggest that the brain continues to rearrange connections to achieve adult-like performance during development.

Ofen and her research team plan to continue research in this area, focused on modeling brain network connectivity, and applying these methods to study abnormal brain development.

Source: Science Daily

Jul 25, 201216 notes
#science #neuroscience #brain #psychology #memory #memory retrieval #MRI
Mice have system to handle smell of fear

July 23, 2012

Mice appear to have a specialized system for detecting and at least initially processing instinctually important smells such as those that denote predators. The finding raises a question about whether their response to those smells is hardwired.

image

A separate subsystem for the smell of fear. Experiments in mice suggest neurons that detect odors associated with an instinctive response — like fleeing when an approaching predator is detected — are configured differently than other olfactory neurons. Further research could determine whether this system automatically triggers flight or other primal behaviors.Credit: Mike Cohea/Brown University

PROVIDENCE, R.I. [Brown University] — A new study finds that mice have a distinct neural subsystem that links the nose to the brain and is associated with instinctually important smells such as those emitted by predators. That insight, published online this week in Proceedings of the National Academy of Sciences, prompts the question whether mice and other mammals have specially hardwired neural circuitry to trigger instinctive behavior in response to certain smells.

In the series of experiments and observations described in the paper, the authors found that nerve cells in the nose that express members of the gene family of trace amine-associated receptors (TAAR) have several key biological differences from the much more common and diverse neurons that express members of the olfactory receptor gene family. Those other nerve cells detect a much broader range of smells, said corresponding author Gilad Barnea, the Robert and Nancy Carney Assistant Professor of Neuroscience at Brown University.

The differences between TAAR neurons and olfactory receptor neurons led Barnea and his co-authors to conclude that they form an independent subsystem for certain smells.

“Our observations suggest that the TAAR-expressing sensory neurons constitute a distinct olfactory subsystem that extracts specific environmental cues that then elicit innate responses,” Barnea said.

Read More →

Jul 25, 201237 notes
#science #neuroscience #brain #psychology #smell #fear #neuron #odor #olfactory system #protein #TAAR #genetics
Strobe Eyewear Training Improves Visual Memory

ScienceDaily (July 23, 2012) — Stroboscopic training, performing a physical activity while using eyewear that simulates a strobe-like experience, has been found to increase visual short-term memory retention, and the effects lasted 24 hours.

image

(Credit: Image courtesy of Duke University)

Participants completed a memory test that required them to note the identity of eight letters of the alphabet that were briefly displayed on a computer screen. After a variable delay, participants were asked to recall one of the eight letters. On easy-level trials, the recall prompt came immediately after the letters disappeared, but on more difficult trials, the prompt came as late as 2.5 seconds following the display. Because participants did not know which letter they would be asked to recall, they had to retain all of the items in memory.

"Humans have a memory buffer in their brain that keeps information alive for a certain short-lived period," said Greg Appelbaum, assistant professor of psychiatry at Duke University and first author of the study. "Wearing the strobe eyewear during the physical training seemed to boost the ability to retain information in this buffer."

The strobe eyewear disrupts vision by only allowing the user to see glimpses of the world. The user must adjust their visual processing in order to perform normally, and this adjustment produces a lingering benefit; once participants removed the strobe eyewear, there was an observed boost in their visual memory retention, which was found to last 24 hours.

Earlier work by Appelbaum and the project’s senior researcher, Stephen Mitroff, had shown that stroboscopic training improves visual perception, including the ability to detect subtle motion cues and the processing of briefly presented visual information. Yet the earlier study had not determined how long the benefits might last.

"Our earlier work on stroboscopic training showed that it can improve perceptual abilities, but we don’t know exactly how," says Mitroff, associate professor of psychology & neuroscience and member of the Duke Institute for Brain Sciences. "This project takes a big step by showing that these improved perceptual abilities are driven, at least in part, by improvements in visual memory."

"Improving human cognition is an important goal with so many benefits," said Appelbaum, also a member of the Duke Institute for Brain Sciences. "Interestingly, our findings demonstrate one way in which visual experience has the capacity to improve cognition."

Source: Science Daily

Jul 25, 20128 notes
#science #neuroscience #brain #psychology #memory #vision #visual memory #cognition
Jul 25, 201220 notes
#science #neuroscience #brain #genetics #protein #huntington's disease #neurodegenerative diseases
Snacking and BMI Linked to Double Effect of Brain Activity and Self-Control

ScienceDaily (July 23, 2012) — Snack consumption and BMI are linked to both brain activity and self-control, new research has found.

image

Snack consumption and BMI are linked to both brain activity and self-control, new research has found. (Credit: © farbkombinat / Fotolia)

The research, carried out by academics from the Universities of Exeter, Cardiff, Bristol, and Bangor, discovered that an individual’s brain ‘reward centre’ response to pictures of food predicted how much they subsequently ate. This had a greater effect on the amount they ate than their conscious feelings of hunger or how much they wanted the food,

A strong brain response was also associated with increased weight (BMI), but only in individuals reporting low levels of self-control on a questionnaire. For those reporting high levels of self-control a stronger brain response to food was actually related to a lower BMI.

This study, which is now published in the journal NeuroImage, adds to mounting evidence that overeating and increased weight are linked, in part, to a region of the brain associated with motivation and reward, called the nucleus accumbens. Responses in this brain region have been shown to predict weight gain in healthy weight and obese individuals, but only now have academics discovered that this is independent of conscious feelings of hunger, and that self-control also plays a key role.

Following these results, academics at the University of Exeter and Cardiff have begun testing ‘brain training’ techniques designed to reduce the influence of food cues on individuals who report low levels of self-control. Similar tests are being used to assist those with gambling or alcohol addiction.

Dr Natalia Lawrence of Psychology at the University of Exeter, lead researcher in both the original research and the new studies, said: “Our research suggests why some individuals are more likely to overeat and put on weight than others when confronted with frequent images of snacks and treats. Food images, such as those used in advertising, cause direct increases in activity in brain ‘reward areas’ in some individuals but not in others. If those sensitive individuals also struggle with self-control, which may be partly innate, they are more likely to be overweight. We are now developing computer programs that we hope will counteract the effects of this high sensitivity to food cues by training the brain to respond less positively to these cues.”

Twenty-five young, healthy females with BMIs ranging from 17-30 were involved in the study. Female participants were chosen because research shows females typically exhibit stronger responses to food-related cues. The hormonal changes during the menstrual cycle affect this reaction, so all participants were taking the monophasic combined oral contraceptive pill. Participants had not eaten for at least six hours to ensure they were hungry at the time of the scan and were given a bowl containing 150 g (four and a half packets) of potato chips to eat at the end of the study; they were informed that potato chip intake had been measured afterwards.

Researchers used MRI scanning to detect the participants’ brain activity while they were shown images of household objects, and food that varied in desirability and calorific content. After scanning, participants rated the food images for desirability and rated their levels of hunger and food craving. Results showed that participants’ brain responses to food (relative to objects) in the nucleus accumbens predicted how many potato chips they ate after the scan. However, participants’ own ratings of hunger and how much they liked and wanted the foods, including potato chips, were unrelated to their potato chip intake.

This study was funded by the Wales Institute of Cognitive Neuroscience.

What this study shows:

  • Brain responses to food images vary considerably between individuals.
  • Brain responses to food images but not conscious feelings of hunger or desire to eat predict subsequent potato chip consumption.
  • Individuals’ reported levels of self-control influence whether this brain response is associated with a higher or lower BMI.

What this study does NOT show:

  • Brain responses to food cues cause overeating.
  • The associations reported here are true in everyone — only healthy young women were included.
  • Whether our brain response and levels of self-control are learned or innate.

Source: Science Daily

Jul 25, 201220 notes
#science #neuroscience #brain #psychology #BMI #food #weight #eating #MRI
Powerful class of antioxidants may be potent Parkinson’s treatment

JUL 23, 2012

A new and powerful class of antioxidants could one day be a potent treatment for Parkinson’s disease, researchers report.

image

Dr. Bobby Thomas

A class of antioxidants called synthetic triterpenoids blocked development of Parkinson’s in an animal model that develops the disease in a handful of days, said Dr. Bobby Thomas, neuroscientist at the Medical College of Georgia at Georgia Health Sciences University and corresponding author of the study in the journal Antioxidants & Redox Signaling.

Thomas and his colleagues were able to block the death of dopamine-producing brain cells that occurs in Parkinson’s by using the drugs to bolster Nrf2, a natural antioxidant and inflammation fighter.

Stressors from head trauma to insecticide exposure to simple aging increase oxidative stress and the body responds with inflammation, part of its natural repair process. “This creates an environment in your brain that is not conducive for normal function,” Thomas said. “You can see the signs of oxidative damage in the brain long before the neurons actually degenerate in Parkinson’s.”

Nrf2, the master regulator of oxidative stress and inflammation, is – inexplicably – significantly decreased early in Parkinson’s. In fact, Nrf2 activity declines normally with age.

“In Parkinson’s patients you can clearly see a significant overload of oxidative stress, which is why we chose this target,” Thomas said. “We used drugs to selectively activate Nrf2.”

They parsed a number of antioxidants already under study for a wide range of diseases from kidney failure to heart disease and diabetes, and found triterpenoids the most effective on Nrf2. Co-author Dr. Michael Sporn, Professor of Pharmacology, Toxicology and Medicine at Dartmouth Medical School, chemically modified the agents so they could permeate the protective blood-brain barrier.

Both in human neuroblastoma and mouse brain cells they were able to document an increase in Nrf2 in response to the synthetic triterpenoids. Human dopaminergic cells are not available for research so the scientists used the human neuroblastoma cells, which are actually cancer cells that have some properties similar to neurons.

Their preliminary evidence indicates the synthetic triterpenoids also increase Nrf2 activity in astrocytes, a brain cell type which nourishes neurons and hauls off some of their garbage. The drugs didn’t protect brain cells in an animal where the Nrf2 gene was deleted, more proof that that Nrf2 is the drugs’ target.

The researchers used the powerful neurotoxin MPTP to mimic Parkinson’s-like brain cell damage in a matter of days. They are now looking at the impact of synthetic triterpenoids in an animal model genetically programmed to acquire the disease more slowly, as humans do. Collaborators at Johns Hopkins School of Medicine also will be providing induced pluripotent stem cells, adult stem cells that can be coaxed into forming dopaminergic neurons, for additional drug testing.

Other collaborators include scientists at Weill Medical College of Cornell University, Johns Hopkins School of Public Health, Moscow State University, Tohoku University and the University of Pittsburgh.

Source: EarthSky

Jul 24, 201210 notes
#science #neuroscience #brain #psychology #antioxidants #parkinson #parkinson's disease #treatment #synthetic triterpenoids
New epilepsy gene identified; possible new treatment option

ScienceDaily (July 23, 2012) — New research conducted by neuroscientists from the Royal College of Surgeons in Ireland (RCSI) published in Nature Medicine has identified a new gene involved in epilepsy and could potentially provide a new treatment option for patients with epilepsy.

The research focussed on a new class of gene called a ‘microRNA’ which controls protein production inside cells. The research looked in detail at one particular microRNA called ‘microRNA-134’ and found that levels of microRNA-134 are much higher in the part of the brain that causes seizures in patients with epilepsy.

By using a new type of drug-like molecule called an antagomir which locks onto the ‘microRNA-134’ and removes it from the brain cell, the researchers found they could prevent epileptic seizures from occurring.

Professor David Henshall, Department of Physiology & Medical Physics, RCSI and senior author on the paper said ‘We have been looking to find what goes wrong inside brain cells to trigger epilepsy. Our research has discovered a completely new gene linked to epilepsy and it shows how we can target this gene using drug-like molecules to reduce the brain’s susceptibility to seizures and the frequency in which they occur.”

Dr Eva Jimenez-Mateos, Department of Physiology & Medical Physics, RCSI and first author on the paper said “Our research found that the antagomir drug protects the brain cells from toxic effects of prolonged seizures and the effects of the treatment can last up to one month.”

Epilepsy affects 37,000 in Ireland alone. For every two out of three people with epilepsy their seizures are controlled by medication, but one in three patients continues to have seizures despite being prescribed medication. This study could potentially offer new treatment methods for patients.

The research was supported by a grant from Science Foundation Ireland (SFI). Researchers in the Department of Physiology & Medical Physics and Molecular & Cellular Therapeutics, RCSI, clinicians at Beaumont Hospital and experts in brain structure from the Cajal Institute in Madrid were involved in the study.

Source: Science Daily

Jul 24, 201217 notes
#science #neuroscience #brain #psychology #epilepsy #genes #treatment #medicine #microRNA
Children with trisomy 13 and 18 and their families are happy

23 JUILLET 2012

Children with trisomy 13 or 18, who are for the most part severely disabled and have a very short life expectancy, and their families lead a life that is happy and rewarding overall, contrary to the usually gloomy predictions made by the medical community at the time of diagnosis, according to a study of parents who are members of support groups published today inPediatrics. The study was conducted by Dr. Annie Janvier of the Sainte-Justine University Hospital Center and the University of Montreal with the special collaboration of the mother of a child who died from trisomy 13, Barbara Farlow, Eng, MSc as the second author.

image

Source : Wikimedia Commons

The study interviewed 332 parents who live or have lived with 272 children with trisomy 13 or 18. It turns out that their experience diverges substantially from what healthcare providers said it would be, according to which their child would have been “incompatible with life” (87 %), would have been “a vegetable” (50 %), would have led “a life of suffering” (57 %) or would have “ruin their family or life as a couple” (23 %).

It should be noted that trisomies 13 and 18 are rare chromosome disorders that are most often diagnosed before birth and sometimes after. Children who have received these diagnoses generally do not survive beyond their first year of life, while some who do have severe disabilities and a short life. When trisomy 13 or 18 is diagnosed before birth, many parents decide to interrupt the pregnancy, whereas others choose to carry it to term and in such cases miscarriages are common.

As children with trisomies 13 or 18 generally receive palliative care at birth, some parents who opt to continue the pregnancy or desire life-prolonging interventions for their child encounter the prejudices of the medical system. In this regard, the parents interviewed in the study consider that caregivers often view their child in terms of a diagnosis (“a T13”, “a lethal trisomy”) rather than a unique baby.

“Our study points out that physicians and parents can have different views of what constitutes quality of life,” states Dr. Annie Janvier, a neonatologist and co-founder of the Master’s program in Pediatric Clinical Ethics at the University of Montreal. In fact, over 97% of the parents interviewed considered that their child was happy and its presence enriched the life of their family and their life as a couple regardless of longevity. “In the medical literature on all handicaps, disabled patients – or their families – rated their quality of life as being higher than caregivers did,” adds Dr. Annie Janvier.

Parents who receive a new diagnosis of trisomy 13 and 18 and join a parental support group often acquire a more positive image of these diagnoses than the predictions made by the medical profession. In fact, according to the parents interviewed, belonging to a support group helped them view their experience positively. “Our research reveals that some parents who chose a path to accept and to love a disabled child with a short life expectancy have experienced happiness and enrichment. My hope is that this knowledge improves the ability of physicians to understand, communicate and make decisions with these parents,” concludes Barbara Farlow.

Given the rarity of trisomy 13 or 18 cases (one case out of approximately every 10,500 births), the parents were recruited through online support groups that parents often join after receiving the physicians’ diagnosis. Dr. Annie Janvier and Barbara Farlow sometimes give joint talks on the subject of trisomies 13 and 18.

Source: Université de Montréal

Jul 24, 20129 notes
#science #neuroscience #brain #psychology #trisomy #diagnosis #disorder #chromosome #biology #quality of life
Multiple sclerosis drug disappoints on disability

July 23, 2012 By David Orenstein

(Medical Xpress) — This week the Journal of the American Medical Association published a study with unfortuate news for the millions of people who suffer from multiple sclerosis. In the large study, a therapy known as interferon beta failed to stave off the progression of the incurable disease. Albert Lo, associate professor of neurology and epidemiology, comments on what the study means for patients, why it was well-designed, and how a new effort to support research on the disease in Rhode Island could help.

The results of this study with nearly 2,700 participants showed that treatment with interferon beta, which is a major class of disease-modifying therapy for multiple sclerosis, did not prevent progression of disability, which is very disappointing from a therapeutic perspective. Currently, there is no cure for MS, and as a lifelong disorder of the nervous system, MS is characterized by episodic relapses of neurological injury such as weakness or blindness. While in most cases, there is a varying degree of recovery after relapses, over time, disability accumulates. The accumulation of deficits and the loss of physical and mental function is a major concern for people with MS and their clinicians.

Currently, there is no medication on the market that is directed explicitly for neuroprotection and the prevention of disability. Many had hoped that the interferons, along with the other disease-modifying agents (which were developed to reduce relapse rates) would also have a significant effect on protecting patients from MS disability.

Although the results from this study were not as we would have hoped, they reflect a marked improvement over prior studies which used known methodologic flaws. The new results from the Tremlett group point to the importance of the research methodology used (prospectively collected longitudinal study data) and a well-controlled design to generate the results – approaches that we are using in our own research at Brown University.

A number of the early studies examining the effect of interferons on disability primarily used patient sample groups of convenience for post-marketing studies. They indicated that interferons were in fact preventing disability. However, using samples of convenience inherently includes a number of biases and problems. Dr. Tremlett’s results were generated from a more systematic longitudinal study in which biases and shortcomings can be better addressed. Therefore, making conclusions and clinical decisions from the results is more reliable. These data both will help in making clinical decisions on treating MS patients during the later course of their disease, when there are virtually no relapses, and will help to point more urgently toward the clinical need of an agent to prevent disability.

Provided by Brown University

Source: medicalxpress.com

Jul 24, 20122 notes
#MS #disease #drug #health #medication #neuroscience #psychology #science #research
Neural precursor cells induce cell death in certain brain tumors

July 23, 2012

Neural precursor cells (NPC) in the young brain suppress certain brain tumors such as high-grade gliomas, especially glioblastoma (GBM), which are among the most common and most aggressive tumors. Now researchers of the Max Delbrück Center for Molecular Medicine (MDC) Berlin-Buch and Charité – Universitätsmedizin Berlin have deciphered the underlying mechanism of action with which neural precursor cells protect the young brain against these tumors. They found that the NPC release substances that activate TRPV1 ion channels in the tumor cells and subsequently induce the tumor cells to undergo stress-induced cell-death. (Nature Medicine http://dx.doi.org/10.1038/nm.2827)*.

Despite surgery, radiation or chemotherapy or even a combination of all three treatment options, there is currently no cure for glioblastoma. In an earlier study the research group led by Professor Helmut Kettenmann (MDC) showed that neural precursor cells migrate to the glioblastoma cells and attack them. The neural precursor cells release a protein belonging to the family of BMP proteins (bone morphogenetic protein) that directly attacks the tumor stem cells. The current consensus of researchers is that tumor stem cells are the actual cause for continuous tumor self-renewal.

Kristin Stock, Jitender Kumar, Professor Kettenmann (all MDC), Dr. Michael Synowitz (MDC and Charité), Professor Rainer Glass (Munich University Hospitals, formerly MDC) and Professor Vincenzo Di Marzo (Istituto di Chimica Biomolecolare Pozzuoli, Naples, Italy) now report a new mechanism of action of NPC in astrocytomas. Like glioblastomas, astrocytomas are brain tumors that belong to the family of gliomas. Gliomas are most common in older people and are almost invariably fatal.

As the MDC researchers showed, the NPC also migrate to the astrocytomas. There they do not secrete proteins, but rather release fatty-acid substances (endovanilloids) which are harmful to the cancer cells. However, in order to exert their lethal effect, the endovanilloids need the aid of a specific ion channel, the TRPV1 channel (transient receptor potential vanilloid type 1), also called the vanilloid receptor 1. TRPV1 is already known to researchers as a transducer of painful stimuli. It has, among other things, a binding site for capsaicin, the irritant of hot chili peppers, and is responsible for the hot sensation after eating them. Clinical trials are currently underway to develop new pain treatments by blocking or desensitizing this ion channel.

MDC researchers describe an additional role of the TRPV1 ion channel

In contrast to its use in pain management, this ion channel, which is located on the surface of glioblastoma cells and is much more abundant there than on normal glial cells, must be activated to trigger cell death in gliomas. The activated ion channel mediates stress-induced cell-death in tumor cells. If however TRPV1 is downregulated or blocked, the glioma cells are not destroyed. The MDC researchers are thus the first to identify neural precursor cells as the source of fatty acids that induce tumor cell death and to describe the role of the TRPV1 ion channel in the fight against gliomas.

However, the activity of neural precursor cells in the brain and thus of the body’s own protective mechanism against gliomas diminishes with increasing age. This could explain why these tumors usually develop in older adults and not in children and young people. How can the natural protection of neural precursor cells be harnessed for older brains? According to the researchers, neural precursor cell therapy is not a solution. The benefit this obviously brings in the treatment of young people can have the opposite effect in older adults and may trigger brain tumors.

One possible treatment would be to use drugs to activate the TRPV1 channels. In mice, the group showed that a synthetic substance (arvanil), which is similar to capsaicin, reduced tumor growth. However, this substance has not yet been approved as a drug because the adverse side effects for humans are too severe. It is only used in basic research on mice, which tolerate the substance well. “In principle, however,” the researchers suggest, “synthetic vanilloid compounds may have clinical potential for brain tumor treatment.”

Source: Science Codex

Jul 24, 201210 notes
#science #neuroscience #brain #psychology #neural precursor cell #cell death #tumours
New Compounds Inhibit Prion Infection

ScienceDaily (July 23, 2012) — A team of University of Alberta researchers has identified a new class of compounds that inhibit the spread of prions, misfolded proteins in the brain that trigger lethal neurodegenerative diseases in humans and animals.

U of A chemistry researcher Frederick West and his team have developed compounds that clear prions from infected cells derived from the brain.

"When these designer molecules were put into infected cells in our lab experiments, the numbers of misfolded proteins diminished — and in some cases we couldn’t detect any remaining misfolded prions," said West.

West and his collaborators at the U of A’s Centre for Prions and Protein Folding Diseases say this research is not yet a cure, but does open a doorway for developing treatments.

"We’re not ready to inject these compounds in prion-infected cattle," said David Westaway, director of the prion centre. "These initial compounds weren’t created for that end-run scenario but they have passed initial tests in a most promising manner."

West notes that the most promising experimental compounds at this stage are simply too big to be used therapeutically in humans or animals.

Human exposure to prion-triggered brain disorder is limited to rare cases of Creutzfeldt-Jakob or mad cow disease. The researchers say the human form of mad cow disease shows up in one in a million people in industrialized nations, but investigating the disease is nonetheless well worth the time and expense.

"There is a strong likelihood that prion diseases operate in a similar way to neurodegenerative diseases such as Alzheimer’s, which are distressingly common around the world," said West.

Source: Science Daily

Jul 24, 201214 notes
#biology #brain #neurodegenerative diseases #neuroscience #prions #protein #psychology #science #infection
Where you look predicts what you're going to say

 23 July 2012 by Will Heaven

Watch where you look – it can be used to predict what you’ll say. A new study shows that it is possible to guess what sentences people will use to describe a scene by tracking their eye movements.

Moreno Coco and Frank Keller at the University of Edinburgh, UK, presented 24 volunteers with a series of photo-realistic images depicting indoor scenes such as a hotel reception. They then tracked the sequence of objects that each volunteer looked at after being asked to describe what they saw.

Other than being prompted with a keyword, such as “man” or “suitcase”, participants were free to describe the scene however they liked. Some typical sentences included “the man is standing in the reception of a hotel” or “the suitcase is on the floor”.

The order in which a participant’s gaze settled on objects in each scene tended to mirror the order of nouns in the sentence used to describe it. “We were surprised there was such a close correlation,” says Keller. Given that multiple cognitive processes are involved in sentence formation, Coco says “it is remarkable to find evidence of similarity between speech and visual attention”.

Word prediction

The team used the discovery to see if they could predict what sentences would be used to describe a scene based on eye movement alone. They developed an algorithm that was able to use the eye gazes recorded from the previous experiment to predict the correct sentence from a choice of 576 descriptions.

Changsong Liu of Michigan State University’s Language and Interaction Research lab, in East Lansing, who was not involved in the study, suggests these results could motivate novel designs for human-machine interfaces that take advantage of visual cues to improve speech recognition software.

Gaze information is already used to help with disambiguation. For example, if a speech recognition system can tell that you are looking at a tree, it is less likely to guess that you just said “three”. Sentence prediction, perhaps in combination with augmented reality headsets that track eye movement, for example, is one possible application.

Coco and Keller are now looking into the role of coordinated visual and linguistic processes in conversations between two people. “People engaged in a dialogue use similar syntactic forms, expressions and eye-movements,” says Coco. One hypothesis is that such “coordinative mimicry” might be important for joint decision-making.

Source: NewScientist

Jul 24, 201233 notes
#science #neuroscience #brain #psychology #eye movements #language production #speech #scene understanding
Why does vivid memory 'feel so real?'

Scientists find evidence that real perceptual experience and mental replay share similar brain activation patterns

Toronto, Canada – Neuroscientists have found strong evidence that vivid memory and directly experiencing the real moment can trigger similar brain activation patterns.

The study, led by Baycrest’s Rotman Research Institute (RRI), in collaboration with the University of Texas at Dallas, is one of the most ambitious and complex yet for elucidating the brain’s ability to evoke a memory by reactivating the parts of the brain that were engaged during the original perceptual experience. Researchers found that vivid memory and real perceptual experience share “striking” similarities at the neural level, although they are not “pixel-perfect” brain pattern replications.

The study appears online this month in the Journal of Cognitive Neuroscience, ahead of print publication.

"When we mentally replay an episode we’ve experienced, it can feel like we are transported back in time and re-living that moment again," said Dr. Brad Buchsbaum, lead investigator and scientist with Baycrest’s RRI. "Our study has confirmed that complex, multi-featured memory involves a partial reinstatement of the whole pattern of brain activity that is evoked during initial perception of the experience. This helps to explain why vivid memory can feel so real."

But vivid memory rarely fools us into believing we are in the real, external world – and that in itself offers a very powerful clue that the two cognitive operations don’t work exactly the same way in the brain, he explained.

In the study, Dr. Buchsbaum’s team used functional magnetic resonance imaging (fMRI), a powerful brain scanning technology that constructs computerized images of brain areas that are active when a person is performing a specific cognitive task. A group of 20 healthy adults (aged 18 to 36) were scanned while they watched 12 video clips, each nine seconds long, sourced from YouTube.com and Vimeo.com. The clips contained a diversity of content – such as music, faces, human emotion, animals, and outdoor scenery. Participants were instructed to pay close attention to each of the videos (which were repeated 27 times) and informed they would be tested on the content of the videos after the scan.

A subset of nine participants from the original group were then selected to complete intensive and structured memory training over several weeks that required practicing over and over again the mental replaying of videos they had watched from the first session. After the training, this group was scanned again as they mentally replayed each video clip. To trigger their memory for a particular clip, they were trained to associate a particular symbolic cue with each one. Following each mental replay, participants would push a button indicating on a scale of 1 to 4 (1 = poor memory, 4 = excellent memory) how well they thought they had recalled a particular clip.

Dr. Buchsbaum’s team found “clear evidence” that patterns of distributed brain activation during vivid memory mimicked the patterns evoked during sensory perception when the videos were viewed – by a correspondence of 91% after a principal components analysis of all the fMRI imaging data.

The so-called “hot spots”, or largest pattern similarity, occurred in sensory and motor association areas of the cerebral cortex – a region that plays a key role in memory, attention, perceptual awareness, thought, language and consciousness.

Dr. Buchsbaum suggested the imaging analysis used in his study could potentially add to the current battery of memory assessment tools available to clinicians. Brain activation patterns from fMRI data could offer an objective way of quantifying whether a patient’s self-report of their memory as “being good or vivid” is accurate or not.

Source: EurekAlert!

Jul 24, 201241 notes
#science #neuroscience #brain #brain activation #psychology #memory #perceptual experience #perception
Study offers new clue on how brain processes visual information, provides insight into neural mechanisms of attention

July 23, 2012

Ever wonder how the human brain, which is constantly bombarded with millions of pieces of visual information, can filter out what’s unimportant and focus on what’s most useful?

image

The process is known as selective attention and scientists have long debated how it works. But now, researchers at Wake Forest Baptist Medical Center have discovered an important clue. Evidence from an animal study, published in the July 22 online edition of the journal Nature Neuroscience, shows that the prefrontal cortex is involved in a previously unknown way.

Two types of attention are utilized in the selective attention process – bottom up and top down. Bottom-up attention is automatically guided to images that stand out from a background by virtue of color, shape or motion, such as a billboard on a highway. Top-down attention occurs when one’s focus is consciously shifted to look for a known target in a visual scene, as when searching for a relative in a crowd.

Traditionally, scientists have believed that separate areas of the brain controlled these two processes, with bottom-up attention occurring in the posterior parietal cortex and top-down attention occurring in the prefrontal cortex.

"Our findings provide insights on the neural mechanisms behind the guidance of attention," said Christos Constantinidis, Ph.D., associate professor of neurobiology and anatomy at Wake Forest Baptist and senior author of the study. "This has implications for conditions such as attention deficit hyperactivity disorder (ADHD), which affects millions of people worldwide. People with ADHD have difficulty filtering information and focusing attention. Our findings suggest that both the ability to focus attention intentionally and shifting attention to eye-catching but sometimes unimportant stimuli depend on the prefrontal cortex."

In the Wake Forest Baptist study, two monkeys were trained to detect images on a computer screen while activity in both areas of the brain was recorded. The visual display was designed to let one image “pop out” due to its color difference from the background, such as a red circle surrounded by green. To trigger bottom-up attention, neither the identity nor the location of the pop-out image could be predicted before it appeared. The monkeys indicated that they detected the pop-out image by pushing a lever.

The neural activity associated with identifying the pop-out images occurred in the prefrontal cortex at the same time as in the posterior parietal cortex. This unexpected finding indicates early involvement of the prefrontal cortex in bottom-up attention, in addition to its known role in top-down attention, and provides new insights into the neural mechanisms of attention.

"We hope that our findings will guide future work targeting attention deficits," Constantinidis said.

Provided by Wake Forest University Baptist Medical Center

Source: medicalxpress.com

Jul 24, 201246 notes
#science #neuroscience #brain #psychology #vision #attention #selective attention #ADHD #disorder
Are these the brain cells that give us consciousness?

23 July 2012 by Caroline Williams

The brainiest creatures share a secret – an odd kind of brain cell involved in emotions and empathy that may have accidentally made us conscious

image

The consciousness connection (Image: Jonathon Burton)

THE origin of consciousness has to be one of the biggest mysteries of all time, occupying philosophers and scientists for generations. So it is strange to think that a little-known neuroscientist called Constantin von Economo might have unearthed an important clue nearly 90 years ago.

When he peered down the lens of his microscope in 1926, von Economo saw a handful of brain cells that were long, spindly and much larger than those around them. In fact, they looked so out of place that at first he thought they were a sign of some kind of disease. But the more brains he looked at, the more of these peculiar cells he found - and always in the same two small areas that evolved to process smells and flavours.

Von Economo briefly pondered what these “rod and corkscrew cells”, as he called them, might be doing, but without the technology to delve much deeper he soon moved on to more promising lines of enquiry.

Little more was said about these neurons until nearly 80 years later when, Esther Nimchinsky and Patrick Hof at Mount Sinai University in New York also stumbled across clusters of these strange-looking neurons. Now, after more than a decade of functional imaging and post-mortem studies, we are beginning to piece together their story. Certain lines of evidence hint that they may help build the rich inner life we call consciousness, including emotions, our sense of self, empathy and our ability to navigate social relationships.

Many other big-brained, social animals also seem to share these cells, in the same spots as the human brain. A greater understanding of the way these paths converged could therefore tell us much about the evolution of the mind.

Admittedly, to the untrained eye these giant brain cells, now known as von Economo neurons (VENs), don’t look particularly exciting. But to a neuroscientist they stand out like a sore thumb. For one thing, VENs are at least 50 per cent, and sometimes up to 200 per cent, larger than typical human neurons. And while most neurons have a pyramid-shaped body with a finely branched tree of connections called dendrites at each end of the cell, VENs have a longer, spindly cell body with a single projection at each end with very few branches. 

image

Perhaps they escaped attention for so long because they are so rare, making up just 1 per cent of the neurons in the two small areas of the human brain: the anterior cingulate cortex (ACC) and the fronto-insular (FI) cortex.

Their location in those regions suggests that VENs may be a central part of our mental machinery, since the ACC and FI are heavily involved in many of the more advanced aspects of our inner lives. Both areas kick into action when we see socially relevant cues, be it a frowning face, a grimace of pain or simply the voice of someone we love. When a mother hears a baby crying, both regions respond strongly. They also light up when we experience emotions such as love, lust, anger and grief. For John Allman, a neuroanatomist at the California Institute of Technology in Pasadena, this adds up to a kind of “social monitoring network” that keeps track of social cues and allows us to alter our behaviour accordingly (Annals of the New York Academy of Sciences, vol 1225, p 59).

The two brain areas also seem to play a key role in the “salience” network, which keeps a subconscious tally of what is going on around us and directs our attention to the most pressing events, as well as monitoring sensations from the body to detect any changes (Brain Structure and Function, DOI: 10.1007/s00429-012-0382-9).

What’s more, both regions are active when a person recognises their reflection in the mirror, suggesting that these parts of the brain underlie our sense of self - a key component of consciousness. “It is the sense of self at every possible level - so the sense of identity, this is me, and the sense of identity of others and how you understand others. That goes to the concept of empathy and theory of mind,” says Hof.

To Bud Craig, a neuroanatomist at Barrow Neurological Institute in Phoenix, Arizona, it all amounts to a continually updated sense of “how I feel now”: the ACC and FI take inputs from the body and tie them together with social cues, thoughts and emotions to quickly and efficiently alter our behaviour (Nature Reviews Neuroscience, vol 10, p 59).

This constantly shifting picture of how we feel may contribute to the way we perceive the passage of time. When something emotionally important is happening, Craig proposes, there is more to process, and because of this time seems to speed up. Conversely, when less is going on we update our view of the world less frequently, so time seems to pass more slowly.

VENs are probably important in all this, though we can only infer their role through circumstantial evidence. That’s because locating these cells, and then measuring their activity in a living brain hasn’t yet been possible. But their unusual appearance is a signal that they probably aren’t just sitting there doing nothing. “They stand out anatomically,” says Allman, “And a general proposition is that anything that’s so distinctive looking must have a distinct function.”

Fast thinking

In the brain, big usually means fast, so Allman suggests that VENs could be acting as a fast relay system - a kind of social superhighway - which allows the gist of the situation to move quickly through the brain, enabling us to react intuitively on the hop, a crucial survival skill in a social species like ours. “That’s what all of civilisation is based on: our ability to communicate socially, efficiently,” adds Craig.

A particularly distressing form of dementia that can strike people as early as their 30s supports this idea. People who develop fronto-temporal dementia lose large numbers of VENs in the ACC and FI early in the disease, when the main symptom is a complete loss of social awareness, empathy and self-control. “They don’t have normal empathic responses to situations that would normally make you disgusted or sad,” says Hof. “You can show them horrible pictures of an accident and they just don’t blink. They will say ‘oh, yes, it’s an accident’.”

Post-mortem examinations of the brains of people with autism also bolster the idea that VENs lie at the heart of our emotions and empathy. According to one recent study, people with autism may fall into two groups: some have too few VENs, perhaps meaning that they don’t have the necessary wiring to process social cues, while others have far too many (Acta Neuropathologica, vol 118, p 673). The latter group would seem to fit with one recent theory of autism, which proposes that the symptoms may arise from an over-wiring of the brain. Perhaps having too many VENs makes emotional systems fire too intensely, causing people with autism to feel overwhelmed, as many say they do.

Another recent study found that people with schizophrenia who committed suicide had significantly more VENs in their ACC than schizophrenics who died of other causes. The researchers suggest that the over-abundance of VENs might create an overactive emotional system that leaves them prone to negative self-assessment and feelings of guilt and hopelessness (PLoS One, vol 6, p e20936).

VENs in other animals provide some clues, too. When these neurons were first identified, there was the glimmer of hope that we might have found one of the key evolutionary changes, unique to humankind, that could explain our social intelligence. But the earliest studies put paid to that kind of thinking, when VENs turned up in chimpanzees and gorillas. In recent years, they have also been found in elephants and some whales and dolphins.

Like us, many of these species live in big social groups and show signs of the same kind of advanced behaviour associated with VENs in people. Elephants, for instance, display something that looks a lot like empathy: they work together to help injured, lost or trapped elephants, for example. They even seem to show signs of grief at elephant “graveyards” (Biology Letters, vol 2, p 26). What’s more, many of these species can recognise themselves in the mirror, which is usually taken as a rudimentary measure of consciousness. When researchers daub paint on an elephant’s face, for instance, it will notice the mark in the mirror and try to feel the spot with its trunk. This has led Allman and others to speculate that von Economo neurons might be a vital adaptation in large brains for keeping track of social situations - and that the sense of self may be a consequence of this ability.

Yet VENs also crop up in manatees, hippos and giraffes - not renowned for their busy social lives. The cells have also been spotted in macaques, which don’t reliably pass the mirror test, although they are social animals. Although this seems to put a major spanner in the works for those who claim that the cells are crucial for advanced cognition, it could also be that these creatures are showing the precursors of the finely tuned cells found in highly social species. “I think that there are homologues of VENs in all mammals,” says Allman. “That’s not to say they’re shaped the same way but they are located in an analogous bit of cortex and they are expressing the same genes.”

It would make sense, after all, that whales and primates might both have recycled, and refined, older machinery present in a common ancestor rather than independently evolving the same mechanism. Much more research is needed, however, to work out the anatomical differences and the functions of these cells in the different animals.

That work might even help us understand how these neurons evolved in the first place. Allman already has some ideas about where they came from. Our VENs reside in a region of the brain that evolved to integrate taste and smell, so he suggests that many of the traits now associated with the FI evolved from the simple act of deciding whether food is good to eat or likely to make your ill. When reaching that decision, he says, the quicker the “gut” reaction kicks in the better. And if you can detect this process in others, so much the better.

"One of the important functions that seems to reside in the FI has to do with empathy," he says. "My take on this is that empathy arose in the context of shared food - it’s very important to observe if members of your social group are becoming ill as a result of eating something." The basic feeding circuity, including the rudimentary VENs, may then have been co-opted by some species to work in other situations that involve a decision, like working out if a person is trustworthy or to be avoided. "So when we have a feeling, whether it be about a foodstuff or situation or another person, I think that engages the circuitry in the fronto-insular cortex and the VENS are one of the outputs of that circuitry," says Allman.

Allman’s genetics work suggests he may be on to something. His team found that VENs in one part of the FI are expressing the genes for hormones that regulate appetite. There are also a lot of studies showing links between smell and taste and the feelings of strong emotions. Our physical reaction to something we find morally disgusting, for example, is more or less identical to our reaction to a bitter taste, suggesting they may share common brain wiring (Science, vol 323, p 1222). Other work has shown that judging a morally questionable act, such as theft, while smelling something disgusting leads to harsher moral judgements (Personality and Social Psychology Bulletin, vol 34, p 1096). What’s more, Allman points out that our language is loaded with analogies - we might find an experience “delicious”, say, or a person “nauseating”. This is no accident, he says.

Red herring

However, it is only in highly social animals that VENs live exclusively in the scent and taste regions. In the others, like giraffes and hippos, VENs seem to be sprinkled all over the brain. Allman, however, points out that these findings may be a red herring, since without understanding the genes they express, or their function, we can’t even be sure how closely these cells relate to human VENs. They may even be a different kind of cell that just looks similar.

Based on the evidence so far, however, Hof thinks that the ancestral VENs would have been more widespread, as seen in the hippo brain, and that over the course of evolution they then migrated to the ACC and FI in some animals, but not others - though he admits to having no idea why that might be. He suspects the pressures that shaped the primate brain may have been very different to those that drove the evolution of whales and dolphins.

Craig has hit upon one possibility that would seem to fit all of these big-brained animals. He points out that the bigger the brain, the more energy it takes to run, so it is crucial that it operates as efficiently as possible. A system that continually monitors the environment and the people or animals in it would therefore be an asset, allowing you to adapt quickly to a situation to save as much energy as possible. “Evolution produced an energy calculation system that incorporated not just the sensory inputs from the body but the sensory inputs from the brain,” Craig says. And the fact that we are constantly updating this picture of “how I feel now” has an interesting and very useful by-product: we have a concept that there is an “I” to do the feeling. “Evolution produced a very efficient moment-by-moment calculation of energy utilisation and that had an epiphenomenon, a by-product that provided a subjective representation of my feelings.”

If he’s right - and there is a long way to go before we can be sure - it raises a very humbling possibility: that far from being the pinnacle of brain evolution, consciousness might have been a big, and very successful accident.

Source: NewScientist

Jul 24, 2012114 notes
#science #neuroscience #brain #psychology #consciousness #brain cells
Next page →
20132014
  • January
  • February
  • March
  • April
  • May
  • June
  • July
  • August
  • September
  • October
  • November
  • December
201220132014
  • January
  • February
  • March
  • April
  • May
  • June
  • July
  • August
  • September
  • October
  • November
  • December
20122013
  • January
  • February
  • March
  • April
  • May
  • June
  • July
  • August
  • September
  • October
  • November
  • December