July 25, 2012
(Medical Xpress) — New understanding of how the brain processes information from inner ear offers hope for sufferers of vertigo.
If you have ever looked over the edge of a cliff and felt dizzy, you understand the challenges faced by people who suffer from symptoms of vestibular dysfunction such as vertigo and dizziness. There are over 70 million of them in North America. For people with vestibular loss, performing basic daily living activities that we take for granted (e.g. dressing, eating, getting in and out of bed, getting around inside as well as outside the home) becomes difficult since even small head movements are accompanied by dizziness and the risk of falling.
We’ve known for a while that a sensory system in the inner ear (the vestibular system) is responsible for helping us keep our balance by giving us a stable visual field as we move around. And while researchers have already developed a basic understanding of how the brain constructs our perceptions of ourselves in motion, until now no one has understood the crucial step by which the neurons in the brain select the information needed to keep us in balance.
The way that the brain takes in and decodes information sent by neurons in the inner ear is complex. The peripheral vestibular sensory neurons in the inner ear take in the time varying acceleration and velocity stimuli caused by our movement in the outside world (such as those experienced while riding in a car that moves from a stationary position to 50 km per hour). These neurons transmit detailed information about these stimuli to the brain (i.e. information that allows one to reconstruct how these stimuli vary over time) in the form of nerve impulses.
Scientists had previously believed that the brain decoded this information linearly and therefore actually attempted to reconstruct the time course of velocity and acceleration stimuli. But by combining electrophysiological and computational approaches, Kathleen Cullen and Maurice Chacron, two professors in McGill University’s Department of Physiology, have been able to show for the first time that the neurons in the vestibular nuclei in the brain instead decode incoming information nonlinearly as they respond preferentially to unexpected, sudden changes in stimuli.
It is known that representations of the outside world change at each stage in this sensory pathway. For example, in the visual system neurons located closer to the periphery of the sensory system (e.g. ganglion cells in the retina) tend to respond to a wide range of sensory stimuli (a “dense” code), whereas central neurons (e.g. in the primary visual cortex at the back of the head tend to respond much more selectively (a “sparse” code). Chacron and Cullen have discovered that the selective transmission of vestibular information they were able to document for the first time occurs as early as the first synapse in the brain. “We were able to show that the brain has developed this very sophisticated computational strategy to represent sudden changes in movement in order to generate quick accurate responses and maintain balance,” explained Prof. Cullen. “I keep describing it as elegant, because that’s really how it strikes me.”
This kind of selectivity in response is important for everyday life, since it enhances the brain’s perception of sudden changes in body posture. So that if you step off an unseen curb, within milliseconds, your brain has both received the essential information and performed the sophisticated computation needed to help you readjust your position. This discovery is expected to apply to other sensory systems and eventually to the development of better treatments for patients who suffer from vertigo, dizziness, and disorientation during their daily activities. It should also lead to treatments that will help alleviate the symptoms that accompany motion and/or space sickness produced in more challenging environments.
Provided by McGill University
Source: medicalxpress.com
July 25, 2012
(HealthDay) — Shortened telomere length (TL) is associated with risks for dementia and mortality in a population of older adults, according to a study published online July 23 in the Archives of Neurology.

Lawrence S. Honig, M.D., Ph.D., from the Columbia University College of Physicians and Surgeons in New York City, and colleagues used real-time polymerase chain reaction analysis to determine TL in stored leukocyte DNA from 1,983 participants in a community-based study of aging. Participants were 65 years or older and blood was drawn at a mean age of 78.3 years. Participants were followed for a median of 9.3 years for mortality, and 9.6 percent developed incident dementia.
The researchers found that TL correlated inversely with age and was shorter in men than women. TL was significantly shorter in persons dying during follow-up compared with survivors, even after adjusting for age, sex, education, and apolipoprotein E genotype. TL was significantly shorter in the participants with incident and prevalent dementia, compared with those who remained dementia-free. Shorter TL correlated with earlier onset of dementia but this association was significant in women only.
"Our results show an association between shortened TL and mortality, and more specifically an association of shortened TL with Alzheimer’s disease, and are consistent with but not indicative of the possibility that TL may be a factor indicative of biological age," the authors conclude.
Source: medicalxpress.com
July 25, 2012
(Medical Xpress) — Many people, whether they know it or not, are philosophical dualists. That is, they believe that the brain and the mind are two separate entities. Despite the fact dualist beliefs are found in virtually all human cultures, surprisingly little is known about the impact of these beliefs on how we think and behave in everyday life.
But a new research article forthcoming in Psychological Science, a journal of the Association for Psychological Science, suggests that espousing a dualist philosophy can have important real-life consequences.
Across five related studies, researchers Matthias Forstmann, Pascal Burgmer, and Thomas Mussweiler of the University of Cologne, Germany, found that people primed with dualist beliefs had more reckless attitudes toward health and exercise, and also preferred (and ate) a less healthy diet than those who were primed with physicalist beliefs.
Furthermore, they found that the relationship also worked in the other direction. People who were primed with unhealthy behaviors – such as pictures of unhealthy food – reported a stronger dualistic belief than participants who were primed with healthy behaviors.
Overall, the findings from the five studies provide converging evidence demonstrating that mind-body dualism has a noticeable impact on people’s health-related attitudes and behaviors. Specifically, these findings suggest that dualistic beliefs decrease the likelihood of engaging in healthy behavior.
These findings support the researchers’ original hypothesis that the more people perceive their minds and bodies to be distinct entities, the less likely they will be to engage in behaviors that protect their bodies. Bodies are ultimately viewed as a disposable vessel that helps the mind interact with the physical world.
Evidence of a bidirectional relationship further suggests that metaphysical beliefs, such as beliefs in mind-body dualism, may serve as cognitive tools for coping with threatening or harmful situations.
The fact that the simple priming procedures used in the studies had an immediate impact on health-related attitudes and behavior suggests that these procedures may eventually have profound implications for real-life problems. Interventions that reduce dualistic beliefs through priming could be one way to help promote healthier – or less self-damaging – behaviors in at-risk populations.
Provided by Association for Psychological Science
Source: medicalxpress.com
New research suggests that patients whose mobility has been limited by stroke may one day use their imagination and a computer link to move their hands.

Leuthardt
In patients, scientists at Washington University School of Medicine in St. Louis have shown they can detect the brain simply thinking about moving a partially or completely paralyzed hand. The half of the brain that normally thinks such thoughts and moves the hand can no longer do so because of stroke damage. Instead, the signal comes from the undamaged half of the brain.
The new study suggests it may be possible to harness these signals to restore a fuller range of movement in the patient’s limbs.
“We’ve known for some time that the brain can reroute or otherwise adapt its circuits to cope with an injury,” says senior author Eric Leuthardt, MD, associate professor of neurosurgery, of biomedical engineering and of neurobiology. “Now we have proof-of-principle that we can use technology to aid that process.”
To demonstrate the potential to help restore movement, scientists connected brain signals detected by an electrode-studded cap to the movements of a cursor on a computer screen. In 30 minutes or less, patients learned to control the movement of the cursor with thoughts of moving their impaired hand. Researchers are now working on a motorized glove that will make the imagined movements a reality.
The results are available online in The Journal of Neural Engineering.
Leuthardt, who is director of Washington University’s Center for Innovation in Neuroscience and Technology, is a pioneer in the field of brain-computer interfaces, or devices that allow the brain to communicate directly with computers to restore abilities lost to injury or disease.
Much of Leuthardt’s research has focused on patients with epilepsy who are undergoing surgery to remove the part of the brain where their seizures originate. He uses the electrode grids temporarily implanted on the surface of the brain to pinpoint areas where the seizures begin. With the patients’ permissions, Leuthardt also uses the implants to gather and analyze detailed information on brain activity for future use in brain-computer interfaces. This approach laid the foundations for the technique now being applied to the stroke population.
In the new research, first author David Bundy, a graduate student, worked with four patients who had suffered strokes that caused extensive damage on one side of the brain. All were experiencing paralysis or significant difficulty moving the hand on the opposite side of the body.
The brain signals that control movement are low-frequency signals, which makes them relatively easy to detect with electrodes on the outside of the skull. Researchers fitted patients with an electrode-studded cap connected to a computer, and asked them to perform a finger-tapping activity. Depending on a cue flashed on a screen in front of them, the patients either tapped the fingers of their unimpaired hand or imagined tapping the fingers of the impaired hand. Scientists used the cap to identify signals in healthy part of the brain that accompanied the imaginary movements.
The researchers are now developing motorized braces that can be controlled by similar signals, with the goal of restoring full movement in weak or paralyzed limbs.
“This is an exciting development that opens up new opportunities to help even more patients overcome limitations imposed by brain damage or degeneration,” Leuthardt says.
July 24, 2012
A new class of drug developed at Northwestern University Feinberg School of Medicine shows early promise of being a one-size-fits-all therapy for Alzheimer’s disease, Parkinson’s disease, multiple sclerosis and traumatic brain injury by reducing inflammation in the brain.
Northwestern has recently been issued patents to cover this new drug class and has licensed the commercial development to a biotech company that has recently completed the first human Phase 1 clinical trial for the drug.
The drugs in this class target a particular type of brain inflammation, which is a common denominator in these neurological diseases and in traumatic brain injury and stroke. This brain inflammation, also called neuroinflammation, is increasingly believed to play a major role in the progressive damage characteristic of these chronic diseases and brain injuries.
By addressing brain inflammation, the new class of drugs — represented by MW151 and MW189 — offers an entirely different therapeutic approach to Alzheimer’s than current ones being tested to prevent the development of beta amyloid plaques in the brain. The plaques are an indicator of the disease but not a proven cause.
A new preclinical study published today in the Journal of Neuroscience, reports that when one of the new Northwestern drugs is given to a mouse genetically engineered to develop Alzheimer’s, it prevents the development of the full-blown disease. The study, from Northwestern’s Feinberg School and the University of Kentucky, identifies the optimal therapeutic time window for administering the drug, which is taken orally and easily crosses the blood-brain barrier.
"This could become part of a collection of drugs you could use to prevent the development of Alzheimer’s," said D. Martin Watterson, a professor of molecular pharmacology and biological chemistry at the Feinberg School, whose lab developed the drug. He is a coauthor of the study.
In previous animal studies, the same drug reduced the neurological damage caused by closed-head traumatic brain injury and inhibited the development of a multiple sclerosis-like disease. In these diseases as well as in Alzheimer’s, the studies show the therapy time window is critical.
ScienceDaily (July 24, 2012) — A study, performed in mice and utilizing post-mortem samples of brains from patients with Alzheimer’s disease, found that a single event of a moderate-to-severe traumatic brain injury (TBI) can disrupt proteins that regulate an enzyme associated with Alzheimer’s. The paper, published in The Journal of Neuroscience, identifies the complex mechanisms that result in a rapid and robust post-injury elevation of the enzyme, BACE1, in the brain. These results may lead to the development of a drug treatment that targets this mechanism to slow the progression of Alzheimer’s disease.
"A moderate-to-severe TBI, or head trauma, is one of the strongest environmental risk factors for Alzheimer’s disease. A serious TBI can lead to a dysfunction in the regulation of the enzyme BACE1. Elevations of this enzyme cause elevated levels of amyloid-beta, the key component of brain plaques associated with senility and Alzheimer’s disease," said first author Kendall Walker, PhD, postdoctoral associate in the department of neuroscience at Tufts University School of Medicine (TUSM).
Building on her previous work, neuroscientist Giuseppina Tesco, MD, PhD, of Tufts University School of Medicine (TUSM), led a research team that first used an in vivo model to determine how a single episode of TBI could alter the brain. In the acute phase (first two days) following injury, levels of two intracellular trafficking proteins (GGA1 and GGA3) were reduced, and an elevation of BACE1 enzyme level was observed.
Next, in an analysis of post-mortem brain samples from patients with Alzheimer’s disease, the researchers found that GGA1 and GGA3 levels were reduced while BACE1 levels were elevated in the brains of Alzheimer’s disease patients compared to the brains of people without Alzheimer’s disease, suggesting a possible inverse association.
In an additional experiment using a mouse strain genetically modified to express the reduced level of GGA3 that was observed in the brains of Alzheimer’s disease patients, the team found that one week following traumatic brain injury, BACE1 and amyloid-beta levels remained elevated even when GGA1 levels had returned to normal. The research suggests that reduced levels of GGA3 were solely responsible for the increase in BACE 1 levels and therefore the sustained amyloid-beta production observed in the sub-acute phase, or seven days, after injury.
"When the proteins are at normal levels, they work as a clean-up crew for the brain by regulating the removal of BACE1 enzymes and facilitating their transport to lysosomes within brain cells, an area of the cell that breaks down and removes excess cellular material. BACE1 enzyme levels may be stabilized when levels of the two proteins are low, likely caused by an interruption in the natural disposal process of the enzyme," said Tesco, assistant professor of neuroscience at Tufts School of Medicine and member of the neuroscience program faculty at the Sackler School of Graduate Biomedical Sciences at Tufts.
"We found that GGA1 and GGA3 act synergistically to regulate BACE1 post-injury. The identification of this interaction may provide a drug target to therapeutically regulate the BACE1 enzyme and reduce the deposition of amyloid-beta in Alzheimer’s patients," she continued. "Our next steps are to confirm these findings in post-mortem brain samples from patients with moderate-to-severe traumatic brain injuries."
Moderate-to-severe TBIs are caused most often by traumas, such as severe falls or motor vehicle accidents, that result in a loss of consciousness. Not all traumas to the head result in a TBI. According to the Centers for Disease Control and Prevention, each year 1.7 million people sustain a TBI. Concussions, the mildest form of a TBI, account for about 75% of all TBIs. Studies have linked repeated head trauma to brain disease and some previous studies have linked single events of brain trauma to brain disease, such as Alzheimer’s. Alzheimer’s disease currently affects as many as 5.1 million Americans and is the most common cause of dementia in adults age 65 and over.
Source: Science Daily
July 24, 2012
Einstein’s famous theory of relativity proposed that matter can distort space and time. Now a new study recently published in the journal Neurology suggests that chronic pain can have the same effect.
Neuroscientists from the University of South Australia, Neuroscience Research Australia and the University of Milano Bicocca in Italy, studied people with chronic back pain, the most common painful condition which costs western countries billions of dollars in lost productivity every year.
They presented identical vibration stimuli to the painful area and a non-painful area and noted that the stimuli were processed more slowly by the brain if they came from the painful area.
The most striking finding, however, was that the same effect occurred if the stimuli were delivered to a healthy body part being held near the painful area.
Lead author of the study, Professor Lorimer Moseley from the University of South Australia, says it was not altogether surprising that, in people with chronic pain, there are changes in the way the brain processes information from and about the painful body part.
“But what is remarkable is that the problem affects the space around the body as well as the body itself,” Prof Moseley says.
Experiments showed that if a hand was held near the painful area of the back, the brain would almost ‘neglect’ that hand.
“The potential similarity between our findings and the time-space distortion predicted by the relativity theory is definitely intriguing,” Prof Moseley says.
“Obviously, here it is not external space that is distorted but the ability of the brain to represent that space within its neural circuitry.
“This finding opens up a whole new area of research into the way the brain allows us to interact with the world and how this can be disrupted in chronic pain.”
Provided by University of South Australia
Source: medicalxpress.com
ScienceDaily (July 24, 2012) — Neuroscientists from Wayne State University and the Massachusetts Institute of Technology (MIT) are taking a deeper look into how the brain mechanisms for memory retrieval differ between adults and children. While the memory systems are the same in many ways, the researchers have learned that crucial functions with relevance to learning and education differ.
The team’s findings were published on July 17, 2012, in the Journal of Neuroscience.
According to lead author Noa Ofen, Ph.D., assistant professor in WSU’s Institute of Gerontology and Department of Pediatrics, cognitive ability, including the ability to learn and remember new information, dramatically changes between childhood and adulthood. This ability parallels with dramatic changes that occur in the structure and function of the brain during these periods.
In the study, “The Development of Brain Systems Associated with Successful Memory Retrieval of Scenes,” Ofen and her collaborative team tested the development of neural underpinnings of memory from childhood to young adulthood. The team of researchers exposed participants to pictures of scenes and then showed them the same scenes mixed with new ones and asked them to judge whether each picture was presented earlier. Participants made retrieval judgments while researchers collected images of their brains with magnetic resonance imaging (MRI).
Using this method, the researchers were able to see how the brain remembers. “Our results suggest that cortical regions related to attentional or strategic control show the greatest developmental changes for memory retrieval,” said Ofen.
The researchers said that older participants used the cortical regions more than younger participants when correctly retrieving past experiences.
"We were interested to see whether there are changes in the connectivity of regions in the brain that support memory retrieval," Ofen added. "We found changes in connectivity of memory-related regions. In particular, the developmental change in connectivity between regions was profound even without a developmental change in the recruitment of those regions, suggesting that functional brain connectivity is an important aspect of developmental changes in the brain."
This study marks the first time that the development of connectivity within memory systems in the brain has been tested, and the results suggest that the brain continues to rearrange connections to achieve adult-like performance during development.
Ofen and her research team plan to continue research in this area, focused on modeling brain network connectivity, and applying these methods to study abnormal brain development.
Source: Science Daily
July 23, 2012
Mice appear to have a specialized system for detecting and at least initially processing instinctually important smells such as those that denote predators. The finding raises a question about whether their response to those smells is hardwired.

A separate subsystem for the smell of fear. Experiments in mice suggest neurons that detect odors associated with an instinctive response — like fleeing when an approaching predator is detected — are configured differently than other olfactory neurons. Further research could determine whether this system automatically triggers flight or other primal behaviors.Credit: Mike Cohea/Brown University
PROVIDENCE, R.I. [Brown University] — A new study finds that mice have a distinct neural subsystem that links the nose to the brain and is associated with instinctually important smells such as those emitted by predators. That insight, published online this week in Proceedings of the National Academy of Sciences, prompts the question whether mice and other mammals have specially hardwired neural circuitry to trigger instinctive behavior in response to certain smells.
In the series of experiments and observations described in the paper, the authors found that nerve cells in the nose that express members of the gene family of trace amine-associated receptors (TAAR) have several key biological differences from the much more common and diverse neurons that express members of the olfactory receptor gene family. Those other nerve cells detect a much broader range of smells, said corresponding author Gilad Barnea, the Robert and Nancy Carney Assistant Professor of Neuroscience at Brown University.
The differences between TAAR neurons and olfactory receptor neurons led Barnea and his co-authors to conclude that they form an independent subsystem for certain smells.
“Our observations suggest that the TAAR-expressing sensory neurons constitute a distinct olfactory subsystem that extracts specific environmental cues that then elicit innate responses,” Barnea said.
ScienceDaily (July 23, 2012) — Stroboscopic training, performing a physical activity while using eyewear that simulates a strobe-like experience, has been found to increase visual short-term memory retention, and the effects lasted 24 hours.

(Credit: Image courtesy of Duke University)
Participants completed a memory test that required them to note the identity of eight letters of the alphabet that were briefly displayed on a computer screen. After a variable delay, participants were asked to recall one of the eight letters. On easy-level trials, the recall prompt came immediately after the letters disappeared, but on more difficult trials, the prompt came as late as 2.5 seconds following the display. Because participants did not know which letter they would be asked to recall, they had to retain all of the items in memory.
"Humans have a memory buffer in their brain that keeps information alive for a certain short-lived period," said Greg Appelbaum, assistant professor of psychiatry at Duke University and first author of the study. "Wearing the strobe eyewear during the physical training seemed to boost the ability to retain information in this buffer."
The strobe eyewear disrupts vision by only allowing the user to see glimpses of the world. The user must adjust their visual processing in order to perform normally, and this adjustment produces a lingering benefit; once participants removed the strobe eyewear, there was an observed boost in their visual memory retention, which was found to last 24 hours.
Earlier work by Appelbaum and the project’s senior researcher, Stephen Mitroff, had shown that stroboscopic training improves visual perception, including the ability to detect subtle motion cues and the processing of briefly presented visual information. Yet the earlier study had not determined how long the benefits might last.
"Our earlier work on stroboscopic training showed that it can improve perceptual abilities, but we don’t know exactly how," says Mitroff, associate professor of psychology & neuroscience and member of the Duke Institute for Brain Sciences. "This project takes a big step by showing that these improved perceptual abilities are driven, at least in part, by improvements in visual memory."
"Improving human cognition is an important goal with so many benefits," said Appelbaum, also a member of the Duke Institute for Brain Sciences. "Interestingly, our findings demonstrate one way in which visual experience has the capacity to improve cognition."
Source: Science Daily
ScienceDaily (July 23, 2012) — Snack consumption and BMI are linked to both brain activity and self-control, new research has found.

Snack consumption and BMI are linked to both brain activity and self-control, new research has found. (Credit: © farbkombinat / Fotolia)
The research, carried out by academics from the Universities of Exeter, Cardiff, Bristol, and Bangor, discovered that an individual’s brain ‘reward centre’ response to pictures of food predicted how much they subsequently ate. This had a greater effect on the amount they ate than their conscious feelings of hunger or how much they wanted the food,
A strong brain response was also associated with increased weight (BMI), but only in individuals reporting low levels of self-control on a questionnaire. For those reporting high levels of self-control a stronger brain response to food was actually related to a lower BMI.
This study, which is now published in the journal NeuroImage, adds to mounting evidence that overeating and increased weight are linked, in part, to a region of the brain associated with motivation and reward, called the nucleus accumbens. Responses in this brain region have been shown to predict weight gain in healthy weight and obese individuals, but only now have academics discovered that this is independent of conscious feelings of hunger, and that self-control also plays a key role.
Following these results, academics at the University of Exeter and Cardiff have begun testing ‘brain training’ techniques designed to reduce the influence of food cues on individuals who report low levels of self-control. Similar tests are being used to assist those with gambling or alcohol addiction.
Dr Natalia Lawrence of Psychology at the University of Exeter, lead researcher in both the original research and the new studies, said: “Our research suggests why some individuals are more likely to overeat and put on weight than others when confronted with frequent images of snacks and treats. Food images, such as those used in advertising, cause direct increases in activity in brain ‘reward areas’ in some individuals but not in others. If those sensitive individuals also struggle with self-control, which may be partly innate, they are more likely to be overweight. We are now developing computer programs that we hope will counteract the effects of this high sensitivity to food cues by training the brain to respond less positively to these cues.”
Twenty-five young, healthy females with BMIs ranging from 17-30 were involved in the study. Female participants were chosen because research shows females typically exhibit stronger responses to food-related cues. The hormonal changes during the menstrual cycle affect this reaction, so all participants were taking the monophasic combined oral contraceptive pill. Participants had not eaten for at least six hours to ensure they were hungry at the time of the scan and were given a bowl containing 150 g (four and a half packets) of potato chips to eat at the end of the study; they were informed that potato chip intake had been measured afterwards.
Researchers used MRI scanning to detect the participants’ brain activity while they were shown images of household objects, and food that varied in desirability and calorific content. After scanning, participants rated the food images for desirability and rated their levels of hunger and food craving. Results showed that participants’ brain responses to food (relative to objects) in the nucleus accumbens predicted how many potato chips they ate after the scan. However, participants’ own ratings of hunger and how much they liked and wanted the foods, including potato chips, were unrelated to their potato chip intake.
This study was funded by the Wales Institute of Cognitive Neuroscience.
What this study shows:
What this study does NOT show:
Source: Science Daily
JUL 23, 2012
A new and powerful class of antioxidants could one day be a potent treatment for Parkinson’s disease, researchers report.

Dr. Bobby Thomas
A class of antioxidants called synthetic triterpenoids blocked development of Parkinson’s in an animal model that develops the disease in a handful of days, said Dr. Bobby Thomas, neuroscientist at the Medical College of Georgia at Georgia Health Sciences University and corresponding author of the study in the journal Antioxidants & Redox Signaling.
Thomas and his colleagues were able to block the death of dopamine-producing brain cells that occurs in Parkinson’s by using the drugs to bolster Nrf2, a natural antioxidant and inflammation fighter.
Stressors from head trauma to insecticide exposure to simple aging increase oxidative stress and the body responds with inflammation, part of its natural repair process. “This creates an environment in your brain that is not conducive for normal function,” Thomas said. “You can see the signs of oxidative damage in the brain long before the neurons actually degenerate in Parkinson’s.”
Nrf2, the master regulator of oxidative stress and inflammation, is – inexplicably – significantly decreased early in Parkinson’s. In fact, Nrf2 activity declines normally with age.
“In Parkinson’s patients you can clearly see a significant overload of oxidative stress, which is why we chose this target,” Thomas said. “We used drugs to selectively activate Nrf2.”
They parsed a number of antioxidants already under study for a wide range of diseases from kidney failure to heart disease and diabetes, and found triterpenoids the most effective on Nrf2. Co-author Dr. Michael Sporn, Professor of Pharmacology, Toxicology and Medicine at Dartmouth Medical School, chemically modified the agents so they could permeate the protective blood-brain barrier.
Both in human neuroblastoma and mouse brain cells they were able to document an increase in Nrf2 in response to the synthetic triterpenoids. Human dopaminergic cells are not available for research so the scientists used the human neuroblastoma cells, which are actually cancer cells that have some properties similar to neurons.
Their preliminary evidence indicates the synthetic triterpenoids also increase Nrf2 activity in astrocytes, a brain cell type which nourishes neurons and hauls off some of their garbage. The drugs didn’t protect brain cells in an animal where the Nrf2 gene was deleted, more proof that that Nrf2 is the drugs’ target.
The researchers used the powerful neurotoxin MPTP to mimic Parkinson’s-like brain cell damage in a matter of days. They are now looking at the impact of synthetic triterpenoids in an animal model genetically programmed to acquire the disease more slowly, as humans do. Collaborators at Johns Hopkins School of Medicine also will be providing induced pluripotent stem cells, adult stem cells that can be coaxed into forming dopaminergic neurons, for additional drug testing.
Other collaborators include scientists at Weill Medical College of Cornell University, Johns Hopkins School of Public Health, Moscow State University, Tohoku University and the University of Pittsburgh.
Source: EarthSky
ScienceDaily (July 23, 2012) — New research conducted by neuroscientists from the Royal College of Surgeons in Ireland (RCSI) published in Nature Medicine has identified a new gene involved in epilepsy and could potentially provide a new treatment option for patients with epilepsy.
The research focussed on a new class of gene called a ‘microRNA’ which controls protein production inside cells. The research looked in detail at one particular microRNA called ‘microRNA-134’ and found that levels of microRNA-134 are much higher in the part of the brain that causes seizures in patients with epilepsy.
By using a new type of drug-like molecule called an antagomir which locks onto the ‘microRNA-134’ and removes it from the brain cell, the researchers found they could prevent epileptic seizures from occurring.
Professor David Henshall, Department of Physiology & Medical Physics, RCSI and senior author on the paper said ‘We have been looking to find what goes wrong inside brain cells to trigger epilepsy. Our research has discovered a completely new gene linked to epilepsy and it shows how we can target this gene using drug-like molecules to reduce the brain’s susceptibility to seizures and the frequency in which they occur.”
Dr Eva Jimenez-Mateos, Department of Physiology & Medical Physics, RCSI and first author on the paper said “Our research found that the antagomir drug protects the brain cells from toxic effects of prolonged seizures and the effects of the treatment can last up to one month.”
Epilepsy affects 37,000 in Ireland alone. For every two out of three people with epilepsy their seizures are controlled by medication, but one in three patients continues to have seizures despite being prescribed medication. This study could potentially offer new treatment methods for patients.
The research was supported by a grant from Science Foundation Ireland (SFI). Researchers in the Department of Physiology & Medical Physics and Molecular & Cellular Therapeutics, RCSI, clinicians at Beaumont Hospital and experts in brain structure from the Cajal Institute in Madrid were involved in the study.
Source: Science Daily
23 JUILLET 2012
Children with trisomy 13 or 18, who are for the most part severely disabled and have a very short life expectancy, and their families lead a life that is happy and rewarding overall, contrary to the usually gloomy predictions made by the medical community at the time of diagnosis, according to a study of parents who are members of support groups published today inPediatrics. The study was conducted by Dr. Annie Janvier of the Sainte-Justine University Hospital Center and the University of Montreal with the special collaboration of the mother of a child who died from trisomy 13, Barbara Farlow, Eng, MSc as the second author.

Source : Wikimedia Commons
The study interviewed 332 parents who live or have lived with 272 children with trisomy 13 or 18. It turns out that their experience diverges substantially from what healthcare providers said it would be, according to which their child would have been “incompatible with life” (87 %), would have been “a vegetable” (50 %), would have led “a life of suffering” (57 %) or would have “ruin their family or life as a couple” (23 %).
It should be noted that trisomies 13 and 18 are rare chromosome disorders that are most often diagnosed before birth and sometimes after. Children who have received these diagnoses generally do not survive beyond their first year of life, while some who do have severe disabilities and a short life. When trisomy 13 or 18 is diagnosed before birth, many parents decide to interrupt the pregnancy, whereas others choose to carry it to term and in such cases miscarriages are common.
As children with trisomies 13 or 18 generally receive palliative care at birth, some parents who opt to continue the pregnancy or desire life-prolonging interventions for their child encounter the prejudices of the medical system. In this regard, the parents interviewed in the study consider that caregivers often view their child in terms of a diagnosis (“a T13”, “a lethal trisomy”) rather than a unique baby.
“Our study points out that physicians and parents can have different views of what constitutes quality of life,” states Dr. Annie Janvier, a neonatologist and co-founder of the Master’s program in Pediatric Clinical Ethics at the University of Montreal. In fact, over 97% of the parents interviewed considered that their child was happy and its presence enriched the life of their family and their life as a couple regardless of longevity. “In the medical literature on all handicaps, disabled patients – or their families – rated their quality of life as being higher than caregivers did,” adds Dr. Annie Janvier.
Parents who receive a new diagnosis of trisomy 13 and 18 and join a parental support group often acquire a more positive image of these diagnoses than the predictions made by the medical profession. In fact, according to the parents interviewed, belonging to a support group helped them view their experience positively. “Our research reveals that some parents who chose a path to accept and to love a disabled child with a short life expectancy have experienced happiness and enrichment. My hope is that this knowledge improves the ability of physicians to understand, communicate and make decisions with these parents,” concludes Barbara Farlow.
Given the rarity of trisomy 13 or 18 cases (one case out of approximately every 10,500 births), the parents were recruited through online support groups that parents often join after receiving the physicians’ diagnosis. Dr. Annie Janvier and Barbara Farlow sometimes give joint talks on the subject of trisomies 13 and 18.
Source: Université de Montréal
July 23, 2012 By David Orenstein
(Medical Xpress) — This week the Journal of the American Medical Association published a study with unfortuate news for the millions of people who suffer from multiple sclerosis. In the large study, a therapy known as interferon beta failed to stave off the progression of the incurable disease. Albert Lo, associate professor of neurology and epidemiology, comments on what the study means for patients, why it was well-designed, and how a new effort to support research on the disease in Rhode Island could help.
The results of this study with nearly 2,700 participants showed that treatment with interferon beta, which is a major class of disease-modifying therapy for multiple sclerosis, did not prevent progression of disability, which is very disappointing from a therapeutic perspective. Currently, there is no cure for MS, and as a lifelong disorder of the nervous system, MS is characterized by episodic relapses of neurological injury such as weakness or blindness. While in most cases, there is a varying degree of recovery after relapses, over time, disability accumulates. The accumulation of deficits and the loss of physical and mental function is a major concern for people with MS and their clinicians.
Currently, there is no medication on the market that is directed explicitly for neuroprotection and the prevention of disability. Many had hoped that the interferons, along with the other disease-modifying agents (which were developed to reduce relapse rates) would also have a significant effect on protecting patients from MS disability.
Although the results from this study were not as we would have hoped, they reflect a marked improvement over prior studies which used known methodologic flaws. The new results from the Tremlett group point to the importance of the research methodology used (prospectively collected longitudinal study data) and a well-controlled design to generate the results – approaches that we are using in our own research at Brown University.
A number of the early studies examining the effect of interferons on disability primarily used patient sample groups of convenience for post-marketing studies. They indicated that interferons were in fact preventing disability. However, using samples of convenience inherently includes a number of biases and problems. Dr. Tremlett’s results were generated from a more systematic longitudinal study in which biases and shortcomings can be better addressed. Therefore, making conclusions and clinical decisions from the results is more reliable. These data both will help in making clinical decisions on treating MS patients during the later course of their disease, when there are virtually no relapses, and will help to point more urgently toward the clinical need of an agent to prevent disability.
Provided by Brown University
Source: medicalxpress.com
July 23, 2012
Neural precursor cells (NPC) in the young brain suppress certain brain tumors such as high-grade gliomas, especially glioblastoma (GBM), which are among the most common and most aggressive tumors. Now researchers of the Max Delbrück Center for Molecular Medicine (MDC) Berlin-Buch and Charité – Universitätsmedizin Berlin have deciphered the underlying mechanism of action with which neural precursor cells protect the young brain against these tumors. They found that the NPC release substances that activate TRPV1 ion channels in the tumor cells and subsequently induce the tumor cells to undergo stress-induced cell-death. (Nature Medicine http://dx.doi.org/10.1038/nm.2827)*.
Despite surgery, radiation or chemotherapy or even a combination of all three treatment options, there is currently no cure for glioblastoma. In an earlier study the research group led by Professor Helmut Kettenmann (MDC) showed that neural precursor cells migrate to the glioblastoma cells and attack them. The neural precursor cells release a protein belonging to the family of BMP proteins (bone morphogenetic protein) that directly attacks the tumor stem cells. The current consensus of researchers is that tumor stem cells are the actual cause for continuous tumor self-renewal.
Kristin Stock, Jitender Kumar, Professor Kettenmann (all MDC), Dr. Michael Synowitz (MDC and Charité), Professor Rainer Glass (Munich University Hospitals, formerly MDC) and Professor Vincenzo Di Marzo (Istituto di Chimica Biomolecolare Pozzuoli, Naples, Italy) now report a new mechanism of action of NPC in astrocytomas. Like glioblastomas, astrocytomas are brain tumors that belong to the family of gliomas. Gliomas are most common in older people and are almost invariably fatal.
As the MDC researchers showed, the NPC also migrate to the astrocytomas. There they do not secrete proteins, but rather release fatty-acid substances (endovanilloids) which are harmful to the cancer cells. However, in order to exert their lethal effect, the endovanilloids need the aid of a specific ion channel, the TRPV1 channel (transient receptor potential vanilloid type 1), also called the vanilloid receptor 1. TRPV1 is already known to researchers as a transducer of painful stimuli. It has, among other things, a binding site for capsaicin, the irritant of hot chili peppers, and is responsible for the hot sensation after eating them. Clinical trials are currently underway to develop new pain treatments by blocking or desensitizing this ion channel.
MDC researchers describe an additional role of the TRPV1 ion channel
In contrast to its use in pain management, this ion channel, which is located on the surface of glioblastoma cells and is much more abundant there than on normal glial cells, must be activated to trigger cell death in gliomas. The activated ion channel mediates stress-induced cell-death in tumor cells. If however TRPV1 is downregulated or blocked, the glioma cells are not destroyed. The MDC researchers are thus the first to identify neural precursor cells as the source of fatty acids that induce tumor cell death and to describe the role of the TRPV1 ion channel in the fight against gliomas.
However, the activity of neural precursor cells in the brain and thus of the body’s own protective mechanism against gliomas diminishes with increasing age. This could explain why these tumors usually develop in older adults and not in children and young people. How can the natural protection of neural precursor cells be harnessed for older brains? According to the researchers, neural precursor cell therapy is not a solution. The benefit this obviously brings in the treatment of young people can have the opposite effect in older adults and may trigger brain tumors.
One possible treatment would be to use drugs to activate the TRPV1 channels. In mice, the group showed that a synthetic substance (arvanil), which is similar to capsaicin, reduced tumor growth. However, this substance has not yet been approved as a drug because the adverse side effects for humans are too severe. It is only used in basic research on mice, which tolerate the substance well. “In principle, however,” the researchers suggest, “synthetic vanilloid compounds may have clinical potential for brain tumor treatment.”
Source: Science Codex
ScienceDaily (July 23, 2012) — A team of University of Alberta researchers has identified a new class of compounds that inhibit the spread of prions, misfolded proteins in the brain that trigger lethal neurodegenerative diseases in humans and animals.
U of A chemistry researcher Frederick West and his team have developed compounds that clear prions from infected cells derived from the brain.
"When these designer molecules were put into infected cells in our lab experiments, the numbers of misfolded proteins diminished — and in some cases we couldn’t detect any remaining misfolded prions," said West.
West and his collaborators at the U of A’s Centre for Prions and Protein Folding Diseases say this research is not yet a cure, but does open a doorway for developing treatments.
"We’re not ready to inject these compounds in prion-infected cattle," said David Westaway, director of the prion centre. "These initial compounds weren’t created for that end-run scenario but they have passed initial tests in a most promising manner."
West notes that the most promising experimental compounds at this stage are simply too big to be used therapeutically in humans or animals.
Human exposure to prion-triggered brain disorder is limited to rare cases of Creutzfeldt-Jakob or mad cow disease. The researchers say the human form of mad cow disease shows up in one in a million people in industrialized nations, but investigating the disease is nonetheless well worth the time and expense.
"There is a strong likelihood that prion diseases operate in a similar way to neurodegenerative diseases such as Alzheimer’s, which are distressingly common around the world," said West.
Source: Science Daily
23 July 2012 by Will Heaven
Watch where you look – it can be used to predict what you’ll say. A new study shows that it is possible to guess what sentences people will use to describe a scene by tracking their eye movements.
Moreno Coco and Frank Keller at the University of Edinburgh, UK, presented 24 volunteers with a series of photo-realistic images depicting indoor scenes such as a hotel reception. They then tracked the sequence of objects that each volunteer looked at after being asked to describe what they saw.
Other than being prompted with a keyword, such as “man” or “suitcase”, participants were free to describe the scene however they liked. Some typical sentences included “the man is standing in the reception of a hotel” or “the suitcase is on the floor”.
The order in which a participant’s gaze settled on objects in each scene tended to mirror the order of nouns in the sentence used to describe it. “We were surprised there was such a close correlation,” says Keller. Given that multiple cognitive processes are involved in sentence formation, Coco says “it is remarkable to find evidence of similarity between speech and visual attention”.
Word predictionThe team used the discovery to see if they could predict what sentences would be used to describe a scene based on eye movement alone. They developed an algorithm that was able to use the eye gazes recorded from the previous experiment to predict the correct sentence from a choice of 576 descriptions.
Changsong Liu of Michigan State University’s Language and Interaction Research lab, in East Lansing, who was not involved in the study, suggests these results could motivate novel designs for human-machine interfaces that take advantage of visual cues to improve speech recognition software.
Gaze information is already used to help with disambiguation. For example, if a speech recognition system can tell that you are looking at a tree, it is less likely to guess that you just said “three”. Sentence prediction, perhaps in combination with augmented reality headsets that track eye movement, for example, is one possible application.
Coco and Keller are now looking into the role of coordinated visual and linguistic processes in conversations between two people. “People engaged in a dialogue use similar syntactic forms, expressions and eye-movements,” says Coco. One hypothesis is that such “coordinative mimicry” might be important for joint decision-making.
Source: NewScientist
Toronto, Canada – Neuroscientists have found strong evidence that vivid memory and directly experiencing the real moment can trigger similar brain activation patterns.
The study, led by Baycrest’s Rotman Research Institute (RRI), in collaboration with the University of Texas at Dallas, is one of the most ambitious and complex yet for elucidating the brain’s ability to evoke a memory by reactivating the parts of the brain that were engaged during the original perceptual experience. Researchers found that vivid memory and real perceptual experience share “striking” similarities at the neural level, although they are not “pixel-perfect” brain pattern replications.
The study appears online this month in the Journal of Cognitive Neuroscience, ahead of print publication.
"When we mentally replay an episode we’ve experienced, it can feel like we are transported back in time and re-living that moment again," said Dr. Brad Buchsbaum, lead investigator and scientist with Baycrest’s RRI. "Our study has confirmed that complex, multi-featured memory involves a partial reinstatement of the whole pattern of brain activity that is evoked during initial perception of the experience. This helps to explain why vivid memory can feel so real."
But vivid memory rarely fools us into believing we are in the real, external world – and that in itself offers a very powerful clue that the two cognitive operations don’t work exactly the same way in the brain, he explained.
In the study, Dr. Buchsbaum’s team used functional magnetic resonance imaging (fMRI), a powerful brain scanning technology that constructs computerized images of brain areas that are active when a person is performing a specific cognitive task. A group of 20 healthy adults (aged 18 to 36) were scanned while they watched 12 video clips, each nine seconds long, sourced from YouTube.com and Vimeo.com. The clips contained a diversity of content – such as music, faces, human emotion, animals, and outdoor scenery. Participants were instructed to pay close attention to each of the videos (which were repeated 27 times) and informed they would be tested on the content of the videos after the scan.
A subset of nine participants from the original group were then selected to complete intensive and structured memory training over several weeks that required practicing over and over again the mental replaying of videos they had watched from the first session. After the training, this group was scanned again as they mentally replayed each video clip. To trigger their memory for a particular clip, they were trained to associate a particular symbolic cue with each one. Following each mental replay, participants would push a button indicating on a scale of 1 to 4 (1 = poor memory, 4 = excellent memory) how well they thought they had recalled a particular clip.
Dr. Buchsbaum’s team found “clear evidence” that patterns of distributed brain activation during vivid memory mimicked the patterns evoked during sensory perception when the videos were viewed – by a correspondence of 91% after a principal components analysis of all the fMRI imaging data.
The so-called “hot spots”, or largest pattern similarity, occurred in sensory and motor association areas of the cerebral cortex – a region that plays a key role in memory, attention, perceptual awareness, thought, language and consciousness.
Dr. Buchsbaum suggested the imaging analysis used in his study could potentially add to the current battery of memory assessment tools available to clinicians. Brain activation patterns from fMRI data could offer an objective way of quantifying whether a patient’s self-report of their memory as “being good or vivid” is accurate or not.
Source: EurekAlert!
July 23, 2012
Ever wonder how the human brain, which is constantly bombarded with millions of pieces of visual information, can filter out what’s unimportant and focus on what’s most useful?

The process is known as selective attention and scientists have long debated how it works. But now, researchers at Wake Forest Baptist Medical Center have discovered an important clue. Evidence from an animal study, published in the July 22 online edition of the journal Nature Neuroscience, shows that the prefrontal cortex is involved in a previously unknown way.
Two types of attention are utilized in the selective attention process – bottom up and top down. Bottom-up attention is automatically guided to images that stand out from a background by virtue of color, shape or motion, such as a billboard on a highway. Top-down attention occurs when one’s focus is consciously shifted to look for a known target in a visual scene, as when searching for a relative in a crowd.
Traditionally, scientists have believed that separate areas of the brain controlled these two processes, with bottom-up attention occurring in the posterior parietal cortex and top-down attention occurring in the prefrontal cortex.
"Our findings provide insights on the neural mechanisms behind the guidance of attention," said Christos Constantinidis, Ph.D., associate professor of neurobiology and anatomy at Wake Forest Baptist and senior author of the study. "This has implications for conditions such as attention deficit hyperactivity disorder (ADHD), which affects millions of people worldwide. People with ADHD have difficulty filtering information and focusing attention. Our findings suggest that both the ability to focus attention intentionally and shifting attention to eye-catching but sometimes unimportant stimuli depend on the prefrontal cortex."
In the Wake Forest Baptist study, two monkeys were trained to detect images on a computer screen while activity in both areas of the brain was recorded. The visual display was designed to let one image “pop out” due to its color difference from the background, such as a red circle surrounded by green. To trigger bottom-up attention, neither the identity nor the location of the pop-out image could be predicted before it appeared. The monkeys indicated that they detected the pop-out image by pushing a lever.
The neural activity associated with identifying the pop-out images occurred in the prefrontal cortex at the same time as in the posterior parietal cortex. This unexpected finding indicates early involvement of the prefrontal cortex in bottom-up attention, in addition to its known role in top-down attention, and provides new insights into the neural mechanisms of attention.
"We hope that our findings will guide future work targeting attention deficits," Constantinidis said.
Provided by Wake Forest University Baptist Medical Center
Source: medicalxpress.com
23 July 2012 by Caroline Williams
The brainiest creatures share a secret – an odd kind of brain cell involved in emotions and empathy that may have accidentally made us conscious

The consciousness connection (Image: Jonathon Burton)
THE origin of consciousness has to be one of the biggest mysteries of all time, occupying philosophers and scientists for generations. So it is strange to think that a little-known neuroscientist called Constantin von Economo might have unearthed an important clue nearly 90 years ago.
When he peered down the lens of his microscope in 1926, von Economo saw a handful of brain cells that were long, spindly and much larger than those around them. In fact, they looked so out of place that at first he thought they were a sign of some kind of disease. But the more brains he looked at, the more of these peculiar cells he found - and always in the same two small areas that evolved to process smells and flavours.
Von Economo briefly pondered what these “rod and corkscrew cells”, as he called them, might be doing, but without the technology to delve much deeper he soon moved on to more promising lines of enquiry.
Little more was said about these neurons until nearly 80 years later when, Esther Nimchinsky and Patrick Hof at Mount Sinai University in New York also stumbled across clusters of these strange-looking neurons. Now, after more than a decade of functional imaging and post-mortem studies, we are beginning to piece together their story. Certain lines of evidence hint that they may help build the rich inner life we call consciousness, including emotions, our sense of self, empathy and our ability to navigate social relationships.
Many other big-brained, social animals also seem to share these cells, in the same spots as the human brain. A greater understanding of the way these paths converged could therefore tell us much about the evolution of the mind.
Admittedly, to the untrained eye these giant brain cells, now known as von Economo neurons (VENs), don’t look particularly exciting. But to a neuroscientist they stand out like a sore thumb. For one thing, VENs are at least 50 per cent, and sometimes up to 200 per cent, larger than typical human neurons. And while most neurons have a pyramid-shaped body with a finely branched tree of connections called dendrites at each end of the cell, VENs have a longer, spindly cell body with a single projection at each end with very few branches.

Perhaps they escaped attention for so long because they are so rare, making up just 1 per cent of the neurons in the two small areas of the human brain: the anterior cingulate cortex (ACC) and the fronto-insular (FI) cortex.
Their location in those regions suggests that VENs may be a central part of our mental machinery, since the ACC and FI are heavily involved in many of the more advanced aspects of our inner lives. Both areas kick into action when we see socially relevant cues, be it a frowning face, a grimace of pain or simply the voice of someone we love. When a mother hears a baby crying, both regions respond strongly. They also light up when we experience emotions such as love, lust, anger and grief. For John Allman, a neuroanatomist at the California Institute of Technology in Pasadena, this adds up to a kind of “social monitoring network” that keeps track of social cues and allows us to alter our behaviour accordingly (Annals of the New York Academy of Sciences, vol 1225, p 59).
The two brain areas also seem to play a key role in the “salience” network, which keeps a subconscious tally of what is going on around us and directs our attention to the most pressing events, as well as monitoring sensations from the body to detect any changes (Brain Structure and Function, DOI: 10.1007/s00429-012-0382-9).
What’s more, both regions are active when a person recognises their reflection in the mirror, suggesting that these parts of the brain underlie our sense of self - a key component of consciousness. “It is the sense of self at every possible level - so the sense of identity, this is me, and the sense of identity of others and how you understand others. That goes to the concept of empathy and theory of mind,” says Hof.
To Bud Craig, a neuroanatomist at Barrow Neurological Institute in Phoenix, Arizona, it all amounts to a continually updated sense of “how I feel now”: the ACC and FI take inputs from the body and tie them together with social cues, thoughts and emotions to quickly and efficiently alter our behaviour (Nature Reviews Neuroscience, vol 10, p 59).
This constantly shifting picture of how we feel may contribute to the way we perceive the passage of time. When something emotionally important is happening, Craig proposes, there is more to process, and because of this time seems to speed up. Conversely, when less is going on we update our view of the world less frequently, so time seems to pass more slowly.
VENs are probably important in all this, though we can only infer their role through circumstantial evidence. That’s because locating these cells, and then measuring their activity in a living brain hasn’t yet been possible. But their unusual appearance is a signal that they probably aren’t just sitting there doing nothing. “They stand out anatomically,” says Allman, “And a general proposition is that anything that’s so distinctive looking must have a distinct function.”
Fast thinkingIn the brain, big usually means fast, so Allman suggests that VENs could be acting as a fast relay system - a kind of social superhighway - which allows the gist of the situation to move quickly through the brain, enabling us to react intuitively on the hop, a crucial survival skill in a social species like ours. “That’s what all of civilisation is based on: our ability to communicate socially, efficiently,” adds Craig.
A particularly distressing form of dementia that can strike people as early as their 30s supports this idea. People who develop fronto-temporal dementia lose large numbers of VENs in the ACC and FI early in the disease, when the main symptom is a complete loss of social awareness, empathy and self-control. “They don’t have normal empathic responses to situations that would normally make you disgusted or sad,” says Hof. “You can show them horrible pictures of an accident and they just don’t blink. They will say ‘oh, yes, it’s an accident’.”
Post-mortem examinations of the brains of people with autism also bolster the idea that VENs lie at the heart of our emotions and empathy. According to one recent study, people with autism may fall into two groups: some have too few VENs, perhaps meaning that they don’t have the necessary wiring to process social cues, while others have far too many (Acta Neuropathologica, vol 118, p 673). The latter group would seem to fit with one recent theory of autism, which proposes that the symptoms may arise from an over-wiring of the brain. Perhaps having too many VENs makes emotional systems fire too intensely, causing people with autism to feel overwhelmed, as many say they do.
Another recent study found that people with schizophrenia who committed suicide had significantly more VENs in their ACC than schizophrenics who died of other causes. The researchers suggest that the over-abundance of VENs might create an overactive emotional system that leaves them prone to negative self-assessment and feelings of guilt and hopelessness (PLoS One, vol 6, p e20936).
VENs in other animals provide some clues, too. When these neurons were first identified, there was the glimmer of hope that we might have found one of the key evolutionary changes, unique to humankind, that could explain our social intelligence. But the earliest studies put paid to that kind of thinking, when VENs turned up in chimpanzees and gorillas. In recent years, they have also been found in elephants and some whales and dolphins.
Like us, many of these species live in big social groups and show signs of the same kind of advanced behaviour associated with VENs in people. Elephants, for instance, display something that looks a lot like empathy: they work together to help injured, lost or trapped elephants, for example. They even seem to show signs of grief at elephant “graveyards” (Biology Letters, vol 2, p 26). What’s more, many of these species can recognise themselves in the mirror, which is usually taken as a rudimentary measure of consciousness. When researchers daub paint on an elephant’s face, for instance, it will notice the mark in the mirror and try to feel the spot with its trunk. This has led Allman and others to speculate that von Economo neurons might be a vital adaptation in large brains for keeping track of social situations - and that the sense of self may be a consequence of this ability.
Yet VENs also crop up in manatees, hippos and giraffes - not renowned for their busy social lives. The cells have also been spotted in macaques, which don’t reliably pass the mirror test, although they are social animals. Although this seems to put a major spanner in the works for those who claim that the cells are crucial for advanced cognition, it could also be that these creatures are showing the precursors of the finely tuned cells found in highly social species. “I think that there are homologues of VENs in all mammals,” says Allman. “That’s not to say they’re shaped the same way but they are located in an analogous bit of cortex and they are expressing the same genes.”
It would make sense, after all, that whales and primates might both have recycled, and refined, older machinery present in a common ancestor rather than independently evolving the same mechanism. Much more research is needed, however, to work out the anatomical differences and the functions of these cells in the different animals.
That work might even help us understand how these neurons evolved in the first place. Allman already has some ideas about where they came from. Our VENs reside in a region of the brain that evolved to integrate taste and smell, so he suggests that many of the traits now associated with the FI evolved from the simple act of deciding whether food is good to eat or likely to make your ill. When reaching that decision, he says, the quicker the “gut” reaction kicks in the better. And if you can detect this process in others, so much the better.
"One of the important functions that seems to reside in the FI has to do with empathy," he says. "My take on this is that empathy arose in the context of shared food - it’s very important to observe if members of your social group are becoming ill as a result of eating something." The basic feeding circuity, including the rudimentary VENs, may then have been co-opted by some species to work in other situations that involve a decision, like working out if a person is trustworthy or to be avoided. "So when we have a feeling, whether it be about a foodstuff or situation or another person, I think that engages the circuitry in the fronto-insular cortex and the VENS are one of the outputs of that circuitry," says Allman.
Allman’s genetics work suggests he may be on to something. His team found that VENs in one part of the FI are expressing the genes for hormones that regulate appetite. There are also a lot of studies showing links between smell and taste and the feelings of strong emotions. Our physical reaction to something we find morally disgusting, for example, is more or less identical to our reaction to a bitter taste, suggesting they may share common brain wiring (Science, vol 323, p 1222). Other work has shown that judging a morally questionable act, such as theft, while smelling something disgusting leads to harsher moral judgements (Personality and Social Psychology Bulletin, vol 34, p 1096). What’s more, Allman points out that our language is loaded with analogies - we might find an experience “delicious”, say, or a person “nauseating”. This is no accident, he says.
Red herringHowever, it is only in highly social animals that VENs live exclusively in the scent and taste regions. In the others, like giraffes and hippos, VENs seem to be sprinkled all over the brain. Allman, however, points out that these findings may be a red herring, since without understanding the genes they express, or their function, we can’t even be sure how closely these cells relate to human VENs. They may even be a different kind of cell that just looks similar.
Based on the evidence so far, however, Hof thinks that the ancestral VENs would have been more widespread, as seen in the hippo brain, and that over the course of evolution they then migrated to the ACC and FI in some animals, but not others - though he admits to having no idea why that might be. He suspects the pressures that shaped the primate brain may have been very different to those that drove the evolution of whales and dolphins.
Craig has hit upon one possibility that would seem to fit all of these big-brained animals. He points out that the bigger the brain, the more energy it takes to run, so it is crucial that it operates as efficiently as possible. A system that continually monitors the environment and the people or animals in it would therefore be an asset, allowing you to adapt quickly to a situation to save as much energy as possible. “Evolution produced an energy calculation system that incorporated not just the sensory inputs from the body but the sensory inputs from the brain,” Craig says. And the fact that we are constantly updating this picture of “how I feel now” has an interesting and very useful by-product: we have a concept that there is an “I” to do the feeling. “Evolution produced a very efficient moment-by-moment calculation of energy utilisation and that had an epiphenomenon, a by-product that provided a subjective representation of my feelings.”
If he’s right - and there is a long way to go before we can be sure - it raises a very humbling possibility: that far from being the pinnacle of brain evolution, consciousness might have been a big, and very successful accident.
Source: NewScientist
19 July 2012 by Nicola Guttridge
Whether a tree branch or a computer mouse is the target, reaching for objects is fundamental primate behaviour. Neurons in the brain prepare for such movements, and this neural activity can now be deciphered, allowing researchers to predict what movements will occur. This discovery could help us develop prosthetic limbs that can be controlled by thought alone.

What happens next? (Image: Gallo Images/Rex Features)
To find out what goes on in the brain when we reach for things, biomedical engineers Daniel Moran and Thomas Pearce at Washington University in St Louis, Missouri, trained two rhesus macaques to participate in a series of exercises. When the monkeys reached for items, electrodes measured the activity of neurons in their dorsal premotor cortex, a region of the brain that is involved in the perception of movement.
The monkeys were trained to reach for a virtual object on a screen to receive a reward. In some tasks the monkeys had to reach directly for an object, in others they had to reach around an obstacle to get to the target.
Impulsive grabMoran and Pearce managed to identify the neural activity corresponding with several aspects of the planned movement, such as angle of reach, hand position and the final target location.
The findings could one day allow the design of prosthetic limbs that can be controlled with thought alone, which is “one of the reasons we did the study”, says Moran.
"The two subjects actually used different strategies to perform the task, and we were able to see this in their neural activity," Moran says. One monkey waited to receive all the information before reaching, but the other reached immediately, even though there was a good chance that an obstacle might appear and the reaching action would need to be rethought.
"If the decoding strategy is a robust finding, then this has wider consequences concerning mind-reading – particularly if we can get equivalent results for more complex strategic differences at higher cognitive levels," says Richard Cooper, a cognitive researcher at Birkbeck, University of London. "However, this is all very speculative."
Source: NewScientist
ScienceDaily (July 20, 2012) — Conditions such as Parkinson’s disease are a result of pathogenic changes to proteins. In the neurodegenerative condition of Parkinson’s disease, which is currently incurable, the alpha-synuclein protein changes and becomes pathological. Until now, there have not been any antibodies that could help to demonstrate the change in alpha-synuclein associated with the disease. An international team of experts led by Gabor G. Kovacs from the Clinical Institute of Neurology at the MedUni Vienna has now discovered a new antibody that actually possesses this ability.
"It opens up new possibilities for the development of a diagnostic test for Parkinsonism," says Kovacs, highlighting the importance of this discovery. "This new antibody will enable us to find the pathological conformation in bodily fluids such as blood or CSF." A clinical study involving around 200 patients is already underway, and the first definitive results are expected at the end of 2012. The tests being carried out in collaboration with the University Department of Neurology, led by Walter Pirker, are designed to determine the extent to which the new antibody can be used as an early diagnostic tool in order to understand the condition better and be able to treat it more effectively.
A step towards a blood test for Parkinson’s With Parkinsonism, the diseased form of alpha-synuclein, which has the same primary structure as the healthy form, undergoes an “abnormal fold.” Says Kovacs: “Until now, however, it was not possible to distinguish between the two.” The previous immunodiagnostic techniques only allowed the general presence of alpha-synuclein to be confirmed. The new, monoclonal antibody, however, which the researchers at the MedUni Vienna have developed in collaboration with the German biotech firm Roboscreen, is now able to detect a strategic part of the protein responsible for the structural changes. The results of the study have now been published in the journal Acta Neuropathologica.
Says Kovacs: “It is still not possible to say whether or not we will be able to diagnose Parkinson’s from a blood test, but this discovery certainly represents a major step in that direction.” Theoretically, it should be possible to diagnose Parkinson’s disease five to eight years before it develops.
In Austria, there are between 15,000 and 16,000 people living with Parkinson’s syndrome. Its frequency increases with age. As society becomes older, Parkinson’s disease, a degenerative condition of the brain, will become an increasingly widespread problem.
Source: Science Daily
ScienceDaily (July 20, 2012) — Scientists at the University of Manchester have uncovered how the internal mechanisms in nerve cells wire the brain. The findings open up new avenues in the investigation of neurodegenerative diseases by analysing the cellular processes underlying these conditions.

Illustration of spectraplakins in axonal growth organising microtubules. (Credit: Image courtesy of University of Manchester)
Dr Andreas Prokop and his team at the Faculty of Life Sciences have been studying the growth of axons, the thin cable-like extensions of nerve cells that wire the brain. If axons don’t develop properly this can lead to birth disorders, mental and physical impairments and the gradual decay of brain capacity during aging.
Axon growth is directed by the hand shaped growth cone which sits in the tip of the axon. It is well documented how growth cones perceive signals from the outside to follow pathways to specific targets, but very little is known about the internal machinery that dictates their behaviour.
Dr Prokop has been studying the key driver of growth cone movements, the cytoskeleton. The cytoskeleton helps to maintain a cell’s shape and is made up of the protein filaments, actin and microtubules. Microtubules are the key driving force of axon growth whilst actin helps to regulate the direction the axon grows.
Dr Prokop and his team used fruit flies to analyse how actin and microtubule proteins combine in the cytoskeleton to coordinate axon growth. They focussed on the multifunctional proteins called spectraplakins which are essential for axonal growth and have known roles in neurodegeneration and wound healing of the skin.
What the team demonstrate in this recent paper is that spectraplakins link microtubules to actin to help them extend in the direction the axon is growing. If this link is missing then microtubule networks show disorganised criss-crossed arrangements instead of parallel bundles and axon growth is hampered.
By understanding the molecular detail of these interactions the team made a second important finding. Spectraplakins collect not only at the tip of microtubules but also along the shaft, which helps to stabilise them and ensure they act as a stable structure within the axon.
This additional function of spectraplakins relates them to a class of microtubule-binding proteins including Tau. Tau is an important player in neurodegenerative diseases, such as Alzheimer’s, which is still little understood. In support of the author’s findings, another publication has just shown that the human spectraplakin, Dystonin, causes neurodegeneration when affected in its linkage to microtubules.
Talking about his research Dr Prokop said: “Understanding cytoskeletal machinery at the cell level is a holy grail of current cell research that will have powerful clinical applications. Thus, cytoskeleton is crucially involved in virtually all aspects of a cell’s life, including cell shape changes, cell division, cell movement, contacts and signalling between cells, and dynamic transport events within cells. Accordingly, the cytoskeleton lies at the root of many brain disorders. Therefore, deciphering the principles of cytoskeletal machinery during the fundamental process of axon growth will essentially help research into the causes of a broad spectrum of diseases. Spectraplakins like at the heart of this machinery and our research opens up new avenues for its investigation”
What Dr Prokop’s paper in the Journal of Neuroscience also demonstrates is the successful research technique using the fruit fly Drosophila. The team was able to replicate its findings regarding axon growth in mice which in turn means the findings can be translated to humans.
Dr Prokop points out fruit flies provide ideal means to make sense of these findings and essentially help to unravel the many mysteries of neurodegeneration.
Dr Prokop continues: “Understanding how spectraplakins perform their cellular functions has important implications for basic as well as biomedical research. Thus, besides their roles during axon growth, spectraplakins of mice and humans are clinically important for a number of conditions and processes including skin blistering, neuro-degeneration, wound healing, synapse formation and neuron migration during brain development. Understanding spectraplakins in one biological process will instruct research on the other clinically relevant roles of these proteins.”
Source: Science Daily
ScienceDaily (July 19, 2012) — While clinical trial results are being released regarding drugs intended to decrease amyloid production — thought to contribute to decline in Alzheimer’s disease — clinical trials of drugs targeting other disease proteins, such as tau, are in their initial phases.
Penn Medicine research presented July 19 at the 2012 Alzheimer’s Association International Conference (AAIC) shows that an anti-tau treatment called epithilone D (EpoD) was effective in preventing and intervening the progress of Alzheimer’s disease in animal models, improving neuron function and cognition, as well as decreasing tau pathology.
By targeting tau, the drug aims to stabilize microtubules, which help support and transport of essential nutrients and information between cells. When tau malfunctions, microtubules break and tau accumulates into tangles.
"This drug effectively hits a tau target by correcting tau loss of function, thereby stabilizing microtubules and offsetting the loss of tau due to its formation into neurofibrillary tangles in animal models, which suggests that this could be an important option to mediate tau function in Alzheimer’s and other tau-based neurodegenerative diseases," said John Trojanowski, MD, PhD, professor of Pathology and Laboratory Medicine in the Perelman School of Medicine at the University of Pennsylvania. "In addition to drugs targeting amyloid, which may not work in advanced Alzheimer’s disease, our hope is that this and other anti-tau drugs can be tested in people with Alzheimer’s disease to determine whether stabilizing microtubules damaged by malfunctioning tau protein may improve clinical and pathological outcomes."
The drug, identified through Penn’s Center for Neurodegenerative Disease Research (CNDR) Drug Discovery Program, was previously shown to prevent further neurological damage and improve cognitive performance in animal models*. The Penn research team includes senior investigator Bin Zhang, MD, and Kurt Brunden, PhD, director of Drug Discovery at CNDR.
Bristol-Myers Squibb, who developed and owns the rights to the drug, has started enrolling patients into a phase I clinical trial in people with mild Alzheimer’s disease.
Source: Science Daily
July 19, 2012
Korean scientists have used tiny stars, squares and triangles as a toolkit to create live neural circuits in a dish.
They hope the shapes can be used to create a reproducible neural circuit model that could be used for learning and memory studies as well as drug screening applications; the shapes could also be integrated into the latest neural tissue scaffolds to aid the regeneration of neurons at injured sites in the body, such as the spinal cord.
Published today in the Journal of Neural Engineering, the study, by researchers at the Korea Advanced Institute of Science and Technology (KAIST), found that triangles were the most effective shape for helping to facilitate the growth of axons and guide them onto specific paths to form a complete circuit.
Co-author of the study, Professor Yoonkey Nam, said: “Eventually, we want to know if we can design a neural tissue model that biologically mimics some neural circuits in our brain.”
A neuron is an electrically excitable cell that processes and transmits information around the body. The neuron is composed of three main parts: a cell body, or soma, dendrites and an axon, which extends from the soma and links to other cells, creating a network.
When axons grow they are usually guided by proteins. Many researchers have been trying to re-create this key process in a dish by manipulating nerve cells from rat brains.
As nerve cells are usually just a few tens of micrometres in size, the challenge associated with creating a live neural network is firstly positioning cells in desired locations and, secondly, making connections between these cells by guiding the axons in designated directions.
The researchers investigated whether two star shapes, five regular shapes (square, circle, triangle, pentagon and hexagon) and three different sizes of isosceles triangles could guide axons in designated directions. Each shape was the size of a single cell and was replicated to form an array which was printed onto a glass surface.
Each of the arrays had an overall size of 1cm-by-1cm with a gap of 10 micrometres between each shape. Hippocampal neurons were taken from rats and plated onto the patterned surfaces. The neurons were fluorescently labelled with dyes so that images could be taken of their growth.
The researchers found that triangles were the most efficient shape to encourage the growth and guidance of an axon. The key to this was the angles at the points where two of the triangle’s lines meet, also known as the vertices. It was shown that the smaller the vertices, the higher chance the triangle had of inducing growth.
"Based on our results, we are suggesting a new design principle for guiding axons in a dish. We can control the axonal growth in a certain direction by putting a sharp triangle pointing to a certain direction. Then, a neuron that adhered to the triangle will have an axon in the sharp vertex direction.
"Overall, we integrated microtechnology with neurobiology to find a new engineering solution" continued Professor Nam.
Provided by Institute of Physics
Source: medicalxpress.com
ScienceDaily (July 19, 2012) — A joint study carried out by The University of Nottingham and the multinational food company Unilever has found for the first time that fat in food can reduce activity in several areas of the brain which are responsible for processing taste, aroma and reward.
The research, now available in the Springer journal Chemosensory Perception, provides the food industry with better understanding of how in the future it might be able to make healthier, less fatty food products without negatively affecting their overall taste and enjoyment. Unveiled in 2010, Unilever’s Sustainable Living Plan sets out its ambition to help hundreds of millions of people improve their diet around the world within a decade.
This fascinating three-year study investigated how the brains of a group of participants in their 20s would respond to changes in the fat content of four different fruit emulsions they tasted while under an MRI scanner. All four samples were of the same thickness and sweetness, but one contained flavour with no fat, while the other three contained fat with different flavour release properties.
The research found that the areas of the participants’ brains which are responsible for the perception of flavour — such as the somatosensory cortices and the anterior, mid & posterior insula — were significantly more activated when the non-fatty sample was tested compared to the fatty emulsions despite having the same flavour perception. It is important to note that increased activation in these brain areas does not necessarily result in increased perception of flavour or reward.
Dr Joanne Hort, Associate Professor in Sensory Science at The University of Nottingham said: “This is the first brain study to assess the effect of fat on the processing of flavour perception and it raises questions as to why fat emulsions suppress the cortical response in brain areas linked to the processing of flavour and reward. It also remains to be determined what the implications of this suppressive effect are on feelings of hunger, satiety and reward.”
Unilever food scientist Johanneke Busch, based at the company’s Research & Development laboratories in Vlaardingen, Netherlands added: “There is more to people’s enjoyment of food than the product’s flavour — like its mouthfeel, its texture and whether it satisfies hunger, so this is a very important building block for us to better understand how to innovate and manufacture healthier food products which people want to buy.”
Source: Science Daily
July 19, 2012 By Emily Martinez
(Medical Xpress) — UT Dallas researchers recently demonstrated how nerve stimulation paired with specific experiences, such as movements or sounds, can reorganize the brain. This technology could lead to new treatments for stroke, tinnitus, autism and other disorders.

Dr. Michael Kilgard helped lead a team that paired vagus nerve stimulation with physical movement to improve brain function.
In a related paper, UT Dallas neuroscientists showed that they could alter the speed at which the brain works in laboratory animals by pairing stimulation of the vagus nerve with fast or slow sounds.
A team led by Dr. Robert Rennaker and Dr. Michael Kilgard looked at whether repeatedly pairing vagus nerve stimulation with a specific movement would change neural activity within the laboratory rats’ primary motor cortex. To test the hypothesis, they paired the vagus nerve stimulation with movements of the forelimb in two groups of rats. The results were published in a recent issue of Cerebral Cortex.
After five days of stimulation and movement pairing, the researchers examined the brain activity in response to the stimulation. The rats who received the training along with the stimulation displayed large changes in the organization of the brain’s movement control system. The animals receiving identical motor training without stimulation pairing did not exhibit any brain changes, or plasticity.
People who suffer strokes or brain trauma often undergo rehabilitation that includes repeated movement of the affected limb in an effort to regain motor skills. It is believed that repeated use of the affected limb causes reorganization of the brain essential to recovery. The recent study suggests that pairing vagus nerve stimulation with standard therapy may result in more rapid and extensive reorganization of the brain, offering the potential for speeding and improving recovery following stroke, said Rennaker, associate professor in The University of Texas at Dallas’ School of Behavioral and Brain Sciences.
“Our goal is to use the brain’s natural neuromodulatory systems to enhance the effectiveness of standard therapies,” Rennaker said. “Our studies in sensory and motor cortex suggest that the technique has the potential to enhance treatments for neurological conditions ranging from chronic pain to motor disorders. Future studies will investigate its effectiveness in treating cognitive impairments.”
July 19, 2012
(Medical Xpress) — When learning to master complex movements such as those required in surgery, is being physically guided by an expert more effective than learning through trial and error?

Dr. George Van Doorn and a participant in the fMRI
New research by Monash University’s Departments of Psychological Studies and Physiology challenges earlier claims that externally guided (or passive) movement is a superior learning method to self-generated (or active) movement.
In the first study of its kind, researchers discovered that different brain regions become active depending on the type of movement used. Lead researcher Dr. George Van Doorn, head of Psychological Studies, said the findings did not support the view that passive movement was a more effective way to learn.
“There has been much debate over the last 30 years about which form of movement is better,” Dr. Van Doorn said. “We found that active movements result in greater activation in brain areas implicated in higher-order processes such as monitoring and controlling goal-directed behaviour, attention, execution of movements, and error detection.
“Passive movements, in contrast, produced greater activity in areas associated with touch perception, length discrimination, tactile object recognition, and the attenuation of sensory inputs.”
People were tested while making movements themselves, and while being guided.
“Whilst inside a functional Magnetic Resonance Imaging (fMRI) machine, we had people either freely move their index finger around a two-dimensional, raised-line pattern to measure self-generated touch. Or we had an experimenter guide the person’s finger around the pattern, to measure externally generated touch. Using the fMRI, we found that different brain regions become active depending on the type of movement used,” Dr. Van Doorn said.
Dr. Van Doorn said touch was becoming a popular area of investigation, with more scientists contributing to understanding about this important, though under-acknowledged, sensory system.
All researchers involved in this study are located at Monash University’s Gippsland campus. The study findings were presented at EuroHaptics 2012, a major international conference and the primary European meeting for researchers in the field of human haptic sensing and touch-enabled computer applications.
Provided by Monash University
Source: medicalxpress.com