ScienceDaily (July 24, 2012) — Neuroscientists from Wayne State University and the Massachusetts Institute of Technology (MIT) are taking a deeper look into how the brain mechanisms for memory retrieval differ between adults and children. While the memory systems are the same in many ways, the researchers have learned that crucial functions with relevance to learning and education differ.
The team’s findings were published on July 17, 2012, in the Journal of Neuroscience.
According to lead author Noa Ofen, Ph.D., assistant professor in WSU’s Institute of Gerontology and Department of Pediatrics, cognitive ability, including the ability to learn and remember new information, dramatically changes between childhood and adulthood. This ability parallels with dramatic changes that occur in the structure and function of the brain during these periods.
In the study, “The Development of Brain Systems Associated with Successful Memory Retrieval of Scenes,” Ofen and her collaborative team tested the development of neural underpinnings of memory from childhood to young adulthood. The team of researchers exposed participants to pictures of scenes and then showed them the same scenes mixed with new ones and asked them to judge whether each picture was presented earlier. Participants made retrieval judgments while researchers collected images of their brains with magnetic resonance imaging (MRI).
Using this method, the researchers were able to see how the brain remembers. “Our results suggest that cortical regions related to attentional or strategic control show the greatest developmental changes for memory retrieval,” said Ofen.
The researchers said that older participants used the cortical regions more than younger participants when correctly retrieving past experiences.
"We were interested to see whether there are changes in the connectivity of regions in the brain that support memory retrieval," Ofen added. "We found changes in connectivity of memory-related regions. In particular, the developmental change in connectivity between regions was profound even without a developmental change in the recruitment of those regions, suggesting that functional brain connectivity is an important aspect of developmental changes in the brain."
This study marks the first time that the development of connectivity within memory systems in the brain has been tested, and the results suggest that the brain continues to rearrange connections to achieve adult-like performance during development.
Ofen and her research team plan to continue research in this area, focused on modeling brain network connectivity, and applying these methods to study abnormal brain development.
Source: Science Daily
July 23, 2012
Mice appear to have a specialized system for detecting and at least initially processing instinctually important smells such as those that denote predators. The finding raises a question about whether their response to those smells is hardwired.

A separate subsystem for the smell of fear. Experiments in mice suggest neurons that detect odors associated with an instinctive response — like fleeing when an approaching predator is detected — are configured differently than other olfactory neurons. Further research could determine whether this system automatically triggers flight or other primal behaviors.Credit: Mike Cohea/Brown University
PROVIDENCE, R.I. [Brown University] — A new study finds that mice have a distinct neural subsystem that links the nose to the brain and is associated with instinctually important smells such as those emitted by predators. That insight, published online this week in Proceedings of the National Academy of Sciences, prompts the question whether mice and other mammals have specially hardwired neural circuitry to trigger instinctive behavior in response to certain smells.
In the series of experiments and observations described in the paper, the authors found that nerve cells in the nose that express members of the gene family of trace amine-associated receptors (TAAR) have several key biological differences from the much more common and diverse neurons that express members of the olfactory receptor gene family. Those other nerve cells detect a much broader range of smells, said corresponding author Gilad Barnea, the Robert and Nancy Carney Assistant Professor of Neuroscience at Brown University.
The differences between TAAR neurons and olfactory receptor neurons led Barnea and his co-authors to conclude that they form an independent subsystem for certain smells.
“Our observations suggest that the TAAR-expressing sensory neurons constitute a distinct olfactory subsystem that extracts specific environmental cues that then elicit innate responses,” Barnea said.
ScienceDaily (July 23, 2012) — Stroboscopic training, performing a physical activity while using eyewear that simulates a strobe-like experience, has been found to increase visual short-term memory retention, and the effects lasted 24 hours.

(Credit: Image courtesy of Duke University)
Participants completed a memory test that required them to note the identity of eight letters of the alphabet that were briefly displayed on a computer screen. After a variable delay, participants were asked to recall one of the eight letters. On easy-level trials, the recall prompt came immediately after the letters disappeared, but on more difficult trials, the prompt came as late as 2.5 seconds following the display. Because participants did not know which letter they would be asked to recall, they had to retain all of the items in memory.
"Humans have a memory buffer in their brain that keeps information alive for a certain short-lived period," said Greg Appelbaum, assistant professor of psychiatry at Duke University and first author of the study. "Wearing the strobe eyewear during the physical training seemed to boost the ability to retain information in this buffer."
The strobe eyewear disrupts vision by only allowing the user to see glimpses of the world. The user must adjust their visual processing in order to perform normally, and this adjustment produces a lingering benefit; once participants removed the strobe eyewear, there was an observed boost in their visual memory retention, which was found to last 24 hours.
Earlier work by Appelbaum and the project’s senior researcher, Stephen Mitroff, had shown that stroboscopic training improves visual perception, including the ability to detect subtle motion cues and the processing of briefly presented visual information. Yet the earlier study had not determined how long the benefits might last.
"Our earlier work on stroboscopic training showed that it can improve perceptual abilities, but we don’t know exactly how," says Mitroff, associate professor of psychology & neuroscience and member of the Duke Institute for Brain Sciences. "This project takes a big step by showing that these improved perceptual abilities are driven, at least in part, by improvements in visual memory."
"Improving human cognition is an important goal with so many benefits," said Appelbaum, also a member of the Duke Institute for Brain Sciences. "Interestingly, our findings demonstrate one way in which visual experience has the capacity to improve cognition."
Source: Science Daily
ScienceDaily (July 23, 2012) — Snack consumption and BMI are linked to both brain activity and self-control, new research has found.

Snack consumption and BMI are linked to both brain activity and self-control, new research has found. (Credit: © farbkombinat / Fotolia)
The research, carried out by academics from the Universities of Exeter, Cardiff, Bristol, and Bangor, discovered that an individual’s brain ‘reward centre’ response to pictures of food predicted how much they subsequently ate. This had a greater effect on the amount they ate than their conscious feelings of hunger or how much they wanted the food,
A strong brain response was also associated with increased weight (BMI), but only in individuals reporting low levels of self-control on a questionnaire. For those reporting high levels of self-control a stronger brain response to food was actually related to a lower BMI.
This study, which is now published in the journal NeuroImage, adds to mounting evidence that overeating and increased weight are linked, in part, to a region of the brain associated with motivation and reward, called the nucleus accumbens. Responses in this brain region have been shown to predict weight gain in healthy weight and obese individuals, but only now have academics discovered that this is independent of conscious feelings of hunger, and that self-control also plays a key role.
Following these results, academics at the University of Exeter and Cardiff have begun testing ‘brain training’ techniques designed to reduce the influence of food cues on individuals who report low levels of self-control. Similar tests are being used to assist those with gambling or alcohol addiction.
Dr Natalia Lawrence of Psychology at the University of Exeter, lead researcher in both the original research and the new studies, said: “Our research suggests why some individuals are more likely to overeat and put on weight than others when confronted with frequent images of snacks and treats. Food images, such as those used in advertising, cause direct increases in activity in brain ‘reward areas’ in some individuals but not in others. If those sensitive individuals also struggle with self-control, which may be partly innate, they are more likely to be overweight. We are now developing computer programs that we hope will counteract the effects of this high sensitivity to food cues by training the brain to respond less positively to these cues.”
Twenty-five young, healthy females with BMIs ranging from 17-30 were involved in the study. Female participants were chosen because research shows females typically exhibit stronger responses to food-related cues. The hormonal changes during the menstrual cycle affect this reaction, so all participants were taking the monophasic combined oral contraceptive pill. Participants had not eaten for at least six hours to ensure they were hungry at the time of the scan and were given a bowl containing 150 g (four and a half packets) of potato chips to eat at the end of the study; they were informed that potato chip intake had been measured afterwards.
Researchers used MRI scanning to detect the participants’ brain activity while they were shown images of household objects, and food that varied in desirability and calorific content. After scanning, participants rated the food images for desirability and rated their levels of hunger and food craving. Results showed that participants’ brain responses to food (relative to objects) in the nucleus accumbens predicted how many potato chips they ate after the scan. However, participants’ own ratings of hunger and how much they liked and wanted the foods, including potato chips, were unrelated to their potato chip intake.
This study was funded by the Wales Institute of Cognitive Neuroscience.
What this study shows:
What this study does NOT show:
Source: Science Daily
JUL 23, 2012
A new and powerful class of antioxidants could one day be a potent treatment for Parkinson’s disease, researchers report.

Dr. Bobby Thomas
A class of antioxidants called synthetic triterpenoids blocked development of Parkinson’s in an animal model that develops the disease in a handful of days, said Dr. Bobby Thomas, neuroscientist at the Medical College of Georgia at Georgia Health Sciences University and corresponding author of the study in the journal Antioxidants & Redox Signaling.
Thomas and his colleagues were able to block the death of dopamine-producing brain cells that occurs in Parkinson’s by using the drugs to bolster Nrf2, a natural antioxidant and inflammation fighter.
Stressors from head trauma to insecticide exposure to simple aging increase oxidative stress and the body responds with inflammation, part of its natural repair process. “This creates an environment in your brain that is not conducive for normal function,” Thomas said. “You can see the signs of oxidative damage in the brain long before the neurons actually degenerate in Parkinson’s.”
Nrf2, the master regulator of oxidative stress and inflammation, is – inexplicably – significantly decreased early in Parkinson’s. In fact, Nrf2 activity declines normally with age.
“In Parkinson’s patients you can clearly see a significant overload of oxidative stress, which is why we chose this target,” Thomas said. “We used drugs to selectively activate Nrf2.”
They parsed a number of antioxidants already under study for a wide range of diseases from kidney failure to heart disease and diabetes, and found triterpenoids the most effective on Nrf2. Co-author Dr. Michael Sporn, Professor of Pharmacology, Toxicology and Medicine at Dartmouth Medical School, chemically modified the agents so they could permeate the protective blood-brain barrier.
Both in human neuroblastoma and mouse brain cells they were able to document an increase in Nrf2 in response to the synthetic triterpenoids. Human dopaminergic cells are not available for research so the scientists used the human neuroblastoma cells, which are actually cancer cells that have some properties similar to neurons.
Their preliminary evidence indicates the synthetic triterpenoids also increase Nrf2 activity in astrocytes, a brain cell type which nourishes neurons and hauls off some of their garbage. The drugs didn’t protect brain cells in an animal where the Nrf2 gene was deleted, more proof that that Nrf2 is the drugs’ target.
The researchers used the powerful neurotoxin MPTP to mimic Parkinson’s-like brain cell damage in a matter of days. They are now looking at the impact of synthetic triterpenoids in an animal model genetically programmed to acquire the disease more slowly, as humans do. Collaborators at Johns Hopkins School of Medicine also will be providing induced pluripotent stem cells, adult stem cells that can be coaxed into forming dopaminergic neurons, for additional drug testing.
Other collaborators include scientists at Weill Medical College of Cornell University, Johns Hopkins School of Public Health, Moscow State University, Tohoku University and the University of Pittsburgh.
Source: EarthSky
ScienceDaily (July 23, 2012) — New research conducted by neuroscientists from the Royal College of Surgeons in Ireland (RCSI) published in Nature Medicine has identified a new gene involved in epilepsy and could potentially provide a new treatment option for patients with epilepsy.
The research focussed on a new class of gene called a ‘microRNA’ which controls protein production inside cells. The research looked in detail at one particular microRNA called ‘microRNA-134’ and found that levels of microRNA-134 are much higher in the part of the brain that causes seizures in patients with epilepsy.
By using a new type of drug-like molecule called an antagomir which locks onto the ‘microRNA-134’ and removes it from the brain cell, the researchers found they could prevent epileptic seizures from occurring.
Professor David Henshall, Department of Physiology & Medical Physics, RCSI and senior author on the paper said ‘We have been looking to find what goes wrong inside brain cells to trigger epilepsy. Our research has discovered a completely new gene linked to epilepsy and it shows how we can target this gene using drug-like molecules to reduce the brain’s susceptibility to seizures and the frequency in which they occur.”
Dr Eva Jimenez-Mateos, Department of Physiology & Medical Physics, RCSI and first author on the paper said “Our research found that the antagomir drug protects the brain cells from toxic effects of prolonged seizures and the effects of the treatment can last up to one month.”
Epilepsy affects 37,000 in Ireland alone. For every two out of three people with epilepsy their seizures are controlled by medication, but one in three patients continues to have seizures despite being prescribed medication. This study could potentially offer new treatment methods for patients.
The research was supported by a grant from Science Foundation Ireland (SFI). Researchers in the Department of Physiology & Medical Physics and Molecular & Cellular Therapeutics, RCSI, clinicians at Beaumont Hospital and experts in brain structure from the Cajal Institute in Madrid were involved in the study.
Source: Science Daily
23 JUILLET 2012
Children with trisomy 13 or 18, who are for the most part severely disabled and have a very short life expectancy, and their families lead a life that is happy and rewarding overall, contrary to the usually gloomy predictions made by the medical community at the time of diagnosis, according to a study of parents who are members of support groups published today inPediatrics. The study was conducted by Dr. Annie Janvier of the Sainte-Justine University Hospital Center and the University of Montreal with the special collaboration of the mother of a child who died from trisomy 13, Barbara Farlow, Eng, MSc as the second author.

Source : Wikimedia Commons
The study interviewed 332 parents who live or have lived with 272 children with trisomy 13 or 18. It turns out that their experience diverges substantially from what healthcare providers said it would be, according to which their child would have been “incompatible with life” (87 %), would have been “a vegetable” (50 %), would have led “a life of suffering” (57 %) or would have “ruin their family or life as a couple” (23 %).
It should be noted that trisomies 13 and 18 are rare chromosome disorders that are most often diagnosed before birth and sometimes after. Children who have received these diagnoses generally do not survive beyond their first year of life, while some who do have severe disabilities and a short life. When trisomy 13 or 18 is diagnosed before birth, many parents decide to interrupt the pregnancy, whereas others choose to carry it to term and in such cases miscarriages are common.
As children with trisomies 13 or 18 generally receive palliative care at birth, some parents who opt to continue the pregnancy or desire life-prolonging interventions for their child encounter the prejudices of the medical system. In this regard, the parents interviewed in the study consider that caregivers often view their child in terms of a diagnosis (“a T13”, “a lethal trisomy”) rather than a unique baby.
“Our study points out that physicians and parents can have different views of what constitutes quality of life,” states Dr. Annie Janvier, a neonatologist and co-founder of the Master’s program in Pediatric Clinical Ethics at the University of Montreal. In fact, over 97% of the parents interviewed considered that their child was happy and its presence enriched the life of their family and their life as a couple regardless of longevity. “In the medical literature on all handicaps, disabled patients – or their families – rated their quality of life as being higher than caregivers did,” adds Dr. Annie Janvier.
Parents who receive a new diagnosis of trisomy 13 and 18 and join a parental support group often acquire a more positive image of these diagnoses than the predictions made by the medical profession. In fact, according to the parents interviewed, belonging to a support group helped them view their experience positively. “Our research reveals that some parents who chose a path to accept and to love a disabled child with a short life expectancy have experienced happiness and enrichment. My hope is that this knowledge improves the ability of physicians to understand, communicate and make decisions with these parents,” concludes Barbara Farlow.
Given the rarity of trisomy 13 or 18 cases (one case out of approximately every 10,500 births), the parents were recruited through online support groups that parents often join after receiving the physicians’ diagnosis. Dr. Annie Janvier and Barbara Farlow sometimes give joint talks on the subject of trisomies 13 and 18.
Source: Université de Montréal
July 23, 2012 By David Orenstein
(Medical Xpress) — This week the Journal of the American Medical Association published a study with unfortuate news for the millions of people who suffer from multiple sclerosis. In the large study, a therapy known as interferon beta failed to stave off the progression of the incurable disease. Albert Lo, associate professor of neurology and epidemiology, comments on what the study means for patients, why it was well-designed, and how a new effort to support research on the disease in Rhode Island could help.
The results of this study with nearly 2,700 participants showed that treatment with interferon beta, which is a major class of disease-modifying therapy for multiple sclerosis, did not prevent progression of disability, which is very disappointing from a therapeutic perspective. Currently, there is no cure for MS, and as a lifelong disorder of the nervous system, MS is characterized by episodic relapses of neurological injury such as weakness or blindness. While in most cases, there is a varying degree of recovery after relapses, over time, disability accumulates. The accumulation of deficits and the loss of physical and mental function is a major concern for people with MS and their clinicians.
Currently, there is no medication on the market that is directed explicitly for neuroprotection and the prevention of disability. Many had hoped that the interferons, along with the other disease-modifying agents (which were developed to reduce relapse rates) would also have a significant effect on protecting patients from MS disability.
Although the results from this study were not as we would have hoped, they reflect a marked improvement over prior studies which used known methodologic flaws. The new results from the Tremlett group point to the importance of the research methodology used (prospectively collected longitudinal study data) and a well-controlled design to generate the results – approaches that we are using in our own research at Brown University.
A number of the early studies examining the effect of interferons on disability primarily used patient sample groups of convenience for post-marketing studies. They indicated that interferons were in fact preventing disability. However, using samples of convenience inherently includes a number of biases and problems. Dr. Tremlett’s results were generated from a more systematic longitudinal study in which biases and shortcomings can be better addressed. Therefore, making conclusions and clinical decisions from the results is more reliable. These data both will help in making clinical decisions on treating MS patients during the later course of their disease, when there are virtually no relapses, and will help to point more urgently toward the clinical need of an agent to prevent disability.
Provided by Brown University
Source: medicalxpress.com
July 23, 2012
Neural precursor cells (NPC) in the young brain suppress certain brain tumors such as high-grade gliomas, especially glioblastoma (GBM), which are among the most common and most aggressive tumors. Now researchers of the Max Delbrück Center for Molecular Medicine (MDC) Berlin-Buch and Charité – Universitätsmedizin Berlin have deciphered the underlying mechanism of action with which neural precursor cells protect the young brain against these tumors. They found that the NPC release substances that activate TRPV1 ion channels in the tumor cells and subsequently induce the tumor cells to undergo stress-induced cell-death. (Nature Medicine http://dx.doi.org/10.1038/nm.2827)*.
Despite surgery, radiation or chemotherapy or even a combination of all three treatment options, there is currently no cure for glioblastoma. In an earlier study the research group led by Professor Helmut Kettenmann (MDC) showed that neural precursor cells migrate to the glioblastoma cells and attack them. The neural precursor cells release a protein belonging to the family of BMP proteins (bone morphogenetic protein) that directly attacks the tumor stem cells. The current consensus of researchers is that tumor stem cells are the actual cause for continuous tumor self-renewal.
Kristin Stock, Jitender Kumar, Professor Kettenmann (all MDC), Dr. Michael Synowitz (MDC and Charité), Professor Rainer Glass (Munich University Hospitals, formerly MDC) and Professor Vincenzo Di Marzo (Istituto di Chimica Biomolecolare Pozzuoli, Naples, Italy) now report a new mechanism of action of NPC in astrocytomas. Like glioblastomas, astrocytomas are brain tumors that belong to the family of gliomas. Gliomas are most common in older people and are almost invariably fatal.
As the MDC researchers showed, the NPC also migrate to the astrocytomas. There they do not secrete proteins, but rather release fatty-acid substances (endovanilloids) which are harmful to the cancer cells. However, in order to exert their lethal effect, the endovanilloids need the aid of a specific ion channel, the TRPV1 channel (transient receptor potential vanilloid type 1), also called the vanilloid receptor 1. TRPV1 is already known to researchers as a transducer of painful stimuli. It has, among other things, a binding site for capsaicin, the irritant of hot chili peppers, and is responsible for the hot sensation after eating them. Clinical trials are currently underway to develop new pain treatments by blocking or desensitizing this ion channel.
MDC researchers describe an additional role of the TRPV1 ion channel
In contrast to its use in pain management, this ion channel, which is located on the surface of glioblastoma cells and is much more abundant there than on normal glial cells, must be activated to trigger cell death in gliomas. The activated ion channel mediates stress-induced cell-death in tumor cells. If however TRPV1 is downregulated or blocked, the glioma cells are not destroyed. The MDC researchers are thus the first to identify neural precursor cells as the source of fatty acids that induce tumor cell death and to describe the role of the TRPV1 ion channel in the fight against gliomas.
However, the activity of neural precursor cells in the brain and thus of the body’s own protective mechanism against gliomas diminishes with increasing age. This could explain why these tumors usually develop in older adults and not in children and young people. How can the natural protection of neural precursor cells be harnessed for older brains? According to the researchers, neural precursor cell therapy is not a solution. The benefit this obviously brings in the treatment of young people can have the opposite effect in older adults and may trigger brain tumors.
One possible treatment would be to use drugs to activate the TRPV1 channels. In mice, the group showed that a synthetic substance (arvanil), which is similar to capsaicin, reduced tumor growth. However, this substance has not yet been approved as a drug because the adverse side effects for humans are too severe. It is only used in basic research on mice, which tolerate the substance well. “In principle, however,” the researchers suggest, “synthetic vanilloid compounds may have clinical potential for brain tumor treatment.”
Source: Science Codex
ScienceDaily (July 23, 2012) — A team of University of Alberta researchers has identified a new class of compounds that inhibit the spread of prions, misfolded proteins in the brain that trigger lethal neurodegenerative diseases in humans and animals.
U of A chemistry researcher Frederick West and his team have developed compounds that clear prions from infected cells derived from the brain.
"When these designer molecules were put into infected cells in our lab experiments, the numbers of misfolded proteins diminished — and in some cases we couldn’t detect any remaining misfolded prions," said West.
West and his collaborators at the U of A’s Centre for Prions and Protein Folding Diseases say this research is not yet a cure, but does open a doorway for developing treatments.
"We’re not ready to inject these compounds in prion-infected cattle," said David Westaway, director of the prion centre. "These initial compounds weren’t created for that end-run scenario but they have passed initial tests in a most promising manner."
West notes that the most promising experimental compounds at this stage are simply too big to be used therapeutically in humans or animals.
Human exposure to prion-triggered brain disorder is limited to rare cases of Creutzfeldt-Jakob or mad cow disease. The researchers say the human form of mad cow disease shows up in one in a million people in industrialized nations, but investigating the disease is nonetheless well worth the time and expense.
"There is a strong likelihood that prion diseases operate in a similar way to neurodegenerative diseases such as Alzheimer’s, which are distressingly common around the world," said West.
Source: Science Daily
23 July 2012 by Will Heaven
Watch where you look – it can be used to predict what you’ll say. A new study shows that it is possible to guess what sentences people will use to describe a scene by tracking their eye movements.
Moreno Coco and Frank Keller at the University of Edinburgh, UK, presented 24 volunteers with a series of photo-realistic images depicting indoor scenes such as a hotel reception. They then tracked the sequence of objects that each volunteer looked at after being asked to describe what they saw.
Other than being prompted with a keyword, such as “man” or “suitcase”, participants were free to describe the scene however they liked. Some typical sentences included “the man is standing in the reception of a hotel” or “the suitcase is on the floor”.
The order in which a participant’s gaze settled on objects in each scene tended to mirror the order of nouns in the sentence used to describe it. “We were surprised there was such a close correlation,” says Keller. Given that multiple cognitive processes are involved in sentence formation, Coco says “it is remarkable to find evidence of similarity between speech and visual attention”.
Word predictionThe team used the discovery to see if they could predict what sentences would be used to describe a scene based on eye movement alone. They developed an algorithm that was able to use the eye gazes recorded from the previous experiment to predict the correct sentence from a choice of 576 descriptions.
Changsong Liu of Michigan State University’s Language and Interaction Research lab, in East Lansing, who was not involved in the study, suggests these results could motivate novel designs for human-machine interfaces that take advantage of visual cues to improve speech recognition software.
Gaze information is already used to help with disambiguation. For example, if a speech recognition system can tell that you are looking at a tree, it is less likely to guess that you just said “three”. Sentence prediction, perhaps in combination with augmented reality headsets that track eye movement, for example, is one possible application.
Coco and Keller are now looking into the role of coordinated visual and linguistic processes in conversations between two people. “People engaged in a dialogue use similar syntactic forms, expressions and eye-movements,” says Coco. One hypothesis is that such “coordinative mimicry” might be important for joint decision-making.
Source: NewScientist
Toronto, Canada – Neuroscientists have found strong evidence that vivid memory and directly experiencing the real moment can trigger similar brain activation patterns.
The study, led by Baycrest’s Rotman Research Institute (RRI), in collaboration with the University of Texas at Dallas, is one of the most ambitious and complex yet for elucidating the brain’s ability to evoke a memory by reactivating the parts of the brain that were engaged during the original perceptual experience. Researchers found that vivid memory and real perceptual experience share “striking” similarities at the neural level, although they are not “pixel-perfect” brain pattern replications.
The study appears online this month in the Journal of Cognitive Neuroscience, ahead of print publication.
"When we mentally replay an episode we’ve experienced, it can feel like we are transported back in time and re-living that moment again," said Dr. Brad Buchsbaum, lead investigator and scientist with Baycrest’s RRI. "Our study has confirmed that complex, multi-featured memory involves a partial reinstatement of the whole pattern of brain activity that is evoked during initial perception of the experience. This helps to explain why vivid memory can feel so real."
But vivid memory rarely fools us into believing we are in the real, external world – and that in itself offers a very powerful clue that the two cognitive operations don’t work exactly the same way in the brain, he explained.
In the study, Dr. Buchsbaum’s team used functional magnetic resonance imaging (fMRI), a powerful brain scanning technology that constructs computerized images of brain areas that are active when a person is performing a specific cognitive task. A group of 20 healthy adults (aged 18 to 36) were scanned while they watched 12 video clips, each nine seconds long, sourced from YouTube.com and Vimeo.com. The clips contained a diversity of content – such as music, faces, human emotion, animals, and outdoor scenery. Participants were instructed to pay close attention to each of the videos (which were repeated 27 times) and informed they would be tested on the content of the videos after the scan.
A subset of nine participants from the original group were then selected to complete intensive and structured memory training over several weeks that required practicing over and over again the mental replaying of videos they had watched from the first session. After the training, this group was scanned again as they mentally replayed each video clip. To trigger their memory for a particular clip, they were trained to associate a particular symbolic cue with each one. Following each mental replay, participants would push a button indicating on a scale of 1 to 4 (1 = poor memory, 4 = excellent memory) how well they thought they had recalled a particular clip.
Dr. Buchsbaum’s team found “clear evidence” that patterns of distributed brain activation during vivid memory mimicked the patterns evoked during sensory perception when the videos were viewed – by a correspondence of 91% after a principal components analysis of all the fMRI imaging data.
The so-called “hot spots”, or largest pattern similarity, occurred in sensory and motor association areas of the cerebral cortex – a region that plays a key role in memory, attention, perceptual awareness, thought, language and consciousness.
Dr. Buchsbaum suggested the imaging analysis used in his study could potentially add to the current battery of memory assessment tools available to clinicians. Brain activation patterns from fMRI data could offer an objective way of quantifying whether a patient’s self-report of their memory as “being good or vivid” is accurate or not.
Source: EurekAlert!
July 23, 2012
Ever wonder how the human brain, which is constantly bombarded with millions of pieces of visual information, can filter out what’s unimportant and focus on what’s most useful?

The process is known as selective attention and scientists have long debated how it works. But now, researchers at Wake Forest Baptist Medical Center have discovered an important clue. Evidence from an animal study, published in the July 22 online edition of the journal Nature Neuroscience, shows that the prefrontal cortex is involved in a previously unknown way.
Two types of attention are utilized in the selective attention process – bottom up and top down. Bottom-up attention is automatically guided to images that stand out from a background by virtue of color, shape or motion, such as a billboard on a highway. Top-down attention occurs when one’s focus is consciously shifted to look for a known target in a visual scene, as when searching for a relative in a crowd.
Traditionally, scientists have believed that separate areas of the brain controlled these two processes, with bottom-up attention occurring in the posterior parietal cortex and top-down attention occurring in the prefrontal cortex.
"Our findings provide insights on the neural mechanisms behind the guidance of attention," said Christos Constantinidis, Ph.D., associate professor of neurobiology and anatomy at Wake Forest Baptist and senior author of the study. "This has implications for conditions such as attention deficit hyperactivity disorder (ADHD), which affects millions of people worldwide. People with ADHD have difficulty filtering information and focusing attention. Our findings suggest that both the ability to focus attention intentionally and shifting attention to eye-catching but sometimes unimportant stimuli depend on the prefrontal cortex."
In the Wake Forest Baptist study, two monkeys were trained to detect images on a computer screen while activity in both areas of the brain was recorded. The visual display was designed to let one image “pop out” due to its color difference from the background, such as a red circle surrounded by green. To trigger bottom-up attention, neither the identity nor the location of the pop-out image could be predicted before it appeared. The monkeys indicated that they detected the pop-out image by pushing a lever.
The neural activity associated with identifying the pop-out images occurred in the prefrontal cortex at the same time as in the posterior parietal cortex. This unexpected finding indicates early involvement of the prefrontal cortex in bottom-up attention, in addition to its known role in top-down attention, and provides new insights into the neural mechanisms of attention.
"We hope that our findings will guide future work targeting attention deficits," Constantinidis said.
Provided by Wake Forest University Baptist Medical Center
Source: medicalxpress.com
23 July 2012 by Caroline Williams
The brainiest creatures share a secret – an odd kind of brain cell involved in emotions and empathy that may have accidentally made us conscious

The consciousness connection (Image: Jonathon Burton)
THE origin of consciousness has to be one of the biggest mysteries of all time, occupying philosophers and scientists for generations. So it is strange to think that a little-known neuroscientist called Constantin von Economo might have unearthed an important clue nearly 90 years ago.
When he peered down the lens of his microscope in 1926, von Economo saw a handful of brain cells that were long, spindly and much larger than those around them. In fact, they looked so out of place that at first he thought they were a sign of some kind of disease. But the more brains he looked at, the more of these peculiar cells he found - and always in the same two small areas that evolved to process smells and flavours.
Von Economo briefly pondered what these “rod and corkscrew cells”, as he called them, might be doing, but without the technology to delve much deeper he soon moved on to more promising lines of enquiry.
Little more was said about these neurons until nearly 80 years later when, Esther Nimchinsky and Patrick Hof at Mount Sinai University in New York also stumbled across clusters of these strange-looking neurons. Now, after more than a decade of functional imaging and post-mortem studies, we are beginning to piece together their story. Certain lines of evidence hint that they may help build the rich inner life we call consciousness, including emotions, our sense of self, empathy and our ability to navigate social relationships.
Many other big-brained, social animals also seem to share these cells, in the same spots as the human brain. A greater understanding of the way these paths converged could therefore tell us much about the evolution of the mind.
Admittedly, to the untrained eye these giant brain cells, now known as von Economo neurons (VENs), don’t look particularly exciting. But to a neuroscientist they stand out like a sore thumb. For one thing, VENs are at least 50 per cent, and sometimes up to 200 per cent, larger than typical human neurons. And while most neurons have a pyramid-shaped body with a finely branched tree of connections called dendrites at each end of the cell, VENs have a longer, spindly cell body with a single projection at each end with very few branches.

Perhaps they escaped attention for so long because they are so rare, making up just 1 per cent of the neurons in the two small areas of the human brain: the anterior cingulate cortex (ACC) and the fronto-insular (FI) cortex.
Their location in those regions suggests that VENs may be a central part of our mental machinery, since the ACC and FI are heavily involved in many of the more advanced aspects of our inner lives. Both areas kick into action when we see socially relevant cues, be it a frowning face, a grimace of pain or simply the voice of someone we love. When a mother hears a baby crying, both regions respond strongly. They also light up when we experience emotions such as love, lust, anger and grief. For John Allman, a neuroanatomist at the California Institute of Technology in Pasadena, this adds up to a kind of “social monitoring network” that keeps track of social cues and allows us to alter our behaviour accordingly (Annals of the New York Academy of Sciences, vol 1225, p 59).
The two brain areas also seem to play a key role in the “salience” network, which keeps a subconscious tally of what is going on around us and directs our attention to the most pressing events, as well as monitoring sensations from the body to detect any changes (Brain Structure and Function, DOI: 10.1007/s00429-012-0382-9).
What’s more, both regions are active when a person recognises their reflection in the mirror, suggesting that these parts of the brain underlie our sense of self - a key component of consciousness. “It is the sense of self at every possible level - so the sense of identity, this is me, and the sense of identity of others and how you understand others. That goes to the concept of empathy and theory of mind,” says Hof.
To Bud Craig, a neuroanatomist at Barrow Neurological Institute in Phoenix, Arizona, it all amounts to a continually updated sense of “how I feel now”: the ACC and FI take inputs from the body and tie them together with social cues, thoughts and emotions to quickly and efficiently alter our behaviour (Nature Reviews Neuroscience, vol 10, p 59).
This constantly shifting picture of how we feel may contribute to the way we perceive the passage of time. When something emotionally important is happening, Craig proposes, there is more to process, and because of this time seems to speed up. Conversely, when less is going on we update our view of the world less frequently, so time seems to pass more slowly.
VENs are probably important in all this, though we can only infer their role through circumstantial evidence. That’s because locating these cells, and then measuring their activity in a living brain hasn’t yet been possible. But their unusual appearance is a signal that they probably aren’t just sitting there doing nothing. “They stand out anatomically,” says Allman, “And a general proposition is that anything that’s so distinctive looking must have a distinct function.”
Fast thinkingIn the brain, big usually means fast, so Allman suggests that VENs could be acting as a fast relay system - a kind of social superhighway - which allows the gist of the situation to move quickly through the brain, enabling us to react intuitively on the hop, a crucial survival skill in a social species like ours. “That’s what all of civilisation is based on: our ability to communicate socially, efficiently,” adds Craig.
A particularly distressing form of dementia that can strike people as early as their 30s supports this idea. People who develop fronto-temporal dementia lose large numbers of VENs in the ACC and FI early in the disease, when the main symptom is a complete loss of social awareness, empathy and self-control. “They don’t have normal empathic responses to situations that would normally make you disgusted or sad,” says Hof. “You can show them horrible pictures of an accident and they just don’t blink. They will say ‘oh, yes, it’s an accident’.”
Post-mortem examinations of the brains of people with autism also bolster the idea that VENs lie at the heart of our emotions and empathy. According to one recent study, people with autism may fall into two groups: some have too few VENs, perhaps meaning that they don’t have the necessary wiring to process social cues, while others have far too many (Acta Neuropathologica, vol 118, p 673). The latter group would seem to fit with one recent theory of autism, which proposes that the symptoms may arise from an over-wiring of the brain. Perhaps having too many VENs makes emotional systems fire too intensely, causing people with autism to feel overwhelmed, as many say they do.
Another recent study found that people with schizophrenia who committed suicide had significantly more VENs in their ACC than schizophrenics who died of other causes. The researchers suggest that the over-abundance of VENs might create an overactive emotional system that leaves them prone to negative self-assessment and feelings of guilt and hopelessness (PLoS One, vol 6, p e20936).
VENs in other animals provide some clues, too. When these neurons were first identified, there was the glimmer of hope that we might have found one of the key evolutionary changes, unique to humankind, that could explain our social intelligence. But the earliest studies put paid to that kind of thinking, when VENs turned up in chimpanzees and gorillas. In recent years, they have also been found in elephants and some whales and dolphins.
Like us, many of these species live in big social groups and show signs of the same kind of advanced behaviour associated with VENs in people. Elephants, for instance, display something that looks a lot like empathy: they work together to help injured, lost or trapped elephants, for example. They even seem to show signs of grief at elephant “graveyards” (Biology Letters, vol 2, p 26). What’s more, many of these species can recognise themselves in the mirror, which is usually taken as a rudimentary measure of consciousness. When researchers daub paint on an elephant’s face, for instance, it will notice the mark in the mirror and try to feel the spot with its trunk. This has led Allman and others to speculate that von Economo neurons might be a vital adaptation in large brains for keeping track of social situations - and that the sense of self may be a consequence of this ability.
Yet VENs also crop up in manatees, hippos and giraffes - not renowned for their busy social lives. The cells have also been spotted in macaques, which don’t reliably pass the mirror test, although they are social animals. Although this seems to put a major spanner in the works for those who claim that the cells are crucial for advanced cognition, it could also be that these creatures are showing the precursors of the finely tuned cells found in highly social species. “I think that there are homologues of VENs in all mammals,” says Allman. “That’s not to say they’re shaped the same way but they are located in an analogous bit of cortex and they are expressing the same genes.”
It would make sense, after all, that whales and primates might both have recycled, and refined, older machinery present in a common ancestor rather than independently evolving the same mechanism. Much more research is needed, however, to work out the anatomical differences and the functions of these cells in the different animals.
That work might even help us understand how these neurons evolved in the first place. Allman already has some ideas about where they came from. Our VENs reside in a region of the brain that evolved to integrate taste and smell, so he suggests that many of the traits now associated with the FI evolved from the simple act of deciding whether food is good to eat or likely to make your ill. When reaching that decision, he says, the quicker the “gut” reaction kicks in the better. And if you can detect this process in others, so much the better.
"One of the important functions that seems to reside in the FI has to do with empathy," he says. "My take on this is that empathy arose in the context of shared food - it’s very important to observe if members of your social group are becoming ill as a result of eating something." The basic feeding circuity, including the rudimentary VENs, may then have been co-opted by some species to work in other situations that involve a decision, like working out if a person is trustworthy or to be avoided. "So when we have a feeling, whether it be about a foodstuff or situation or another person, I think that engages the circuitry in the fronto-insular cortex and the VENS are one of the outputs of that circuitry," says Allman.
Allman’s genetics work suggests he may be on to something. His team found that VENs in one part of the FI are expressing the genes for hormones that regulate appetite. There are also a lot of studies showing links between smell and taste and the feelings of strong emotions. Our physical reaction to something we find morally disgusting, for example, is more or less identical to our reaction to a bitter taste, suggesting they may share common brain wiring (Science, vol 323, p 1222). Other work has shown that judging a morally questionable act, such as theft, while smelling something disgusting leads to harsher moral judgements (Personality and Social Psychology Bulletin, vol 34, p 1096). What’s more, Allman points out that our language is loaded with analogies - we might find an experience “delicious”, say, or a person “nauseating”. This is no accident, he says.
Red herringHowever, it is only in highly social animals that VENs live exclusively in the scent and taste regions. In the others, like giraffes and hippos, VENs seem to be sprinkled all over the brain. Allman, however, points out that these findings may be a red herring, since without understanding the genes they express, or their function, we can’t even be sure how closely these cells relate to human VENs. They may even be a different kind of cell that just looks similar.
Based on the evidence so far, however, Hof thinks that the ancestral VENs would have been more widespread, as seen in the hippo brain, and that over the course of evolution they then migrated to the ACC and FI in some animals, but not others - though he admits to having no idea why that might be. He suspects the pressures that shaped the primate brain may have been very different to those that drove the evolution of whales and dolphins.
Craig has hit upon one possibility that would seem to fit all of these big-brained animals. He points out that the bigger the brain, the more energy it takes to run, so it is crucial that it operates as efficiently as possible. A system that continually monitors the environment and the people or animals in it would therefore be an asset, allowing you to adapt quickly to a situation to save as much energy as possible. “Evolution produced an energy calculation system that incorporated not just the sensory inputs from the body but the sensory inputs from the brain,” Craig says. And the fact that we are constantly updating this picture of “how I feel now” has an interesting and very useful by-product: we have a concept that there is an “I” to do the feeling. “Evolution produced a very efficient moment-by-moment calculation of energy utilisation and that had an epiphenomenon, a by-product that provided a subjective representation of my feelings.”
If he’s right - and there is a long way to go before we can be sure - it raises a very humbling possibility: that far from being the pinnacle of brain evolution, consciousness might have been a big, and very successful accident.
Source: NewScientist
19 July 2012 by Nicola Guttridge
Whether a tree branch or a computer mouse is the target, reaching for objects is fundamental primate behaviour. Neurons in the brain prepare for such movements, and this neural activity can now be deciphered, allowing researchers to predict what movements will occur. This discovery could help us develop prosthetic limbs that can be controlled by thought alone.

What happens next? (Image: Gallo Images/Rex Features)
To find out what goes on in the brain when we reach for things, biomedical engineers Daniel Moran and Thomas Pearce at Washington University in St Louis, Missouri, trained two rhesus macaques to participate in a series of exercises. When the monkeys reached for items, electrodes measured the activity of neurons in their dorsal premotor cortex, a region of the brain that is involved in the perception of movement.
The monkeys were trained to reach for a virtual object on a screen to receive a reward. In some tasks the monkeys had to reach directly for an object, in others they had to reach around an obstacle to get to the target.
Impulsive grabMoran and Pearce managed to identify the neural activity corresponding with several aspects of the planned movement, such as angle of reach, hand position and the final target location.
The findings could one day allow the design of prosthetic limbs that can be controlled with thought alone, which is “one of the reasons we did the study”, says Moran.
"The two subjects actually used different strategies to perform the task, and we were able to see this in their neural activity," Moran says. One monkey waited to receive all the information before reaching, but the other reached immediately, even though there was a good chance that an obstacle might appear and the reaching action would need to be rethought.
"If the decoding strategy is a robust finding, then this has wider consequences concerning mind-reading – particularly if we can get equivalent results for more complex strategic differences at higher cognitive levels," says Richard Cooper, a cognitive researcher at Birkbeck, University of London. "However, this is all very speculative."
Source: NewScientist
ScienceDaily (July 20, 2012) — Conditions such as Parkinson’s disease are a result of pathogenic changes to proteins. In the neurodegenerative condition of Parkinson’s disease, which is currently incurable, the alpha-synuclein protein changes and becomes pathological. Until now, there have not been any antibodies that could help to demonstrate the change in alpha-synuclein associated with the disease. An international team of experts led by Gabor G. Kovacs from the Clinical Institute of Neurology at the MedUni Vienna has now discovered a new antibody that actually possesses this ability.
"It opens up new possibilities for the development of a diagnostic test for Parkinsonism," says Kovacs, highlighting the importance of this discovery. "This new antibody will enable us to find the pathological conformation in bodily fluids such as blood or CSF." A clinical study involving around 200 patients is already underway, and the first definitive results are expected at the end of 2012. The tests being carried out in collaboration with the University Department of Neurology, led by Walter Pirker, are designed to determine the extent to which the new antibody can be used as an early diagnostic tool in order to understand the condition better and be able to treat it more effectively.
A step towards a blood test for Parkinson’s With Parkinsonism, the diseased form of alpha-synuclein, which has the same primary structure as the healthy form, undergoes an “abnormal fold.” Says Kovacs: “Until now, however, it was not possible to distinguish between the two.” The previous immunodiagnostic techniques only allowed the general presence of alpha-synuclein to be confirmed. The new, monoclonal antibody, however, which the researchers at the MedUni Vienna have developed in collaboration with the German biotech firm Roboscreen, is now able to detect a strategic part of the protein responsible for the structural changes. The results of the study have now been published in the journal Acta Neuropathologica.
Says Kovacs: “It is still not possible to say whether or not we will be able to diagnose Parkinson’s from a blood test, but this discovery certainly represents a major step in that direction.” Theoretically, it should be possible to diagnose Parkinson’s disease five to eight years before it develops.
In Austria, there are between 15,000 and 16,000 people living with Parkinson’s syndrome. Its frequency increases with age. As society becomes older, Parkinson’s disease, a degenerative condition of the brain, will become an increasingly widespread problem.
Source: Science Daily
ScienceDaily (July 20, 2012) — Scientists at the University of Manchester have uncovered how the internal mechanisms in nerve cells wire the brain. The findings open up new avenues in the investigation of neurodegenerative diseases by analysing the cellular processes underlying these conditions.

Illustration of spectraplakins in axonal growth organising microtubules. (Credit: Image courtesy of University of Manchester)
Dr Andreas Prokop and his team at the Faculty of Life Sciences have been studying the growth of axons, the thin cable-like extensions of nerve cells that wire the brain. If axons don’t develop properly this can lead to birth disorders, mental and physical impairments and the gradual decay of brain capacity during aging.
Axon growth is directed by the hand shaped growth cone which sits in the tip of the axon. It is well documented how growth cones perceive signals from the outside to follow pathways to specific targets, but very little is known about the internal machinery that dictates their behaviour.
Dr Prokop has been studying the key driver of growth cone movements, the cytoskeleton. The cytoskeleton helps to maintain a cell’s shape and is made up of the protein filaments, actin and microtubules. Microtubules are the key driving force of axon growth whilst actin helps to regulate the direction the axon grows.
Dr Prokop and his team used fruit flies to analyse how actin and microtubule proteins combine in the cytoskeleton to coordinate axon growth. They focussed on the multifunctional proteins called spectraplakins which are essential for axonal growth and have known roles in neurodegeneration and wound healing of the skin.
What the team demonstrate in this recent paper is that spectraplakins link microtubules to actin to help them extend in the direction the axon is growing. If this link is missing then microtubule networks show disorganised criss-crossed arrangements instead of parallel bundles and axon growth is hampered.
By understanding the molecular detail of these interactions the team made a second important finding. Spectraplakins collect not only at the tip of microtubules but also along the shaft, which helps to stabilise them and ensure they act as a stable structure within the axon.
This additional function of spectraplakins relates them to a class of microtubule-binding proteins including Tau. Tau is an important player in neurodegenerative diseases, such as Alzheimer’s, which is still little understood. In support of the author’s findings, another publication has just shown that the human spectraplakin, Dystonin, causes neurodegeneration when affected in its linkage to microtubules.
Talking about his research Dr Prokop said: “Understanding cytoskeletal machinery at the cell level is a holy grail of current cell research that will have powerful clinical applications. Thus, cytoskeleton is crucially involved in virtually all aspects of a cell’s life, including cell shape changes, cell division, cell movement, contacts and signalling between cells, and dynamic transport events within cells. Accordingly, the cytoskeleton lies at the root of many brain disorders. Therefore, deciphering the principles of cytoskeletal machinery during the fundamental process of axon growth will essentially help research into the causes of a broad spectrum of diseases. Spectraplakins like at the heart of this machinery and our research opens up new avenues for its investigation”
What Dr Prokop’s paper in the Journal of Neuroscience also demonstrates is the successful research technique using the fruit fly Drosophila. The team was able to replicate its findings regarding axon growth in mice which in turn means the findings can be translated to humans.
Dr Prokop points out fruit flies provide ideal means to make sense of these findings and essentially help to unravel the many mysteries of neurodegeneration.
Dr Prokop continues: “Understanding how spectraplakins perform their cellular functions has important implications for basic as well as biomedical research. Thus, besides their roles during axon growth, spectraplakins of mice and humans are clinically important for a number of conditions and processes including skin blistering, neuro-degeneration, wound healing, synapse formation and neuron migration during brain development. Understanding spectraplakins in one biological process will instruct research on the other clinically relevant roles of these proteins.”
Source: Science Daily
ScienceDaily (July 19, 2012) — While clinical trial results are being released regarding drugs intended to decrease amyloid production — thought to contribute to decline in Alzheimer’s disease — clinical trials of drugs targeting other disease proteins, such as tau, are in their initial phases.
Penn Medicine research presented July 19 at the 2012 Alzheimer’s Association International Conference (AAIC) shows that an anti-tau treatment called epithilone D (EpoD) was effective in preventing and intervening the progress of Alzheimer’s disease in animal models, improving neuron function and cognition, as well as decreasing tau pathology.
By targeting tau, the drug aims to stabilize microtubules, which help support and transport of essential nutrients and information between cells. When tau malfunctions, microtubules break and tau accumulates into tangles.
"This drug effectively hits a tau target by correcting tau loss of function, thereby stabilizing microtubules and offsetting the loss of tau due to its formation into neurofibrillary tangles in animal models, which suggests that this could be an important option to mediate tau function in Alzheimer’s and other tau-based neurodegenerative diseases," said John Trojanowski, MD, PhD, professor of Pathology and Laboratory Medicine in the Perelman School of Medicine at the University of Pennsylvania. "In addition to drugs targeting amyloid, which may not work in advanced Alzheimer’s disease, our hope is that this and other anti-tau drugs can be tested in people with Alzheimer’s disease to determine whether stabilizing microtubules damaged by malfunctioning tau protein may improve clinical and pathological outcomes."
The drug, identified through Penn’s Center for Neurodegenerative Disease Research (CNDR) Drug Discovery Program, was previously shown to prevent further neurological damage and improve cognitive performance in animal models*. The Penn research team includes senior investigator Bin Zhang, MD, and Kurt Brunden, PhD, director of Drug Discovery at CNDR.
Bristol-Myers Squibb, who developed and owns the rights to the drug, has started enrolling patients into a phase I clinical trial in people with mild Alzheimer’s disease.
Source: Science Daily
July 19, 2012
Korean scientists have used tiny stars, squares and triangles as a toolkit to create live neural circuits in a dish.
They hope the shapes can be used to create a reproducible neural circuit model that could be used for learning and memory studies as well as drug screening applications; the shapes could also be integrated into the latest neural tissue scaffolds to aid the regeneration of neurons at injured sites in the body, such as the spinal cord.
Published today in the Journal of Neural Engineering, the study, by researchers at the Korea Advanced Institute of Science and Technology (KAIST), found that triangles were the most effective shape for helping to facilitate the growth of axons and guide them onto specific paths to form a complete circuit.
Co-author of the study, Professor Yoonkey Nam, said: “Eventually, we want to know if we can design a neural tissue model that biologically mimics some neural circuits in our brain.”
A neuron is an electrically excitable cell that processes and transmits information around the body. The neuron is composed of three main parts: a cell body, or soma, dendrites and an axon, which extends from the soma and links to other cells, creating a network.
When axons grow they are usually guided by proteins. Many researchers have been trying to re-create this key process in a dish by manipulating nerve cells from rat brains.
As nerve cells are usually just a few tens of micrometres in size, the challenge associated with creating a live neural network is firstly positioning cells in desired locations and, secondly, making connections between these cells by guiding the axons in designated directions.
The researchers investigated whether two star shapes, five regular shapes (square, circle, triangle, pentagon and hexagon) and three different sizes of isosceles triangles could guide axons in designated directions. Each shape was the size of a single cell and was replicated to form an array which was printed onto a glass surface.
Each of the arrays had an overall size of 1cm-by-1cm with a gap of 10 micrometres between each shape. Hippocampal neurons were taken from rats and plated onto the patterned surfaces. The neurons were fluorescently labelled with dyes so that images could be taken of their growth.
The researchers found that triangles were the most efficient shape to encourage the growth and guidance of an axon. The key to this was the angles at the points where two of the triangle’s lines meet, also known as the vertices. It was shown that the smaller the vertices, the higher chance the triangle had of inducing growth.
"Based on our results, we are suggesting a new design principle for guiding axons in a dish. We can control the axonal growth in a certain direction by putting a sharp triangle pointing to a certain direction. Then, a neuron that adhered to the triangle will have an axon in the sharp vertex direction.
"Overall, we integrated microtechnology with neurobiology to find a new engineering solution" continued Professor Nam.
Provided by Institute of Physics
Source: medicalxpress.com
ScienceDaily (July 19, 2012) — A joint study carried out by The University of Nottingham and the multinational food company Unilever has found for the first time that fat in food can reduce activity in several areas of the brain which are responsible for processing taste, aroma and reward.
The research, now available in the Springer journal Chemosensory Perception, provides the food industry with better understanding of how in the future it might be able to make healthier, less fatty food products without negatively affecting their overall taste and enjoyment. Unveiled in 2010, Unilever’s Sustainable Living Plan sets out its ambition to help hundreds of millions of people improve their diet around the world within a decade.
This fascinating three-year study investigated how the brains of a group of participants in their 20s would respond to changes in the fat content of four different fruit emulsions they tasted while under an MRI scanner. All four samples were of the same thickness and sweetness, but one contained flavour with no fat, while the other three contained fat with different flavour release properties.
The research found that the areas of the participants’ brains which are responsible for the perception of flavour — such as the somatosensory cortices and the anterior, mid & posterior insula — were significantly more activated when the non-fatty sample was tested compared to the fatty emulsions despite having the same flavour perception. It is important to note that increased activation in these brain areas does not necessarily result in increased perception of flavour or reward.
Dr Joanne Hort, Associate Professor in Sensory Science at The University of Nottingham said: “This is the first brain study to assess the effect of fat on the processing of flavour perception and it raises questions as to why fat emulsions suppress the cortical response in brain areas linked to the processing of flavour and reward. It also remains to be determined what the implications of this suppressive effect are on feelings of hunger, satiety and reward.”
Unilever food scientist Johanneke Busch, based at the company’s Research & Development laboratories in Vlaardingen, Netherlands added: “There is more to people’s enjoyment of food than the product’s flavour — like its mouthfeel, its texture and whether it satisfies hunger, so this is a very important building block for us to better understand how to innovate and manufacture healthier food products which people want to buy.”
Source: Science Daily
July 19, 2012 By Emily Martinez
(Medical Xpress) — UT Dallas researchers recently demonstrated how nerve stimulation paired with specific experiences, such as movements or sounds, can reorganize the brain. This technology could lead to new treatments for stroke, tinnitus, autism and other disorders.

Dr. Michael Kilgard helped lead a team that paired vagus nerve stimulation with physical movement to improve brain function.
In a related paper, UT Dallas neuroscientists showed that they could alter the speed at which the brain works in laboratory animals by pairing stimulation of the vagus nerve with fast or slow sounds.
A team led by Dr. Robert Rennaker and Dr. Michael Kilgard looked at whether repeatedly pairing vagus nerve stimulation with a specific movement would change neural activity within the laboratory rats’ primary motor cortex. To test the hypothesis, they paired the vagus nerve stimulation with movements of the forelimb in two groups of rats. The results were published in a recent issue of Cerebral Cortex.
After five days of stimulation and movement pairing, the researchers examined the brain activity in response to the stimulation. The rats who received the training along with the stimulation displayed large changes in the organization of the brain’s movement control system. The animals receiving identical motor training without stimulation pairing did not exhibit any brain changes, or plasticity.
People who suffer strokes or brain trauma often undergo rehabilitation that includes repeated movement of the affected limb in an effort to regain motor skills. It is believed that repeated use of the affected limb causes reorganization of the brain essential to recovery. The recent study suggests that pairing vagus nerve stimulation with standard therapy may result in more rapid and extensive reorganization of the brain, offering the potential for speeding and improving recovery following stroke, said Rennaker, associate professor in The University of Texas at Dallas’ School of Behavioral and Brain Sciences.
“Our goal is to use the brain’s natural neuromodulatory systems to enhance the effectiveness of standard therapies,” Rennaker said. “Our studies in sensory and motor cortex suggest that the technique has the potential to enhance treatments for neurological conditions ranging from chronic pain to motor disorders. Future studies will investigate its effectiveness in treating cognitive impairments.”
July 19, 2012
(Medical Xpress) — When learning to master complex movements such as those required in surgery, is being physically guided by an expert more effective than learning through trial and error?

Dr. George Van Doorn and a participant in the fMRI
New research by Monash University’s Departments of Psychological Studies and Physiology challenges earlier claims that externally guided (or passive) movement is a superior learning method to self-generated (or active) movement.
In the first study of its kind, researchers discovered that different brain regions become active depending on the type of movement used. Lead researcher Dr. George Van Doorn, head of Psychological Studies, said the findings did not support the view that passive movement was a more effective way to learn.
“There has been much debate over the last 30 years about which form of movement is better,” Dr. Van Doorn said. “We found that active movements result in greater activation in brain areas implicated in higher-order processes such as monitoring and controlling goal-directed behaviour, attention, execution of movements, and error detection.
“Passive movements, in contrast, produced greater activity in areas associated with touch perception, length discrimination, tactile object recognition, and the attenuation of sensory inputs.”
People were tested while making movements themselves, and while being guided.
“Whilst inside a functional Magnetic Resonance Imaging (fMRI) machine, we had people either freely move their index finger around a two-dimensional, raised-line pattern to measure self-generated touch. Or we had an experimenter guide the person’s finger around the pattern, to measure externally generated touch. Using the fMRI, we found that different brain regions become active depending on the type of movement used,” Dr. Van Doorn said.
Dr. Van Doorn said touch was becoming a popular area of investigation, with more scientists contributing to understanding about this important, though under-acknowledged, sensory system.
All researchers involved in this study are located at Monash University’s Gippsland campus. The study findings were presented at EuroHaptics 2012, a major international conference and the primary European meeting for researchers in the field of human haptic sensing and touch-enabled computer applications.
Provided by Monash University
Source: medicalxpress.com
ScienceDaily (July 19, 2012) — By decoding brain activity, scientists were able to “see” that two monkeys were planning to approach the same reaching task differently — even before they moved a muscle.

The obstacle-avoidance task is a variation on the center-out reaching task in which an obstacle sometimes prevents the monkey from moving directly to the target. The monkey must first place a cursor (yellow) on the central target (purple). This was the starting position. After the first hold, a second target appeared (green). After the second hold an obstacle appeared (red box). After the third hold, the center target disappeared, indicating a “go” for the monkey, which then moved the cursor out and around the obstacle to the target. (Credit: Moran/Pearce)
Anyone who has looked at the jagged recording of the electrical activity of a single neuron in the brain must have wondered how any useful information could be extracted from such a frazzled signal.
But over the past 30 years, researchers have discovered that clear information can be obtained by decoding the activity of large populations of neurons.
Now, scientists at Washington University in St. Louis, who were decoding brain activity while monkeys reached around an obstacle to touch a target, have come up with two remarkable results.
Their first result was one they had designed their experiment to achieve: they demonstrated that multiple parameters can be embedded in the firing rate of a single neuron and that certain types of parameters are encoded only if they are needed to solve the task at hand.
Their second result, however, was a complete surprise. They discovered that the population vectors could reveal different planning strategies, allowing the scientists, in effect, to read the monkeys’ minds.
ScienceDaily (July 18, 2012) — Researchers at Oregon Health & Science University School of Dentistry have discovered that TDP-43, a protein strongly linked to ALS (amyotrophic lateral sclerosis) and other neurodegenerative diseases, appears to activate a variety of different molecular pathways when genetically manipulated. The findings have implications for understanding and possibly treating ALS and neurodegenerative diseases such as Alzheimer’s and Parkinson’s.
ALS affects two in 100,000 adults in the United States annually and the prognosis for patients is grim.The new discovery is published online in G3: Genes, Genomes, Genetics (and the July 2012 print issue of G3).
Using a fruit fly model, the OHSU team genetically increased or eliminated TDP-43 to study its effect on the central nervous system. By using massively parallel sequencing methods to profile the expression of genes in the central nervous system, the team found that the loss of TDP-43 results in widespread gene activation and altered splicing, much of which is reversed by rescue of TDP-43 expression. Although previous studies have implicated both absence and over expression of TDP-43 in ALS, the OHSU study showed little overlap in the gene expression between these two manipulations, suggesting that the bulk of the genes affected are different.
"Our data suggest that TDP-43 plays a role in synaptic transmission, synaptic release and endocytosis," said Dennis Hazelett, Ph.D., lead author of the study. "We also uncovered a potential novel regulation of several pathways, many targets of which appear to be conserved."
Source: Science Daily
ScienceDaily (July 18, 2012) — Researchers from the University of Medicine and Dentistry of New Jersey (UMDNJ), collaborating with scientists from Northwestern University in Illinois, have provided direct experimental evidence that diabetes is linked to the onset of Alzheimer’s disease. The study, published online this week in the Journal of Alzheimer’s Disease, used an experimental model that shows potential as an important new tool for investigations of Alzheimer’s disease and of drugs being developed to treat Alzheimer’s.
UMDNJ researchers Peter Frederikse, PhD, and Chinnaswamy Kasinathan, PhD, collaborated with William Klein, PhD, at Northwestern University, to build on prior studies from the Klein lab and others that indicated close links between Alzheimer’s disease and diabetes. Working with Claudine Bitel and Rajesh Kaswala, students at UMDNJ, the researchers tested whether untreated diabetes would provide a physiological model of Alzheimer neuropathology.
"The results were striking," Frederikse said. "Because we used diabetes as an instigator of the disease, our study shows — for the first time directly — the link between Alzheimer’s and diabetes."
The researchers found substantial increases in amyloid beta peptide pathology — a hallmark of Alzheimer’s disease — in the brain cortex and hippocampus concurrent with diabetes. They also found significant amyloid beta pathology in the retina and by contrast, when diabetes is not present, no observable pathology was detected in either the brain or the retina.
"Second, our study examined the retina, which is considered an extension of the brain, and is more accessible for diagnostic exams," Frederikse added. "Our findings indicate that scientists may be able to follow the onset and progression of Alzheimer’s disease through retinal examination, which could provide a long sought after early-warning sign of the disease."
This experimental model replicated spontaneous formation of amyloid beta “oligomer” assemblies in brain and retina which may help to explain one of the most widely recognized symptoms of Alzheimer’s. “This is exciting,” Klein said. “Oligomers are the neurotoxins now regarded as causing Alzheimer’s disease memory loss. What could cause them to appear and buildup in late-onset Alzheimer’s disease has been a mystery, so these new findings with diabetes represent an important step.”
Previous research indicated that insulin plays an important role in the formation of memories. Once attached to neurons, oligomers cause insulin receptors to be eliminated from the surface membranes, contributing to insulin resistance in the brain. This launches a vicious cycle in which diabetes induces oligomer accumulation which makes neurons even more insulin resistant.
"In light of the near epidemic increases in Alzheimer’s disease and diabetes today, developing a physiological model of Alzheimer neuropathology has been an important goal," Kasinathan added. "It allows us to identify a potential biomarker for Alzheimer’s disease and may also make important contributions to Alzheimer drug testing and development."
Source: Science Daily
7/18/2012
Metabolic syndrome, a term used to describe a combination of risk factors that often lead to heart disease and type 2 diabetes, seems to be linked to lower blood flow to the brain, according to research by the University of Wisconsin School of Medicine and Public Health.
Dr. Barbara Bendlin, researcher for the Wisconsin Alzheimer’s Disease Research Center and an assistant professor of medicine (geriatrics) at the UW School of Medicine and Public Health, said study participants with multiple risk factors connected to metabolic syndrome, including abdominal obesity, high blood pressure, high blood sugar and high cholesterol averaged 15 percent less blood flow to the brain than those in a control group, according to results of brain scans to measure cerebral blood flow.
"We thought the cerebral blood flow measurements of the metabolic syndrome group would be lower, but it was striking how much lower it was," said Bendlin.
Although lower blood flow could result in an eventual reduction in memory skills, Bendlin said it is not known if people with metabolic syndrome will get Alzheimer’s disease.
"Having metabolic syndrome at middle age does have an effect on the brain, and there is some suggestion that if you have lower blood flow, certain types of memory functions are reduced," she said. "The key will be to follow these people over time, because we want to know if lower blood flow will lead to a gradual loss of memory and cognitive skills. But it’s too early to say if these people will develop Alzheimer’s."
The study, presented today at the Alzheimer’s Association International Conference in Vancouver, British Columbia, involved 71 middle-aged people recruited from the Wisconsin Registry for Alzheimer’s Prevention (WRAP). Of this group, 29 met the criteria for metabolic syndrome and 42 did not.
Bendlin said the next steps will be to conduct additional brain scans on people with metabolic syndrome to get more specifics on why they have reduced cerebral blood flow.
"By comparing people with metabolic syndrome with those who don’t, we don’t know which of the risk factors are worst," she said. "Is having a high blood-glucose level worse than having high blood pressure or is it different than having abdominal obesity? All of these risk factors have been linked to increased risk for dementia, but they are clustered together. If we knew which ones were the worst, those would be the ones to target with specific treatments."
Source: Bio-Medicine
July 18, 2012
Drugs used to treat Attention Deficit Hyperactivity Disorder (ADHD) do not appear to have long-term effects on the brain, according to new animal research from Wake Forest Baptist Medical Center.
As many as five to seven percent of elementary school children are diagnosed with ADHD, a behavioral disorder that causes problems with inattentiveness, over-activity, impulsivity, or a combination of these traits. Many of these children are treated with psychostimulant drugs, and while doctors and scientists know a lot about how these drugs work and their effectiveness, little is known about their long-term effects.
Linda Porrino, Ph.D., professor and chair of the Department of Physiology and Pharmacology, along with fellow professor Michael A. Nader, Ph.D., both of Wake Forest Baptist, and colleagues conducted an animal study to determine what the long-lasting effects may be. Their findings were surprising, said Porrino. “We know that the drugs used to treat ADHD are very effective, but there have always been concerns about the long-lasting effects of these drugs,” Porrino said.
"We didn’t know whether taking these drugs over a long period could harm brain development in some way or possibly lead to abuse of drugs later in adolescence."
Findings from the Wake Forest Baptist research are published online this month in the journal Neuropsychopharmacology.
The researchers studied 16 juvenile non-human primates, whose ages were equivalent to 6-to 10-year-old humans. Eight animals were in the control group that did not receive any drug treatment and the other eight were treated with a therapeutic-level dose of an extended-release form of Ritalin, or methylphenidate (MPH), for over a year, which is equivalent to about four years in children. Imaging of the animals’ brains, both before and after the study, was conducted on both groups to measure brain chemistry and structure. The researchers also looked at developmental milestones to address concerns that ADHD drugs adversely affect physical growth.
Once the MPH treatment and imaging studies were concluded, the animals were given the opportunity to self administer cocaine over several months. Nader measured their propensity to acquire the drug and looked at how rapidly and in what amounts, to provide an index of vulnerability to substance abuse in adolescence. As reported in the research paper, they found no differences between groups – monkeys treated with Ritalin during adolescence were not more vulnerable to later drug use than the control animals.
"After one year of drug therapy, we found no long-lasting effects on the neurochemistry of the brain, no changes in the structure of the developing brain. There was also no increase in the susceptibility for drug abuse later in adolescence," Porrino said. "We were very careful to give the drugs in the same doses that would be given to children. That’s one of the great advantages of our study is that it’s directly translatable to children."
Porrino said non-human primates provide exceptional models for developmental research because they undergo relatively long childhood and adolescent periods marked by hormonal and physiological maturation much like humans.
"Our study showed that long-term therapeutic use of drugs to treat ADHD does not cause long-term negative effects on the developing brain, and importantly, it doesn’t put children at risk for substance abuse later in adolescence," she said.
One of the exciting things about this research, Porrino said, is that a “sister” study was conducted simultaneously at John Hopkins with slightly older aged animals and different drugs and their findings were similar. “We feel very confident of the results because we have replicated each other’s studies within the same time frame and gotten similar results,” she said. “We think that’s pretty powerful and reassuring.”
Provided by Wake Forest University Baptist Medical Center
Source: medicalxpress.com
July 18, 2012
A new guideline released by the American Academy of Neurology recommends several treatments for people with Huntington’s disease who experience chorea—jerky, random, uncontrollable movements that can make everyday activities challenging. The guideline is published in the July 18, 2012, online issue of Neurology.
"Chorea can be disabling, worsen weight loss and increase the risk of falling," said guideline lead author Melissa Armstrong, MD, MSc, with the University of Maryland Department of Neurology and a member of the American Academy of Neurology.
Huntington’s disease is a complex disease with physical, cognitive and behavioral symptoms. The new guideline addresses only one aspect of the disease that may require treatment.
The guideline found that the drugs tetrabenazine (TBZ), riluzole and amantadine can be helpful and the drug nabilone may also be considered to treat chorea. The medications riluzole, amantadine and nabilone are not often prescribed for Huntington’s disease.
"People with Huntington’s disease who have chorea should discuss with their doctors whether treating chorea is a priority. Huntington’s disease is complex with a wide range of sometimes severe symptoms and treating other symptoms may be a higher priority than treating chorea," said Armstrong.
Armstrong adds that it is important for patients to understand that their doctors may try drugs not recommended in this guideline to treat chorea. More research is needed to know if drugs such as those used for psychosis are effective; however, doctors may prescribe them on the basis of past clinical experience.
Provided by American Academy of Neurology
Source: medicalxpress.com
July 18, 2012
Sleep deprivation in the first few hours after exposure to a significantly stressful threat actually reduces the risk of Post-Traumatic Stress Disorder (PTSD), according to a study by researchers from Ben-Gurion University of the Negev (BGU) and Tel Aviv University.
The new study was published in the international scientific journal, Neuropsychopharmacology. It revealed in a series of experiments that sleep deprivation of approximately six hours immediately after exposure to a traumatic event reduces the development of post trauma-like behavioral responses. As a result, sleep deprivation the first hours after stress exposure might represent a simple, yet effective, intervention for PTSD.
The research was conducted by Prof. Hagit Cohen, director of the Anxiety and Stress Research Unit at BGU’s Faculty of Health Sciences, in collaboration with Prof. Joseph Zohar of Tel Aviv University.
Approximately 20 percent of people exposed to a severe traumatic event, such as a car or work accident, terrorist attack or war, cannot normally carry on their lives. These people retain the memory of the event for many years. It causes considerable difficulties in the person’s functioning in daily life and, in extreme cases, may render the individual completely dysfunctional.
"Often those close to someone exposed to a traumatic event, including medical teams, seek to relieve the distress and assume that it would be best if they could rest and "sleep on it," says Prof. Cohen. "Since memory is a significant component in the development of post-traumatic symptoms, we decided to examine the various effects of sleep deprivation immediately after exposure to trauma."
In the experiments, rats that underwent sleep deprivation after exposure to trauma (predator scent stress exposure), later did not exhibit behavior indicating memory of the event, while a control group of rats that was allowed to sleep after the stress exposure did remember, as shown by their post trauma-like behavior.
"As is the case for human populations exposed to severe stress, 15 to 20 percent of the animals develop long-term disruptions in their behavior," says Cohen. "Our research method for this study is, we believe, a breakthrough in biomedical research."
A pilot study in humans is currently being planned. The studies were funded by a Israel Academy of Science and Humanities grant and the Israel Ministry of Health.
Provided by American Associates, Ben-Gurion University of the Negev
Source: medicalxpress.com
July 18, 2012
(Phys.org) — New research at the Hebrew University of Jerusalem sheds light on pluripotency—the ability of embryonic stem cells to renew themselves indefinitely and to differentiate into all types of mature cells. Solving this problem, which is a major challenge in modern biology, could expedite the use of embryonic stem cells in cell therapy and regenerative medicine. If scientists can replicate the mechanisms that make pluripotency possible, they could create cells in the laboratory which could be implanted in humans to cure diseases characterized by cell death, such as Alzheimer’s, Parkinson’s, diabetes and other degenerative diseases.
To shed light on these processes, researchers in the lab of Dr. Eran Meshorer, in the Department of Genetics at the Hebrew University’s Alexander Silberman Institute of Life Sciences, are combining molecular, microscopic and genomic approaches. Meshorer’s team is focusing on epigenetic pathways—which cause biological changes without a corresponding change in the DNA sequence—that are specific to embryonic stem cells.
The molecular basis for epigenetic mechanisms is chromatin, which is comprised of a cell’s DNA and structural and regulatory proteins. In groundbreaking research performed by Shai Melcer, a PhD student in the Meshorer lab, the mechanisms which support an “open” chromatin conformation in embryonic stem cells were examined. The researchers found that chromatin is less condensed in embryonic stem cells, allowing them the flexibility or “functional plasticity” to turn into any kind of cell.
A distinct pattern of chemical modifications of chromatin structural proteins (referred to as the acetylation and methylation of histones) enables a looser chromatin configuration in embryonic stem cells. During the early stages of differentiation, this pattern changes to facilitate chromatin compaction.
But even more interestingly, the authors found that a nuclear lamina protein, lamin A, is also a part of the secret. In all differentiated cell types, lamin A binds compacted domains of chromatin and anchors them to the cell’s nuclear envelope. Lamin A is absent from embryonic stem cells and this may enable the freer, more dynamic chromatin state in the cell nucleus. The authors believe that chromatin plasticity is tantamount to functional plasticity since chromatin is made up of DNA that includes all genes and codes for all proteins in any living cell. Understanding the mechanisms that regulate chromatin function will enable intelligent manipulations of embryonic stem cells in the future.
"If we can apply this new understanding about the mechanisms that give embryonic stem cells their plasticity, then we can increase or decrease the dynamics of the proteins that bind DNA and thereby increase or decrease the cells’ differentiation potential," concludes Dr. Meshorer. “This could expedite the use of embryonic stem cells in cell therapy and regenerative medicine, by enabling the creation of cells in the laboratory which could be implanted in humans to cure diseases characterized by cell death, such as Alzheimer’s, Parkinson’s, diabetes and other degenerative diseases.”
Source: PHYS.ORG
ScienceDaily (July 17, 2012) — The ability of infants to recognize speech is more sophisticated than previously known, researchers in New York University’s Department of Psychology have found. Their study, which appears in the journal Developmental Psychology, showed that infants, as early as nine months old, could make distinctions between speech and non-speech sounds in both humans and animals.

A new study shows that infants, as early as nine months old, could make distinctions between speech and non-speech sounds in both humans and animals. (Credit: © ChantalS / Fotolia)
"Our results show that infant speech perception is resilient and flexible," explained Athena Vouloumanos, an assistant professor at NYU and the study’s lead author. "This means that our recognition of speech is more refined at an earlier age than we’d thought."
It is well-known that adults’ speech perception is fine-tuned — they can detect speech among a range of ambiguous sounds. But much less is known about the capability of infants to make similar assessments. Understanding when these abilities become instilled would shed new light on how early in life we develop the ability to recognize speech.
In order to gauge the aptitude to perceive speech at any early age, the researchers examined the responses of infants, approximately nine months in age, to recorded human and parrot speech and non-speech sounds. Human (an adult female voice) and parrot speech sounds included the words “truck,” “treat,” “dinner,” and “two.” The adult non-speech sounds were whistles and a clearing of the throat while the parrot non-speech sounds were squawks and chirps. The recorded parrot speech sounds were those of Alex, an African Gray parrot that had the ability to talk and reason and whose behaviors were studied by psychology researcher Irene Pepperberg.
Since infants cannot verbally communicate their recognition of speech, the researchers employed a commonly used method to measure this process: looking longer at what they find either interesting or unusual. Under this method, looking longer at a visual paired with a sound may be interpreted as a reflection of recognition. In this study, sounds were paired with a series of visuals: a checkerboard-like image, adult female faces, and a cup.
The results showed that infants listened longer to human speech compared to human non-speech sounds regardless of the visual stimulus, revealing the ability recognize human speech independent of the context.
Their findings on non-human speech were more nuanced. When paired with human-face visuals or human artifacts like cups, the infants listened to parrot speech longer than they did non-speech, such that their preference for parrot speech was similar to their preference for human speech sounds. However, this did not occur in the presence of other visual stimuli. In other words, infants were able to distinguish animal speech from non-speech, but only in some contexts.
"Parrot speech is unlike human speech, so the results show infants have the ability to detect different types of speech, even if they need visual cues to assist in this process," explained Vouloumanos.
Source: Science Daily
ScienceDaily (July 17, 2012) — Johns Hopkins researchers say they have discovered a cause-and-effect relationship between two well-established biological risk factors for schizophrenia previously believed to be independent of one another.
The findings could eventually lead researchers to develop better drugs to treat the cognitive dysfunction associated with schizophrenia and possibly other mental illnesses.
Researchers have long studied the role played in the brain’s neurons by the Disrupted-in-Schizophrenia 1 (DISC1) gene, a mutation with one of the strongest links to an increased risk of developing the debilitating psychiatric illness.
In a study published in the journal Molecular Psychiatry, the laboratory of Mikhail V. Pletnikov, M.D., Ph.D., in collaboration with the laboratory of Solomon H. Snyder, M.D., D.Sc., instead looked at the role the DISC1 gene plays in glia cells known as astrocytes, a kind of support cell in the brain that helps neurons communicate with one another.
"Abnormalities in glia cells could be as important as abnormalities in neuronal cells themselves," says Pletnikov, an associate professor of psychiatry and behavioral sciences at the Johns Hopkins University School of Medicine, and the study’s leader. "Most gene work has been done with neurons. But we also need to understand a lot more about the role that genetic mutations in glia cells play because neuron-glia interaction appears crucial in ensuring the brain operates normally."
Besides the paranoia and hallucinations that characterize the disease, schizophrenics have cognitive deficits, leaving them unable to think clearly or organize their thoughts and behavior.
Previous studies found that one of the roles of astrocytes is to secrete the neurotransmitter D-serine, which helps promote the transmission of glutamate in the brain, believed to be a key to cognitive function. Schizophrenics have decreased glutamate transmission. It appears, Pletnikov says, that people with DISC1 mutations associated with the psychiatric illness are faster to metabolize D-serine, which leads to a decrease in the apparently crucial transmitter.
In clinical trials, other researchers are trying to boost D-serine levels in people with schizophrenia to see if they can boost cognitive function.
In the new study, the Johns Hopkins researchers found that DISC1 is directly involved in regulating the production of D-serine by the enzyme known as serine racemase.
The researchers found that DISC1 normally binds to serine racemase and stabilizes it. The mutant DISC1 in patients with schizophrenia cannot bind with serine racemase, and instead destabilizes and destroys it. The result is a deficiency of D-serine.
The Hopkins researchers bred mice with the mutant DISC1 protein expressed only in astrocytes and, as predicted, the animals had decreased levels of D-serine. These mice also showed abnormal behavior “consistent with schizophrenia,” Pletnikov says. For example, the rodents showed sensitivity to psycho-stimulants that target glutamate transmission. By treating the mice with D-serine, the scientists were able to ameliorate the schizophrenic-like symptoms. Mice without the DISC1 mutation in astrocytes had normal D-serine levels.
Pletnikov says that in the future, researchers hope that they can target the unstable junction between the abnormal DISC1 and serine racemase. If drugs, for example, can be found to increase glutamate transmission in humans, doctors may be able to improve cognitive function in schizophrenics. He says a DISC1 mutation may also be an important risk factor in other psychiatric disorders.
"Abnormal glutamate transmission is believed to be present in patients with bipolar disorder, major depression and possibly anxiety disorders, so our findings could apply to other psychiatric diseases," he says.
Source: Science Daily
ScienceDaily (July 17, 2012) — Scientists have discovered two genetic variants associated with the substantial, rapid weight gain occurring in nearly half the patients treated with antipsychotic medications, according to two studies involving the Centre for Addiction and Mental Health (CAMH).
These results could eventually be used to identify which patients have the variations, enabling clinicians to choose strategies to prevent this serious side-effect and offer more personalized treatment.
"Weight gain occurs in up to 40 per cent of patients taking medications called second-generation or atypical antipsychotics, which are used because they’re effective in controlling the major symptoms of schizophrenia," says CAMH Scientist Dr. James Kennedy, senior author on the most recent study published online in the Archives of General Psychiatry.
This weight gain can lead to obesity, type 2 diabetes, heart problems and a shortened life span. “Identifying genetic risks leading to these side-effects will help us prescribe more effectively,” says Dr. Kennedy, head of the new Tanenbaum Centre for Pharmacogenetics, which is part of CAMH’s Campbell Family Mental Health Research Institute. Currently, CAMH screens for two other genetic variations that affect patients’ responses to psychiatric medications.
Each study identified a different variation near the melanocortin-4 receptor (MC4R) gene, which is known to be linked to obesity.
In the Archives of General Psychiatry study, people carrying two copies of a variant gained about three times as much weight as those with one or no copies, after six to 12 weeks of treatment with atypical antipsychotics. (The difference was approximately 6 kg versus 2 kg.) The study had four patient groups: two from the U.S., one in Germany and one from a larger European study.
"The weight gain was associated with this genetic variation in all these groups, which included pediatric patients with severe behaviour or mood problems, and patients with schizophrenia experiencing a first episode or who did not respond to other antipsychotic treatments," says CAMH Scientist Dr. Daniel Müller. "The results from our genetic analysis combined with this diverse set of patients provide compelling evidence for the role of this MC4R variant. Our research group has discovered other gene variants associated with antipsychotic-induced weight gain in the past, but this one appears to be the most compelling finding thus far."
Three of the four groups had never previously taken atypical antipsychotics. Different groups were treated with drugs such as olanzapine, risperidone, aripiprazole or quetiapine, and compliance was monitored to ensure the treatment regime was followed. Weight and other metabolic-related measures were taken at the start and during treatment.
A genome-wide association study was conducted on pediatric patients by the study’s lead researcher, Dr. Anil Malhotra, at the Zucker Hillside Hospital in Glen Oaks, NY. In this type of study, variations are sought across a person’s entire set of genes to identify those associated with a particular trait. The result pointed to the MC4R gene.
This gene’s role in antipsychotic-induced weight gain had been identified in a CAMH study published earlier this year in The Pharmacogenomics Journal, involving Drs. Müller and Kennedy, and conducted by PhD student Nabilah Chowdhury. They found a different variation on MC4R that was linked to the side-effect.
For both studies, CAMH researchers did genotyping experiments to identify the single changes to the sequence of the MC4R gene — known as single nucleotide polymorphisms (SNPs) — related to the drug-induced weight gain side-effect.
The MC4R gene encodes a receptor involved in the brain pathways regulating weight, appetite and satiety. “We don’t know exactly how the atypical antipsychotics disrupt this pathway, or how this variation affects the receptor,” says Dr. Müller. “We need further studies to validate this result and eventually turn this into a clinical application.”
Source: Science Daily