ScienceDaily (Aug. 23, 2012) — Scientists at the University of Houston (UH) have discovered what may possibly be a key ingredient in the fight against Parkinson’s disease.
Affecting more than 500,000 people in the U.S., Parkinson’s disease is a degenerative disorder of the central nervous system marked by a loss of certain nerve cells in the brain, causing a lack of dopamine. These dopamine-producing neurons are in a section of the midbrain that regulates body control and movement. In a study recently published in the Proceedings of the National Academy of Sciences (PNAS), researchers from the UH Center for Nuclear Receptors and Cell Signaling (CNRCS) demonstrated that the nuclear receptor liver X receptor beta (LXRbeta) may play a role in the prevention and treatment of this progressive neurodegenerative disease.
"LXRbeta performs an important function in the development of the central nervous system, and our work indicates that the presence of LXRbeta promotes the survival of dopaminergic neurons, which are the main source of dopamine in the central nervous system," said CNRCS director and professor Jan-Åke Gustafsson, whose lab discovered LXRbeta in 1995. "The receptor continues to show promise as a potential therapeutic target for this disease, as well as other neurological disorders."
To better understand the relationship between LXRbeta and Parkinson’s disease, the team worked with a potent neurotoxin, called MPTP, a contaminant found in street drugs that caused Parkinson’s in people who consumed these drugs. In lab settings, MPTP is used in murine models to simulate the disease and to study its pathology and possible treatments.
The researchers found that the absence of LXRbeta increased the harmful effects of MPTP on dopamine-producing neurons. Additionally, they found that using a drug that activates LXRbeta receptors prevented the destructive effects of MPTP and, therefore, may offer protection against the neurodegeneration of the midbrain.
"LXRbeta is not expressed in the dopamine-producing neurons, but instead in the microglia surrounding the neurons," Gustafsson said. "Microglia are the police of the brain, keeping things in order. In Parkinson’s disease the microglia are overactive and begin to destroy the healthy neurons in the neighborhood of those neurons damaged by MPTP. LXRbeta calms down the microglia and prevents collateral damage. Thus, we have discovered a novel therapeutic target for treatment of Parkinson’s disease."
Source: Science Daily
Better diagnosis and treatment of a crippling inherited nerve disorder may be just around the corner thanks to an international team that spanned Asia, Europe and the United States. The team had been hunting DNA strands for the cause of the inherited nerve disorder known as spinocerebellar ataxia, or SCA. The disease causes progressive loss of balance, muscle control and ability to walk. Thanks to their diligence and detective work they have discovered the disease gene in a region of chromosome 1 where another group from the Netherlands had previously shown linkage with a form of SCA called SCA19, and the Taiwanese group on the new paper had shown similar linkage in a family for a form of the disease that was then called SCA22. The international team, from France, Japan, Taiwan and the USA have published their discovery in the Annals of Neurology. The Dutch group has also published results in the same issue of the journal.

Their paper reveals that mutations in the gene KCND3 were found in six families in Asia, Europe and the United States that have been haunted by SCA. Their results will allow for a better understanding of why nerves in the brain’s movement-controlling centre die, and how new DNA mapping techniques can find the causes of other diseases that run in families.
Margit Burmeister, Ph.D., a geneticist at University of Michigan Health System (U-M), helped lead the work and stressed that the gene could not have been found without a great deal of DNA detective work and the cooperation of the families who volunteered to let researchers map all the DNA of multiple members of their family tree. ‘We combined traditional genetic linkage analysis in families with inherited diseases with whole exome sequencing of an individual’s DNA, allowing us to narrow down and ultimately identify the mutation,’ she says. ‘This new type of approach has already resulted in many new gene identifications, and will bring in many more.’
The gene is very important as it manages the production of a protein that allows nerve cells to ‘talk’ to one another through the flow of potassium. Pinpointing its role as a cause of ataxia will now allow more people with ataxia to learn the exact cause of their disease, give a very specific target for new treatments, and perhaps allow the families to stop the disease from affecting future generations.
U-M neurologist Vikram Shakkottai, M.D., Ph.D., an ataxia specialist and co-author on the paper, also notes that the new genetic information will help patients find out the specific cause of their disease. He and his colleagues are already working to find drugs that might alter potassium flow, and provide a treatment for a group of diseases that currently are only treated with supportive care such as physical activity and balance training as patients deteriorate. ‘Many of the families who come to our clinic for treatment don’t have a recognised genetic mutation, so it’s important to find new genetic mutations to explain their symptoms,’ says Shakkottai. ‘But at the same time, this research is helping us understand a common mechanism of nerve cell dysfunction in progressive and non-progressive disease.’
Their findings however are not restricted to just ataxia. The researchers were also able to show that when KCND3 is mutated, it causes poor communication between nerve cells in the cerebellum as well as the death of those cells. This discovery could aid research on other neurological disorders involving balance and movement.
The Dutch team, that also published its findings about KCND3 at the same time, studied families in the Netherlands and found that mutations on the gene are responsible for SCA19, the cause of which had up until now been a mystery. ‘In other words, mutations in this gene are not uncommon and present all over the world,’ says Burmeister. ‘This means that in the future, this gene should be tested for mutations as part of a clinical genetic test panel for patients with ataxia symptoms. Because a generation can be skipped, it may even be relevant in some sporadic cases - those where the patient isn’t aware of any other family members with a similar disease.’
Source: Cordis News
Helium reveals gibbon’s soprano skill
![]()
Apes are unlikely to become virtuosos at the opera house, but gibbons have naturally mastered some of the vocal techniques that human sopranos rely on, scientists in Japan report.
The research shows that, like humans, gibbons use a ‘source–filter’ mode of sound generation. The sound originates from the creatures’ vocal folds as a mixture of different harmonics, which are multiples of the frequency at which the vocal folds vibrate. The resonant frequencies of the vocal tract then determine which of these harmonics are projected. By altering the position of the mouth, lips and teeth, humans vary these resonant frequencies to make the different sounds required for speech.
The gibbon’s melodious calling bears many similarities to the techniques of human singers. Like professional sopranos, gibbons tune the resonant frequency of their vocal tract to the pitch frequency generated by the vocal folds to amplify the sound. Acoustic physicist Joe Wolfe of the University of New South Wales in Sydney, Australia, says that this type of “resonance tuning” is something that comes fairly easily to human singers and is key to their ability to project their voice over a loud orchestra.
22 August 2012
Scientists have found a switch in the brain which may explain why smoking cannabis causes psychosis and addiction in more than one-in-ten users.
The team, at Aberdeen University found a genetic difference in the switch, probably inherited from early humans who smoked the drug in prehistoric times. The difference may also explain why some people could be more susceptible to conditions such as obesity.

The researchers, at the university’s Kosterlitz Centre for Therapeutics, studied genetic differences around a gene called ‘CNR1’, which produces what are known as cannabinoid receptors in the brain which control parts of the brain involved in memory, mood, appetite and pain.
Cannabinoid receptors activate these areas of the brain when they are triggered by naturally-occurring chemicals in the body known as endocannabinoids. Chemicals found in the cannabis and ‘skunk’ mimic the action of endocannabinoids. It is known that cannabis has pain-relieving and anti-inflammatory properties which can help treat diseases such as multiple sclerosis and arthritis.
However, developing drugs from cannabis to treat these conditions is hampered by the fact that such drugs will have psychoactive side effects - and smoked cannabis can cause addiction and psychosis in up to 12 per cent of users.
Dr Alasdair MacKenzie, who led the research, said:
We looked at one specific genetic difference in CNR1 because we know it is linked to obesity and addiction. What we found was a mutation that caused a change in the genetic switch for the gene itself - a switch that is very ancient and has remained relatively unchanged in over three hundred million years of evolution, since before the time of the dinosaurs.
These genetic ‘switches’ regulate the gene itself, ensuring that it is turned on or off in the right place at the right time and in the right amount. It is normally thought that mutations cause disease by reducing the function of the gene, or the switch that controls it. In this case however, the mutation actually increased the activity of the switch in parts of the brain that control appetite and pain, and also, and most especially, in the part of the brain called the hippocampus, which is affected in psychosis.
He added: We know that this overactive switch is relatively rare in Europeans, but is quite common in African populations. But we were all once African, so something must have decreased it in our early ancestors who left Africa and migrated through Central Asia towards Europe and the north. One possibility we are keen to explore is that once in Central Asia these early migrants came into contact with the cannabis plant, which we know was endemic across that area at that time.
It is possible that the side effects of taking cannabis were such that people with the mutation were not so effective in producing and raising children. Therefore, over the generations the numbers of people with the mutation decreased.
This work is at a very early stage however, and there are likely to be more exciting discoveries - not only on how these differences came about, but also about the role of this genetic switch in health and disease.
Co-researcher Dr Scott Davidson said: Further analysis of this mutation will help us to understand many of the side effects which are associated with cannabis use such as addiction and psychosis.

Professor Ruth Ross, head of the Kosterlitz Centre, an internationally recognised expert in cannabis pharmacology, said: Previously in drug research, attempts to detect the causes of adverse drug reactions have focused on the genes themselves.
Our study is one of the first to explore the possibility that changes in gene switches are involved in causing side effects to drugs. We believe this approach will be crucially important in the future development of more effective personalised medicine, with fewer side effects.
One question that is intriguing the research team is why this overactive genetic switch evolved in the first place.
A collaborative research effort by scientists at the University of North Carolina School of Medicine, Duke University, and University College of London in the UK, sheds new light on alcohol-related birth defects.
The project, led by Kathleen K. Sulik, PhD, a professor in the Department of Cell and Developmental Biology and the Bowles Center for Alcohol Studies at UNC, could help enhance how doctors diagnose birth defects caused by alcohol exposure in the womb. The findings also illustrate how the precise timing of that exposure could determine the specific kinds of defects.

“We now know that maternal alcohol use is the leading known and preventable cause of birth defects and mental disability in the United States,” Sulik said. “Alcohol’s effects can cause a range of cognitive, developmental and behavioral problems that typically become evident during childhood, and last a lifetime.”
Fetal alcohol syndrome (FAS) is at the severe end of fetal alcohol spectrum disorders (FASD). First described in 1972, FAS is recognized by a specific pattern of facial features: small eyelid openings, a smooth ridge on the upper lip (absence of a central groove, or philtrum), and a thin upper lip border.
In its full-blown state, FAS affects roughly 1 in 750 live births in the U.S. And while clinicians typically look for those classical facial features in making a diagnosis, within the broader classification of FASD “adverse outcomes vary considerably and most individuals don’t exhibit the facial characteristics that currently define FAS,” said the study’s lead author Robert J. Lipinski, PhD, a postdoctoral scientist in Sulik’s lab. “This study could expand the base of diagnostic criteria used by clinicians who suspect problems caused by maternal alcohol use.”
In their animal-based studies, the Sulik lab team has collaborated with co-author G. Allan Johnson, PhD and his group at Duke University’s Center for In Vivo Microscopy. Johnson, professor of radiology and physics, has developed new imaging tools with spatial resolution up to a million times higher than clinical magnetic resonance imaging (MRI). These include small bore tools suitable for imaging fetal mice that are only 15 mm long.
To quantify facial shape from MRI data, the study team turned to co-author Peter Hammond, a professor of computational biology at UCL’s Institute of Child Health, in London. Hammond invented powerful new techniques for 3D shape analysis that have already proven successful in objectively defining facial shape changes in humans.
In the study, described in the August 22, 2012 issue of the online journal PLOS ONE, Lipinski and Sulik treated one group of mice with alcohol on their seventh day of pregnancy, a time corresponding to the third week of pregnancy in humans. A second group of mice was treated just 36 hours later, approximating the fourth week of human pregnancy. The amount of alcohol given was large, “high doses that most women wouldn’t achieve unless they were alcoholic and had a tolerance for alcohol,” Sulik said.
Near the end of pregnancy, the fetuses were then imaged at Duke University. These 3D data sets showed individual brain regions, as well as accurate and detailed facial surfaces, from which Hammond and research assistant and co-author Michael Suttie performed shape analyses.
The team found that the earlier alcohol exposure time elicited the classic FAS facial features, including characteristic abnormalities of the upper lip and eyes. What they observed in fetuses exposed just 36 hours later, however, was a surprise. These mice exhibited unique and in some cases opposing facial patterns, such as shortened upper lip, a present philtrum, and the brain, instead of appearing too narrow in the front, appeared wide.
“Overall, the results of our studies show that alcohol can cause more than one pattern of birth defects, and that the type and extent of brain abnormalities—which are the most devastating manifestation of prenatal alcohol exposure—in some cases may be predicted by specific facial features,” Sulik said. “And, importantly, alcohol can cause tremendously devastating and permanent damage at a time in development when most women don’t recognize that they’re pregnant.”
Source: Newswise
August 22, 2012
Animals that literally have holes in their brains can go on to behave as normal adults if they’ve had the benefit of a little cognitive training in adolescence. That’s according to new work in the August 23 Neuron, a Cell Press publication, featuring an animal model of schizophrenia, where rats with particular neonatal brain injuries develop schizophrenia-like symptoms.
"The brain can be loaded with all sorts of problems," said André Fenton of New York University. "What this work shows is that experience can overcome those disabilities."
Fenton’s team made the discovery completely by accident. His team was interested in what Fenton considers a core problem in schizophrenia: the inability to sift through confusing or conflicting information and focus on what’s relevant.
"As you walk through the world, you might be focused on a phone conversation, but there are also kids in the park and cars and other distractions," he explained. "These information streams are all competing for our brain to process them. That’s a really challenging situation for someone with schizophrenia."
Fenton and his colleagues developed a laboratory test of cognitive control needed for that kind of focus. In the test, rats had to learn to avoid a foot shock while they were presented with conflicting information. Normal rats can manage that task quickly. Rats with brain lesions can also manage this task, but only up until they become young adults—the equivalent of an 18- or 20-year-old person—when signs of schizophrenia typically set in.
While that was good to see, Fenton says, it wasn’t really all that surprising. But then some unexpected circumstances in the lab led them to test animals with adolescent experience in the cognitive control test again, once they had grown into adults.
These rats should have shown cognitive control deficits, similar to those that had not received prior cognitive training, or so the researchers thought. Instead, they were just fine. Their schizophrenic symptoms had somehow been averted.
Fenton believes their early training for focus forged some critical neural connections, allowing the animals to compensate for the injury still present in their brains in adulthood. Not only were the animals’ behaviors normalized with training, but the patterns of activity in their brains were also.
The finding is consistent with the notion that mental disorders are the consequence of problems in brain development that might have gotten started years before. They raise the optimistic hope that the right kinds of experiences at the right time could change the future by enabling people to better manage their diseases and better function in society. Adolescence, when the brain undergoes significant change and maturation, might be a prime time for such training.
"You may have a damaged brain, but the consequences of that damage might be overcome without changing the damage itself," Fenton says. "You could target schizophrenia, but other disorders aren’t very different," take autism or depression, for example.
And really, in this world of infinite distraction, couldn’t we all use a little more cognitive control?
Source: medicalxpress.com
22 August 2012 by Ewen Callaway
Genome study may explain links between paternal age and conditions such as autism.

Older fathers’ sperm have more mutations — as do their children.
V. Peñafiel/Flickr/GETTY
In the 1930s, the pioneering geneticist J. B. S. Haldane noticed a peculiar inheritance pattern in families with long histories of haemophilia. The faulty mutation responsible for the blood-clotting disorder tended to arise on the X chromosomes that fathers passed to their daughters, rather than on those that mothers passed down. Haldane subsequently proposed that children inherit more mutations from their fathers than their mothers, although he acknowledged that “it is difficult to see how this could be proved or disproved for many years to come”.
22 August 2012 by Kai Kupferschmidt
In June of last year, a 43-year old woman was admitted to the Clinical Center of the National Institutes of Health in Bethesda, Maryland, for a lung disease. Doctors knew she was carrying a highly resistant form of a deadly bacterium known as Klebsiella pneumoniae—although it didn’t make her sick—and they placed her in isolation. When the woman was discharged, no one else appeared to have become infected. A few weeks later, however, another patient was found to be carrying the bacterium, and over the next 3 months, 12 more intensive care patients contracted it. Six died as a direct result of the infection.
Doctors could not make sense of the outbreak with the usual methods: A survey of bed locations showed that the first patient had had no direct contact with any of the others and, in theory, Klebsiella might have been introduced into the hospital multiple times. So physicians turned to the bacterium’s genome for answers. The approach, known as genomic epidemiology, helped them track the path of the microbe, contain the disease, and save lives, according to a new study.

Tracking a killer. Full-genome sequencing revealed the movements of Klebsiella (shown) within one hospital. Credit: Image courtesy of Adrian Zelazny
Genomic epidemiology makes use of the fact that when bacteria divide, they accumulate mutations. As a result, the bacterial genome differs slightly—often by just one or two letters of genetic code, or base pairs—from one patient to the next. By fully sequencing the genomes of patients’ bacteria and finding these minute differences, researchers can track microbial movements with unprecedented precision. The technique has already been used to reconstruct the spread of methicillin-resistant Staphylococcus aureus (MRSA) around the world and to pinpoint the origin of a cholera outbreak in Haiti.
It also helped the doctors at the hospital in Bethesda. Comparing the genomes from all patients showed that the female patient admitted in June had indeed initiated the outbreak; the researchers showed that the bacteria had been transmitted from her to other patients three times independently. Apparently, transmission occurred in ways the researchers didn’t understand, says Tara Palmore, an infectious disease physician at the hospital. “When we realized there was more than met the eye, we started testing everyone in the hospital,” she says. That helped identify four more infected patients outside the intensive care unit, the scientists report online today in Science Translational Medicine. They were quickly isolated, which Palmore believes prevented further spread.
Just how the microbes were transmitted is still unclear. Palmore assumes that the bacteria mainly traveled on the hands of doctors. But the clinic had stationed a person outside the isolation rooms to make sure everyone who entered followed a hygiene regimen 24/7. That suggests that bacteria might have established colonies on surfaces or medical equipment and spread that way as well. “The conventional wisdom is that Klebsiellas do not really survive in the environment, but we found them in six sink drains and a ventilator,” Palmore says.
"This small study demonstrates the potential power of whole genome sequencing for outbreak investigation and surveillance," says Sharon Peacock, a microbiologist at the University of Cambridge in the United Kingdom who was not involved in the work. And infectious disease specialist Dag Harmsen of the University Clinic of Münster in Germany says it is "further proof that the time is ripe for using genomic sequencing of pathogens in a hospital setting." The paper also highlights the dangers of resistant Gram-negative bacteria like Klebsiella p., he adds. In many patients, the bacteria were not susceptible to any available antibiotic; not even to colistin, an old compound used only when all else fails. “This is even more dramatic than MRSA, because you have nothing left to treat the patients with,” Harmsen says. Since the outbreak, every patient at the hospital is checked for such dangerous pathogens; one more resistant Klebsiella case—although a different strain—has been found so far.
Genomic epidemiology could make it easier for hospitals to deal with similar outbreaks, Palmore says. “A lot of academic centers have the ability to do this now,” she says. The cost is becoming less of an issue; during last year’s outbreak, scientists still paid about $2000 per genome sequenced; now that would be closer to $500. But Peacock cautions that it still takes bioinformatics specialists several weeks to interpret the data. “This technology will not be applicable to routine clinical practice until automated interpretation tools become available.”
ScienceDaily (Aug. 22, 2012) — A low dose of the sedative clonazepam alleviated autistic-like behavior in mice with a mutation that causes Dravet syndrome in humans, University of Washington researchers have shown.

(Credit: © Vasiliy Koval / Fotolia)
Dravet syndrome is an infant seizure disorder accompanied by developmental delays and behavioral symptoms that include autistic features. It usually originates spontaneously from a gene mutation in an affected child not found in either parent.
Studies of mice with a similar gene mutation are revealing the overly excited brain circuits behind the autistic traits and cognitive impairments common in this condition. The research report appears in the Aug. 23 issue of Nature. Dr William Catterall, professor and chair of pharmacology at the UW, is the senior author.
Dravet syndrome mutations cause loss-of-function of the human gene called SCN1A. People or mice with two copies of the mutation do not survive infancy; one copy results in major disability and sometimes early death. The mutation causes malformation in one type of sodium ion channels, the tiny pores in nerve cells that produce electrical signals by gating the flow of sodium ions.
The Catteralll lab is studying these defective ion channels and their repercussion on cell-to-cell signaling in the brain. They also are documenting the behavior of mice with this mutation, compared to their unaffected peers. Their findings may help explain how the sporadic gene mutations that cause Dravet syndrome lead to its symptoms of cognitive deficit and autistic behaviors.
Scientists at Georgia State University have found that the ability to hear is lessened when, as a result of injury, a region of the brain responsible for processing sounds receives both visual and auditory inputs.
Yu-Ting Mao, a former graduate student under Sarah L. Pallas, professor of neuroscience, explored how the brain’s ability to change, or neuroplasticity, affected the brain’s ability to process sounds when both visual and auditory information is sent to the auditory thalamus.
The study was published in the Journal of Neuroscience.
The auditory thalamus is the region of the brain responsible for carrying sound information to the auditory cortex, where sound is processed in detail.
When a person or animal loses input from one of the senses, such as hearing, the region of the brain that processes that information does not become inactive, but instead gets rewired with input from other sensory systems.
In the case of this study, early brain injury resulted in visual inputs into the auditory thalamus, which altered how the auditory cortex processes sounds.
The cortical “map” for discriminating different sound frequencies was significantly disrupted, she explained.
“One of the possible reasons the sound frequency map is so disrupted is that visual responsive neurons are sprinkled here and there, and we also have a lot of single neurons that respond to both light and sound,” Pallas said. “So those strange neurons sprinkled there probably keeps the map from forming properly.”
Mao also discovered reduced sensitivity and slower responses of neurons in the auditory cortex to sound.
Finally, the neurons in the auditory cortex were less sharply tuned to different frequencies of sound.
“Generally, individual neurons will be pretty sensitive to one sound frequency that we call their ‘best frequency,’” Pallas said. “We found that they would respond to a broader range of frequencies after the rewiring with visual inputs.”
While Pallas’ research seeks to create a basic understanding of brain development, knowledge gained from her lab’s studies may help to give persons who are deaf, blind, or have suffered brain injuries ways to keep visual and auditory functions from being compromised.
“Usually we think of plasticity as a good thing, but in this case, it’s a bad thing,” she said. “We would like to limit the plasticity so that we can keep the function that’s supposed to be there.”
Source: Georgia State University
22 August 2012 by Jim Giles
More than a year after it won the quiz show Jeopardy!, IBM’s supercomputer is learning how to help doctors diagnose patients
IT IS more than a year since Watson, IBM’s famous supercomputer, opened a new frontier for artificial intelligence by beating human champions of the quiz show Jeopardy!. Now Watson is learning to use its language skills to help doctors diagnose patients.
Progress is most advanced in cancer care, where IBM is working with several US hospitals to build a virtual physicians’ assistant. “It’s a machine that can read everything and forget nothing,” says Larry Norton, a doctor at the Memorial Sloan-Kettering Cancer Center in New York, who is collaborating with IBM.
When playing Jeopardy!, Watson analysed each question in a bid to guess what it was about. Then it looked for possible answers in its database, made up of sources such as encyclopaedias, scoring each according to the evidence associated with it and answering with the highest rated answer. The system takes a similar approach when dealing with medical questions, although in this case it draws on information from medical journals and clinical guidelines.
To test the system, Watson was first tasked with answering questions taken from Doctor’s Dilemma, a competition for trainee doctors that takes place at the annual meeting of the American College of Physicians. Watson was given 188 questions that it had not seen before and achieved around 50 per cent accuracy - not bad for an early test, but hardly ideal (Artificial Intelligence, doi.org/h6m).
To improve, Watson is now absorbing records - tens of thousands at Sloan-Kettering alone - of treatments and outcomes associated with individual patients. Given data on a new patient, Watson looks for information on those with similar symptoms, as well as the treatments that have been the most successful. The idea is it will give doctors a range of possible diagnoses and treatment options, each with an associated level of confidence. The result will be a system that its creators say can suggest nuanced treatment plans that take into account factors like drug interactions and a patient’s medical history.
William Audeh, a doctor at Cedars-Sinai Medical Center in Los Angeles, who is working with IBM, says the last few months have involved “filling Watson’s brain” with medical data. Watson is answering basic questions based on the treatment guidelines that are published by medical societies and is showing “very positive” results, he adds.
The technology is particularly useful in oncology because doctors struggle to keep up with the explosion of genomic and molecular data generated about each cancer type. This means it can take years for findings to translate into medical practice. By contrast, Watson can absorb new results and relay them to doctors quickly, together with an estimate of their potential usefulness. “Watson really has great potential,” says Audeh. “Cancer needs it most because it’s becoming so complicated so quickly.”
The IBM system could also approve treatment requests more quickly. At WellPoint, one of the largest insurers in the US, nurses use guidelines and patient history to determine if a request is in line with company policy. Nurses are now training Watson by feeding it test requests and observing the answers. Progress is good and the system could be deployed next year, says WellPoint’s Cindy Wakefield. “Now it can take up to a couple of days,” she says. “We hope Watson can return the accurate recommendation in a matter of minutes.”
Source: NewScientist
August 21, 2012 by Kathleen Raven
Stem cell treatment could lower inflammation levels and demonstrate whether autism is an autoimmune disease

Image: Nature News
Families with autistic children must navigate a condition where questions outnumber the answers, and therapies remain sparse and largely ineffective. A clinical trial being conducted by the Sutter Neuroscience Institute in Sacramento, California to address this situation began recruiting participants today for a highly experimental stem cell therapy for autism. The institute plans to find 30 autistic children between ages 2 and 7 with cord blood banked at the privately-run Cord Blood Registry, located about 100 miles west of the institute. Already one other clinical trial, with 37 total participants between ages 3 and 12 years old, has been completed in China. The researchers affiliated with Beike Biotechnology in Shenzhen, the firm that sponsored the study, have not yet published any papers from that the trial, which used stem cells from donated cord blood. Mexican researchers are currently recruiting kids for yet another type of autism stem cell trial that will harvest cells from the participant’s fat tissue.
But for each of these officially registered trials, many more undocumented stem cell therapy treatments take place for clients who are willing to pay enough. “Our research is important because many people are going to foreign countries and spending a lot of money on therapy that may not be valid,” says Michael Chez, a pediatric neurologist and lead investigator of the study at Sutter.
A major difference between the Sutter trial and those in China is that his will use the child’s own stem cells, rather than those from a donor. Chez hypothesizes that one way autologous stem cell infusion might work is by reducing inflammation within the body’s immune system. This would answer previous research that suggests that autism may be an autoimmune disease. “One of our exploratory goals will be to look at inflammatory markers in cells,” he says.
The study’s primary goal, however, will be assessing changes in patients’ speaking and understanding of vocabulary. For each individual, researchers will create a baseline benchmark that establishes current skill levels. The group will be evenly divided, with one initially receiving an infusion of their own, unmodified cord blood stem cells and the other a placebo treatment of saline injection. Six months later, all of the children will be tested on their ability to comprehend and form words. The groups will then be switched. In the course of the 13-month-long study, both groups will receive only one stem cell therapy infusion.
Not all stem cell scientists who study neurodevelopmental diseases are ready to invest great hope that the autism stem cell trial will succeed. “I wish I could tell you I’m optimistic about the end results,” says James Carroll, a pediatric neurologist at the Georgia Health Sciences University in Augusta who began a clinical trial two years ago to better understand how stem cell therapy affects patients with cerebral palsy. “But so far we have not seen any kind of miraculous recovery in our cerebral palsy patients. I would be delighted if that changes.”
Members in the stem cell therapy patient community think Chez will have no shortage of volunteers for the trial. Jeremy Lowey, who lives in Sacramento and has struggled with a rare condition known as non-verbal learning disorder, arranged for his own stem cell therapy treatment in India last year, which he called life-changing. He receives numerous Facebook requests from parents of autistic children who are curious to know more. He always begins his conversations by saying, “Go slowly and think hard about your decision.”
Source: Scientific American
ScienceDaily (Aug. 21, 2012) — Together with his team, Prof. Christoph Ploner, director of the Department of Neurology at the Virchow campus, examined a professional cellist who suffered from encephalitis caused by a herpes virus. As a result of the inflammation, the patient developed serious disturbances in memory.
Both his memory for the past (retrograde amnesia), as well as the acquisition of new information (anterograde amnesia) were affected. Whereas the patient was unable to recount any events from his private or professional life, or remember any of his friends or relatives, he retained a completely intact musical memory. Furthermore, he was still able to sight-read and play the cello.
For the systematic examination of his musical memory, Dr. Carsten Finke, Nazli Esfahani and Prof. Christoph Ploner developed various tests that take the beginning of his amnesia into account. In comparison to amateur musicians and professional musicians from the Berlin Philharmonic, the patient showed a normal musical memory in all tests. He not only remembered music pieces from the past, but was also able to retain music he had never heard before.
"The findings show that musical memory is organized at least partially independent of the hippocampus, a brain structure that is central to memory formation," says Carsten Finke, the primary author of the study. "It is possible that the enormous significance of music throughout all times and in all cultures contributed to the development of an independent memory for music."
Carsten Finke and his colleagues hope that the intact musical memory in patients with amnesia can be used to stimulate other memory content. In this way, perhaps a particular melody can be connected to a person or an everyday task, for example taking medicine.
Source: Science Daily
ScienceDaily (Aug. 21, 2012) — New magnetic resonance imaging (MRI) research shows that changes in brain blood flow associated with vein abnormalities are not specific for multiple sclerosis (MS) and do not contribute to its severity, despite what some researchers have speculated. Results of the research are published online in the journal Radiology.
"MRI allowed an accurate evaluation of cerebral blood flow that was crucial for our results," said Simone Marziali, M.D., from the Department of Diagnostic Imaging at the University of Rome Tor Vergata in Rome.
MS is a disease of the central nervous system in which the body’s immune system attacks the nerves. There are different types of MS, and symptoms and severity vary widely. Recent reports suggest a highly significant association between MS and chronic cerebrospinal venous insufficiency (CCSVI), a condition characterized by compromised blood flow in the veins that drain blood from the brain. This strong correlation has generated substantial attention from the scientific community and the media in recent years, raising the possibility that MS can be treated with endovascular procedures like stent placement. However, the role of brain blood flow alterations on MS patients is still unclear.
To investigate this further, Italian researchers compared brain blood flow in 39 MS patients and 26 healthy control participants. Twenty-five of the MS patients and 14 of the healthy controls were positive for CCSVI, based on Color-Doppler-Ultrasound (CDU) findings. The researchers used dynamic susceptibility contrast-enhanced (DSC) MRI to assess blood flow in the brains of the study groups. DSC MR imaging offers more accurate assessment of brain blood flow than that of CDU. MRI and CDU were used to assess two different anatomical structures.
While CCSVI-positive patients showed decreased cerebral blood flow and volume compared with their CCSVI-negative counterparts, there was no significant interaction between MS and CCSVI for any of the blood flow parameters. Furthermore, the researchers did not find any correlation between the cerebral blood flow and volume in the brain’s white matter and the severity of disability in MS patients.
The results suggest that CCSVI is not a pathological condition correlated with MS, according to Dr. Marziali, but probably just an epiphenomenon — an accessory process occurring in the course of a disease that is not necessarily related to the disease. This determination is important because, to date, studies of the prevalence of CCSVI in MS patients have provided inconclusive results.
"This study clearly demonstrates the important role of MRI in defining and understanding the causes of MS," Dr. Marziali said. "I believe that, in the future, it will be necessary to use powerful and advanced diagnostic tools to obtain a better understanding of this and other diseases still under study."
Source: Science Daily
Does some fine madness yield great artists, writers, and scientists? The evidence is growing for a significant link between bipolar disorder and creative temperament and achievement.

People with bipolar disorder swing repeatedly from depression to euphoria and hyperactivity, or intensely irritable mood states. Sometimes likened to being on an emotional rollercoaster, each swing up then down affects one’s behaviour, energy levels, thought patterns and sleep.
Also known as manic-depressive illness, bipolar disorder is strongly genetically linked, passing down through each generation of an affected family. It is fairly common and very treatable with modern medicines and psychotherapy.
21 August 2012 by Lois Rogers
Thousands of otherwise healthy people put up with a level of sleep deprivation that would drive the rest of us insane. But they are not the usual candidates for insomnia, such as shift workers or those with severe mental illness. Instead, they belong to a newly identified group of people born without the ‘comfort’ genes needed for easy sleep.

This means they are immune to the feeling of warmth and relaxation which sends an average person off to sleep within 15 minutes. Their genes are designed instead to maintain a state of mental alertness. This makes normal, prolonged sleep impossible so they sleep fitfully, in only short bursts. Even then, their lack of ‘comfort’ genes may mean they struggle to get comfortable, fussing about the bedding or finding their sleeping position.
There are other so-called insomnia genes — some cause repeated periods of wakefulness in the small hours of the night or at the slightest disturbance, or drive an affected person to leap out of bed raring to start the day at 4am, but leave them exhausted by 4pm. Until recently, insomnia was considered a purely psychological complaint triggered by stress, grief, or sleep disruption as a result of shift work or jet lag.
But doctors are now unravelling the genetic explanation of why at least one-third of us have intermittent or constant sleep problems. Even so, it’s already thought there could be six or more different types of insomnia linked to genes. This means it will be possible to develop drugs to block the effect of the chemical signals they produce.
ScienceDaily (Aug. 21, 2012) — Working with units of material so small that it would take 50,000 to make up one drop, scientists are developing the profiles of the contents of individual brain cells in a search for the root causes of chronic pain, memory loss and other maladies that affect millions of people.
They described the latest results of this one-by-one exploration of cells or “neurons” from among the millions present in an animal brain at the 244th National Meeting & Exposition of the American Chemical Society (ACS), the world’s largest scientific society. The meeting, expected to attract almost 14,000 scientists and others from around the world, continues in Philadelphia through Thursday, with 8,600 presentations on new discoveries in science and other topics.
Jonathan Sweedler, Ph.D., a pioneer in the field, explained in a talk at the meeting that knowledge of the chemistry occurring in individual brain cells would provide the deepest possible insights into the causes of certain diseases and could point toward new ways of diagnosis and treatment. Until recently, however, scientists have not had the technology to perform such neuron-by-neuron research.
"Most of our current knowledge about the brain comes from studies in which scientists have been forced to analyze the contents of multiple nerve cells, and, in effect, average the results," Sweedler said. He is with the University of Illinois at Urbana-Champaign and also serves as editor-in-chief of Analytical Chemistry, which is among ACS’ more than 40 peer-reviewed scientific journals. “That approach masks the sometimes-dramatic differences that can exist even between nerve cells that are shoulder-to-shoulder together. Suppose that only a few cells in that population are changing, perhaps as a disease begins to take root or starts to progress or a memory forms and solidifies. Then we would miss those critical changes by averaging the data.”
However, scientists have found it difficult to analyze the minute amounts of material inside single brain cells. Those amounts are in the so-called “nanoliter” range, units so small that it would take 355 billion nanoliters to fill a 12-ounce soft-drink can. Sweedler’s group spent much of the past decade developing the technology to analyze the chemicals found in individual cells — a huge feat with a potentially big pay-off. “We are using our new approaches to understand what happens in learning and memory in the healthy brain, and we want to better understand how long-lasting, chronic pain develops,” he said.
The 85 billion neurons in the brain are highly interconnected, forming an intricate communications network that makes the complexity of the Internet pale in comparison. The neural net’s chemical signaling agents and electrical currents orchestrate a person’s personality, thoughts, consciousness and memories. These connections are different from person to person and change over the course of a lifetime, depending on one’s experiences. Even now, no one fully understands how these processes happen.
To get a handle on these complex workings, Sweedler’s team and others have zeroed in on small sections of the central nervous system ― the brain and spinal cord ― using stand-ins for humans such as sea slugs and laboratory rats. Sweedler’s new methods enable scientists to actually select areas of the nervous system, spread out the individual neurons onto a glass surface, and one-by-one analyze the proteins and other substances inside each cell.
One major goal is to see how the chemical make-up of nerve cells changes during pain and other disorders. Pain from disease or injuries, for instance, is a huge global challenge, responsible for 40 million medical appointments annually in the United States alone.
Sweedler reported that some of the results are surprising, including tests on cells in an area of the nervous system involved in the sensation of pain. Analysis of the minute amounts of material inside the cells showed that the vast majority of cells undergo no detectable change after a painful event. The chemical imprint of pain occurs in only a few cells. Finding out why could point scientists toward ways of blocking those changes and in doing so, could lead to better ways of treating pain.
Source: Science Daily
Controlling the amount of oxygen that stem cells are exposed to can significantly increase the effectiveness of a procedure meant to combat an often fatal form of muscular dystrophy, according to Purdue University research.
A genetic mutation in patients with Duchenne muscular dystrophy causes the constant breakdown of muscles and gradual depletion of stem cells that are responsible for repairing the damage and progressive muscle wasting. A healthy stem cell tends to duplicate in a regular pattern that creates one copy of itself that continues to function as a stem cell, and a differentiated cell, which performs a specific function. In a healthy person, a torn or damaged muscle would be repaired through this process.

Stem cell therapy - implanting healthy stem cells to combat tissue wasting - has shown promise against muscular dystrophy and other neurodegenerative diseases, but few of the implanted stem cells survive the procedure. Shihuan Kuang, a Purdue assistant professor of animal sciences, and Weiyi Liu, a postdoctoral research associate, showed that survival of implanted muscle stem cells could be increased by as much as fivefold in a mouse model if the cells are cultured under oxygen levels similar to those found in human muscles.
"Stem cells survive in a microenvironment in the body that has a low oxygen level," Kuang said. "But when we culture cells, there is a lot of oxygen around the petri dish. We wanted to see if less oxygen could mimic that microenvironment. When we did that, we saw that more stem cells survived the transplant."
Liu thinks that’s because the stem cells grown in higher oxygen levels acclimate to their surroundings. When they’re injected into muscles with lower oxygen levels, they essentially suffocate.
"By contrast, in our study the cells become used to the host environment when they are conditioned under low oxygen levels prior to transplantation," Liu said.
In the mouse model, Kuang and Liu saw more stem cells survive the transplants, and those stem cells retained their ability to duplicate themselves.
"When we lower the oxygen level, we can also maintain the self-renewal process," Kuang said. "If these stem cells self-renew, they should never be used up and should continue to repair damaged muscle."
The findings, reported in the journal Development, shows promise for increasing the effectiveness of stem cell therapy for patients with Duchenne muscular dystrophy, which affects about one in 3,500 boys starting at about 3-5 years old. The disease, which confines almost all patients to wheelchairs by their 20s, is often fatal as muscles that control the abilities to breathe and eat deteriorate.
Source: Purdue University
ScienceDaily (Aug. 21, 2012) — How abnormal protein deposits in the brains of Alzheimer’s patients disrupt the signalling between nerve cells has now been reported by researchers in Bochum and Munich, led by Dr. Thorsten Müller from the Medizinisches Proteom-Center of the Ruhr-Universität, in the journal Molecular and Cellular Proteomics. They varied the amount of APP protein and related proteins associated with Alzheimer’s disease in cell cultures, and then analysed how this manipulation affected other proteins in the cell. The result: the amount of APP present was related to the amount of an enzyme that is essential for the production of neurotransmitters and therefore for communication amongst nerve cells.

Mass spectrometer: The proteins are injected into the apparatus via a very thin needle. (Credit: © RUB-Pressestelle, Marion Nelle)
Proteomics: analysing all the proteins of the cells at once
Amyloid plaques are a characteristic feature of Alzheimer’s disease. They consist largely of cleavage products of the so-called amyloid precursor protein APP, which occur in excess in the brains of Alzheimer’s patients. What role APP plays in healthy people and why the abnormal accumulation of amyloid disrupts the regular functioning of the brain is still largely unclear. To understand the function of APP, the RUB researchers established a new cell model. The new cells produced only a very small amount of APP. What impact this had on all the other proteins of these cells was examined by the researchers through the use of mass spectrometry, among other things. With this method they identified over 2000 proteins and determined their concentrations. They were looking specifically for molecules whose concentrations in the newly established low-APP cells were different than in the reference cells that contained normal amounts of APP.
Abnormal protein able to curb neurotransmitter production
"One candidate has particularly caught our attention, this being the enzyme methionine adenosyltransferase II, alpha, MAT2A for short," Thorsten Müller said. Among other things, the enzyme is crucially involved in the production of neurotransmitters. Low-APP cells contained less MAT2A than the reference cells. To confirm the connection between the "Alzheimer’s protein" APP and the neurotransmitter-producing MAT2A, the team studied tissue samples from the brains of deceased Alzheimer’s patients and from healthy individuals. In the tissue of the Alzheimer’s patients there was less MAT2A than in the healthy samples. These results suggest that APP and MAT2A concentrations are related and are linked to the synthesis of neurotransmitters. "Our results point to a new mechanism by which the defective cleavage of the APP protein in Alzheimer’s disease could be directly related to altered neurotransmitter production," Müller said. "As a result, the signal transduction of nerve cells could be disrupted, which, over an extended period, could possibly also cause the death of cells."
Source: Science Daily
Aug. 20, 2012 by Quinn Eastman
People with Parkinson’s disease performed markedly better on a test of working memory after a night’s sleep, and sleep disorders can interfere with that benefit, researchers have shown.

The ability of sleep to improve scores on a test of working memory specifically depends on how much slow wave sleep Parkinson’s patients obtain, researchers have found.
While the classic symptoms of Parkinson’s disease include tremors and slow movements, Parkinson’s can also affect someone’s memory, including “working memory.” Working memory is defined as the ability to temporarily store and manipulate information, rather than simply repeat it. The use of working memory is important in planning, problem solving and independent living.
The findings underline the importance of addressing sleep disorders in the care of patients with Parkinson’s, and indicate that working memory capacity in patients with Parkinson’s potentially can be improved with training. The results also have implications for the biology of sleep and memory.
The results were published this week in the journal Brain.
"It was known already that sleep is beneficial for memory, but here, we’ve been able to analyze what aspects of sleep are required for the improvements in working memory performance," says postdoctoral fellow Michael Scullin, who is the first author of the paper. The senior author is Donald Bliwise, professor of neurology at Emory University School of Medicine.
The performance boost from sleep was linked with the amount of slow wave sleep, or the deepest stage of sleep. Several research groups have reported that slow wave sleep is important for synaptic plasticity, the ability of brain cells to reorganize and make new connections.
Sleep apnea, the disruption of sleep caused by obstruction of the airway, interfered with sleep’s effects on memory. Study participants who showed signs of sleep apnea, if it was severe enough to lower their blood oxygen levels for more than five minutes, did not see a working memory test boost.
In this study, participants took a “digit span test,” in which they had to repeat a list of numbers forward and backward. The test was conducted in an escalating fashion: the list grows incrementally until someone makes a mistake. Participants took the digit span test eight times during a 48-hour period, four during the first day and four during the second. In between, they slept.
Repeating numbers in the original order is a test of short-term memory, while repeating the numbers in reverse order is a test of working memory.
"Repeating the list in reverse order requires some effort to manipulate the numbers, not just spit them back out again," Scullin says. "It’s also a purely verbal test, which is important when working with a population that may have motor impairments."
54 study participants had Parkinson’s disease, and 10 had dementia with Lewy bodies: a more advanced condition, where patients may have hallucinations or fluctuating cognition as well as motor symptoms. Those who had dementia with Lewy bodies saw no working memory boost from the night’s rest. As expected, their baseline level of performance was lower than the Parkinson’s group.
Participants with Parkinson’s who were taking dopamine-enhancing medications saw their performance on the digit span test jump up between the fourth and fifth test. On average, they could remember one more number backwards. The ability to repeat numbers backward improved, even though the ability to repeat numbers forward did not.
Patients needed to be taking dopamine-enhancing medications to see the most performance benefit from sleep. Patients not taking dopamine medications, even though they had generally had Parkinson’s for less time, did not experience as much of a performance benefit. This may reflect a role for dopamine, an important neurotransmitter, in memory.
Scullin and Bliwise are planning an expanded study of sleep and working memory, in healthy elderly people as well as patients with neurodegenerative diseases.
"Many elderly people go through a decline in how much slow wave sleep they experience, and this may be a significant contributor to working memory difficulties," Scullin says.
Source: Emory
20 August 2012 by Kayt Sukel
Some people can recall what happened on almost every day of their lives. Unlocking their secrets could shed light on the way all our memories work

IT WAS an email that memory researcher James McGaugh found hard to believe. The sender, a 34-year-old housewife named Jill Price, was claiming that she could recall key events on any date back to when she was about 12, as well as what she herself had done each day.
"Some people call me the human calendar," she wrote, "while others run out of the room in fear. But the one reaction I get from everyone who finds out about this ‘gift’ is amazement. I run my entire life through my head every day and it drives me crazy!"
McGaugh invited Price to his lab, making sure he had to hand a copy of 20th Century Day by Day, a book that lists important events by date. He opened the book to random pages and asked Price what had happened on those days. “Whether it was a plane crash or some elections or a movie star doing an outrageous thing, she was dead on,” he recalls. “Time and time again.”
That was in June 2000. McGaugh’s group has worked closely with Price ever since, and has discovered she is one of a select few with similar abilities. These individuals are neither autistic savants nor masters of mnemonic-based tricks of recall, yet they can remember key events from almost every day of their lives. Learning more about their abilities and how their brains are wired should lead to insights into the nature of human memory.
Intrigued by McGaugh’s findings, I arranged to visit his lab at the University of California, Irvine, to find out how these people live with such unusual abilities - and what it is like for the researchers working with them. “It never ceases to amaze me,” says McGaugh’s colleague, Aurora LePort. “Some of them can remember every day you give them.” She says studying people whose powers of recall seem to be enhanced, rather than impaired, offers us a new tool to explore memory.
It is certainly fair to say that most of our knowledge of memory derives from looking at memory loss. The classic case is that of Henry Molaison (better known as “HM”), who had surgery nearly 60 years ago to treat severe epilepsy. In a misguided attempt to remove the source of the seizures, several parts of the brain were cut out, including both hippocampi, curled up ridges on either side of the brain.
For HM, the consequences were catastrophic. Although he could still recall his early life, he was no longer able to lay down memories of things that happened to him after the surgery. Every day, the researchers studying his condition had to introduce themselves anew. Intriguingly, though, he could perform tasks that used short-term memory, like retaining a phone number for a few minutes.
Thanks to HM and many other people with neurological problems caused by head injuries and strokes, we now know that there are different kinds of remembering. Our short-term memories last up to about a minute, unless they are reinforced, or “rehearsed” through further repetition. While much about the neuroscience of memory remains mysterious, our hippocampi seem to be involved in turning these fleeting impressions into long-term memories, which are thought to be stored in the temporal lobes on either side of the brain.
Long-term memories can be subdivided into semantic ones to do with concepts, such as the fact that London is the UK capital, and autobiographical memories, about everyday events that we experience. Price has no special abilities with regard to her short-term or semantic memory, but when it comes to autobiographical memory, her scores are off the chart.