ScienceDaily (July 20, 2012) — Conditions such as Parkinson’s disease are a result of pathogenic changes to proteins. In the neurodegenerative condition of Parkinson’s disease, which is currently incurable, the alpha-synuclein protein changes and becomes pathological. Until now, there have not been any antibodies that could help to demonstrate the change in alpha-synuclein associated with the disease. An international team of experts led by Gabor G. Kovacs from the Clinical Institute of Neurology at the MedUni Vienna has now discovered a new antibody that actually possesses this ability.
"It opens up new possibilities for the development of a diagnostic test for Parkinsonism," says Kovacs, highlighting the importance of this discovery. "This new antibody will enable us to find the pathological conformation in bodily fluids such as blood or CSF." A clinical study involving around 200 patients is already underway, and the first definitive results are expected at the end of 2012. The tests being carried out in collaboration with the University Department of Neurology, led by Walter Pirker, are designed to determine the extent to which the new antibody can be used as an early diagnostic tool in order to understand the condition better and be able to treat it more effectively.
A step towards a blood test for Parkinson’s With Parkinsonism, the diseased form of alpha-synuclein, which has the same primary structure as the healthy form, undergoes an “abnormal fold.” Says Kovacs: “Until now, however, it was not possible to distinguish between the two.” The previous immunodiagnostic techniques only allowed the general presence of alpha-synuclein to be confirmed. The new, monoclonal antibody, however, which the researchers at the MedUni Vienna have developed in collaboration with the German biotech firm Roboscreen, is now able to detect a strategic part of the protein responsible for the structural changes. The results of the study have now been published in the journal Acta Neuropathologica.
Says Kovacs: “It is still not possible to say whether or not we will be able to diagnose Parkinson’s from a blood test, but this discovery certainly represents a major step in that direction.” Theoretically, it should be possible to diagnose Parkinson’s disease five to eight years before it develops.
In Austria, there are between 15,000 and 16,000 people living with Parkinson’s syndrome. Its frequency increases with age. As society becomes older, Parkinson’s disease, a degenerative condition of the brain, will become an increasingly widespread problem.
Source: Science Daily
ScienceDaily (July 20, 2012) — Scientists at the University of Manchester have uncovered how the internal mechanisms in nerve cells wire the brain. The findings open up new avenues in the investigation of neurodegenerative diseases by analysing the cellular processes underlying these conditions.

Illustration of spectraplakins in axonal growth organising microtubules. (Credit: Image courtesy of University of Manchester)
Dr Andreas Prokop and his team at the Faculty of Life Sciences have been studying the growth of axons, the thin cable-like extensions of nerve cells that wire the brain. If axons don’t develop properly this can lead to birth disorders, mental and physical impairments and the gradual decay of brain capacity during aging.
Axon growth is directed by the hand shaped growth cone which sits in the tip of the axon. It is well documented how growth cones perceive signals from the outside to follow pathways to specific targets, but very little is known about the internal machinery that dictates their behaviour.
Dr Prokop has been studying the key driver of growth cone movements, the cytoskeleton. The cytoskeleton helps to maintain a cell’s shape and is made up of the protein filaments, actin and microtubules. Microtubules are the key driving force of axon growth whilst actin helps to regulate the direction the axon grows.
Dr Prokop and his team used fruit flies to analyse how actin and microtubule proteins combine in the cytoskeleton to coordinate axon growth. They focussed on the multifunctional proteins called spectraplakins which are essential for axonal growth and have known roles in neurodegeneration and wound healing of the skin.
What the team demonstrate in this recent paper is that spectraplakins link microtubules to actin to help them extend in the direction the axon is growing. If this link is missing then microtubule networks show disorganised criss-crossed arrangements instead of parallel bundles and axon growth is hampered.
By understanding the molecular detail of these interactions the team made a second important finding. Spectraplakins collect not only at the tip of microtubules but also along the shaft, which helps to stabilise them and ensure they act as a stable structure within the axon.
This additional function of spectraplakins relates them to a class of microtubule-binding proteins including Tau. Tau is an important player in neurodegenerative diseases, such as Alzheimer’s, which is still little understood. In support of the author’s findings, another publication has just shown that the human spectraplakin, Dystonin, causes neurodegeneration when affected in its linkage to microtubules.
Talking about his research Dr Prokop said: “Understanding cytoskeletal machinery at the cell level is a holy grail of current cell research that will have powerful clinical applications. Thus, cytoskeleton is crucially involved in virtually all aspects of a cell’s life, including cell shape changes, cell division, cell movement, contacts and signalling between cells, and dynamic transport events within cells. Accordingly, the cytoskeleton lies at the root of many brain disorders. Therefore, deciphering the principles of cytoskeletal machinery during the fundamental process of axon growth will essentially help research into the causes of a broad spectrum of diseases. Spectraplakins like at the heart of this machinery and our research opens up new avenues for its investigation”
What Dr Prokop’s paper in the Journal of Neuroscience also demonstrates is the successful research technique using the fruit fly Drosophila. The team was able to replicate its findings regarding axon growth in mice which in turn means the findings can be translated to humans.
Dr Prokop points out fruit flies provide ideal means to make sense of these findings and essentially help to unravel the many mysteries of neurodegeneration.
Dr Prokop continues: “Understanding how spectraplakins perform their cellular functions has important implications for basic as well as biomedical research. Thus, besides their roles during axon growth, spectraplakins of mice and humans are clinically important for a number of conditions and processes including skin blistering, neuro-degeneration, wound healing, synapse formation and neuron migration during brain development. Understanding spectraplakins in one biological process will instruct research on the other clinically relevant roles of these proteins.”
Source: Science Daily
ScienceDaily (July 19, 2012) — While clinical trial results are being released regarding drugs intended to decrease amyloid production — thought to contribute to decline in Alzheimer’s disease — clinical trials of drugs targeting other disease proteins, such as tau, are in their initial phases.
Penn Medicine research presented July 19 at the 2012 Alzheimer’s Association International Conference (AAIC) shows that an anti-tau treatment called epithilone D (EpoD) was effective in preventing and intervening the progress of Alzheimer’s disease in animal models, improving neuron function and cognition, as well as decreasing tau pathology.
By targeting tau, the drug aims to stabilize microtubules, which help support and transport of essential nutrients and information between cells. When tau malfunctions, microtubules break and tau accumulates into tangles.
"This drug effectively hits a tau target by correcting tau loss of function, thereby stabilizing microtubules and offsetting the loss of tau due to its formation into neurofibrillary tangles in animal models, which suggests that this could be an important option to mediate tau function in Alzheimer’s and other tau-based neurodegenerative diseases," said John Trojanowski, MD, PhD, professor of Pathology and Laboratory Medicine in the Perelman School of Medicine at the University of Pennsylvania. "In addition to drugs targeting amyloid, which may not work in advanced Alzheimer’s disease, our hope is that this and other anti-tau drugs can be tested in people with Alzheimer’s disease to determine whether stabilizing microtubules damaged by malfunctioning tau protein may improve clinical and pathological outcomes."
The drug, identified through Penn’s Center for Neurodegenerative Disease Research (CNDR) Drug Discovery Program, was previously shown to prevent further neurological damage and improve cognitive performance in animal models*. The Penn research team includes senior investigator Bin Zhang, MD, and Kurt Brunden, PhD, director of Drug Discovery at CNDR.
Bristol-Myers Squibb, who developed and owns the rights to the drug, has started enrolling patients into a phase I clinical trial in people with mild Alzheimer’s disease.
Source: Science Daily
July 19, 2012
Korean scientists have used tiny stars, squares and triangles as a toolkit to create live neural circuits in a dish.
They hope the shapes can be used to create a reproducible neural circuit model that could be used for learning and memory studies as well as drug screening applications; the shapes could also be integrated into the latest neural tissue scaffolds to aid the regeneration of neurons at injured sites in the body, such as the spinal cord.
Published today in the Journal of Neural Engineering, the study, by researchers at the Korea Advanced Institute of Science and Technology (KAIST), found that triangles were the most effective shape for helping to facilitate the growth of axons and guide them onto specific paths to form a complete circuit.
Co-author of the study, Professor Yoonkey Nam, said: “Eventually, we want to know if we can design a neural tissue model that biologically mimics some neural circuits in our brain.”
A neuron is an electrically excitable cell that processes and transmits information around the body. The neuron is composed of three main parts: a cell body, or soma, dendrites and an axon, which extends from the soma and links to other cells, creating a network.
When axons grow they are usually guided by proteins. Many researchers have been trying to re-create this key process in a dish by manipulating nerve cells from rat brains.
As nerve cells are usually just a few tens of micrometres in size, the challenge associated with creating a live neural network is firstly positioning cells in desired locations and, secondly, making connections between these cells by guiding the axons in designated directions.
The researchers investigated whether two star shapes, five regular shapes (square, circle, triangle, pentagon and hexagon) and three different sizes of isosceles triangles could guide axons in designated directions. Each shape was the size of a single cell and was replicated to form an array which was printed onto a glass surface.
Each of the arrays had an overall size of 1cm-by-1cm with a gap of 10 micrometres between each shape. Hippocampal neurons were taken from rats and plated onto the patterned surfaces. The neurons were fluorescently labelled with dyes so that images could be taken of their growth.
The researchers found that triangles were the most efficient shape to encourage the growth and guidance of an axon. The key to this was the angles at the points where two of the triangle’s lines meet, also known as the vertices. It was shown that the smaller the vertices, the higher chance the triangle had of inducing growth.
"Based on our results, we are suggesting a new design principle for guiding axons in a dish. We can control the axonal growth in a certain direction by putting a sharp triangle pointing to a certain direction. Then, a neuron that adhered to the triangle will have an axon in the sharp vertex direction.
"Overall, we integrated microtechnology with neurobiology to find a new engineering solution" continued Professor Nam.
Provided by Institute of Physics
Source: medicalxpress.com
ScienceDaily (July 19, 2012) — A joint study carried out by The University of Nottingham and the multinational food company Unilever has found for the first time that fat in food can reduce activity in several areas of the brain which are responsible for processing taste, aroma and reward.
The research, now available in the Springer journal Chemosensory Perception, provides the food industry with better understanding of how in the future it might be able to make healthier, less fatty food products without negatively affecting their overall taste and enjoyment. Unveiled in 2010, Unilever’s Sustainable Living Plan sets out its ambition to help hundreds of millions of people improve their diet around the world within a decade.
This fascinating three-year study investigated how the brains of a group of participants in their 20s would respond to changes in the fat content of four different fruit emulsions they tasted while under an MRI scanner. All four samples were of the same thickness and sweetness, but one contained flavour with no fat, while the other three contained fat with different flavour release properties.
The research found that the areas of the participants’ brains which are responsible for the perception of flavour — such as the somatosensory cortices and the anterior, mid & posterior insula — were significantly more activated when the non-fatty sample was tested compared to the fatty emulsions despite having the same flavour perception. It is important to note that increased activation in these brain areas does not necessarily result in increased perception of flavour or reward.
Dr Joanne Hort, Associate Professor in Sensory Science at The University of Nottingham said: “This is the first brain study to assess the effect of fat on the processing of flavour perception and it raises questions as to why fat emulsions suppress the cortical response in brain areas linked to the processing of flavour and reward. It also remains to be determined what the implications of this suppressive effect are on feelings of hunger, satiety and reward.”
Unilever food scientist Johanneke Busch, based at the company’s Research & Development laboratories in Vlaardingen, Netherlands added: “There is more to people’s enjoyment of food than the product’s flavour — like its mouthfeel, its texture and whether it satisfies hunger, so this is a very important building block for us to better understand how to innovate and manufacture healthier food products which people want to buy.”
Source: Science Daily
July 19, 2012 By Emily Martinez
(Medical Xpress) — UT Dallas researchers recently demonstrated how nerve stimulation paired with specific experiences, such as movements or sounds, can reorganize the brain. This technology could lead to new treatments for stroke, tinnitus, autism and other disorders.

Dr. Michael Kilgard helped lead a team that paired vagus nerve stimulation with physical movement to improve brain function.
In a related paper, UT Dallas neuroscientists showed that they could alter the speed at which the brain works in laboratory animals by pairing stimulation of the vagus nerve with fast or slow sounds.
A team led by Dr. Robert Rennaker and Dr. Michael Kilgard looked at whether repeatedly pairing vagus nerve stimulation with a specific movement would change neural activity within the laboratory rats’ primary motor cortex. To test the hypothesis, they paired the vagus nerve stimulation with movements of the forelimb in two groups of rats. The results were published in a recent issue of Cerebral Cortex.
After five days of stimulation and movement pairing, the researchers examined the brain activity in response to the stimulation. The rats who received the training along with the stimulation displayed large changes in the organization of the brain’s movement control system. The animals receiving identical motor training without stimulation pairing did not exhibit any brain changes, or plasticity.
People who suffer strokes or brain trauma often undergo rehabilitation that includes repeated movement of the affected limb in an effort to regain motor skills. It is believed that repeated use of the affected limb causes reorganization of the brain essential to recovery. The recent study suggests that pairing vagus nerve stimulation with standard therapy may result in more rapid and extensive reorganization of the brain, offering the potential for speeding and improving recovery following stroke, said Rennaker, associate professor in The University of Texas at Dallas’ School of Behavioral and Brain Sciences.
“Our goal is to use the brain’s natural neuromodulatory systems to enhance the effectiveness of standard therapies,” Rennaker said. “Our studies in sensory and motor cortex suggest that the technique has the potential to enhance treatments for neurological conditions ranging from chronic pain to motor disorders. Future studies will investigate its effectiveness in treating cognitive impairments.”
July 19, 2012
(Medical Xpress) — When learning to master complex movements such as those required in surgery, is being physically guided by an expert more effective than learning through trial and error?

Dr. George Van Doorn and a participant in the fMRI
New research by Monash University’s Departments of Psychological Studies and Physiology challenges earlier claims that externally guided (or passive) movement is a superior learning method to self-generated (or active) movement.
In the first study of its kind, researchers discovered that different brain regions become active depending on the type of movement used. Lead researcher Dr. George Van Doorn, head of Psychological Studies, said the findings did not support the view that passive movement was a more effective way to learn.
“There has been much debate over the last 30 years about which form of movement is better,” Dr. Van Doorn said. “We found that active movements result in greater activation in brain areas implicated in higher-order processes such as monitoring and controlling goal-directed behaviour, attention, execution of movements, and error detection.
“Passive movements, in contrast, produced greater activity in areas associated with touch perception, length discrimination, tactile object recognition, and the attenuation of sensory inputs.”
People were tested while making movements themselves, and while being guided.
“Whilst inside a functional Magnetic Resonance Imaging (fMRI) machine, we had people either freely move their index finger around a two-dimensional, raised-line pattern to measure self-generated touch. Or we had an experimenter guide the person’s finger around the pattern, to measure externally generated touch. Using the fMRI, we found that different brain regions become active depending on the type of movement used,” Dr. Van Doorn said.
Dr. Van Doorn said touch was becoming a popular area of investigation, with more scientists contributing to understanding about this important, though under-acknowledged, sensory system.
All researchers involved in this study are located at Monash University’s Gippsland campus. The study findings were presented at EuroHaptics 2012, a major international conference and the primary European meeting for researchers in the field of human haptic sensing and touch-enabled computer applications.
Provided by Monash University
Source: medicalxpress.com
ScienceDaily (July 19, 2012) — By decoding brain activity, scientists were able to “see” that two monkeys were planning to approach the same reaching task differently — even before they moved a muscle.

The obstacle-avoidance task is a variation on the center-out reaching task in which an obstacle sometimes prevents the monkey from moving directly to the target. The monkey must first place a cursor (yellow) on the central target (purple). This was the starting position. After the first hold, a second target appeared (green). After the second hold an obstacle appeared (red box). After the third hold, the center target disappeared, indicating a “go” for the monkey, which then moved the cursor out and around the obstacle to the target. (Credit: Moran/Pearce)
Anyone who has looked at the jagged recording of the electrical activity of a single neuron in the brain must have wondered how any useful information could be extracted from such a frazzled signal.
But over the past 30 years, researchers have discovered that clear information can be obtained by decoding the activity of large populations of neurons.
Now, scientists at Washington University in St. Louis, who were decoding brain activity while monkeys reached around an obstacle to touch a target, have come up with two remarkable results.
Their first result was one they had designed their experiment to achieve: they demonstrated that multiple parameters can be embedded in the firing rate of a single neuron and that certain types of parameters are encoded only if they are needed to solve the task at hand.
Their second result, however, was a complete surprise. They discovered that the population vectors could reveal different planning strategies, allowing the scientists, in effect, to read the monkeys’ minds.
ScienceDaily (July 18, 2012) — Researchers at Oregon Health & Science University School of Dentistry have discovered that TDP-43, a protein strongly linked to ALS (amyotrophic lateral sclerosis) and other neurodegenerative diseases, appears to activate a variety of different molecular pathways when genetically manipulated. The findings have implications for understanding and possibly treating ALS and neurodegenerative diseases such as Alzheimer’s and Parkinson’s.
ALS affects two in 100,000 adults in the United States annually and the prognosis for patients is grim.The new discovery is published online in G3: Genes, Genomes, Genetics (and the July 2012 print issue of G3).
Using a fruit fly model, the OHSU team genetically increased or eliminated TDP-43 to study its effect on the central nervous system. By using massively parallel sequencing methods to profile the expression of genes in the central nervous system, the team found that the loss of TDP-43 results in widespread gene activation and altered splicing, much of which is reversed by rescue of TDP-43 expression. Although previous studies have implicated both absence and over expression of TDP-43 in ALS, the OHSU study showed little overlap in the gene expression between these two manipulations, suggesting that the bulk of the genes affected are different.
"Our data suggest that TDP-43 plays a role in synaptic transmission, synaptic release and endocytosis," said Dennis Hazelett, Ph.D., lead author of the study. "We also uncovered a potential novel regulation of several pathways, many targets of which appear to be conserved."
Source: Science Daily
ScienceDaily (July 18, 2012) — Researchers from the University of Medicine and Dentistry of New Jersey (UMDNJ), collaborating with scientists from Northwestern University in Illinois, have provided direct experimental evidence that diabetes is linked to the onset of Alzheimer’s disease. The study, published online this week in the Journal of Alzheimer’s Disease, used an experimental model that shows potential as an important new tool for investigations of Alzheimer’s disease and of drugs being developed to treat Alzheimer’s.
UMDNJ researchers Peter Frederikse, PhD, and Chinnaswamy Kasinathan, PhD, collaborated with William Klein, PhD, at Northwestern University, to build on prior studies from the Klein lab and others that indicated close links between Alzheimer’s disease and diabetes. Working with Claudine Bitel and Rajesh Kaswala, students at UMDNJ, the researchers tested whether untreated diabetes would provide a physiological model of Alzheimer neuropathology.
"The results were striking," Frederikse said. "Because we used diabetes as an instigator of the disease, our study shows — for the first time directly — the link between Alzheimer’s and diabetes."
The researchers found substantial increases in amyloid beta peptide pathology — a hallmark of Alzheimer’s disease — in the brain cortex and hippocampus concurrent with diabetes. They also found significant amyloid beta pathology in the retina and by contrast, when diabetes is not present, no observable pathology was detected in either the brain or the retina.
"Second, our study examined the retina, which is considered an extension of the brain, and is more accessible for diagnostic exams," Frederikse added. "Our findings indicate that scientists may be able to follow the onset and progression of Alzheimer’s disease through retinal examination, which could provide a long sought after early-warning sign of the disease."
This experimental model replicated spontaneous formation of amyloid beta “oligomer” assemblies in brain and retina which may help to explain one of the most widely recognized symptoms of Alzheimer’s. “This is exciting,” Klein said. “Oligomers are the neurotoxins now regarded as causing Alzheimer’s disease memory loss. What could cause them to appear and buildup in late-onset Alzheimer’s disease has been a mystery, so these new findings with diabetes represent an important step.”
Previous research indicated that insulin plays an important role in the formation of memories. Once attached to neurons, oligomers cause insulin receptors to be eliminated from the surface membranes, contributing to insulin resistance in the brain. This launches a vicious cycle in which diabetes induces oligomer accumulation which makes neurons even more insulin resistant.
"In light of the near epidemic increases in Alzheimer’s disease and diabetes today, developing a physiological model of Alzheimer neuropathology has been an important goal," Kasinathan added. "It allows us to identify a potential biomarker for Alzheimer’s disease and may also make important contributions to Alzheimer drug testing and development."
Source: Science Daily
7/18/2012
Metabolic syndrome, a term used to describe a combination of risk factors that often lead to heart disease and type 2 diabetes, seems to be linked to lower blood flow to the brain, according to research by the University of Wisconsin School of Medicine and Public Health.
Dr. Barbara Bendlin, researcher for the Wisconsin Alzheimer’s Disease Research Center and an assistant professor of medicine (geriatrics) at the UW School of Medicine and Public Health, said study participants with multiple risk factors connected to metabolic syndrome, including abdominal obesity, high blood pressure, high blood sugar and high cholesterol averaged 15 percent less blood flow to the brain than those in a control group, according to results of brain scans to measure cerebral blood flow.
"We thought the cerebral blood flow measurements of the metabolic syndrome group would be lower, but it was striking how much lower it was," said Bendlin.
Although lower blood flow could result in an eventual reduction in memory skills, Bendlin said it is not known if people with metabolic syndrome will get Alzheimer’s disease.
"Having metabolic syndrome at middle age does have an effect on the brain, and there is some suggestion that if you have lower blood flow, certain types of memory functions are reduced," she said. "The key will be to follow these people over time, because we want to know if lower blood flow will lead to a gradual loss of memory and cognitive skills. But it’s too early to say if these people will develop Alzheimer’s."
The study, presented today at the Alzheimer’s Association International Conference in Vancouver, British Columbia, involved 71 middle-aged people recruited from the Wisconsin Registry for Alzheimer’s Prevention (WRAP). Of this group, 29 met the criteria for metabolic syndrome and 42 did not.
Bendlin said the next steps will be to conduct additional brain scans on people with metabolic syndrome to get more specifics on why they have reduced cerebral blood flow.
"By comparing people with metabolic syndrome with those who don’t, we don’t know which of the risk factors are worst," she said. "Is having a high blood-glucose level worse than having high blood pressure or is it different than having abdominal obesity? All of these risk factors have been linked to increased risk for dementia, but they are clustered together. If we knew which ones were the worst, those would be the ones to target with specific treatments."
Source: Bio-Medicine
July 18, 2012
Drugs used to treat Attention Deficit Hyperactivity Disorder (ADHD) do not appear to have long-term effects on the brain, according to new animal research from Wake Forest Baptist Medical Center.
As many as five to seven percent of elementary school children are diagnosed with ADHD, a behavioral disorder that causes problems with inattentiveness, over-activity, impulsivity, or a combination of these traits. Many of these children are treated with psychostimulant drugs, and while doctors and scientists know a lot about how these drugs work and their effectiveness, little is known about their long-term effects.
Linda Porrino, Ph.D., professor and chair of the Department of Physiology and Pharmacology, along with fellow professor Michael A. Nader, Ph.D., both of Wake Forest Baptist, and colleagues conducted an animal study to determine what the long-lasting effects may be. Their findings were surprising, said Porrino. “We know that the drugs used to treat ADHD are very effective, but there have always been concerns about the long-lasting effects of these drugs,” Porrino said.
"We didn’t know whether taking these drugs over a long period could harm brain development in some way or possibly lead to abuse of drugs later in adolescence."
Findings from the Wake Forest Baptist research are published online this month in the journal Neuropsychopharmacology.
The researchers studied 16 juvenile non-human primates, whose ages were equivalent to 6-to 10-year-old humans. Eight animals were in the control group that did not receive any drug treatment and the other eight were treated with a therapeutic-level dose of an extended-release form of Ritalin, or methylphenidate (MPH), for over a year, which is equivalent to about four years in children. Imaging of the animals’ brains, both before and after the study, was conducted on both groups to measure brain chemistry and structure. The researchers also looked at developmental milestones to address concerns that ADHD drugs adversely affect physical growth.
Once the MPH treatment and imaging studies were concluded, the animals were given the opportunity to self administer cocaine over several months. Nader measured their propensity to acquire the drug and looked at how rapidly and in what amounts, to provide an index of vulnerability to substance abuse in adolescence. As reported in the research paper, they found no differences between groups – monkeys treated with Ritalin during adolescence were not more vulnerable to later drug use than the control animals.
"After one year of drug therapy, we found no long-lasting effects on the neurochemistry of the brain, no changes in the structure of the developing brain. There was also no increase in the susceptibility for drug abuse later in adolescence," Porrino said. "We were very careful to give the drugs in the same doses that would be given to children. That’s one of the great advantages of our study is that it’s directly translatable to children."
Porrino said non-human primates provide exceptional models for developmental research because they undergo relatively long childhood and adolescent periods marked by hormonal and physiological maturation much like humans.
"Our study showed that long-term therapeutic use of drugs to treat ADHD does not cause long-term negative effects on the developing brain, and importantly, it doesn’t put children at risk for substance abuse later in adolescence," she said.
One of the exciting things about this research, Porrino said, is that a “sister” study was conducted simultaneously at John Hopkins with slightly older aged animals and different drugs and their findings were similar. “We feel very confident of the results because we have replicated each other’s studies within the same time frame and gotten similar results,” she said. “We think that’s pretty powerful and reassuring.”
Provided by Wake Forest University Baptist Medical Center
Source: medicalxpress.com
July 18, 2012
A new guideline released by the American Academy of Neurology recommends several treatments for people with Huntington’s disease who experience chorea—jerky, random, uncontrollable movements that can make everyday activities challenging. The guideline is published in the July 18, 2012, online issue of Neurology.
"Chorea can be disabling, worsen weight loss and increase the risk of falling," said guideline lead author Melissa Armstrong, MD, MSc, with the University of Maryland Department of Neurology and a member of the American Academy of Neurology.
Huntington’s disease is a complex disease with physical, cognitive and behavioral symptoms. The new guideline addresses only one aspect of the disease that may require treatment.
The guideline found that the drugs tetrabenazine (TBZ), riluzole and amantadine can be helpful and the drug nabilone may also be considered to treat chorea. The medications riluzole, amantadine and nabilone are not often prescribed for Huntington’s disease.
"People with Huntington’s disease who have chorea should discuss with their doctors whether treating chorea is a priority. Huntington’s disease is complex with a wide range of sometimes severe symptoms and treating other symptoms may be a higher priority than treating chorea," said Armstrong.
Armstrong adds that it is important for patients to understand that their doctors may try drugs not recommended in this guideline to treat chorea. More research is needed to know if drugs such as those used for psychosis are effective; however, doctors may prescribe them on the basis of past clinical experience.
Provided by American Academy of Neurology
Source: medicalxpress.com
July 18, 2012
Sleep deprivation in the first few hours after exposure to a significantly stressful threat actually reduces the risk of Post-Traumatic Stress Disorder (PTSD), according to a study by researchers from Ben-Gurion University of the Negev (BGU) and Tel Aviv University.
The new study was published in the international scientific journal, Neuropsychopharmacology. It revealed in a series of experiments that sleep deprivation of approximately six hours immediately after exposure to a traumatic event reduces the development of post trauma-like behavioral responses. As a result, sleep deprivation the first hours after stress exposure might represent a simple, yet effective, intervention for PTSD.
The research was conducted by Prof. Hagit Cohen, director of the Anxiety and Stress Research Unit at BGU’s Faculty of Health Sciences, in collaboration with Prof. Joseph Zohar of Tel Aviv University.
Approximately 20 percent of people exposed to a severe traumatic event, such as a car or work accident, terrorist attack or war, cannot normally carry on their lives. These people retain the memory of the event for many years. It causes considerable difficulties in the person’s functioning in daily life and, in extreme cases, may render the individual completely dysfunctional.
"Often those close to someone exposed to a traumatic event, including medical teams, seek to relieve the distress and assume that it would be best if they could rest and "sleep on it," says Prof. Cohen. "Since memory is a significant component in the development of post-traumatic symptoms, we decided to examine the various effects of sleep deprivation immediately after exposure to trauma."
In the experiments, rats that underwent sleep deprivation after exposure to trauma (predator scent stress exposure), later did not exhibit behavior indicating memory of the event, while a control group of rats that was allowed to sleep after the stress exposure did remember, as shown by their post trauma-like behavior.
"As is the case for human populations exposed to severe stress, 15 to 20 percent of the animals develop long-term disruptions in their behavior," says Cohen. "Our research method for this study is, we believe, a breakthrough in biomedical research."
A pilot study in humans is currently being planned. The studies were funded by a Israel Academy of Science and Humanities grant and the Israel Ministry of Health.
Provided by American Associates, Ben-Gurion University of the Negev
Source: medicalxpress.com
July 18, 2012
(Phys.org) — New research at the Hebrew University of Jerusalem sheds light on pluripotency—the ability of embryonic stem cells to renew themselves indefinitely and to differentiate into all types of mature cells. Solving this problem, which is a major challenge in modern biology, could expedite the use of embryonic stem cells in cell therapy and regenerative medicine. If scientists can replicate the mechanisms that make pluripotency possible, they could create cells in the laboratory which could be implanted in humans to cure diseases characterized by cell death, such as Alzheimer’s, Parkinson’s, diabetes and other degenerative diseases.
To shed light on these processes, researchers in the lab of Dr. Eran Meshorer, in the Department of Genetics at the Hebrew University’s Alexander Silberman Institute of Life Sciences, are combining molecular, microscopic and genomic approaches. Meshorer’s team is focusing on epigenetic pathways—which cause biological changes without a corresponding change in the DNA sequence—that are specific to embryonic stem cells.
The molecular basis for epigenetic mechanisms is chromatin, which is comprised of a cell’s DNA and structural and regulatory proteins. In groundbreaking research performed by Shai Melcer, a PhD student in the Meshorer lab, the mechanisms which support an “open” chromatin conformation in embryonic stem cells were examined. The researchers found that chromatin is less condensed in embryonic stem cells, allowing them the flexibility or “functional plasticity” to turn into any kind of cell.
A distinct pattern of chemical modifications of chromatin structural proteins (referred to as the acetylation and methylation of histones) enables a looser chromatin configuration in embryonic stem cells. During the early stages of differentiation, this pattern changes to facilitate chromatin compaction.
But even more interestingly, the authors found that a nuclear lamina protein, lamin A, is also a part of the secret. In all differentiated cell types, lamin A binds compacted domains of chromatin and anchors them to the cell’s nuclear envelope. Lamin A is absent from embryonic stem cells and this may enable the freer, more dynamic chromatin state in the cell nucleus. The authors believe that chromatin plasticity is tantamount to functional plasticity since chromatin is made up of DNA that includes all genes and codes for all proteins in any living cell. Understanding the mechanisms that regulate chromatin function will enable intelligent manipulations of embryonic stem cells in the future.
"If we can apply this new understanding about the mechanisms that give embryonic stem cells their plasticity, then we can increase or decrease the dynamics of the proteins that bind DNA and thereby increase or decrease the cells’ differentiation potential," concludes Dr. Meshorer. “This could expedite the use of embryonic stem cells in cell therapy and regenerative medicine, by enabling the creation of cells in the laboratory which could be implanted in humans to cure diseases characterized by cell death, such as Alzheimer’s, Parkinson’s, diabetes and other degenerative diseases.”
Source: PHYS.ORG
ScienceDaily (July 17, 2012) — The ability of infants to recognize speech is more sophisticated than previously known, researchers in New York University’s Department of Psychology have found. Their study, which appears in the journal Developmental Psychology, showed that infants, as early as nine months old, could make distinctions between speech and non-speech sounds in both humans and animals.

A new study shows that infants, as early as nine months old, could make distinctions between speech and non-speech sounds in both humans and animals. (Credit: © ChantalS / Fotolia)
"Our results show that infant speech perception is resilient and flexible," explained Athena Vouloumanos, an assistant professor at NYU and the study’s lead author. "This means that our recognition of speech is more refined at an earlier age than we’d thought."
It is well-known that adults’ speech perception is fine-tuned — they can detect speech among a range of ambiguous sounds. But much less is known about the capability of infants to make similar assessments. Understanding when these abilities become instilled would shed new light on how early in life we develop the ability to recognize speech.
In order to gauge the aptitude to perceive speech at any early age, the researchers examined the responses of infants, approximately nine months in age, to recorded human and parrot speech and non-speech sounds. Human (an adult female voice) and parrot speech sounds included the words “truck,” “treat,” “dinner,” and “two.” The adult non-speech sounds were whistles and a clearing of the throat while the parrot non-speech sounds were squawks and chirps. The recorded parrot speech sounds were those of Alex, an African Gray parrot that had the ability to talk and reason and whose behaviors were studied by psychology researcher Irene Pepperberg.
Since infants cannot verbally communicate their recognition of speech, the researchers employed a commonly used method to measure this process: looking longer at what they find either interesting or unusual. Under this method, looking longer at a visual paired with a sound may be interpreted as a reflection of recognition. In this study, sounds were paired with a series of visuals: a checkerboard-like image, adult female faces, and a cup.
The results showed that infants listened longer to human speech compared to human non-speech sounds regardless of the visual stimulus, revealing the ability recognize human speech independent of the context.
Their findings on non-human speech were more nuanced. When paired with human-face visuals or human artifacts like cups, the infants listened to parrot speech longer than they did non-speech, such that their preference for parrot speech was similar to their preference for human speech sounds. However, this did not occur in the presence of other visual stimuli. In other words, infants were able to distinguish animal speech from non-speech, but only in some contexts.
"Parrot speech is unlike human speech, so the results show infants have the ability to detect different types of speech, even if they need visual cues to assist in this process," explained Vouloumanos.
Source: Science Daily
ScienceDaily (July 17, 2012) — Johns Hopkins researchers say they have discovered a cause-and-effect relationship between two well-established biological risk factors for schizophrenia previously believed to be independent of one another.
The findings could eventually lead researchers to develop better drugs to treat the cognitive dysfunction associated with schizophrenia and possibly other mental illnesses.
Researchers have long studied the role played in the brain’s neurons by the Disrupted-in-Schizophrenia 1 (DISC1) gene, a mutation with one of the strongest links to an increased risk of developing the debilitating psychiatric illness.
In a study published in the journal Molecular Psychiatry, the laboratory of Mikhail V. Pletnikov, M.D., Ph.D., in collaboration with the laboratory of Solomon H. Snyder, M.D., D.Sc., instead looked at the role the DISC1 gene plays in glia cells known as astrocytes, a kind of support cell in the brain that helps neurons communicate with one another.
"Abnormalities in glia cells could be as important as abnormalities in neuronal cells themselves," says Pletnikov, an associate professor of psychiatry and behavioral sciences at the Johns Hopkins University School of Medicine, and the study’s leader. "Most gene work has been done with neurons. But we also need to understand a lot more about the role that genetic mutations in glia cells play because neuron-glia interaction appears crucial in ensuring the brain operates normally."
Besides the paranoia and hallucinations that characterize the disease, schizophrenics have cognitive deficits, leaving them unable to think clearly or organize their thoughts and behavior.
Previous studies found that one of the roles of astrocytes is to secrete the neurotransmitter D-serine, which helps promote the transmission of glutamate in the brain, believed to be a key to cognitive function. Schizophrenics have decreased glutamate transmission. It appears, Pletnikov says, that people with DISC1 mutations associated with the psychiatric illness are faster to metabolize D-serine, which leads to a decrease in the apparently crucial transmitter.
In clinical trials, other researchers are trying to boost D-serine levels in people with schizophrenia to see if they can boost cognitive function.
In the new study, the Johns Hopkins researchers found that DISC1 is directly involved in regulating the production of D-serine by the enzyme known as serine racemase.
The researchers found that DISC1 normally binds to serine racemase and stabilizes it. The mutant DISC1 in patients with schizophrenia cannot bind with serine racemase, and instead destabilizes and destroys it. The result is a deficiency of D-serine.
The Hopkins researchers bred mice with the mutant DISC1 protein expressed only in astrocytes and, as predicted, the animals had decreased levels of D-serine. These mice also showed abnormal behavior “consistent with schizophrenia,” Pletnikov says. For example, the rodents showed sensitivity to psycho-stimulants that target glutamate transmission. By treating the mice with D-serine, the scientists were able to ameliorate the schizophrenic-like symptoms. Mice without the DISC1 mutation in astrocytes had normal D-serine levels.
Pletnikov says that in the future, researchers hope that they can target the unstable junction between the abnormal DISC1 and serine racemase. If drugs, for example, can be found to increase glutamate transmission in humans, doctors may be able to improve cognitive function in schizophrenics. He says a DISC1 mutation may also be an important risk factor in other psychiatric disorders.
"Abnormal glutamate transmission is believed to be present in patients with bipolar disorder, major depression and possibly anxiety disorders, so our findings could apply to other psychiatric diseases," he says.
Source: Science Daily
ScienceDaily (July 17, 2012) — Scientists have discovered two genetic variants associated with the substantial, rapid weight gain occurring in nearly half the patients treated with antipsychotic medications, according to two studies involving the Centre for Addiction and Mental Health (CAMH).
These results could eventually be used to identify which patients have the variations, enabling clinicians to choose strategies to prevent this serious side-effect and offer more personalized treatment.
"Weight gain occurs in up to 40 per cent of patients taking medications called second-generation or atypical antipsychotics, which are used because they’re effective in controlling the major symptoms of schizophrenia," says CAMH Scientist Dr. James Kennedy, senior author on the most recent study published online in the Archives of General Psychiatry.
This weight gain can lead to obesity, type 2 diabetes, heart problems and a shortened life span. “Identifying genetic risks leading to these side-effects will help us prescribe more effectively,” says Dr. Kennedy, head of the new Tanenbaum Centre for Pharmacogenetics, which is part of CAMH’s Campbell Family Mental Health Research Institute. Currently, CAMH screens for two other genetic variations that affect patients’ responses to psychiatric medications.
Each study identified a different variation near the melanocortin-4 receptor (MC4R) gene, which is known to be linked to obesity.
In the Archives of General Psychiatry study, people carrying two copies of a variant gained about three times as much weight as those with one or no copies, after six to 12 weeks of treatment with atypical antipsychotics. (The difference was approximately 6 kg versus 2 kg.) The study had four patient groups: two from the U.S., one in Germany and one from a larger European study.
"The weight gain was associated with this genetic variation in all these groups, which included pediatric patients with severe behaviour or mood problems, and patients with schizophrenia experiencing a first episode or who did not respond to other antipsychotic treatments," says CAMH Scientist Dr. Daniel Müller. "The results from our genetic analysis combined with this diverse set of patients provide compelling evidence for the role of this MC4R variant. Our research group has discovered other gene variants associated with antipsychotic-induced weight gain in the past, but this one appears to be the most compelling finding thus far."
Three of the four groups had never previously taken atypical antipsychotics. Different groups were treated with drugs such as olanzapine, risperidone, aripiprazole or quetiapine, and compliance was monitored to ensure the treatment regime was followed. Weight and other metabolic-related measures were taken at the start and during treatment.
A genome-wide association study was conducted on pediatric patients by the study’s lead researcher, Dr. Anil Malhotra, at the Zucker Hillside Hospital in Glen Oaks, NY. In this type of study, variations are sought across a person’s entire set of genes to identify those associated with a particular trait. The result pointed to the MC4R gene.
This gene’s role in antipsychotic-induced weight gain had been identified in a CAMH study published earlier this year in The Pharmacogenomics Journal, involving Drs. Müller and Kennedy, and conducted by PhD student Nabilah Chowdhury. They found a different variation on MC4R that was linked to the side-effect.
For both studies, CAMH researchers did genotyping experiments to identify the single changes to the sequence of the MC4R gene — known as single nucleotide polymorphisms (SNPs) — related to the drug-induced weight gain side-effect.
The MC4R gene encodes a receptor involved in the brain pathways regulating weight, appetite and satiety. “We don’t know exactly how the atypical antipsychotics disrupt this pathway, or how this variation affects the receptor,” says Dr. Müller. “We need further studies to validate this result and eventually turn this into a clinical application.”
Source: Science Daily
ScienceDaily (July 17, 2012) — Researchers at the University of Colorado School of Medicine have found a drug that boosts memory function in those with Down syndrome, a major milestone in the treatment of this genetic disorder that could significantly improve quality of life.
"Before now there had never been any positive results in attempts to improve cognitive abilities in persons with Down syndrome through medication," said Alberto Costa, MD, Ph.D., who led the four- year study at the CU School of Medicine. "This is the first time we have been able to move the needle at all and that means improvement is possible."
The study was published July 17 in the journal Translational Psychiatry.
Costa, an associate professor of medicine, and his colleagues studied 38 adolescents and young adults with Down syndrome. Half took the drug memantine, used to treat Alzheimer’s disease, and the others took a placebo.
Costa’s research team hypothesized that memantine, which improved memory in mice with Down syndrome, could increase test scores of young adults with the disorder in the area of spatial and episodic memory, functions associated with the hippocampus region of the brain.
Participants underwent a 16-week course of either memantine or a placebo while scientists compared the adaptive and cognitive function of the two groups.
ScienceDaily (July 17, 2012) — A buildup of sodium in the brain detected by magnetic resonance imaging (MRI) may be a biomarker for the degeneration of nerve cells that occurs in patients with multiple sclerosis (MS), according to a new study published online in the journal Radiology.
The study found that patients with early-stage MS showed sodium accumulation in specific brain regions, while patients with more advanced disease showed sodium accumulation throughout the whole brain. Sodium buildup in motor areas of the brain correlated directly to the degree of disability seen in the advanced-stage patients.
"A major challenge with multiple sclerosis is providing patients with a prognosis of disease progression," said Patrick Cozzone, Ph.D., director emeritus of the Center for Magnetic Resonance in Biology and Medicine, a joint unit of National Center for Scientific Research (CNRS) and Aix-Marseille University in Marseille, France. "It’s very hard to predict the course of the disease."
In MS, the body’s immune system attacks the protective sheath (called myelin) that covers nerve cells, or neurons, in the brain and spinal cord. The scarring affects the neurons’ ability to conduct signals, causing neurological and physical disability. The type and severity of MS symptoms, as well as the progression of the disease, vary from one patient to another.
Dr. Cozzone, along with Wafaa Zaaraoui, Ph.D., research officer at CNRS, Jean-Philippe Ranjeva, Ph.D., professor in neuroscience at Aix-Marseille University and a European team of interdisciplinary researchers used 3 Tesla (3T) sodium MRI to study relapsing-remitting multiple sclerosis (RRMS), the most common form of the disease in which clearly defined attacks of worsening neurologic function are followed by periods of recovery. Sodium MRI produces images and information on the sodium content of cells in the body.
"We collaborated for two years with chemists and physicists to develop techniques to perform 3T sodium MRI on patients," Dr. Zaaraoui said. "To better understand this disease, we need to probe new molecules. The time has come for probing brain sodium concentrations."
Using specially developed hardware and software, the researchers conducted sodium MRI on 26 MS patients, including 14 with early-stage RRMS (less than five years in duration) and 12 with advanced disease (longer than five years), and 15 age- and sex-matched control participants.
In the early-stage RRMS patients, sodium MRI revealed abnormally high concentrations of sodium in specific brain regions, including the brainstem, cerebellum and temporal pole. In the advanced-stage RRMS patients, abnormally high sodium accumulation was widespread throughout the whole brain, including normal appearing brain tissue.
"In RRMS patients, the amount of sodium accumulation in gray matter associated with the motor system was directly correlated to the degree of patient disability," Dr. Zaaraoui said.
Current treatments for MS are only able to slow the progress of the disease. The use of sodium accumulation as a biomarker of neuron degeneration may assist pharmaceutical companies in developing and assessing potential treatments.
"Brain sodium MR imaging can help us to better understand the disease and to monitor the occurrence of neuronal injury in MS patients and possibly in patients with other brain disorders," Dr. Ranjeva said.
Source: Science Daily
ScienceDaily (July 17, 2012) — Using adult stem cells, Johns Hopkins researchers and a consortium of colleagues nationwide say they have generated the type of human neuron specifically damaged by Parkinson’s disease (PD) and used various drugs to stop the damage.
Their experiments on cells in the laboratory, reported in the July 4 issue of the journal Science Translational Medicine, could speed the search for new drugs to treat the incurable neurodegenerative disease, but also, they say, may lead them back to better ways of using medications that previously failed in clinical trials.
"Our study suggests that some failed drugs should actually work if they were used earlier, and especially if we could diagnose PD before tremors and other symptoms first appear," says one of the study’s leaders, Ted M. Dawson, M.D., Ph.D., a professor of neurology at the Johns Hopkins University School of Medicine.
Dawson and his colleagues, working as part of a National Institute of Neurological Disorders and Stroke consortium, created three lines of induced pluripotent stem (iPS) cells derived from the skin cells of adults with PD. Two of the cell lines had the mutated LRKK2 gene, a hallmark of the most common genetic cause of PD. Induced pluripotent stem cells are adult cells that have been genetically reprogrammed to their most primitive state. Under the right circumstances, they can develop into most or all of the 200 cell types in the human body.
ScienceDaily (July 17, 2012) — A stroke can weaken one side of the body, raising the dangerous possibility of unstable walking and debilitating falls. Physical therapy can help patients learn to shift their body weight slightly to the weaker, stroke-affected side to help regain balance, but for some patients, the weakness returns after their therapy ends.
University of Illinois at Chicago physical therapy professor Alexander Aruin has developed an inexpensive, simple way to deal with the problem, training the brain to rebalance body weight using a simple shoe insole he calls a “compelled body weight shift.” It slightly lifts and tilts the body toward the stroke-affected side, restoring balance without the patient having to think about it.
Aruin along with colleagues at UIC and Marianjoy Rehabilitation Hospital in Wheaton, Ill., studied two patient groups: one group at UIC who just had strokes, and one at Marianjoy who had strokes over a year ago.
"We tried a purely biomechanical approach," Aruin said. "We mechanically lifted the healthy side so the patient cannot resist. The mechanics force body weight to where it is distributed almost 50/50. When patients ambulate in such a condition, they learn how to bear weight equally through both extremities. It’s quite simple."
The two test groups followed slightly different protocols and were tested for various lengths of time. Their results were measured against those of control groups, who did not get the small therapeutic shoe insole, which measures less than half an inch thick. patients in all groups also received standard post-stroke physical therapy.
After the testing period ended, patients stopped using the insole. About three months afterward they were tested again to see if they retained the ability to keep their balance. Aruin and his colleagues found that physical therapy helped both the insole-user and control groups, but the insole group got an added boost.
"They showed more symmetrical body weight distribution and bore more weight on their affected side, and their gait velocity improved," he said. "The outcome looks promising. The technique is very simple and inexpensive and has potential, which is exciting."
Aruin hopes other physical therapists use the simple devices on stroke patients to see if they too benefit from it. His associates are also considering ways to use the insole to improve posture in post-stroke patients.
Source: Science Daily
ScienceDaily (July 17, 2012) — Scientists at the California Institute of Technology (Caltech) pioneered the study of the link between irregularities in the immune system and neurodevelopmental disorders such as autism a decade ago. Since then, studies of postmortem brains and of individuals with autism, as well as epidemiological studies, have supported the correlation between alterations in the immune system and autism spectrum disorder.

Scientists at Caltech pioneered the study of the link between irregularities in the immune system and neurodevelopmental disorders such as autism a decade ago. Since then, studies of postmortem brains and of individuals with autism, as well as epidemiological studies, have supported the correlation between alterations in the immune system and autism spectrum disorder. (Credit: Elaine Hsiao)
What has remained unanswered, however, is whether the immune changes play a causative role in the development of the disease or are merely a side effect. Now a new Caltech study suggests that specific changes in an overactive immune system can indeed contribute to autism-like behaviors in mice, and that in some cases, this activation can be related to what a developing fetus experiences in the womb.
The results appear in a paper this week in the Proceedings of the National Academy of Sciences (PNAS).
"We have long suspected that the immune system plays a role in the development of autism spectrum disorder," says Paul Patterson, the Anne P. and Benjamin F. Biaggini Professor of Biological Sciences at Caltech, who led the work. "In our studies of a mouse model based on an environmental risk factor for autism, we find that the immune system of the mother is a key factor in the eventual abnormal behaviors in the offspring."
The first step in the work was establishing a mouse model that tied the autism-related behaviors together with immune changes. Several large epidemiological studies — including one that involved tracking the medical history of every person born in Denmark between 1980 and 2005 — have found a correlation between viral infection during the first trimester of a mother’s pregnancy and a higher risk for autism spectrum disorder in her child. To model this in mice, the researchers injected pregnant mothers with a viral mimic that triggered the same type of immune response a viral infection would.
"In mice, this single insult to the mother translates into autism-related behavioral abnormalities and neuropathologies in the offspring," says Elaine Hsiao, a graduate student in Patterson’s lab and lead author of the PNAS paper.
The team found that the offspring exhibit the core behavioral symptoms associated with autism spectrum disorder — repetitive or stereotyped behaviors, decreased social interactions, and impaired communication. In mice, this translates to such behaviors as compulsively burying marbles placed in their cage, excessively self grooming, choosing to spend time alone or with a toy rather than interacting with a new mouse, or vocalizing ultrasonically less often or in an altered way compared to typical mice.
Next, the researchers characterized the immune system of the offspring of mothers that had been infected and found that the offspring display a number of immune changes. Some of those changes parallel those seen in people with autism, including decreased levels of regulatory T cells, which play a key role in suppressing the immune response. Taken together, the observed immune alterations add up to an immune system in overdrive — one that promotes inflammation.
"Remarkably, we saw these immune abnormalities in both young and adult offspring of immune-activated mothers," Hsiao says. "This tells us that a prenatal challenge can result in long-term consequences for health and development."
With the mouse model established, the group was then able to test whether the offspring’s immune problems contribute to their autism-related behaviors. In the most revealing test of this hypothesis, the researchers were able to correct many of the autism-like behaviors in the offspring of immune-activated mothers by giving the offspring a bone-marrow transplant from typical mice. The normal stem cells in the transplanted bone marrow not only replenished the immune system of the host animals but altered their autism-like behavioral impairments.
The researchers emphasize that because the work was conducted in mice, the results cannot be readily extrapolated to humans, and they certainly do not suggest that bone-marrow transplants should be considered as a treatment for autism. They also have yet to establish whether it was the infusion of stem cells or the bone-marrow transplant procedure itself — complete with irradiation — that corrected the behaviors.
However, Patterson says, the results do suggest that immune irregularities in children could be an important target for innovative immune manipulations in addressing the behaviors associated with autism spectrum disorder. By correcting these immune problems, he says, it might be possible to ameliorate some of the classic developmental delays seen in autism.
In future studies, the researchers plan to examine the effects of highly targeted anti-inflammatory treatments on mice that display autism-related behaviors and immune changes. They are also interested in considering the gastrointestinal (GI) bacteria, or microbiota, of such mice. Coauthor Sarkis Mazmanian, a professor of biology at Caltech, has shown that gut bacteria are intimately tied to the function of the immune system. He and Patterson are investigating whether changes to the microbiota of these mice might also influence their autism-related behaviors.
Source: Science Daily
July 17, 2012
(Medical Xpress) — You’re headed out the door and you realize you don’t have your car keys. After a few minutes of rifling through pockets, checking the seat cushions and scanning the coffee table, you find the familiar key ring and off you go. Easy enough, right? What you might not know is that the task that took you a couple seconds to complete is a task that computers — despite decades of advancement and intricate calculations — still can’t perform as efficiently as humans: the visual search.

Pictured is part of the research team in front of the magnetic resonance imaging device at the UCSB Brain Imaging Center. From left to right: researcher Tim Preston; associate professor of psychological and brain sciences Barry Giesbrecht; and professor of psychological and brain sciences Miguel P. Eckstein. Not pictured: Koel Das, now a faculty member at the Indian Institute of Science in Bangalore, Karnatka, India; and lead author Fei Guo, now in the software industry. Credit: UCSB
"Our daily lives are comprised of little searches that are constantly changing, depending on what we need to do," said Miguel Eckstein, UC Santa Barbara professor of psychological and brain sciences and co-author of the recently released paper "Feature-Independent Neural Coding of Target Detection during Search of Natural Scenes," published in the Journal of Neuroscience. "So the idea is, where does that take place in the brain?"
A large part of the human brain is dedicated to vision, with different parts involved in processing the many visual properties of the world. Some parts are stimulated by color, others by motion, yet others by shape.
However, those parts of the brain tell only a part of the story. What Eckstein and co-authors wanted to determine was how we decide whether the target object we are looking for is actually in the scene, how difficult the search is, and how we know we’ve found what we wanted.
They found their answers in the dorsal frontoparietal network, a region of the brain that roughly corresponds to the top of one’s head, and is also associated with properties such as attention and eye movements. In the parts of the human brain used earlier in the processing stream, regions stimulated by specific features like color, motion, and direction are a major part of the search. However, in the dorsal frontoparietal network, activity is not confined to any specific features of the object.
"It’s flexible," said Eckstein. Using 18 observers, an MRI machine, and hundreds of photos of scenes flashed before the observers with instructions to look for certain items, the scientists monitored their subjects’ brain activity. By watching the intraparietal sulcus (IPS), located within the dorsal frontoparietal network, the researchers were able to note not only whether their subjects found the objects, but also how confident they were in their finds.
The IPS region would be stimulated even if the object was not there, said Eckstein, but the pattern of activity would not be the same as it would had the object actually existed in the scene. The pattern of activity was consistent, even though the 368 different objects the subjects searched for were defined by very different visual features. This, Eckstein said, indicates that IPS did not rely on the presence of any fixed feature to determine the presence or absence of various objects. Other visual regions did not show this consistent pattern of activity across objects.
"As you go further up in processing, the neurons are less interested in a specific feature, but they’re more interested in whatever is behaviorally relevant to you at the moment," said Eckstein. Thus, a search for an apple, for instance, would make red, green, and rounded shapes relevant. If the search was for your car keys, the interparietal sulcus would now be interested in gold, silver, and key-type shapes and not interested in green, red, and rounded shapes.
"For visual search to be efficient, we want those visual features related to what we are looking for to elicit strong responses in our brain and not others that are not related to our search, and are distracting," Eckstein added. "Our results suggest that this is what is achieved in the intraparietal sulcus, and allows for efficient visual search."
For Eckstein and colleagues, these findings are just the tip of the iceberg. Future research will dig more deeply into the seemingly simple yet essential ability of humans to do a visual search and how they can use the layout of a scene to guide their search.
"What we’re trying to really understand is what other mechanisms or strategies the brain has to make searches efficient and easy," said Eckstein. "What part of the brain is doing that?"
Provided by University of California - Santa Barbara
Source: medicalxpress.com
July 16, 2012 By Maureen Salamon
(HealthDay) — Evidence is building that poor sleep patterns may do more than make you cranky: The amount and quality of shuteye you get could be linked to mental deterioration and Alzheimer’s disease, four new studies suggest.

Inadequate shuteye associated with mental decline in four new studies.
Too little or too much sleep was equated with two years’ brain aging in one study. A separate study concluded that people with sleep apnea — disrupted breathing during sleep — were more than twice as likely to develop mild thinking problems or dementia compared to problem-free sleepers. Yet another suggests excessive daytime sleepiness may predict diminished memory and thinking skills, known as cognitive decline, in older people.
"Whether sleep changes, such as sleep apnea or disturbances, are signs of a decline to come or the cause of decline is something we don’t know, but these four studies … shed further light that this is an area we need to look into more," said Heather Snyder, senior associate director of medical and scientific relations for the Alzheimer’s Association in Chicago, who was not involved in the studies.
The studies are scheduled for presentation Monday at the Alzheimer’s Association annual meeting in Vancouver.
The largest of the studies, which examined data on more than 15,000 women in the U.S. Nurses’ Health Study, suggested that those who slept five hours a day or less, or nine hours a day or more, had lower average mental functioning than participants who slept seven hours per day. Too much or too little sleep was cognitively equivalent to aging by two years, according to the research, which followed the women over 14 years beginning in middle age.
The study also observed that women whose sleep duration changed by two hours or more a day from mid- to later life had worse brain function than participants with no change in sleep duration — a finding that held true regardless of how long they usually slept at the beginning of the study.
ScienceDaily (July 16, 2012) — Post-traumatic stress disorder (PTSD) is more treatable than previously thought. A novel method has shown to be remarkably effective. The method, called Narrative Exposure Therapy (NET), is an intervention aimed at reducing symptoms of post-traumatic stress.
In an on-going Norwegian study, exposure therapy has been used with asylum seekers and refugees who have survived the ordeal of torture.
"According to previous studies, these patients do not benefit from traditional psychological therapy. In our study, however, 60 per cent show a marked improvement, and approximately 20 per cent show no symptoms of PTSD after treatment," says Håkon Stenmark, a PhD candidate at the Norwegian University of Science and Technology’s Department of Neuroscience, and has conducted the study in collaboration with colleague and fellow PhD candidate Joar Øverås Halvorsen.
Describing traumatic events
"Narrative" simply means telling a story. In exposure therapy the patient constructs a narration of his life while focusing on a detailed report of traumatic experiences. In a typical therapy session, the patient is given a rope to symbolize his or her life, from early childhood up to the present date.
The patient then describes the events in his life, good and bad, in chronological order. For every good memory the patient places a flower on the rope, and for every bad memory, a stone.
"I was blindfolded and seated in the prison’s interrogation room. I received multiple blows all over my body, and had no way of anticipating where I would be beaten next, the patient recalls with great difficulty."
The therapist is sitting at the opposite end of the table, listening attentively. Everything is written down, as it might prove useful later. The written account may be used in an application for asylum, or even as documentation for Amnesty International.
"Electrodes were fastened to my toes, and I was told I would be given electric shocks. The next thing I knew, a skinny man with a cigarette in his mouth turned the nob. The pain was excruciating, and my whole body tensed up."
"This is just one example. Although the patients are of different nationalities, and have been subjected to different kinds of torture, they share similar stories," Stenmark says.
Flashbacks and learning problems
Torture can result in a range of symptoms, depending on the method of torture as well as the duration of the ordeal. Nonetheless, symptoms typically fall into three main categories: ‘Reliving’ the event, avoidance and arousal. “A patients who is reliving torture may have flashbacks of the event, or episodes of repeated nightmares. Avoidance reactions are typically displayed as an extreme fear of the police or anybody who might resemble the abuser. People with these symptoms will try to isolate themselves and avoid people in general. Symptoms of arousal may result in difficulties concentrating, irritability, or having trouble falling or staying asleep,” Stenmark explains.
The classic symptom of PTSD is an inability to concentrate. As a consequence, sufferers often have learning difficulties and end up losing their jobs.
The brain’s “alarm system”
Existing trials are showing promising results with regards to exposure therapy. But why the method works in the first place, and the exact mechanisms behind it, have yet to be verified.
The most prominent theory is that exposure therapy changes the way fear is ‘wired’ in the memory. Simply stated, there is a part of the brain known as the brain’s ‘alarm system’, which enables us to respond to dangerous stimuli.
"During therapy the patient describes the traumatic event in a safe setting, while re-experiencing his or her emotions. In the process, the patient learns that the memories are not dangerous in themselves. The event was threatening when it occurred, but the memory the patient has today is not," Stenmark explains.
The goal of exposure therapy is to reduce the overall symptoms of PTSD, thereby increasing levels of functioning. Stenmark stresses that this is especially important for asylum seekers and refugees, as they often face additional challenges in Norwegian society.
Narrative exposure therapy was developed by trauma specialists working in refugee camps in Africa and Asia. To date, exposure therapy is not widely used in other parts of the world, which makes Øverås Halvorsen and Stenmark’s study the largest of its kind in the western world.
Source: Science Daily
ScienceDaily (July 16, 2012) — While Spider-Man is capturing the imagination of theatergoers, real-life spider men in Upstate New York are working intently to save a young boy’s life.

UB researchers are developing a treatment for muscular dystrophy using a peptide found in the venom of a Chilean rose tarantula. (Credit: Image courtesy of University at Buffalo)
It all began in 2009, when Jeff Harvey, a stockbroker from the Buffalo suburbs, discovered that his grandson, JB, had Duchenne muscular dystrophy. The disease is fatal. It strikes only boys, causing their muscles to waste away.
Hoping to help his grandson, Harvey searched Google for promising muscular dystrophy treatments and, in a moment of serendipity, stumbled upon University at Buffalo scientist Frederick Sachs, PhD.
Sachs was a professor of physiology and biophysics who had been studying the medical benefits of venom. In the venom of the Chilean rose tarantula, he and his colleagues discovered a protein that held promise for keeping muscular dystrophy at bay. Specifically, the protein helped stop muscle cells from deteriorating.
Within months of getting in touch, Harvey and Sachs co-founded Tonus Therapeutics, a pharmaceutical company devoted to developing the protein as a drug. Though the treatment has yet to be tested in humans, it has helped dystrophic mice gain strength in preliminary experiments.
The therapy is not a cure. But if it works in humans, it could extend the lives of children like JB for years — maybe even decades.
Success can’t come quickly enough.
JB, now four, can’t walk down the stairs alone. When he runs, he waddles. He receives physical therapy and takes steroids as a treatment. While playing tee ball one recent day, he confided to his grandfather, “When I grow up, I want to be a baseball player.” It was a heartbreaking moment.
"Oh, I would be thrilled if you could be a baseball player," Harvey remembers replying. He’s doing everything he can to make sure that JB — and other boys like him — can live out their dreams.
Source: Science Daily
July 16th, 2012
Spinal Muscular Atrophy affects one in 6,000 children and has no known cure.
A team of University of Missouri researchers has found that introducing a missing gene into the central nervous system could help extend the lives of patients with Spinal Muscular Atrophy (SMA) – the leading genetic cause of infantile death in the world.
SMA is a rare genetic disease that is inherited by one in 6,000 children who often die young because there is no cure. Children who inherit SMA are missing a gene that produces a protein which directs nerves in the spine to give commands to muscles.
The MU team, led by Christian Lorson, professor in the Department of Veterinary Pathobiology and the Department of Molecular Microbiology and Immunology, introduced the missing gene into mice born with SMA through two different methods: intravenously and directly into the mice’s central nervous systems. While both methods were effective in extending the lives of the mice, Lorson found that introducing the missing gene directly into the central nervous system extended the lives of the mice longer.

Mice born with spinal muscular atrophy typically only live five or six days. Researchers introduced the SMN gene into the mice’s central nervous systems and were able to extend their lives 10-25 days longer. The mice in the picture have spinal muscular atrophy.
“Typically, mice born with SMA only live five or six days, but by introducing the missing SMN gene into the mice’s central nervous systems, we were able to extend their lives 10-25 days longer than SMA mice who go untreated,” said Lorson, who works in the MU Bond Life Sciences Center and the College of Veterinary Medicine. “While this system is still not perfect, what our study did show is that the direct administration of the missing gene into the central nervous system provides some degree of rescue and a profound extension of survival.”
There are several different types of SMA that appear in humans, depending on the age that symptoms begin to appear. Lorson believes that introducing the missing gene through the central nervous system is a way to potentially treat humans regardless of what SMA type they have.
“This is a treatment method that is very close to being a reality for human patients,” Lorson said. “Clinical trials of SMA treatment using gene therapy are likely to begin in next 12-18 months, barring any unforeseen problems.”
Source: Neuroscience News
July 16, 2012
Can you teach an old dog (or human) new tricks? Yes, but it might take time, practice, and hard work before he or she gets it right, according to Hans Schroder and colleagues from Michigan State University in the US. Their work shows that when rules change, our attempts to control our actions are accompanied by a loss of attention to detail. Their work is published online in the Springer journal Cognitive, Affective, & Behavioral Neuroscience.
In order to adapt to changing conditions, humans need to be able to modify their behavior successfully. Overriding the rules we adhere to on a daily basis requires substantial attention and effort, and we do not always get it right the first time. When we switch between two or more tasks, we are slower and more likely to commit errors, which suggests switching tasks is a costly process. This may explain why it is so hard to learn from our mistakes when rules change.
The authors explain: “Switching the rules we use to perform a task makes us less aware of our mistakes. We therefore have a harder time learning from them. That’s because switching tasks is mentally taxing and costly, which leads us to pay less attention to the detail and therefore make more mistakes.”
A total of 67 undergraduates took part in the study. They were asked to wear a cap, which recorded electrical activity in the brain. They then performed a computer task that is easy to make mistakes on. Specifically, the participants were shown letter strings like “MMMMM” or “NNMNN” and were told to follow a simple rule: if ‘M’ is in the middle, press the left button; if ‘N’ is in the middle, press the right button. After they had followed this rule for almost 50 trials, they were instructed to perform the same task, but with the rules reversed i.e. now if ‘M’ is in the middle, press the right button; and if ‘N’ is in the middle, press the left button.
When the rules were reversed, participants made more consecutive errors. They were more likely to get it wrong twice in a row. This showed they were less apt to bounce back and learn from their mistakes. Reversing the rules also produced greater control-related and less error-awareness brain activity.
These results suggest that when rules are reversed, our brain works harder to juggle the two rules - the new rule and the old rule - and stay focused on the new rule. When we spend brain energy juggling these two rules, we have less brain power available for recognizing our mistakes.
Provided by Springer
Source: medicalxpress.com
ScienceDaily (July 16, 2012) — Far from processing every word we read or hear, our brains often do not even notice key words that can change the whole meaning of a sentence, according to new research from the Economic and Social Research Council (ESRC).
After a plane crash, where should the survivors be buried?
If you are considering where the most appropriate burial place should be, you are not alone. Scientists have found that around half the people asked this question, answer it as if they were being asked about the victims not the survivors.
Similarly, when asked “Can a man marry his widow’s sister?” most people answer “yes” — effectively answering that it would indeed be possible for a dead man to marry his bereaved wife’s sister.
What makes researchers particularly interested in people’s failure to notice words that actually don’t make sense, so called semantic illusions, is that these illusions challenge traditional models of language processing which assume that we build understanding of a sentence by deeply analysing the meaning of each word in turn.
Instead semantic illusions provide a strong line of evidence that the way we process language is often shallow and incomplete.
Professor Leuthold at University of Glasgow led a study using electroencephalography (EEG) to explore what is happening in our brains when we process sentences containing semantic illusions.
By analysing the patterns of brain activity when volunteers read or listened to sentences containing hard-to-detect semantic anomalies — words that fit the general context even though they do not actually make sense — the researchers found that when a volunteer was tricked by the semantic illusion, their brain had not even noticed the anomalous word.
Analyses of brain activity also revealed that we are more likely to use this type of shallow processing under conditions of higher cognitive load — that is, when the task we are faced with is more difficult or when we are dealing with more than one task at a time.
The research findings not only provide a better understanding of the processes involved in language comprehension but, according to Professor Leuthold, knowing what is happening in the brain when mistakes occur can help us to avoid the pitfalls,such as missing critical information in textbooks or legal documents, and communicate more effectively.
There are a number of tricks we can use to make sure we get the correct message across: “We know that we process a word more deeply if it is emphasised in some way. So, for example in a news story, a newsreader can stress important words that may otherwise be missed and these words can be italicised to make sure we notice them when reading,” said Professor Leuthold.
The way we construct sentences can also help reduce misunderstandings, he explained: “It’s a good idea to put important information first because we are more likely to miss unusual words when they are near the end of a sentence. Also, we often use an active sentence construction such as ‘Bob ate the apple’ because we make far more mistakes answering questions about a sentence with a passive construction — for example ‘The apple was eaten by Bob’.”
The study findings also suggest that we should avoid multi-tasking when we are reading or listening to an important message: “For example, talking to someone on the phone while driving on a busy motorway or in town, or doing some homework while listening to the newsmight lead to more shallow processing,” said Professor Leuthold.
Source: Science Daily
July 16, 2012
A nationwide consortium of scientists at 20 institutions, led by a principal faculty member at the Harvard Stem Cell Institute (HSCI), has used stem cells to take a major step toward developing personalized medicine to treat Parkinson’s disease.

This study points the way to screening patients with Parkinson’s for their particular variation of the disease, and then treating them with drugs shown effective to work on that variation, rather than trying to treat all patients with the same drugs, as is generally done now, notes Ole Isacson, a leader of the study. Credit: B. D. Colen/Harvard Staff
In part supported by the Harvard Miller Consortium for the Development of Nervous System Therapies, the team of scientists created induced pluripotent stem cells (iPS cells) from the skin cells of patients and at-risk individuals carrying genetic mutations implicated in Parkinson’s disease, and used those cells to derive neural cells, providing a platform for studying the disease in human cells outside of patients.
In a paper published in the journal Science Translational Medicine, the researchers report that although approximately 15 genetic mutations are linked to forms of Parkinson’s, many seem to affect the mitochondria, the cell unit that produces most of its energy.
“This is the first comprehensive study of how human neuronal cells can be models of Parkinson’s, and how it might be treated,” said Ole Isacson, a leader of the study, an HSCI principal faculty member, and a Harvard Medical School professor of neurology, based at McLean Hospital’s Neuroregeneration Laboratory.
The researchers determined that certain compounds or drugs could reverse some signs of disease in the cultured cells with specific genetic mutations, and not in cells with other types of mutations, making real the concept of developing drugs that would be prescribed to patients or individuals at risk for Parkinson’s.
The study was launched with federal stimulus funding provided by the National Institutes of Health (NIH) and was continued with funding from HSCI.
“These findings suggest new opportunities for clinical trials of Parkinson’s disease, wherein cell reprogramming technology could be used to identify the patients most likely to respond to a particular intervention,” said Margaret Sutherland, a program director at NIH’s National Institute of Neurological Disorders and Stroke, in a press release.
The new research indicates that compounds that previously have shown promise in treating Parkinson’s in animal studies, including the antioxidant coenzyme Q10, together with the immunosuppressant rapamycin, have differing levels of effectiveness on various genetic forms of Parkinson’s.
Researchers hope that such findings can provide the basis for more specific drugs for individuals with sporadic forms of Parkinson’s.
As Isacson explained in an interview, this study points the way to screening patients with Parkinson’s for their particular variation of the disease, and then treating them with drugs shown effective to work on that variation, rather than trying to treat all patients with the same drugs, as is generally done now.
“We believe that using human stem cells to study the disease is the correct way to go,” Isacson said. “We have the cell type most vulnerable to the disease in a dish. We can study the most vulnerable cells and compare them to the least vulnerable cells. Traditionally, in neurology,” he said, “all patients with the same disease get the same drugs. But they may have the disease for different reasons. This gives us a way to tease out those different reasons, and find different ways to treat them.”
Provided by Harvard University
Source: medicalxpress.com
July 16, 2012
For more than 20 years, doctors have been using cells from blood that remains in the placenta and umbilical cord after childbirth to treat a variety of illnesses, from cancer and immune disorders to blood and metabolic diseases.

This microscope image shows a colony of neurons derived from cord-blood cells using stem cell reprogramming technology. The green and red glow indicates that the cells are producing protein makers found in neurons, evidence that the cord-blood cells did in fact morph into neurons. The blue glow marks the nuclei of the neurons. Credit: Image: Courtesy of Alessandra Giorgetti
Now, scientists at the Salk Institute for Biological Studies have found a new way-using a single protein, known as a transcription factor-to convert cord blood (CB) cells into neuron-like cells that may prove valuable for the treatment of a wide range of neurological conditions, including stroke, traumatic brain injury and spinal cord injury.
The researchers demonstrated that these CB cells, which come from the mesoderm, the middle layer of embryonic germ cells, can be switched to ectodermal cells, outer layer cells from which brain, spinal and nerve cells arise. “This study shows for the first time the direct conversion of a pure population of human cord blood cells into cells of neuronal lineage by the forced expression of a single transcription factor,” says Juan Carlos Izpisua Belmonte, a professor in Salk’s Gene Expression Laboratory, who led the research team. The study, a collaboration with Fred H. Gage, a professor in Salk’s Laboratory of Genetics, and his team, was published on July 16 in the Proceedings of the National Academy of Sciences.
"Unlike previous studies, where multiple transcription factors were necessary to convert skin cells into neurons, our method requires only one transcription factor to convert CB cells into functional neurons," says Gage.
The Salk researchers used a retrovirus to introduce Sox2, a transcription factor that acts as a switch in neuronal development, into CB cells. After culturing them in the laboratory, they discovered colonies of cells expressing neuronal markers. Using a variety of tests, they determined that the new cells, called induced neuronal-like cells (iNC), could transmit electrical impulses, signaling that the cells were mature and functional neurons. Additionally, they transferred the Sox2-infused CB cells to a mouse brain and found that they integrated into the existing mouse neuronal network and were capable of transmitting electrical signals like mature functional neurons.
"We also show that the CB-derived neuronal cells can be expanded under certain conditions and still retain the ability to differentiate into more mature neurons both in the lab and in a mouse brain," says Mo Li, a scientist in Belmonte’s lab and a co-first author on the paper with Alessandra Giorgetti, of the Center for Regenerative Medicine, in Barcelona, and Carol Marchetto of Gage’s lab. "Although the cells we developed were not for a specific lineage-for example, motor neurons or mid-brain neurons-we hope to generate clinically relevant neuronal subtypes in the future."
Importantly, says Marchetto, “We could use these cells in the future for modeling neurological diseases such as autism, schizophrenia, Parkinson’s or Alzheimer’s disease.”
Cord blood cells, says Giorgetti, offer a number of advantages over other types of stem cells. First, they are not embryonic stem cells and thus they are not controversial. They are more plastic, or flexible, than adult stem cells from sources like bone marrow, which may make them easier to convert into specific cell lineages. The collection of CB cells is safe and painless and poses no risk to the donor, and they can be stored in blood banks for later use.
"If our protocol is developed into a clinical application, it could aid in future cell-replacement therapies," says Li. "You could search all the cord blood banks in the country to look for a suitable match."
Provided by Salk Institute
Source: medicalxpress.com
ScienceDaily (July 16, 2012) — A team of scientists at The New York Stem Cell Foundation (NYSCF) Laboratory led by Scott Noggle, PhD, NYSCF-Charles Evans Senior Research Fellow for Alzheimer’s Disease, has developed the first cell-based model of Alzheimer’s disease (AD) by reprogramming skin cells of Alzheimer’s patients to become brain cells that are affected in Alzheimer’s. This will allow researchers to work directly on living brain cells suffering from Alzheimer’s, which until now had not been possible. Andrew Sproul, PhD, a postdoctoral associate in Dr. Noggle’s laboratory, will present this work on July 19 at the Alzheimer’s Association International Conference (AAIC) held in Vancouver.
Dr. Noggle and his team reprogrammed skin cell samples taken from twelve patients diagnosed with early-onset Alzheimer’s and from healthy, genetically related individuals into induced pluripotent stem (iPS) cells, which can differentiate into any cell type. The team of scientists used these iPS cells to create cholinergic basal forebrain neurons, the brain cells that are affected in Alzheimer’s. These cells recapitulate the features and cellular-level functions of patients suffering from Alzheimer’s, a devastating disease that affects millions of people globally but for which there is currently no effective treatment.
NYSCF has pioneered the creation of disease models based on the derivation of human cells. Four years ago, a NYSCF-funded team created a cell-based model for ALS, or motor neuron disease, the first patient-specific stem cells created for any disease. The cell-based model for Alzheimer’s builds on this earlier work.
"Patient derived AD cells will prove invaluable for future research advances, as they already have with patient-derived ALS cells," said NYSCF CEO Susan Solomon. "They will be a critical tool in the drug discovery process, as potential drugs could be tested directly on these cells. Although research on animals has provided valuable insight into AD, we aren’t mice, and animals don’t properly reflect the features of the disease we are trying to cure. As we work to find new drugs and treatments our research should focus on actual human sufferers of Alzheimer’s disease," emphasized Ms. Solomon
This cell-based model has already led to important findings. Preliminary results of this NYSCF research, done in collaboration with Sam Gandy, MD, PhD, an international expert in the pathology of Alzheimer’s at Mount Sinai School of Medicine, demonstrated differences in cellular function in Alzheimer’s patients. Specifically, Alzheimer’s neurons produce more of the toxic form of beta amyloid, the protein fragment that makes up amyloid plaques, than in disease-free neurons.
"iPS cell technology, along with whole genome sequencing, provide our best chance at unravelling the causes of common forms of Alzheimer’s disease," noted Dr. Gandy.
"This collaboration is a great example of how NYSCF is bringing together experts in stem cell technology and clinicians to save and enhance lives by finding better treatments," Ms. Solomon explained.
The research to be reported at the AAIC by Andrew Sproul focused on stem cell models of individuals with presenilin-1 (PSEN1) mutations, a genetic cause of AD. As Dr. Sproul has said, this cell-based model could “revolutionize how we discover drugs to potentially cure Alzheimer’s disease.”
Source: Science Daily
July 16, 2012
Activity lingers longer in certain areas of the brain in those with Alzheimer’s than it does in healthy people, Mayo Clinic researchers who created a map of the brain found. The results suggest varying brain activity may reduce the risk of Alzheimer’s disease. The study, “Non-stationarity in the “Resting Brain’s” Modular Architecture,” was presented at the Alzheimer’s Association International Conference and recently published in the journal PLoS One.
Researchers compared brain activity to a complex network, with multiple objects sharing information along pathways.
"Our understanding of those objects and pathways is limited," says lead author David T. Jones, M.D. "There are regions in the brain that correspond to those objects, and we are not really clear exactly what those are. We need a good mapping or atlas of those regions that make up the network in the brain, which is part of what we were doing in this study."
Researchers examined 892 cognitively normal people taking part in the Mayo Clinic Study of Aging, and set out to create an active map of their brains while the people were “at rest,” not engaged in a specific task. To do this, they employed task-free, functional magnetic resonance imaging to construct an atlas of 68 functional regions of the brain, which correspond to the cities on the road map.
Researchers filled in the roads between these regions by creating dynamic graphic representations of brain connectivity within a sliding time window.
This analysis revealed that there were many roads that could be used to exchange information in the brain, and the brain uses different roads at different times. The question to answer then, said Dr. Jones, is whether or not Alzheimer’s patients used this map and these roads in a different way than their healthy peers.
"What we found in this study was that Alzheimer’s patients tended to spend more time using some roads and less time using other roads, biasing one over the other," he says.
While more research is needed, the researchers say one implication is that how we use our brains may protect us from Alzheimer’s. Dr. Jones says exercise, education, and social contacts may help balance activity in the brain.
"Diversifying the mental space that you explore may actually decrease your risk for Alzheimer’s," he says.
Provided by Mayo Clinic
Source: medicalxpress.com