Neuroscience

Articles and news from the latest research reports.

Posts tagged science

14 notes

Front-most part of the cortex involved in making short-term predictions about what will happen next

June 19, 2012

Researchers at the University of Iowa, together with colleagues from the California Institute of Technology and New York University, have discovered how a part of the brain helps predict future events from past experiences. The work sheds light on the function of the front-most part of the frontal lobe, known as the frontopolar cortex, an area of the cortex uniquely well developed in humans in comparison with apes and other primates.

The image shows the overlap of lesions for eight subjects superimposed on a template brain — red indicates maximum overlap (seven subjects) and dark blue is minimum overlap (one subject). The patient group was selected for lesions that include frontopolar cortex, but the lesions almost invariably extended outside to other parts of anterior prefrontal cortex. Credit: Christopher Kovach, University of Iowa

Making the best possible decisions in a changing and unpredictable environment is an enormous challenge. Not only does it require learning from past experience, but it also demands anticipating what might happen under previously unencountered circumstances. Past research from the UI Department of Neurology was among the first to show that damage to certain parts of the frontal lobe can cause severe deficits in decision making in rapidly changing environments. The new study from the same department on a rare group of patients with damage to the very frontal part of their brains reveals a critical aspect of how this area contributes to decision making. The findings were published June 19 in the Journal of Neuroscience.

"We gave the patients four slot machines from which to pick in order to win money. Unbeknownst to the patients, the probability of getting money from a particular slot machine gradually and unpredictably changed during the experiment. Finding the strategy that pays the most in the long run is a surprisingly difficult problem to solve, and one we hypothesized would require the frontopolar cortex,” explains Christopher Kovach, Ph.D., a UI post-doctoral fellow in neurosurgery and first author of the study.

Contrary to the authors’ initial expectation, the patients actually did quite well on the task, winning as much money, on average, as healthy control participants.

"But when we compared their behavior to that of subjects with intact frontal lobe, we found they used a different set of assumptions about how the payoffs changed over time,” Kovach says. “Both groups based their decisions on how much they had recently won from each slot machine, but healthy comparison subjects pursued a more elaborate strategy, which involved predicting the direction that payoffs were moving based on recent trends. This points towards a specific role for the frontopolar cortex in extrapolating recent trends.”

Kovach’s colleague and study author Ralph Adolphs, Ph.D., professor of neuroscience and psychology at the California Institute of Technology, adds that the study results “argue that the frontopolar cortex helps us to make short-term predictions about what will happen next, a strategy particularly useful in environments that change rapidly — such as the stock market or most social settings.”

Adolphs also hold an adjunct appointment in the UI Department of Neurology.

The study’s innovative approach to understanding the function of this part of the brain uses model-based analyses of behavior of patients with specific and precisely characterized areas of brain damage. These patients are members of the UI’s world-renowned Iowa Neurological Patient Registry, which was established in 1982 and has more than 500 active members with selective forms of damage, or lesions, to one or two defined regions in the brain.

"The University of Iowa is one of the few places in the world where you could carry out this kind of study, since it requires carefully assessed patients with damage to specific parts of their brain," says study author Daniel Tranel, Ph.D., UI professor of neurology and psychology and director of the UI Division of Behavioral Neurology and Cognitive Neuroscience.

In a final twist to the finding, the strategy taken by lesion patients was actually slightly better than the one used by comparison subjects. It happened that the task was designed so that the trends in the payoffs were, in fact, random and uninformative.

"The healthy comparison subjects seemed to perceive trends in what was just random noise," Kovach says.

This implies that the functions of the frontopolar cortex, which support more complex and detailed models of the environment, at times come with a downside: setting up mistaken assumptions.

"To the best of my knowledge this is the first study which links a normal tendency to see a nonexistent pattern in random noise, a type of cognitive bias, to a particular brain region," Kovach notes.

The researchers next want to investigate other parts of the frontal cortex in the brain, and have also begun to record activity directly from the brains of neurosurgical patients to see how single cells respond while making decisions. The work is also important to understand difficulties in decision making seen in disorders such as addiction.

Provided by University of Iowa

Source: medicalxpress.com

Filed under science neuroscience brain psychology

10 notes

First example of a heritable abnormality affecting semantic cognition found

June 19, 2012

Four generations of a single family have been found to possess an abnormality within a specific brain region which appears to affect their ability to recall verbal material, a new study by researchers at the University of Bristol and University College London has found.

This is the first suggestion of a heritable abnormality in otherwise healthy humans, and this has important implications for our understanding of the genetic basis of cognition.

Dr Josie Briscoe of Bristol’s School of Experimental Psychology and colleagues at the Institute of Child Health in London studied eight members of a single family (aged 8 years), who despite all having high levels of intelligence have since childhood, experienced profound difficulties in recalling sentences and prose, and language difficulties in listening comprehension and naming less common objects .

While their conversation is articulate and engaging, they can experience the inability to ‘find’ a particular word or topic – a phenomenon similar to the ‘tip-of-the-tongue’ problem experienced by many people. They also report associated problems such as struggling to follow a narrative thread while reading or watching television drama.

Dr Briscoe said: “With their consent, we conducted a number of standard memory and language tests on the affected members of the family. These showed they had difficulty repeating longer sentences correctly and learning words in lists and pairs. This suggests their difficulties lie in semantic cognition: the way people construct and generate meaning from words, objects and ideas.”

"Given the very wide variation in age, the coherence of their difficulties in semantic cognition was remarkable."

The researchers also used Magnetic Resonance Imaging (MRI) to study the brains of the affected family members and found they had reduced grey matter in the posterior inferior portion of the temporal lobe, a brain area known to be involved in semantic cognition.

Dr Briscoe said: “These brain abnormalities were surprising to find in healthy people, particularly in the same family, although similar brain regions have been implicated in research with older adults with neurological problems that are linked to semantic cognition”

"Our findings have uncovered a potential causal link between anomalous neuroanatomy and semantic cognition in a single family. Importantly, the pattern of inheritance appears as a potentially dominant trait. This may well prove to be the first example of a heritable, highly specific abnormality affecting semantic cognition in humans.”

Provided by University of Bristol

Source: medicalxpress.com

Filed under science neuroscience brain psychology

11 notes

'Hallucinating' robots arrange objects for human use

June 18, 2012 By Bill Steele

(Phys.org) — If you hire a robot to help you move into your new apartment, you won’t have to send out for pizza. But you will have to give the robot a system for figuring out where things go. The best approach, according to Cornell researchers, is to ask “How will humans use this?”

A robot populates a room with imaginary human stick figures in order to decide where objects should go to suit the needs of humans.

Researchers in the Personal Robotics Lab of Ashutosh Saxena, assistant professor of computer science, have already taught robots to identify common objects, pick them up and place them stably in appropriate locations. Now they’ve added the human element by teaching robots to “hallucinate” where and how humans might stand, sit or work in a room, and place objects in their usual relationship to those imaginary people.

Their work will be reported at the International Symposium on Experimental Robotics, June 21 in Quebec, and the International Conference of Machine Learning, June 29 in Edinburgh, Scotland.

Previous work on robotic placement, the researchers note, has relied on modeling relationships between objects. A keyboard goes in front of a monitor, and a mouse goes next to the keyboard. But that doesn’t help if the robot puts the monitor, keyboard and mouse at the back of the desk, facing the wall.

Above left, random placing of objects in a scene puts food on the floor, shoes on the desk and a laptop teetering on the top of the fridge. Considering the relationships between objects (upper right) is better, but he laptop is facing away from a potential user and the food higher than most humans would like. Adding human context (lower left) makes things more accessible. Lower right: how an actual robot carried it out. (Personal Robotics Lab)

Relating objects to humans not only avoids such mistakes but also makes computation easier, the researchers said, because each object is described in terms of its relationship to a small set of human poses, rather than to the long list of other objects in a scene. A computer learns these relationships by observing 3-D images of rooms with objects in them, in which it imagines human figures, placing them in practical relationships with objects and furniture. You don’t don’t put a sitting person where there is no chair. You can put a sitting person on top of a bookcase, but there are no objects there for the person to use, so that”s ignored. It The computer calculates the distance of objects from various parts of the imagined human figures, and notes the orientation of the objects.

Eventually it learns commonalities: There are lots of imaginary people sitting on the sofa facing the TV, and the TV is always facing them. The remote is usually near a human’s reaching arm, seldom near a standing person’s feet. “It is more important for a robot to figure out how an object is to be used by humans, rather than what the object is. One key achievement in this work is using unlabeled data to figure out how humans use a space,” Saxena said.

In a new situation the a robot places human figures in a 3-D image of a room, locating them in relation to objects and furniture already there. “It puts a sample of human poses in the environment, then figures out which ones are relevant and ignores the others,” Saxena explained. It decides where new objects should be placed in relation to the human figures, and carries out the action.

The researchers tested their method using images of living rooms, kitchens and offices from the Google 3-D Warehouse, and later, images of local offices and apartments. Finally, they programmed a robot to carry out the predicted placements in local settings. Volunteers who were not associated with the project rated the placement of each object for correctness on a scale of 1 to 5.

Comparing various algorithms, the researchers found that placements based on human context were more accurate than those based solely in relationships between objects, but the best results of all came from combining human context with object-to-object relationships, with an average score of 4.3. Some tests were done in rooms with furniture and some objects, others in rooms where only a major piece of furniture was present. The object-only method performed significantly worse in the latter case because there was no context to use. “The difference between previous works and our [human to object] method was significantly higher in the case of empty rooms,” Saxena reported.

Provided by Cornell University

Source: phys.org

Filed under science neuroscience robotics

13 notes

Robots Get a Feel for the World

June 18th, 2012

Robots equipped with tactile sensor able to identify materials through touch, paving the way for more useful prostheses.

What does a robot feel when it touches something? Little or nothing until now. But with the right sensors, actuators and software, robots can be given the sense of feel, or at least the ability to identify different materials by touch.

Researchers at the University of Southern California’s Viterbi School of Engineering published a study today in Frontiers in Neurorobotics showing that a specially designed robot can outperform humans in identifying a wide range of natural materials according to their textures, paving the way for advancements in prostheses, personal assistive robots and consumer product testing.

The robot was equipped with a new type of tactile sensor built to mimic the human fingertip. It also used a newly designed algorithm to make decisions about how to explore the outside world by imitating human strategies. Capable of other human sensations, the sensor can also tell where and in which direction forces are applied to the fingertip and even the thermal properties of an object being touched.

Like the human finger, the group’s BioTac® sensor has a soft, flexible skin over a liquid filling. The skin even has fingerprints on its surface, greatly enhancing its sensitivity to vibration. As the finger slides over a textured surface, the skin vibrates in characteristic ways. These vibrations are detected by a hydrophone inside the bone-like core of the finger. The human finger uses similar vibrations to identify textures, but the robot finger is even more sensitive.

[Video: Robots Get a Feel for the World]
What does a robot feel when it touches something? Little or nothing until now. Researchers at the USC Viterbi School of Engineering publish a study in Frontiers in Neurorobotics showing that specially designed robots can be taught to feel even more than humans. Vimeo video by USC Viterbi. USC Viterbi.

When humans try to identify an object by touch, they use a wide range of exploratory movements based on their prior experience with similar objects. A famous theorem by 18th century mathematician Thomas Bayes describes how decisions might be made from the information obtained during these movements. Until now, however, there was no way to decide which exploratory movement to make next. The article, authored by Professor of Biomedical Engineering Gerald Loeb and recently graduated doctoral student Jeremy Fishel, describes their new theorem for solving this general problem as “Bayesian Exploration.”

Built by Fishel, the specialized robot was trained on 117 common materials gathered from fabric, stationery and hardware stores. When confronted with one material at random, the robot could correctly identify the material 95% of the time, after intelligently selecting and making an average of five exploratory movements. It was only rarely confused by pairs of similar textures that human subjects making their own exploratory movements could not distinguish at all.

Tactile sensors which mimic finger tips enables robots to identify materials through touch better than humans. Image from press release by USC Viterbi School of Engineering.

So, is touch another task that humans will outsource to robots? Fishel and Loeb point out that while their robot is very good at identifying which textures are similar to each other, it has no way to tell what textures people will prefer. Instead, they say this robot touch technology could be used in human prostheses or to assist companies who employ experts to assess the feel of consumer products and even human skin.

Source: Neuroscience News

Filed under science neuroscience robotics

11 notes

Children, Brain Development and the Criminal Law

ScienceDaily (June 18, 2012) — The legal system needs to take greater account of new discoveries in neuroscience that show how a difficult childhood can affect the development of a young person’s brain which can increase the risk adolescent crimes, according to researchers.

The research will be presented as part of an Economic and Social Research Council seminar series in conjunction with the Parliamentary Office of Science and Technology.

Neuroscientists have recently shown that early adversity — such as a very chaotic and frightening home life — can result in a young child becoming hyper vigilant to potential threats in their environment. This appears to influence the development of brain connectivity and functions.

Such children may come to adolescence with brain systems that are set differently, and this may increase their likelihood of taking impulsive risks. For many young offenders such early adversity is a common experience, and it may increase both their vulnerability to mental health problems and also their risk of problem behaviours.

These insights, from a team led by Dr Eamon McCrory, University College London, are part of a wave of neuroscientific research questions that have potential implications for the legal system.

Other research by Dr Seena Fazel of Oxford University has shown that while social disadvantage is a major risk factor for offending, a Traumatic Brain Injury (TBI) — from an accident or assault — significantly increases the risk of involvement in violent crime. Professor Huw Williams, at University of Exeter, has similarly shown that around 45 per cent of young offenders have TBI histories, and more injuries are associated with greater violence.

Professor Williams said: “The latest message from neuroscience is that young people who suffer troubled childhoods may experience a kind of ‘triple whammy’. A difficult social background may put them at greater risk of offending and influence their brain development early on in childhood in a way that increases risky behaviour. This can then increase their chances of experiencing an injury to their brains that would compromise their ability to stay in school or contribute to society still further.”

Professor Williams wants to see better communication between neuroscientists, clinicians and lawyers so that research findings like these lead to changes in the legal system. “There is a big gap between research conducted by neuroscientists and the realities of the day to day work of the justice system,” he said. “Although criminal behaviour results from a complex interplay of a host of factors, neuroscientists and clinicians are identifying key risk factors that — if addressed — could reduce crime. Investment in earlier, focussed interventions may offset the costs of years of custody and social violence.”

Dr Eileen Vizard, a prominent adolescent forensic psychiatrist, will talk at the event Neuroscience, Children and the Law, about how the criminal justice system needs to be changed to age appropriate sentencing for children as young as ten years old, whilst also providing for the welfare needs of these deprived children. Laura Hoyano — a leading expert on vulnerable people in criminal courts — will discuss the problems children face when testifying in criminal courts.

Source: Science Daily

Filed under science neuroscience psychology brain

13 notes

Clues to Nervous System Evolution Found in Nerve-Less Sponge

ScienceDaily (June 18, 2012) — UC Santa Barbara scientists turned to the simple sponge to find clues about the evolution of the complex nervous system and found that, but for a mechanism that coordinates the expression of genes that lead to the formation of neural synapses, sponges and the rest of the animal world may not be so distant after all. Their findings, titled “Functionalization of a protosynaptic gene expression network,” are published in the Proceedings of the National Academy of Sciences.

The genes of Amphimedon queenslandica, a marine sponge native to the Great Barrier Reef, Australia, have been fully sequenced, allowing the researchers to monitor gene expression for signs of neural development. (Credit: UCSB)

"If you’re interested in finding the truly ancient origins of the nervous system itself, we know where to look," said Kenneth Kosik, Harriman Professor of Neuroscience Research in the Department of Molecular, Cellular & Developmental Biology, and co-director of UCSB’s Neuroscience Research Institute.

That place, said Kosik, is the evolutionary period of time when virtually the rest of the animal kingdom branched off from a common ancestor it shared with sponges, the oldest known animal group with living representatives. Something must have happened to spur the evolution of the nervous system, a characteristic shared by creatures as simple as jellyfish and hydra to complex humans, according to Kosik.

A previous sequencing of the genome of the Amphimedon queenslandica — a sponge that lives in Australia’s Great Barrier Reef — showed that it contained the same genes that lead to the formation of synapses, the highly specialized characteristic component of the nervous system that sends chemical and electrical signals between cells. Synapses are like microprocessors, said Kosik explaining that they carry out many sophisticated functions: They send and receive signals, and they also change behaviors with interaction — a property called “plasticity.”

"Specifically, we were hoping to understand why the marine sponge, despite having almost all the genes necessary to build a neuronal synapse, does not have any neurons at all," said the paper’s first author, UCSB postdoctoral researcher Cecilia Conaco, from the UCSB Department of Molecular, Cellular, and Developmental Biology (MCDB) and Neuroscience Research Institute (NRI). "In the bigger scheme of things, we were hoping to gain an understanding of the various factors that contribute to the evolution of these complex cellular machines."

This time the scientists, including Danielle Bassett, from the Department of Physics and the Sage Center for the Study of the Mind, and Hongjun Zhou and Mary Luz Arcila, from NRI and MCDB, examined the sponge’s RNA (ribonucleic acid), a macromolecule that controls gene expression. They followed the activity of the genes that encode for the proteins in a synapse throughout the different stages of the sponge’s development.

"We found a lot of them turning on and off, as if they were doing something," said Kosik. However, compared to the same genes in other animals, which are expressed in unison, suggesting a coordinated effort to make a synapse, the ones in sponges were not coordinated.

"It was as if the synapse gene network was not wired together yet," said Kosik. The critical step in the evolution of the nervous system as we know it, he said, was not the invention of a gene that created the synapse, but the regulation of preexisting genes that were somehow coordinated to express simultaneously, a mechanism that took hold in the rest of the animal kingdom.

The work isn’t over, said Kosik. Plans for future research include a deeper look at some of the steps that lead to the formation of the synapse; and a study of the changes in nervous systems after they began to evolve.

"Is the human brain just a lot more of the same stuff, or has it changed in a qualitative way?" he asked.

Source: Science Daily

Filed under science neuroscience evolution psychology nervous system

2 notes

Diabetes, poor glucose control associated with greater cognitive decline in older adults

June 18, 2012

Among well-functioning older adults without dementia, diabetes mellitus (DM) and poor glucose control among those with DM are associated with worse cognitive function and greater cognitive decline, according to a report published Online First by Archives of Neurology, a JAMA Network publication.

Findings from previous studies have suggested an association between diabetes mellitus and an increased risk of cognitive impairment and dementia, including Alzheimer disease, but this association continues to be debated and less is known regarding incident DM in late life and cognitive function over time, the authors write as background in the study.

Kristine Yaffe, M.D., of the University of California, San Francisco and the San Francisco VA Medical Center, and colleagues evaluated 3,069 patients (mean age, 74.2 years; 42 percent black; 52 percent female) who completed the Modified Mini-Mental State Examination (3MS) and Digit Symbol Substitution Test (DSST) at baseline and selected intervals over 10 years.

At study baseline, 717 patients (23.4 percent) had prevalent DM and 2,352 (76.6 percent) were without DM, 159 of whom developed DM during follow-up. Patients who had prevalent DM at baseline had lower 3MS and DSST test scores than patients without DM, and results from analysis show similar patterns for 9-year decline with participants with prevalent DM showing significant decline on both the 3MS and DSST compared with those without DM.

Also, among participants with prevalent DM at baseline, higher levels of hemoglobin A1c (HbA1c) were associated with lower 3MS and DSST scores. However, after adjusting for age, sex, race and education, scores remained significantly lower for those with mid (7 percent to 8 percent) and high (greater than or equal to 8 percent) HbA1c levels on the 3MS but were no longer significant for the DSST.

"This study supports the hypothesis that older adults with DM have reduced cognitive function and that poor glycemic control may contribute to this association,” the authors conclude. “Future studies should determine if early diagnosis and treatment of DM lessen the risk of developing cognitive impairment and if maintaining optimal glucose control helps mitigate the effect of DM on cognition.”

Provided by JAMA and Archives Journals

Source: medicalxpress.com

Filed under science neuroscience brain alzheimer

13 notes

Highways of the brain: High-cost and high-capacity

June 18, 2012

A new study proposes a communication routing strategy for the brain that mimics the American highway system, with the bulk of the traffic leaving the local and feeder neural pathways to spend as much time as possible on the longer, higher-capacity passages through an influential network of hubs, the so-called rich club.

The study, published this week online in the Early Edition of the Proceedings of the National Academy of Sciences, involves researchers from Indiana University and the University Medical Center Utrecht in the Netherlands and advances their earlier findings that showed how select hubs in the brain not only are powerful in their own right but have numerous and strong connections between each other.

The current study characterizes the influential network within the rich club as the “backbone” for global brain communication. A costly network in terms of the energy and space consumed, said Olaf Sporns, professor in the Department of Psychological and Brain Sciences at IU Bloomington, but one with a big pay-off: providing quick and effective communication between billions and billions of brain cells.

"Until now, no one knew how central the brain’s rich club really was," Sporns said. "It turns out the rich club is always right in the middle when it comes to how brain regions talk to each other. It absorbs, transforms and disseminates information. This underscores its importance for brain communication.”

In earlier work, using diffusion imaging, the researchers found a group of 12 strongly interconnected bihemispheric hub regions, comprising the precuneus, superior frontal and superior parietal cortex, as well as the subcortical hippocampus, putamen and thalamus. Together, these regions form the brain’s “rich club.” Most of these areas are engaged in a wide range of complex behavioral and cognitive tasks, rather than more specialized processing such as vision and motor control.

For the current study, Martijn van den Heuvel, a professor at the Rudolf Magnus Institute of Neuroscience at University Medical Center Utrecht, used diffusion tensor imaging data for two sets of 40 healthy subjects to map the large-scale connectivity structure of the brain. The cortical sheet was divided into 1,170 regions, and then pathways between the regions were reconstructed and measured. As in the previous study, the rich club nodes were widely distributed and had up to 40 percent more connectivity compared to other areas.

The connections measured — almost 700,000 in total — were classified in one of three ways: as rich club connections if they connected nodes within the rich club; as feeder connections if they connected a non-rich club node to a rich club node; and as local connections if they connected non-rich club nodes. Rich club connections made up the majority of all long-distance . The study also found that connections classified as rich club connections were used more heavily for communication than other feeder and local connections. A path analysis showed that when a minimally short path is traced from one area of the brain to another, it travels through the rich club network 69 percent of the time, even though the network accounts for only 10 percent of the brain.

A common pattern in communication paths spanning long distances, Sporns said, was that such paths involved sequences of steps leading across local, feeder, rich club, feeder and back to local connections. In other words, he said, many communication paths first traveled toward the rich club before reaching their destinations.

"It is as if the rich club acts as an attractor for signal traffic in the brain," Sporns said. "It soaks up information which is then integrated and sent back out to the rest of the brain."

Van den Heuvel agreed.

"It’s like a big ‘neuronal magnet’ for communication and information integration in our brains," he said. "Seeking out the rich club may offer a strategy for neurons and brain regions to find short communication paths across the brain, and might provide insight into how our brain manages to be so highly efficient."

From an evolutionary standpoint, it was important for the brain to minimize energy consumption and wiring volume, but if these were the only factors, there would be no rich club because of the extra resources it requires, Sporns said. The rich club is expensive, at least in terms of wiring volume, and perhaps also in terms of metabolic cost. The trade-off for higher cost, Sporns said, is higher performance — the integration of diverse signals and the ability to select short paths across the network.

Brain neurons don’t have maps; how do they find paths to get in touch? Perhaps the rich club helps with this, offering the brain’s neurons and regions a way to communicate efficiently based on a routing strategy that involves the rich club.”

People use related strategies to navigate social networks.

"Strangely, neurons may solve their communication problems just like the people to which they belong," Sporns said.

Provided by Indiana University

Source: medicalxpress.com

Filed under science neuroscience brain psychology

11 notes

Coenzyme Q10 study indicates promise in Huntington’s treatment

June 18, 2012

A new study shows that the compound Coenzyme Q10 (CoQ) reduces oxidative damage, a key finding that hints at its potential to slow the progression of Huntington disease. The discovery, which appears in the inaugural issue of the Journal of Huntington’s Disease, also points to a new biomarker that could be used to screen experimental treatments for this and other neurological disorders.

"This study supports the hypothesis that CoQ exerts antioxidant effects in patients with Huntington’s disease and therefore is a treatment that warrants further study," says University of Rochester Medical Center Kevin M. Biglan, M.D., M.P.H., lead author of the study. “As importantly, it has provided us with a new method to evaluate the efficacy of potential new treatments.”

Huntington’s disease (HD) is a genetic, progressive that impacts movement, behavior, cognition, and generally results in death within 20 years of the disease’s onset. While the precise causes and mechanism of the disease are not completely understood, scientists believe that one of the important triggers of the disease is a genetic “" which produces deposits in . It is believed that these deposits – through a chain of molecular events – inhibit the cell’s ability to meet its energy demands resulting in oxidative stress and, ultimately, cellular death.

Scientists had previously identified the correlation between a specific fragment of genetic code, called 8-hydroxy-2’-deoxyguanosine (80HdG) and the presence of oxidative stress in brain cells. 80HdG can be detected in a person’s blood, meaning that it could serve as a convenient and accessible for the disease. Researchers have also been evaluating the compound as a possible treatment for HD because of its ability to support the function of mitochondria – the tiny power plants the provide cells with energy – and counter oxidative stress.

The study’s authors evaluated a series of blood samples of 20 individuals with HD who had previously undergone treatment with CoQ in clinical trial titled Pre-2Care. While these studies showed that CoQ alleviated some symptoms of the disease, it was not known what impact – if any – the treatment had at the molecular level in the brain. Upon analysis, the authors found that 80HdG levels dropped by 20 percent in individuals who had been treated with CoQ.

CoQ is currently being evaluated in a Phase 3 clinical trial, which is the largest therapeutic clinical study to date for HD. The trial – called 2Care – is being run by the Huntington Study Group, an international networks or investigators.

"Identifying treatments that slow the progression or delay the onset of Huntington’s disease is a major focus of the medical community," said Biglan. "This study demonstrates that 80HdG could be an ideal marker to identify the presence oxidative injury and whether or not treatment is having an impact."

Provided by University of Rochester Medical Center

Source: medicalxpress.com

Filed under science neuroscience brain huntington psychology

10 notes

Device implanted in brain has therapeutic potential for Huntington’s disease

June 18, 2012

Studies suggest that neurotrophic factors, which play a role in the development and survival of neurons, have significant therapeutic and restorative potential for neurologic diseases such as Huntington’s disease. However, clinical applications are limited because these proteins cannot easily cross the blood brain barrier, have a short half-life, and cause serious side effects. Now, a group of scientists has successfully treated neurological symptoms in laboratory rats by implanting a device to deliver a genetically engineered neurotrophic factor directly to the brain. They report on their results in the latest issue of Restorative Neurology and Neuroscience.

The tip of the EC biodelivery system, a straw-like device that is implanted in the brain of patients, contains living cells which are genetically modified to produce a therapeutic factor. The membrane enclosing the cells allows the factor to flow out of the device and into the patient’s brain tissue. This way, areas deep within the brain affected by Huntington’s disease can be treated to delay or prevent the disease. Credit: Jens Tornøe, NsGene A/S, Ballerup, Denmark

Researchers used Encapsulated Cell (EC) biodelivery, a platform which can be applied using conventional minimally invasive neurosurgical procedures to target deep brain structures with therapeutic proteins. “Our study adds to the continually increasing body of preclinical and clinical data positioning EC biodelivery as a promising therapeutic delivery method for larger biomolecules. It combines the therapeutic advantages of gene therapy with the well-established safety of a retrievable implant,” says lead investigator Jens Tornøe, NsGene A/S, Ballerup, Denmark.

Investigators made a catheter-like device consisting of a hollow fiber membrane encapsulating a polymeric “scaffold,” which provides a surface area to which neurotrophic factor-producing cells can attach. When implanted in the brain, the membrane allows the neurotrophic factor to flow out of the device, as well as allowing nutrients in. Dr. Tornøe and his colleagues used the neurotrophic factor Meteorin, which plays a role in the development of striatal projection neurons, whose degeneration is a hallmark of Huntington’s disease. The scientists engineered ARPE-19 cells to produce Meteorin and used those that produced high levels of Meteorin in their experiment.

The EC biodelivery devices were implanted in the brains of rats followed by injection with quinolinic acid (QA), a potent neurotoxin that causes excitotoxicity, a component of Huntington’s disease. They tested three different implant types: devices filled with the high-producing ARPE-19 cells (EC-Meteorin), devices with unmodified ARPE-19 cells (ARPE-19), and devices without cells. Motor dysfunction was tested immediately prior to injection with QA and at two and four weeks after injection.

The research team found that the EC-Meteorin devices significantly protected against QA-induced toxicity. Rats with EC-Meteorin devices manifested near normal neurological performance and significantly reduced loss of brain cells from the QA injection compared to controls. Analysis of the Meteorin-treated brains showed a markedly reduced striatal lesion size. The EC biodelivery devices were found to produce stable or even increasing levels of Meteorin throughout the study. Meteorin diffused readily from the biodelivery device to the striatal tissue.

"Huntington’s disease can be diagnosed with high accuracy by genetic testing. Pre-symptomatic administration of a safe therapeutic treatment providing sustained delay or prevention of disease would be of great benefit to patients," says Dr. Tornøe. "With additional functional and safety data, tests in animals larger than the rat to study distribution, and more accurate disease models to evaluate the therapeutic potential of Meteorin, we anticipate that EC biodelivery can be developed as a platform technology for targeted therapy in patients with Huntington’s disease."

Provided by IOS Press

Source: medicalxpress.com

Filed under science neuroscience brain psychology huntington

free counters