Posts tagged psychology

Posts tagged psychology
June 19, 2012
Why do some people excel in sports, music and managing companies? New research points to uniquely high mind-brain development in those who excel.

“What we have found is an astonishing integration of brain functioning in high performers compared to average-performing controls,” said Fred Travis, Ph.D., director of the Center for Brain, Consciousness, and Cognition at Maharishi University of Management in Fairfield, Iowa.
He claims this research is the “first in the world to show that there is a brain measure of effective leadership.”
In the study, published in the journal Cognitive Processing, researchers found that 20 top-level managers scored higher on three measures — the Brain Integration Scale, Gibbs’s Socio-moral Reasoning questionnaire, and an inventory of peak experiences — compared to 20 low-level managers who served as controls.
“The current understanding of high performance is fragmented,” said co-researcher Harald Harung, Ph.D., of the Oslo and Akershus University College of Applied Sciences in Norway.
“What we have done in our research is to use quantitative and neurophysiological research methods on topics that so far have been dominated by psychology.”
The researchers carried out four studies comparing world-class performers to average performers. This recent study and two others examined top performers in management, sports and classical music. A number of years ago Harung and his colleagues published a study on a variety of professions, such as public administration, management, sports, arts, and education.
The studies include using electroencephalography (EEG) to look at the extent of integration and development of several brain processes.
ScienceDaily (June 19, 2012) — Human brains process large and small numbers of objects using two different mechanisms, but infants have not yet developed the ability to make those two processes work together, according to new research from the University of Missouri.
"This research was the first to show the inability of infants in a single age group to discriminate large and small sets in a single task," said Kristy vanMarle, assistant professor of psychological sciences in the College of Arts and Science. "Understanding how infants develop the ability to represent and compare numbers could be used to improve early education programs."
The MU study found that infants consistently chose the larger of two groups of food items when both sets were larger or smaller than four, just as an adult would. Unlike adults, the infants showed no preference for the larger group when choosing between one large and one small set. The results suggest that at age one infants have not yet integrated the two mental functions: one being the ability to estimate numbers of items at a glance and the other being the ability to visually track small sets of objects.
In vanMarle’s study, 10- to 12-month-old infants were presented with two opaque cups. Different numbers of pieces of breakfast cereal were hidden in each cup, while the infants observed, and then the infants were allowed to choose a cup. Four comparisons were tested between different combinations of large and small sets. Infants consistently chose two food items over one and eight items over four, but chose randomly when asked to compare two versus four and two versus eight.
"Being unable to determine that eight is larger than two would put an organism at a serious disadvantage," vanMarle said. "However, ongoing studies in my lab suggest that the capacity to compare small and large sets seems to develop before age two."
The ability to make judgments about the relative number of objects in a group has old evolutionary roots. Dozens of species, including some fish, monkeys and birds have shown the ability to recognize numerical differences in laboratory studies. VanMarle speculated that being unable to compare large and small sets early in infancy may not have been problematic during human evolution because young children probably received most of their food and protection from caregivers. Infants’ survival didn’t depend on determining which bush had the most berries or how many predators they just saw, she said.
"In the modern world there are educational programs that claim to give children an advantage by teaching them arithmetic at an early age," said vanMarle. "This research suggests that such programs may be ineffective simply because infants are unable to compare some numbers with others."
Source: Science Daily
June 19, 2012
Pathological rage can be blocked in mice, researchers have found, suggesting potential new treatments for severe aggression, a widespread trait characterized by sudden violence, explosive outbursts and hostile overreactions to stress.
In a study appearing today in the Journal of Neuroscience, researchers from the University of Southern California and Italy identify a critical neurological factor in aggression: a brain receptor that malfunctions in overly hostile mice. When the researchers shut down the brain receptor, which also exists in humans, the excess aggression completely disappeared.
The findings are a significant breakthrough in developing drug targets for pathological aggression, a component in many common psychological disorders including Alzheimer’s disease, autism, bipolar disorder and schizophrenia.
"From a clinical and social point of view, reactive aggression is absolutely a major problem," said Marco Bortolato, lead author of the study and research assistant professor of pharmacology and pharmaceutical sciences at the USC School of Pharmacy. “We want to find the tools that might reduce impulsive violence.”
A large body of independent research, including past work by Bortolato and senior author Jean Shih, USC University Professor and Boyd & Elsie Welin Professor in Pharmacology and Pharmaceutical Sciences at USC, has identified a specific genetic predisposition to pathological aggression: low levels of the enzyme monoamine oxidase A (MAO A). Both male humans and mice with congenital deficiency of the enzyme respond violently in response to stress.
"The same type of mutation that we study in mice is associated with criminal, very violent behavior in humans. But we really didn’t understand why that it is," Bortolato said.
Bortolato and Shih worked backwards to replicate elements of human pathological aggression in mice, including not just low enzyme levels but also the interaction of genetics with early stressful events such as trauma and neglect during childhood.
"Low levels of MAO A are one basis of the predisposition to aggression in humans. The other is an encounter with maltreatment, and the combination of the two factors appears to be deadly: it results consistently in violence in adults," Bortolato said.
The researchers show that in excessively aggressive rodents that lack MAO A, high levels of electrical stimulus are required to activate a specific brain receptor in the pre-frontal cortex. Even when this brain receptor does work, it stays active only for a short period of time.
"The fact that blocking this receptor moderates aggression is why this discovery has so much potential. It may have important applications in therapy," Bortolato said. "Whatever the ways environment can persistently affect behavior — and even personality over the long term — behavior is ultimately supported by biological mechanisms."
Importantly, the aggression receptor, known as NMDA, is also thought to play a key role in helping us make sense of multiple, coinciding streams of sensory information, according to Bortolato.
The researchers are now studying the potential side effects of drugs that reduce the activity of this receptor.
"Aggressive behaviors have a profound socio-economic impact, yet current strategies to reduce these staggering behaviors are extremely unsatisfactory," Bortolato said. "Our challenge now is to understand what pharmacological tools and what therapeutic regimens should be administered to stabilize the deficits of this receptor. If we can manage that, this could truly be an important finding."
Provided by University of Southern California
Source: medicalxpress.com
June 19, 2012
Researchers at the University of Iowa, together with colleagues from the California Institute of Technology and New York University, have discovered how a part of the brain helps predict future events from past experiences. The work sheds light on the function of the front-most part of the frontal lobe, known as the frontopolar cortex, an area of the cortex uniquely well developed in humans in comparison with apes and other primates.

The image shows the overlap of lesions for eight subjects superimposed on a template brain — red indicates maximum overlap (seven subjects) and dark blue is minimum overlap (one subject). The patient group was selected for lesions that include frontopolar cortex, but the lesions almost invariably extended outside to other parts of anterior prefrontal cortex. Credit: Christopher Kovach, University of Iowa
Making the best possible decisions in a changing and unpredictable environment is an enormous challenge. Not only does it require learning from past experience, but it also demands anticipating what might happen under previously unencountered circumstances. Past research from the UI Department of Neurology was among the first to show that damage to certain parts of the frontal lobe can cause severe deficits in decision making in rapidly changing environments. The new study from the same department on a rare group of patients with damage to the very frontal part of their brains reveals a critical aspect of how this area contributes to decision making. The findings were published June 19 in the Journal of Neuroscience.
"We gave the patients four slot machines from which to pick in order to win money. Unbeknownst to the patients, the probability of getting money from a particular slot machine gradually and unpredictably changed during the experiment. Finding the strategy that pays the most in the long run is a surprisingly difficult problem to solve, and one we hypothesized would require the frontopolar cortex,” explains Christopher Kovach, Ph.D., a UI post-doctoral fellow in neurosurgery and first author of the study.
Contrary to the authors’ initial expectation, the patients actually did quite well on the task, winning as much money, on average, as healthy control participants.
"But when we compared their behavior to that of subjects with intact frontal lobe, we found they used a different set of assumptions about how the payoffs changed over time,” Kovach says. “Both groups based their decisions on how much they had recently won from each slot machine, but healthy comparison subjects pursued a more elaborate strategy, which involved predicting the direction that payoffs were moving based on recent trends. This points towards a specific role for the frontopolar cortex in extrapolating recent trends.”
Kovach’s colleague and study author Ralph Adolphs, Ph.D., professor of neuroscience and psychology at the California Institute of Technology, adds that the study results “argue that the frontopolar cortex helps us to make short-term predictions about what will happen next, a strategy particularly useful in environments that change rapidly — such as the stock market or most social settings.”
Adolphs also hold an adjunct appointment in the UI Department of Neurology.
The study’s innovative approach to understanding the function of this part of the brain uses model-based analyses of behavior of patients with specific and precisely characterized areas of brain damage. These patients are members of the UI’s world-renowned Iowa Neurological Patient Registry, which was established in 1982 and has more than 500 active members with selective forms of damage, or lesions, to one or two defined regions in the brain.
"The University of Iowa is one of the few places in the world where you could carry out this kind of study, since it requires carefully assessed patients with damage to specific parts of their brain," says study author Daniel Tranel, Ph.D., UI professor of neurology and psychology and director of the UI Division of Behavioral Neurology and Cognitive Neuroscience.
In a final twist to the finding, the strategy taken by lesion patients was actually slightly better than the one used by comparison subjects. It happened that the task was designed so that the trends in the payoffs were, in fact, random and uninformative.
"The healthy comparison subjects seemed to perceive trends in what was just random noise," Kovach says.
This implies that the functions of the frontopolar cortex, which support more complex and detailed models of the environment, at times come with a downside: setting up mistaken assumptions.
"To the best of my knowledge this is the first study which links a normal tendency to see a nonexistent pattern in random noise, a type of cognitive bias, to a particular brain region," Kovach notes.
The researchers next want to investigate other parts of the frontal cortex in the brain, and have also begun to record activity directly from the brains of neurosurgical patients to see how single cells respond while making decisions. The work is also important to understand difficulties in decision making seen in disorders such as addiction.
Provided by University of Iowa
Source: medicalxpress.com
June 19, 2012
Four generations of a single family have been found to possess an abnormality within a specific brain region which appears to affect their ability to recall verbal material, a new study by researchers at the University of Bristol and University College London has found.
This is the first suggestion of a heritable abnormality in otherwise healthy humans, and this has important implications for our understanding of the genetic basis of cognition.
Dr Josie Briscoe of Bristol’s School of Experimental Psychology and colleagues at the Institute of Child Health in London studied eight members of a single family (aged 8 years), who despite all having high levels of intelligence have since childhood, experienced profound difficulties in recalling sentences and prose, and language difficulties in listening comprehension and naming less common objects .
While their conversation is articulate and engaging, they can experience the inability to ‘find’ a particular word or topic – a phenomenon similar to the ‘tip-of-the-tongue’ problem experienced by many people. They also report associated problems such as struggling to follow a narrative thread while reading or watching television drama.
Dr Briscoe said: “With their consent, we conducted a number of standard memory and language tests on the affected members of the family. These showed they had difficulty repeating longer sentences correctly and learning words in lists and pairs. This suggests their difficulties lie in semantic cognition: the way people construct and generate meaning from words, objects and ideas.”
"Given the very wide variation in age, the coherence of their difficulties in semantic cognition was remarkable."
The researchers also used Magnetic Resonance Imaging (MRI) to study the brains of the affected family members and found they had reduced grey matter in the posterior inferior portion of the temporal lobe, a brain area known to be involved in semantic cognition.
Dr Briscoe said: “These brain abnormalities were surprising to find in healthy people, particularly in the same family, although similar brain regions have been implicated in research with older adults with neurological problems that are linked to semantic cognition”
"Our findings have uncovered a potential causal link between anomalous neuroanatomy and semantic cognition in a single family. Importantly, the pattern of inheritance appears as a potentially dominant trait. This may well prove to be the first example of a heritable, highly specific abnormality affecting semantic cognition in humans.”
Provided by University of Bristol
Source: medicalxpress.com
ScienceDaily (June 18, 2012) — The legal system needs to take greater account of new discoveries in neuroscience that show how a difficult childhood can affect the development of a young person’s brain which can increase the risk adolescent crimes, according to researchers.
The research will be presented as part of an Economic and Social Research Council seminar series in conjunction with the Parliamentary Office of Science and Technology.
Neuroscientists have recently shown that early adversity — such as a very chaotic and frightening home life — can result in a young child becoming hyper vigilant to potential threats in their environment. This appears to influence the development of brain connectivity and functions.
Such children may come to adolescence with brain systems that are set differently, and this may increase their likelihood of taking impulsive risks. For many young offenders such early adversity is a common experience, and it may increase both their vulnerability to mental health problems and also their risk of problem behaviours.
These insights, from a team led by Dr Eamon McCrory, University College London, are part of a wave of neuroscientific research questions that have potential implications for the legal system.
Other research by Dr Seena Fazel of Oxford University has shown that while social disadvantage is a major risk factor for offending, a Traumatic Brain Injury (TBI) — from an accident or assault — significantly increases the risk of involvement in violent crime. Professor Huw Williams, at University of Exeter, has similarly shown that around 45 per cent of young offenders have TBI histories, and more injuries are associated with greater violence.
Professor Williams said: “The latest message from neuroscience is that young people who suffer troubled childhoods may experience a kind of ‘triple whammy’. A difficult social background may put them at greater risk of offending and influence their brain development early on in childhood in a way that increases risky behaviour. This can then increase their chances of experiencing an injury to their brains that would compromise their ability to stay in school or contribute to society still further.”
Professor Williams wants to see better communication between neuroscientists, clinicians and lawyers so that research findings like these lead to changes in the legal system. “There is a big gap between research conducted by neuroscientists and the realities of the day to day work of the justice system,” he said. “Although criminal behaviour results from a complex interplay of a host of factors, neuroscientists and clinicians are identifying key risk factors that — if addressed — could reduce crime. Investment in earlier, focussed interventions may offset the costs of years of custody and social violence.”
Dr Eileen Vizard, a prominent adolescent forensic psychiatrist, will talk at the event Neuroscience, Children and the Law, about how the criminal justice system needs to be changed to age appropriate sentencing for children as young as ten years old, whilst also providing for the welfare needs of these deprived children. Laura Hoyano — a leading expert on vulnerable people in criminal courts — will discuss the problems children face when testifying in criminal courts.
Source: Science Daily
ScienceDaily (June 18, 2012) — UC Santa Barbara scientists turned to the simple sponge to find clues about the evolution of the complex nervous system and found that, but for a mechanism that coordinates the expression of genes that lead to the formation of neural synapses, sponges and the rest of the animal world may not be so distant after all. Their findings, titled “Functionalization of a protosynaptic gene expression network,” are published in the Proceedings of the National Academy of Sciences.

The genes of Amphimedon queenslandica, a marine sponge native to the Great Barrier Reef, Australia, have been fully sequenced, allowing the researchers to monitor gene expression for signs of neural development. (Credit: UCSB)
"If you’re interested in finding the truly ancient origins of the nervous system itself, we know where to look," said Kenneth Kosik, Harriman Professor of Neuroscience Research in the Department of Molecular, Cellular & Developmental Biology, and co-director of UCSB’s Neuroscience Research Institute.
That place, said Kosik, is the evolutionary period of time when virtually the rest of the animal kingdom branched off from a common ancestor it shared with sponges, the oldest known animal group with living representatives. Something must have happened to spur the evolution of the nervous system, a characteristic shared by creatures as simple as jellyfish and hydra to complex humans, according to Kosik.
A previous sequencing of the genome of the Amphimedon queenslandica — a sponge that lives in Australia’s Great Barrier Reef — showed that it contained the same genes that lead to the formation of synapses, the highly specialized characteristic component of the nervous system that sends chemical and electrical signals between cells. Synapses are like microprocessors, said Kosik explaining that they carry out many sophisticated functions: They send and receive signals, and they also change behaviors with interaction — a property called “plasticity.”
"Specifically, we were hoping to understand why the marine sponge, despite having almost all the genes necessary to build a neuronal synapse, does not have any neurons at all," said the paper’s first author, UCSB postdoctoral researcher Cecilia Conaco, from the UCSB Department of Molecular, Cellular, and Developmental Biology (MCDB) and Neuroscience Research Institute (NRI). "In the bigger scheme of things, we were hoping to gain an understanding of the various factors that contribute to the evolution of these complex cellular machines."
This time the scientists, including Danielle Bassett, from the Department of Physics and the Sage Center for the Study of the Mind, and Hongjun Zhou and Mary Luz Arcila, from NRI and MCDB, examined the sponge’s RNA (ribonucleic acid), a macromolecule that controls gene expression. They followed the activity of the genes that encode for the proteins in a synapse throughout the different stages of the sponge’s development.
"We found a lot of them turning on and off, as if they were doing something," said Kosik. However, compared to the same genes in other animals, which are expressed in unison, suggesting a coordinated effort to make a synapse, the ones in sponges were not coordinated.
"It was as if the synapse gene network was not wired together yet," said Kosik. The critical step in the evolution of the nervous system as we know it, he said, was not the invention of a gene that created the synapse, but the regulation of preexisting genes that were somehow coordinated to express simultaneously, a mechanism that took hold in the rest of the animal kingdom.
The work isn’t over, said Kosik. Plans for future research include a deeper look at some of the steps that lead to the formation of the synapse; and a study of the changes in nervous systems after they began to evolve.
"Is the human brain just a lot more of the same stuff, or has it changed in a qualitative way?" he asked.
Source: Science Daily
June 18, 2012
A new study proposes a communication routing strategy for the brain that mimics the American highway system, with the bulk of the traffic leaving the local and feeder neural pathways to spend as much time as possible on the longer, higher-capacity passages through an influential network of hubs, the so-called rich club.

The study, published this week online in the Early Edition of the Proceedings of the National Academy of Sciences, involves researchers from Indiana University and the University Medical Center Utrecht in the Netherlands and advances their earlier findings that showed how select hubs in the brain not only are powerful in their own right but have numerous and strong connections between each other.
The current study characterizes the influential network within the rich club as the “backbone” for global brain communication. A costly network in terms of the energy and space consumed, said Olaf Sporns, professor in the Department of Psychological and Brain Sciences at IU Bloomington, but one with a big pay-off: providing quick and effective communication between billions and billions of brain cells.
"Until now, no one knew how central the brain’s rich club really was," Sporns said. "It turns out the rich club is always right in the middle when it comes to how brain regions talk to each other. It absorbs, transforms and disseminates information. This underscores its importance for brain communication.”
In earlier work, using diffusion imaging, the researchers found a group of 12 strongly interconnected bihemispheric hub regions, comprising the precuneus, superior frontal and superior parietal cortex, as well as the subcortical hippocampus, putamen and thalamus. Together, these regions form the brain’s “rich club.” Most of these areas are engaged in a wide range of complex behavioral and cognitive tasks, rather than more specialized processing such as vision and motor control.
For the current study, Martijn van den Heuvel, a professor at the Rudolf Magnus Institute of Neuroscience at University Medical Center Utrecht, used diffusion tensor imaging data for two sets of 40 healthy subjects to map the large-scale connectivity structure of the brain. The cortical sheet was divided into 1,170 regions, and then pathways between the regions were reconstructed and measured. As in the previous study, the rich club nodes were widely distributed and had up to 40 percent more connectivity compared to other areas.
The connections measured — almost 700,000 in total — were classified in one of three ways: as rich club connections if they connected nodes within the rich club; as feeder connections if they connected a non-rich club node to a rich club node; and as local connections if they connected non-rich club nodes. Rich club connections made up the majority of all long-distance neural pathways. The study also found that connections classified as rich club connections were used more heavily for communication than other feeder and local connections. A path analysis showed that when a minimally short path is traced from one area of the brain to another, it travels through the rich club network 69 percent of the time, even though the network accounts for only 10 percent of the brain.
A common pattern in communication paths spanning long distances, Sporns said, was that such paths involved sequences of steps leading across local, feeder, rich club, feeder and back to local connections. In other words, he said, many communication paths first traveled toward the rich club before reaching their destinations.
"It is as if the rich club acts as an attractor for signal traffic in the brain," Sporns said. "It soaks up information which is then integrated and sent back out to the rest of the brain."
Van den Heuvel agreed.
"It’s like a big ‘neuronal magnet’ for communication and information integration in our brains," he said. "Seeking out the rich club may offer a strategy for neurons and brain regions to find short communication paths across the brain, and might provide insight into how our brain manages to be so highly efficient."
From an evolutionary standpoint, it was important for the brain to minimize energy consumption and wiring volume, but if these were the only factors, there would be no rich club because of the extra resources it requires, Sporns said. The rich club is expensive, at least in terms of wiring volume, and perhaps also in terms of metabolic cost. The trade-off for higher cost, Sporns said, is higher performance — the integration of diverse signals and the ability to select short paths across the network.
“Brain neurons don’t have maps; how do they find paths to get in touch? Perhaps the rich club helps with this, offering the brain’s neurons and regions a way to communicate efficiently based on a routing strategy that involves the rich club.”
People use related strategies to navigate social networks.
"Strangely, neurons may solve their communication problems just like the people to which they belong," Sporns said.
Provided by Indiana University
Source: medicalxpress.com
June 18, 2012
A new study shows that the compound Coenzyme Q10 (CoQ) reduces oxidative damage, a key finding that hints at its potential to slow the progression of Huntington disease. The discovery, which appears in the inaugural issue of the Journal of Huntington’s Disease, also points to a new biomarker that could be used to screen experimental treatments for this and other neurological disorders.
"This study supports the hypothesis that CoQ exerts antioxidant effects in patients with Huntington’s disease and therefore is a treatment that warrants further study," says University of Rochester Medical Center neurologist Kevin M. Biglan, M.D., M.P.H., lead author of the study. “As importantly, it has provided us with a new method to evaluate the efficacy of potential new treatments.”
Huntington’s disease (HD) is a genetic, progressive neurodegenerative disorder that impacts movement, behavior, cognition, and generally results in death within 20 years of the disease’s onset. While the precise causes and mechanism of the disease are not completely understood, scientists believe that one of the important triggers of the disease is a genetic “stutter" which produces abnormal protein deposits in brain cells. It is believed that these deposits – through a chain of molecular events – inhibit the cell’s ability to meet its energy demands resulting in oxidative stress and, ultimately, cellular death.
Scientists had previously identified the correlation between a specific fragment of genetic code, called 8-hydroxy-2’-deoxyguanosine (80HdG) and the presence of oxidative stress in brain cells. 80HdG can be detected in a person’s blood, meaning that it could serve as a convenient and accessible biomarker for the disease. Researchers have also been evaluating the compound Coenzyme Q10 as a possible treatment for HD because of its ability to support the function of mitochondria – the tiny power plants the provide cells with energy – and counter oxidative stress.
The study’s authors evaluated a series of blood samples of 20 individuals with HD who had previously undergone treatment with CoQ in clinical trial titled Pre-2Care. While these studies showed that CoQ alleviated some symptoms of the disease, it was not known what impact – if any – the treatment had at the molecular level in the brain. Upon analysis, the authors found that 80HdG levels dropped by 20 percent in individuals who had been treated with CoQ.
CoQ is currently being evaluated in a Phase 3 clinical trial, which is the largest therapeutic clinical study to date for HD. The trial – called 2Care – is being run by the Huntington Study Group, an international networks or investigators.
"Identifying treatments that slow the progression or delay the onset of Huntington’s disease is a major focus of the medical community," said Biglan. "This study demonstrates that 80HdG could be an ideal marker to identify the presence oxidative injury and whether or not treatment is having an impact."
Provided by University of Rochester Medical Center
Source: medicalxpress.com
June 18, 2012
Studies suggest that neurotrophic factors, which play a role in the development and survival of neurons, have significant therapeutic and restorative potential for neurologic diseases such as Huntington’s disease. However, clinical applications are limited because these proteins cannot easily cross the blood brain barrier, have a short half-life, and cause serious side effects. Now, a group of scientists has successfully treated neurological symptoms in laboratory rats by implanting a device to deliver a genetically engineered neurotrophic factor directly to the brain. They report on their results in the latest issue of Restorative Neurology and Neuroscience.

The tip of the EC biodelivery system, a straw-like device that is implanted in the brain of patients, contains living cells which are genetically modified to produce a therapeutic factor. The membrane enclosing the cells allows the factor to flow out of the device and into the patient’s brain tissue. This way, areas deep within the brain affected by Huntington’s disease can be treated to delay or prevent the disease. Credit: Jens Tornøe, NsGene A/S, Ballerup, Denmark
Researchers used Encapsulated Cell (EC) biodelivery, a platform which can be applied using conventional minimally invasive neurosurgical procedures to target deep brain structures with therapeutic proteins. “Our study adds to the continually increasing body of preclinical and clinical data positioning EC biodelivery as a promising therapeutic delivery method for larger biomolecules. It combines the therapeutic advantages of gene therapy with the well-established safety of a retrievable implant,” says lead investigator Jens Tornøe, NsGene A/S, Ballerup, Denmark.
Investigators made a catheter-like device consisting of a hollow fiber membrane encapsulating a polymeric “scaffold,” which provides a surface area to which neurotrophic factor-producing cells can attach. When implanted in the brain, the membrane allows the neurotrophic factor to flow out of the device, as well as allowing nutrients in. Dr. Tornøe and his colleagues used the neurotrophic factor Meteorin, which plays a role in the development of striatal projection neurons, whose degeneration is a hallmark of Huntington’s disease. The scientists engineered ARPE-19 cells to produce Meteorin and used those that produced high levels of Meteorin in their experiment.
The EC biodelivery devices were implanted in the brains of rats followed by injection with quinolinic acid (QA), a potent neurotoxin that causes excitotoxicity, a component of Huntington’s disease. They tested three different implant types: devices filled with the high-producing ARPE-19 cells (EC-Meteorin), devices with unmodified ARPE-19 cells (ARPE-19), and devices without cells. Motor dysfunction was tested immediately prior to injection with QA and at two and four weeks after injection.
The research team found that the EC-Meteorin devices significantly protected against QA-induced toxicity. Rats with EC-Meteorin devices manifested near normal neurological performance and significantly reduced loss of brain cells from the QA injection compared to controls. Analysis of the Meteorin-treated brains showed a markedly reduced striatal lesion size. The EC biodelivery devices were found to produce stable or even increasing levels of Meteorin throughout the study. Meteorin diffused readily from the biodelivery device to the striatal tissue.
"Huntington’s disease can be diagnosed with high accuracy by genetic testing. Pre-symptomatic administration of a safe therapeutic treatment providing sustained delay or prevention of disease would be of great benefit to patients," says Dr. Tornøe. "With additional functional and safety data, tests in animals larger than the rat to study distribution, and more accurate disease models to evaluate the therapeutic potential of Meteorin, we anticipate that EC biodelivery can be developed as a platform technology for targeted therapy in patients with Huntington’s disease."
Provided by IOS Press
Source: medicalxpress.com