Posts tagged neuroscience

Posts tagged neuroscience
ScienceDaily (June 18, 2012) — The legal system needs to take greater account of new discoveries in neuroscience that show how a difficult childhood can affect the development of a young person’s brain which can increase the risk adolescent crimes, according to researchers.
The research will be presented as part of an Economic and Social Research Council seminar series in conjunction with the Parliamentary Office of Science and Technology.
Neuroscientists have recently shown that early adversity — such as a very chaotic and frightening home life — can result in a young child becoming hyper vigilant to potential threats in their environment. This appears to influence the development of brain connectivity and functions.
Such children may come to adolescence with brain systems that are set differently, and this may increase their likelihood of taking impulsive risks. For many young offenders such early adversity is a common experience, and it may increase both their vulnerability to mental health problems and also their risk of problem behaviours.
These insights, from a team led by Dr Eamon McCrory, University College London, are part of a wave of neuroscientific research questions that have potential implications for the legal system.
Other research by Dr Seena Fazel of Oxford University has shown that while social disadvantage is a major risk factor for offending, a Traumatic Brain Injury (TBI) — from an accident or assault — significantly increases the risk of involvement in violent crime. Professor Huw Williams, at University of Exeter, has similarly shown that around 45 per cent of young offenders have TBI histories, and more injuries are associated with greater violence.
Professor Williams said: “The latest message from neuroscience is that young people who suffer troubled childhoods may experience a kind of ‘triple whammy’. A difficult social background may put them at greater risk of offending and influence their brain development early on in childhood in a way that increases risky behaviour. This can then increase their chances of experiencing an injury to their brains that would compromise their ability to stay in school or contribute to society still further.”
Professor Williams wants to see better communication between neuroscientists, clinicians and lawyers so that research findings like these lead to changes in the legal system. “There is a big gap between research conducted by neuroscientists and the realities of the day to day work of the justice system,” he said. “Although criminal behaviour results from a complex interplay of a host of factors, neuroscientists and clinicians are identifying key risk factors that — if addressed — could reduce crime. Investment in earlier, focussed interventions may offset the costs of years of custody and social violence.”
Dr Eileen Vizard, a prominent adolescent forensic psychiatrist, will talk at the event Neuroscience, Children and the Law, about how the criminal justice system needs to be changed to age appropriate sentencing for children as young as ten years old, whilst also providing for the welfare needs of these deprived children. Laura Hoyano — a leading expert on vulnerable people in criminal courts — will discuss the problems children face when testifying in criminal courts.
Source: Science Daily
ScienceDaily (June 18, 2012) — UC Santa Barbara scientists turned to the simple sponge to find clues about the evolution of the complex nervous system and found that, but for a mechanism that coordinates the expression of genes that lead to the formation of neural synapses, sponges and the rest of the animal world may not be so distant after all. Their findings, titled “Functionalization of a protosynaptic gene expression network,” are published in the Proceedings of the National Academy of Sciences.

The genes of Amphimedon queenslandica, a marine sponge native to the Great Barrier Reef, Australia, have been fully sequenced, allowing the researchers to monitor gene expression for signs of neural development. (Credit: UCSB)
"If you’re interested in finding the truly ancient origins of the nervous system itself, we know where to look," said Kenneth Kosik, Harriman Professor of Neuroscience Research in the Department of Molecular, Cellular & Developmental Biology, and co-director of UCSB’s Neuroscience Research Institute.
That place, said Kosik, is the evolutionary period of time when virtually the rest of the animal kingdom branched off from a common ancestor it shared with sponges, the oldest known animal group with living representatives. Something must have happened to spur the evolution of the nervous system, a characteristic shared by creatures as simple as jellyfish and hydra to complex humans, according to Kosik.
A previous sequencing of the genome of the Amphimedon queenslandica — a sponge that lives in Australia’s Great Barrier Reef — showed that it contained the same genes that lead to the formation of synapses, the highly specialized characteristic component of the nervous system that sends chemical and electrical signals between cells. Synapses are like microprocessors, said Kosik explaining that they carry out many sophisticated functions: They send and receive signals, and they also change behaviors with interaction — a property called “plasticity.”
"Specifically, we were hoping to understand why the marine sponge, despite having almost all the genes necessary to build a neuronal synapse, does not have any neurons at all," said the paper’s first author, UCSB postdoctoral researcher Cecilia Conaco, from the UCSB Department of Molecular, Cellular, and Developmental Biology (MCDB) and Neuroscience Research Institute (NRI). "In the bigger scheme of things, we were hoping to gain an understanding of the various factors that contribute to the evolution of these complex cellular machines."
This time the scientists, including Danielle Bassett, from the Department of Physics and the Sage Center for the Study of the Mind, and Hongjun Zhou and Mary Luz Arcila, from NRI and MCDB, examined the sponge’s RNA (ribonucleic acid), a macromolecule that controls gene expression. They followed the activity of the genes that encode for the proteins in a synapse throughout the different stages of the sponge’s development.
"We found a lot of them turning on and off, as if they were doing something," said Kosik. However, compared to the same genes in other animals, which are expressed in unison, suggesting a coordinated effort to make a synapse, the ones in sponges were not coordinated.
"It was as if the synapse gene network was not wired together yet," said Kosik. The critical step in the evolution of the nervous system as we know it, he said, was not the invention of a gene that created the synapse, but the regulation of preexisting genes that were somehow coordinated to express simultaneously, a mechanism that took hold in the rest of the animal kingdom.
The work isn’t over, said Kosik. Plans for future research include a deeper look at some of the steps that lead to the formation of the synapse; and a study of the changes in nervous systems after they began to evolve.
"Is the human brain just a lot more of the same stuff, or has it changed in a qualitative way?" he asked.
Source: Science Daily
June 18, 2012
Among well-functioning older adults without dementia, diabetes mellitus (DM) and poor glucose control among those with DM are associated with worse cognitive function and greater cognitive decline, according to a report published Online First by Archives of Neurology, a JAMA Network publication.
Findings from previous studies have suggested an association between diabetes mellitus and an increased risk of cognitive impairment and dementia, including Alzheimer disease, but this association continues to be debated and less is known regarding incident DM in late life and cognitive function over time, the authors write as background in the study.
Kristine Yaffe, M.D., of the University of California, San Francisco and the San Francisco VA Medical Center, and colleagues evaluated 3,069 patients (mean age, 74.2 years; 42 percent black; 52 percent female) who completed the Modified Mini-Mental State Examination (3MS) and Digit Symbol Substitution Test (DSST) at baseline and selected intervals over 10 years.
At study baseline, 717 patients (23.4 percent) had prevalent DM and 2,352 (76.6 percent) were without DM, 159 of whom developed DM during follow-up. Patients who had prevalent DM at baseline had lower 3MS and DSST test scores than patients without DM, and results from analysis show similar patterns for 9-year decline with participants with prevalent DM showing significant decline on both the 3MS and DSST compared with those without DM.
Also, among participants with prevalent DM at baseline, higher levels of hemoglobin A1c (HbA1c) were associated with lower 3MS and DSST scores. However, after adjusting for age, sex, race and education, scores remained significantly lower for those with mid (7 percent to 8 percent) and high (greater than or equal to 8 percent) HbA1c levels on the 3MS but were no longer significant for the DSST.
"This study supports the hypothesis that older adults with DM have reduced cognitive function and that poor glycemic control may contribute to this association,” the authors conclude. “Future studies should determine if early diagnosis and treatment of DM lessen the risk of developing cognitive impairment and if maintaining optimal glucose control helps mitigate the effect of DM on cognition.”
Provided by JAMA and Archives Journals
Source: medicalxpress.com
June 18, 2012
A new study proposes a communication routing strategy for the brain that mimics the American highway system, with the bulk of the traffic leaving the local and feeder neural pathways to spend as much time as possible on the longer, higher-capacity passages through an influential network of hubs, the so-called rich club.

The study, published this week online in the Early Edition of the Proceedings of the National Academy of Sciences, involves researchers from Indiana University and the University Medical Center Utrecht in the Netherlands and advances their earlier findings that showed how select hubs in the brain not only are powerful in their own right but have numerous and strong connections between each other.
The current study characterizes the influential network within the rich club as the “backbone” for global brain communication. A costly network in terms of the energy and space consumed, said Olaf Sporns, professor in the Department of Psychological and Brain Sciences at IU Bloomington, but one with a big pay-off: providing quick and effective communication between billions and billions of brain cells.
"Until now, no one knew how central the brain’s rich club really was," Sporns said. "It turns out the rich club is always right in the middle when it comes to how brain regions talk to each other. It absorbs, transforms and disseminates information. This underscores its importance for brain communication.”
In earlier work, using diffusion imaging, the researchers found a group of 12 strongly interconnected bihemispheric hub regions, comprising the precuneus, superior frontal and superior parietal cortex, as well as the subcortical hippocampus, putamen and thalamus. Together, these regions form the brain’s “rich club.” Most of these areas are engaged in a wide range of complex behavioral and cognitive tasks, rather than more specialized processing such as vision and motor control.
For the current study, Martijn van den Heuvel, a professor at the Rudolf Magnus Institute of Neuroscience at University Medical Center Utrecht, used diffusion tensor imaging data for two sets of 40 healthy subjects to map the large-scale connectivity structure of the brain. The cortical sheet was divided into 1,170 regions, and then pathways between the regions were reconstructed and measured. As in the previous study, the rich club nodes were widely distributed and had up to 40 percent more connectivity compared to other areas.
The connections measured — almost 700,000 in total — were classified in one of three ways: as rich club connections if they connected nodes within the rich club; as feeder connections if they connected a non-rich club node to a rich club node; and as local connections if they connected non-rich club nodes. Rich club connections made up the majority of all long-distance neural pathways. The study also found that connections classified as rich club connections were used more heavily for communication than other feeder and local connections. A path analysis showed that when a minimally short path is traced from one area of the brain to another, it travels through the rich club network 69 percent of the time, even though the network accounts for only 10 percent of the brain.
A common pattern in communication paths spanning long distances, Sporns said, was that such paths involved sequences of steps leading across local, feeder, rich club, feeder and back to local connections. In other words, he said, many communication paths first traveled toward the rich club before reaching their destinations.
"It is as if the rich club acts as an attractor for signal traffic in the brain," Sporns said. "It soaks up information which is then integrated and sent back out to the rest of the brain."
Van den Heuvel agreed.
"It’s like a big ‘neuronal magnet’ for communication and information integration in our brains," he said. "Seeking out the rich club may offer a strategy for neurons and brain regions to find short communication paths across the brain, and might provide insight into how our brain manages to be so highly efficient."
From an evolutionary standpoint, it was important for the brain to minimize energy consumption and wiring volume, but if these were the only factors, there would be no rich club because of the extra resources it requires, Sporns said. The rich club is expensive, at least in terms of wiring volume, and perhaps also in terms of metabolic cost. The trade-off for higher cost, Sporns said, is higher performance — the integration of diverse signals and the ability to select short paths across the network.
“Brain neurons don’t have maps; how do they find paths to get in touch? Perhaps the rich club helps with this, offering the brain’s neurons and regions a way to communicate efficiently based on a routing strategy that involves the rich club.”
People use related strategies to navigate social networks.
"Strangely, neurons may solve their communication problems just like the people to which they belong," Sporns said.
Provided by Indiana University
Source: medicalxpress.com
June 18, 2012
A new study shows that the compound Coenzyme Q10 (CoQ) reduces oxidative damage, a key finding that hints at its potential to slow the progression of Huntington disease. The discovery, which appears in the inaugural issue of the Journal of Huntington’s Disease, also points to a new biomarker that could be used to screen experimental treatments for this and other neurological disorders.
"This study supports the hypothesis that CoQ exerts antioxidant effects in patients with Huntington’s disease and therefore is a treatment that warrants further study," says University of Rochester Medical Center neurologist Kevin M. Biglan, M.D., M.P.H., lead author of the study. “As importantly, it has provided us with a new method to evaluate the efficacy of potential new treatments.”
Huntington’s disease (HD) is a genetic, progressive neurodegenerative disorder that impacts movement, behavior, cognition, and generally results in death within 20 years of the disease’s onset. While the precise causes and mechanism of the disease are not completely understood, scientists believe that one of the important triggers of the disease is a genetic “stutter" which produces abnormal protein deposits in brain cells. It is believed that these deposits – through a chain of molecular events – inhibit the cell’s ability to meet its energy demands resulting in oxidative stress and, ultimately, cellular death.
Scientists had previously identified the correlation between a specific fragment of genetic code, called 8-hydroxy-2’-deoxyguanosine (80HdG) and the presence of oxidative stress in brain cells. 80HdG can be detected in a person’s blood, meaning that it could serve as a convenient and accessible biomarker for the disease. Researchers have also been evaluating the compound Coenzyme Q10 as a possible treatment for HD because of its ability to support the function of mitochondria – the tiny power plants the provide cells with energy – and counter oxidative stress.
The study’s authors evaluated a series of blood samples of 20 individuals with HD who had previously undergone treatment with CoQ in clinical trial titled Pre-2Care. While these studies showed that CoQ alleviated some symptoms of the disease, it was not known what impact – if any – the treatment had at the molecular level in the brain. Upon analysis, the authors found that 80HdG levels dropped by 20 percent in individuals who had been treated with CoQ.
CoQ is currently being evaluated in a Phase 3 clinical trial, which is the largest therapeutic clinical study to date for HD. The trial – called 2Care – is being run by the Huntington Study Group, an international networks or investigators.
"Identifying treatments that slow the progression or delay the onset of Huntington’s disease is a major focus of the medical community," said Biglan. "This study demonstrates that 80HdG could be an ideal marker to identify the presence oxidative injury and whether or not treatment is having an impact."
Provided by University of Rochester Medical Center
Source: medicalxpress.com
June 18, 2012
Studies suggest that neurotrophic factors, which play a role in the development and survival of neurons, have significant therapeutic and restorative potential for neurologic diseases such as Huntington’s disease. However, clinical applications are limited because these proteins cannot easily cross the blood brain barrier, have a short half-life, and cause serious side effects. Now, a group of scientists has successfully treated neurological symptoms in laboratory rats by implanting a device to deliver a genetically engineered neurotrophic factor directly to the brain. They report on their results in the latest issue of Restorative Neurology and Neuroscience.

The tip of the EC biodelivery system, a straw-like device that is implanted in the brain of patients, contains living cells which are genetically modified to produce a therapeutic factor. The membrane enclosing the cells allows the factor to flow out of the device and into the patient’s brain tissue. This way, areas deep within the brain affected by Huntington’s disease can be treated to delay or prevent the disease. Credit: Jens Tornøe, NsGene A/S, Ballerup, Denmark
Researchers used Encapsulated Cell (EC) biodelivery, a platform which can be applied using conventional minimally invasive neurosurgical procedures to target deep brain structures with therapeutic proteins. “Our study adds to the continually increasing body of preclinical and clinical data positioning EC biodelivery as a promising therapeutic delivery method for larger biomolecules. It combines the therapeutic advantages of gene therapy with the well-established safety of a retrievable implant,” says lead investigator Jens Tornøe, NsGene A/S, Ballerup, Denmark.
Investigators made a catheter-like device consisting of a hollow fiber membrane encapsulating a polymeric “scaffold,” which provides a surface area to which neurotrophic factor-producing cells can attach. When implanted in the brain, the membrane allows the neurotrophic factor to flow out of the device, as well as allowing nutrients in. Dr. Tornøe and his colleagues used the neurotrophic factor Meteorin, which plays a role in the development of striatal projection neurons, whose degeneration is a hallmark of Huntington’s disease. The scientists engineered ARPE-19 cells to produce Meteorin and used those that produced high levels of Meteorin in their experiment.
The EC biodelivery devices were implanted in the brains of rats followed by injection with quinolinic acid (QA), a potent neurotoxin that causes excitotoxicity, a component of Huntington’s disease. They tested three different implant types: devices filled with the high-producing ARPE-19 cells (EC-Meteorin), devices with unmodified ARPE-19 cells (ARPE-19), and devices without cells. Motor dysfunction was tested immediately prior to injection with QA and at two and four weeks after injection.
The research team found that the EC-Meteorin devices significantly protected against QA-induced toxicity. Rats with EC-Meteorin devices manifested near normal neurological performance and significantly reduced loss of brain cells from the QA injection compared to controls. Analysis of the Meteorin-treated brains showed a markedly reduced striatal lesion size. The EC biodelivery devices were found to produce stable or even increasing levels of Meteorin throughout the study. Meteorin diffused readily from the biodelivery device to the striatal tissue.
"Huntington’s disease can be diagnosed with high accuracy by genetic testing. Pre-symptomatic administration of a safe therapeutic treatment providing sustained delay or prevention of disease would be of great benefit to patients," says Dr. Tornøe. "With additional functional and safety data, tests in animals larger than the rat to study distribution, and more accurate disease models to evaluate the therapeutic potential of Meteorin, we anticipate that EC biodelivery can be developed as a platform technology for targeted therapy in patients with Huntington’s disease."
Provided by IOS Press
Source: medicalxpress.com
June 18, 2012
New pictures from the University of Iowa show what it looks like when a person runs out of patience and loses self-control.

This image shows brain activity when people exert self-control. Credit: University of Iowa
A study by University of Iowa neuroscientist and neuro-marketing expert William Hedgcock confirms previous studies that show self-control is a finite commodity that is depleted by use. Once the pool has dried up, we’re less likely to keep our cool the next time we’re faced with a situation that requires self-control.
But Hedgcock’s study is the first to actually show it happening in the brain using fMRI images that scan people as they perform self-control tasks. The images show the anterior cingulate cortex (ACC)—the part of the brain that recognizes a situation in which self-control is needed and says, “Heads up, there are multiple responses to this situation and some might not be good”—fires with equal intensity throughout the task.
However, the dorsolateral prefrontal cortex (DLPFC)—the part of the brain that manages self-control and says, “I really want to do the dumb thing, but I should overcome that impulse and do the smart thing”—fires with less intensity after prior exertion of self-control.

This image shows brain activity after people have been engaged in self-control tasks long enough that self-control resources have been depleted. Credit: University of Iowa
He said that loss of activity in the DLPFC might be the person’s self-control draining away. The stable activity in the ACC suggests people have no problem recognizing a temptation. Although they keep fighting, they have a harder and harder time not giving in.
Which would explain why someone who works very hard not to take seconds of lasagna at dinner winds up taking two pieces of cake at desert. The study could also modify previous thinking that considered self-control to be like a muscle. Hedgcock says his images seem to suggest that it’s like a pool that can be drained by use then replenished through time in a lower conflict environment, away from temptations that require its use.
The researchers gathered their images by placing subjects in an MRI scanner and then had them perform two self-control tasks—the first involved ignoring words that flashed on a computer screen, while the second involved choosing preferred options. The study found the subjects had a harder time exerting self-control on the second task, a phenomenon called “regulatory depletion.” Hedgcock says that the subjects’ DLPFCs were less active during the second self-control task, suggesting it was harder for the subjects to overcome their initial response.
Hedgcock says the study is an important step in trying to determine a clearer definition of self-control and to figure out why people do things they know aren’t good for them. One possible implication is crafting better programs to help people who are trying to break addictions to things like food, shopping, drugs, or alcohol. Some therapies now help people break addictions by focusing at the conflict recognition stage and encouraging the person to avoid situations where that conflict arises. For instance, an alcoholic should stay away from places where alcohol is served.
But Hedgcock says his study suggests new therapies might be designed by focusing on the implementation stage instead. For instance, he says dieters sometimes offer to pay a friend if they fail to implement control by eating too much food, or the wrong kind of food. That penalty adds a real consequence to their failure to implement control and increases their odds of choosing a healthier alternative.
The study might also help people who suffer from a loss of self-control due to birth defect or brain injury.
"If we know why people are losing self-control, it helps us design better interventions to help them maintain control," says Hedgcock, an assistant professor in the Tippie College of Business marketing department and the UI Graduate College’s Interdisciplinary Graduate Program in Neuroscience.
Provided by University of Iowa
Source: medicalxpress.com
June 18, 2012
Fear conditioning using sound and taste aversion, as applied to mice, have revealed interesting information on the basis of memory allocation.

Credit: Thinkstock
European ‘Cellular mechanisms underlying formation of the fear memory trace in the mouse amygdala’ (FEAR Memory TRACE) project is investigating memory allocation and the recruitment of certain neurons to encode a memory. By studying conditioned fear memory in response to an auditory stimulus, the researchers have delved into pathological emotional states and neural mechanisms involved in memory allocation, retrieval and extinction.
Prior research has revealed that the conditioned fear response in mice is located in a specific bundle of neurons in the amygdala. Memory allocation modulation is due to expression of the transcription factor, cyclic adenosine 3’, 5’-monophosphate response element binding protein (CREB) and possibly neuronal excitability.
FEAR Memory TRACE focused on the electrophysiological properties of neurons encoding the same memory. The project also aimed to ascertain the biophysical mechanisms in the plasticity changes recorded in the specific set of neurons in the fear memory trace.
Recording information on auditory fear conditioning and conditioned taste aversion, the scientists used intra-amygdala surgery using viral vectors and electrophysiological experiments to detect neuronal excitability.
Transfected by virus, CREB tagged with green fluorescent protein together with the gene for channelrhodopsin2 were used in neural control experiments. Combined, these two elements caused neuron firing in specific nerve cells. Molecular techniques included western blot for protein detection, genotyping and viral DNA preparation.
Behavioural tests on long- and short-term memory of mice involving fear conditioning and taste aversion showed increased memory performance at the three-hour point rather than the five-hour point. The intrinsic excitability of the mice receiving both shock and the tone was increased at three hours, not five, compared to mice that only received the tone.
As the project continues to its close in two years, the aim is to identify biophysical mechanisms involved in recruiting neurons that compete with each other for a specific memory. FEAR Memory TRACE will also develop computational models to assess the role of these mechanisms in memory performance.
Information on biochemical processes in neural mechanisms has wide application in many clinical situations including patients suffering memory loss, such as stroke victims. Fear response manipulation can be applied in treatment of neuroses and phobias.
Provided by CORDIS
Source: medicalxpress.com
British researchers create robot that can learn simple words by conversing with humans
In an attempt to replicate the early experiences of infants, researchers in England have created a robot that can learn simple words in minutes just by having a conversation with a human.
The three-foot-tall robot, named DeeChee, was built to produce any syllable in the English language. But it knew no words at the outset of the study, speaking only babble phrases like “een rain rain mahdl kross.”
During the experiment, a human volunteer attempted to teach the robot simple words for shapes and colors by using them repeatedly in regular speech.
June 17, 2012
A collaborative research team led by Professor Tadashi ISA from The National Institute for Physiological Sciences, The National Institutes of Natural Sciences and Fukushima Medical University and Kyoto University, developed a “double viral vector transfection technique” which can deliver genes to a specific neural circuit by combining two new kinds of gene transfer vectors. With this method, they found that “indirect pathways”, which were suspected to have been left behind when the direct connection from the brain to motor neurons (which control muscles) was established in the course of evolution, actually plays an important role in the highly developed dexterous hand movements. This study was supported by the Strategic Research Program for Brain Sciences by the MEXT of Japan. This research result will be published in Nature (June 17th, advance online publication).
It is said that the higher primates including human beings accomplished explosive evolution by having acquired the ability to move hands skillfully. It has been thought that this ability to move individual fingers is a result of the evolution of the direct connection from the cerebrocortical motor area to motor neurons of the spinal cord which control the muscles. On the other hand, in lower animals with clumsy hands, such as cats or rats, the cortical motor area is connected to the motor neurons, only through interneurons of the spinal cord. Such “indirect pathway”remains in us, primates, without us fully understanding its functions. Is this “phylogenetically old circuit” still in operation? Or maybe suppressed since it is obstructive? The conclusion was not attached to this argument.
The collaborative research team led by Professor Tadashi ISA, Project Assistant Professor Masaharu KINOSHITA from The National Institute for Physiological Sciences, The National Institutes of Natural Sciences and Fukushima Medical University and Kyoto University developed “the double viral vector transfection technique”which can deliver genes to a specific neural circuit by combining two new kinds of gene transfer vectors.
With this method, they succeeded in the selective and reversible suppression of the propriospinal neurons (spinal interneurons mediating the indirect connection from cortical motor area to spinal motor neurons)
The results revealed that “indirect pathways” play an important role in dexterous hand movements and finally a longtime debate has come to a close.
The key component of this discovery was”the double viral vector transfection technique”in which one vector is retrogradely transported from the terminal zone back to the neuronal cell bodies and the other is transfected at the location of their cell bodies. The expression of the target gene is regulated only in the cells with double transfection by the two vectors. Using this technique, they succeeded in the suppression of the propriospinal neuron selectively and reversibly.
Such an operation was possible in mice in which the inheritable genetic manipulation of germline cells were possible, but impossible in primates until now.
Using this method, further development of gene therapy targeted to a specific neural circuit can be expected.
Professor Tadashi ISA says “this newly developed double viral vector transfection technique can be applied to the gene therapy of the human central nervous system, as we are the same higher primates.
And this is the discovery which reverses the general idea that the spinal cord is only a reflex pathway, but also plays a pivotal role in integrating the complex neural signals which enable dexterous movements.”
Provided by National Institute for Physiological Sciences
Source: medicalxpress.com