Posts tagged science

Posts tagged science
An international team of researchers have identified a previously unknown neurodegenerative disorder and discovered it is caused by a single mutation in one individual born during the height of the Ottoman Empire in Turkey about 16 generations ago.

(Image caption: An fMRI scan of the brain of a patient with CLP1 mutation reveals severe atrophy of the brainstem (red line) and cerebellum (blue) as well as lack of formation of the corpus callosum (green), which connects both sides of the cerebrum (yellow), which is also atrophied. The lines outline approximately the expected sizes of the brain areas. A study traced the mutation to a single individual born in Turkey during the Ottoman Empire, some 16 generations ago.)
The genetic cause of the rare disorder was discovered during a massive analysis of the individual genomes of thousands of Turkish children suffering from neurological disorders.
“The more we learn about basic mechanisms behind rare forms of neuro-degeneration, the more novel insights we can gain into more common diseases such as Alzheimer’s or Lou Gehrig’s Disease,” said Murat Gunel, the Nixdorff-German Professor of Neurosurgery, and professor of genetics and neurobiology at Yale.
Gunel is a senior co-author of one of two papers published in the April 24 issue of the journal Cell that document the devastating effects of a mutation in the CLP1 gene. Gunel and colleagues at Yale Center for Mendelian Genomics along with Joseph Gleeson’s group at University of California-San Diego compared DNA sequencing results of more than 2,000 children from different families with neurodevelopmental disorders. In four apparently unrelated families, they identified the exact same mutation in the CLP1 gene. Working with the Frank Bass group from the Netherlands, the researchers also studied how CLP1 mutations interfered with the transfer of information encoded within genes to cells’ protein-making machinery.
The discovery of the identical mutation in seemingly unrelated families originally from eastern Turkey suggested an ancestral mutation, dating back several generations, noted the researchers.
Affected children suffer from intellectual disability, seizures, and delayed or absent mental and motor development, and their imaging studies show atrophy affecting the cerebral cortex, cerebellum, and the brain stem.
The second Cell paper by researchers from Baylor School of Medicine and Austria also found the identical founder mutation in CLP1 in another 11 children from an additional five families originally from eastern Turkey.
Gunel said that the high prevalence of consanguineous marriages [between closely related people] in Turkey and the Middle East leads to these rare recessive genetic neurodegenerative disorders. Affected children inherit mutations in the same gene from both of their parents, who are closely related to each other, such as first cousins. Without consanguinity between parents, children are very unlikely to inherit two mutations in the same gene.
“By dissecting the genetic basis of these neurodevelopmental disorders, we are gaining fundamental insight into basic physiological mechanisms important for human brain development and function” Gunel said. “We learn a lot about normal biology by studying what happens when things go wrong.”
(Source: news.yale.edu)

Oops! Researchers find neural signature for mistake correction
Culminating an 8 year search, scientists at the RIKEN-MIT Center for Neural Circuit Genetics captured an elusive brain signal underlying memory transfer and, in doing so, pinpointed the first neural circuit for “oops”—the precise moment when one becomes consciously aware of a self-made mistake and takes corrective action.
The findings, published in Cell, verified a 20 year old hypothesis on how brain areas communicate. In recent years, researchers have been pursuing a class of ephemeral brain signals called gamma oscillations, millisecond scale bursts of synchronized wave-like electrical activity that pass through brain tissue like ripples on a pond. In 1993, German scientist Wolf Singer proposed that gamma waves enable binding of memory associations. For example, in a process called working memory, animals store and recall short-term memory associations when exploring the environment.
In 2006, the MIT team under the direction of Nobel Laureate Susumu Tonegawa began a study to understand working memory in mice. They trained animals to navigate a T maze and turn left or right at a junction for an associated food reward. They found that working memory required communication between two brain areas, the hippocampus and entorhinal cortex, but how mice knew the correct direction and the neural signal for memory transfer of this event remained unclear.
The study’s lead author Jun Yamamoto noticed that mice sometimes made mistakes, turning in the wrong direction then pausing, and turning around to go in the correct direction, trials he termed “oops” in his lab notebook. Intrigued, he recorded neural activity in the circuit and observed a burst of gamma waves just before the “oops” moment. He also saw gamma waves when mice chose the correct direction, but not when they failed to choose the correct direction or did not correct their mistakes.
The critical experiment was to block gamma oscillations and prevent mice from making correct decisions. To do this, the researchers created a transgenic mouse with a light-activated protein called archaerhodopsin (ArchT) in the hippocampus. Using an optic fiber implanted in the brain, light was flashed into the hippocampal-entorhinal circuit, shutting off gamma activity. In accord, the mice could no longer accurately choose the right direction and the number of “oops” events decreased.
The findings provide strong evidence of a role for gamma oscillations in cognition, and raise the prospect of their involvement in other behaviors requiring retrieval and evaluation of working memory. This may open the door to a class of behaviors called metacognition, or “thinking about thinking”, the self-monitoring of one’s actions. Regarding the appearance of gamma oscillations in the “oops” cases, Dr. Tonegawa stated “our data suggest that animals consciously monitor whether their behavioral choices are correct and use memory recall to improve their outcomes”
(Image caption: Channelrhodopsins before (upper left) and after (lower right) molecular engineering, shown superimposed over an image of a mammalian neuron. In the upper left opsin, the red color shows negative charges spanning the opsin that facilitated the flow of positive (stimulatory) ions through the channel into neurons. In the newly engineered channels (lower right), those negative charges have been changed to positive (blue), allowing the negatively charged inhibitory chloride ions to flow through. Credit: Andre Berndt, Soo Yeun Lee, Charu Ramakrishnan, and Karl Deisseroth.)
Researchers Build New “Off Switch” to Shut Down Neural Activity
Nearly a decade ago, the era of optogenetics was ushered in with the development of channelrhodopsins, light-activated ion channels that can, with the flick of a switch, instantaneously turn on neurons in which they are genetically expressed. What has lagged behind, however, is the ability to use light to inactivate neurons with an equal level of reliability and efficiency. Now, Howard Hughes Medical Institute (HHMI) scientists have used an analysis of channelrhodopsin’s molecular structure to guide a series of genetic mutations to the ion channel that grant the power to silence neurons with an unprecedented level of control.
The new structurally engineered channel at last gives neuroscientists the tools to both activate and inactivate neurons in deep brain structures using dim pulses of externally projected light. HHMI early career scientist Karl Deisseroth and his colleagues at Stanford University published their findings April 25, 2014 in the journal Science. “We’re excited about this increased light sensitivity of inhibition in part because we think it will greatly enhance work in large-brained organisms like rats and primates,” he says.
First discovered in unicellular green algae in 2002, channelrhodopsins function as photoreceptors that guide the microorganisms’ movements in response to light. In a landmark 2005 study, Deisseroth and his colleagues described a method for expressing the light-sensitive proteins in mouse neurons. By shining a pulse of blue light on those neurons, the researchers showed they could reliably induce the ion channel at channelrhodopsin’s core to open up, allowing positively charged ions to rush into the cell and trigger action potentials. Channelrhodopsins have since been used in hundreds of research projects investigating the neurobiology of everything from cell dynamics to cognitive functions.
A few years later came the deployment of halorhodopsins, light-sensitive proteins selective for the negatively charged ion chloride. These proteins, derived from halobacteria, provided researchers with a tool for the light-controlled inactivation of neurons. A major limitation of these proteins, however, is their inefficiency. Unlike channelrhodopsin, halorhodopsin is an ion pump, meaning that only one chloride ion moves across the neuron’s membrane per photon of light. “What that translates into is you get partial inhibition,” Deisseroth says. “You can inhibit neurons, but in the living animal it’s not always complete.”
Searches for a naturally occurring light-sensitive channel with a pore permeable to negatively charged ions have come up empty handed. “We searched,” Deisseroth says. “We did big genomic searches and found many interesting channelrhodopsins and lots of pumps, but we never found an inhibitory channel in nature.”
The team’s fruitless exploration led them to try modifying the molecular structure of channelrhodopsin so that its pore would shuttle negative ions into the cell. “To do that you need to know what the channel pore looks like at the angstrom level,” Deisseroth says. “What we really needed was the high-resolution crystal structure.” In 2012, working with a group in Japan, Deisseroth and his colleagues captured the structure of a chimera of channelrhodopsin called C1C2 using X-ray crystallography.
A molecular analysis of channelrhodopsin’s pore suggested that swapping out certain negatively charged amino acid residues lining the pore with positive residues could reverse the electrostatic potential of the channel, making it more conductive to negatively charged ions such as chloride. To achieve this molecular switcheroo, the researchers performed dozens of single site-directed mutations. Several mutations conferred selectivity for chloride, but the channels failed to conduct current. So, the team screened hundreds of combinations of mutations. “In a systematic process we found first a combination of four mutations, and then a group of five mutations, that seemed to change selectivity,” says Deisseroth. “We put those together into a nine-fold mutated channel and that one, amazingly, was chloride selective.”
Not only does the new channel—dubbed iC1C2 for “inhibitory C1C2”—allow the selective passage of chloride ions, it greatly reduces the likelihood of action potentials by making the neuron more “leaky,” a function not possible in ion pumps like halorhodopsin.
Deisseroth’s team made a final mutation to a cysteine residue in iC1C2 that makes the channel both bi-stable and orders of magnitude more sensitive to light. When activated by blue light, the mutated channels remain open for up to minutes at a time, while exposing the channels to red light makes them close quickly. This level of long-term control is useful in developmental studies where events play out over minutes to hours. The long channel open times also mean that neurons can essentially integrate chloride currents over longer time scales and, therefore, weaker light can be used to inhibit the neurons. Increased light sensitivity translates to less light-induced damage to neural tissue, the ability to reach deep brain structures, and the possibility of controlling brain functions that involve large regions of the brain.
“This is something we’ve sought for many years and it’s really the culmination of many streams of work in the lab—crystal structure work, mutational work, behavioral work —all of which have come together here,” Deisseroth says.
Better-educated people appear to be significantly more likely to recover from a moderate to severe traumatic brain injury (TBI), suggesting that a brain’s “cognitive reserve” may play a role in helping people get back to their previous lives, new Johns Hopkins research shows.

The researchers, reporting in the journal Neurology, found that those with the equivalent of at least a college education are seven times more likely than those who didn’t finish high school to be disability-free one year after a TBI serious enough to warrant inpatient time in a hospital and rehabilitation facility.
The findings, while new among TBI investigators, mirror those in Alzheimer’s disease research, in which higher educational attainment — believed to be an indicator of a more active, or more effective, use of the brain’s “muscles” and therefore its cognitive reserve — has been linked to slower progression of dementia.
“After this type of brain injury, some patients experience lifelong disability, while others with very similar damage achieve a full recovery,” says study leader Eric B. Schneider, Ph.D., an epidemiologist at the Johns Hopkins University School of Medicine’s Center for Surgical Trials and Outcomes Research. “Our work suggests that cognitive reserve ¬— the brain’s ability to be resilient in the face of insult or injury — could account for the difference.”
Schneider conducted the research in conjunction with Robert D. Stevens. M.D., a neuro-intensive care physician with Johns Hopkins’ Department of Anesthesiology and Critical Care Medicine.
For the study, the researchers studied 769 patients enrolled in the TBI Model Systems database, an ongoing multi-center cohort of patients funded by the National Institute on Disability and Rehabilitation Research. The patients had been hospitalized with a moderate to severe TBI and subsequently admitted to a rehabilitation facility.
Of the 769 patients, 219 — or 27.8 percent — were free of any detectable disability one year after their injury. Twenty-three patients who didn’t complete high school — 9.7 percent of those at that education level — recovered, while 136 patients with between 12 and 15 years of schooling — 30.8 percent of those at that educational level — did. Nearly 40 percent of patients — 76 of the 194 — who had 16 or more years of education fully recovered.
Schneider says researchers don’t currently understand the biological mechanisms that might account for the link between years of schooling and improved recovery.
“People with increased cognitive reserve capabilities may actually heal in a different way that allows them to return to their pre–injury function and/or they may be able to better adapt and form new pathways in their brains to compensate for the injury,” Schneider says. “Further studies are needed to not only find out, but also to use that knowledge to help people with less cognitive reserve.”
Meanwhile, he says, “What we learned may point to the potential value of continuing to educate yourself and engage in cognitively intensive activities. Just as we try to keep our bodies strong in order to help us recover when we are ill, we need to keep the brain in the best shape it can be.”
Adds Stevens: “Understanding the underpinnings of cognitive reserve in terms of brain biology could generate ideas on how to enhance recovery from brain injury.”
(Source: hopkinsmedicine.org)
Bionic ear technology used for gene therapy
Researchers at UNSW have for the first time used electrical pulses delivered from a cochlear implant to deliver gene therapy, thereby successfully regrowing auditory nerves.
The research also heralds a possible new way of treating a range of neurological disorders, including Parkinson’s disease, and psychiatric conditions such as depression through this novel way of delivering gene therapy.
The research is published today in the prestigious journal Science Translational Medicine.
“People with cochlear implants do well with understanding speech, but their perception of pitch can be poor, so they often miss out on the joy of music,” says UNSW Professor Gary Housley, who is the senior author of the research paper.
“Ultimately, we hope that after further research, people who depend on cochlear implant devices will be able to enjoy a broader dynamic and tonal range of sound, which is particularly important for our sense of the auditory world around us and for music appreciation,” says Professor Housley, who is also the Director of the Translational Neuroscience Facility at UNSW Medicine.
The research, which has the support of Cochlear Limited through an Australian Research Council Linkage Project grant, has been five years in development.
The work centres on regenerating surviving nerves after age-related or environmental hearing loss, using existing cochlear technology. The cochlear implants are “surprisingly efficient” at localised gene therapy in the animal model, when a few electric pulses are administered during the implant procedure.
“This research breakthrough is important because while we have had very good outcomes with our cochlear implants so far, if we can get the nerves to grow close to the electrodes and improve the connections between them, then we’ll be able to have even better outcomes in the future,” says Jim Patrick, Chief Scientist and Senior Vice-President, Cochlear Limited.
It has long been established that the auditory nerve endings regenerate if neurotrophins – a naturally occurring family of proteins crucial for the development, function and survival of neurons – are delivered to the auditory portion of the inner ear, the cochlea.
But until now, research has stalled because safe, localised delivery of the neurotrophins can’t be achieved using drug delivery, nor by viral-based gene therapy.
Professor Housley and his team at UNSW developed a way of using electrical pulses delivered from the cochlear implant to deliver the DNA to the cells close to the array of implanted electrodes. These cells then produce neurotrophins.
“No-one had tried to use the cochlear implant itself for gene therapy,” says Professor Housley. “With our technique, the cochlear implant can be very effective for this.”
While the neurotrophin production dropped away after a couple of months, Professor Housley says ultimately the changes in the hearing nerve may be maintained by the ongoing neural activity generated by the cochlear implant.
“We think it’s possible that in the future this gene delivery would only add a few minutes to the implant procedure,” says the paper’s first author, Jeremy Pinyon, whose PhD is based on this work. “The surgeon who installs the device would inject the DNA solution into the cochlea and then fire electrical impulses to trigger the DNA transfer once the implant is inserted.”
Integration of this technology into other ‘bionic’ devices such as electrode arrays used in deep brain stimulation (for the treatment of Parkinson’s disease and depression, for example) could also afford opportunities for safe, directed gene therapy of complex neurological disorders.
"Our work has implications far beyond hearing disorders,” says co-author Associate Professor Matthias Klugmann, from the UNSW Translational Neuroscience Facility research team. “Gene therapy has been suggested as a treatment concept even for devastating neurological conditions and our technology provides a novel platform for safe and efficient gene transfer into tissues as delicate as the brain.”
(Image caption: A solar flare erupts on the far right side of the sun, in this image captured by NASA’s Solar Dynamics Observatory. The flare peaked at 6:34 p.m. EDT on March 12, 2014. Credit: NASA)
Some Astronauts at Risk for Cognitive Impairment
Johns Hopkins scientists report that rats exposed to high-energy particles, simulating conditions astronauts would face on a long-term deep space mission, show lapses in attention and slower reaction times, even when the radiation exposure is in extremely low dose ranges.
The cognitive impairments — which affected a large subset, but far from all, of the animals — appear to be linked to protein changes in the brain, the scientists say. The findings, if found to hold true in humans, suggest it may be possible to develop a biological marker to predict sensitivity to radiation’s effects on the human brain before deployment to deep space. The study, funded by NASA’s National Space Biomedical Research Institute, is described in the April issue of the journal Radiation Research.
When astronauts are outside of the Earth’s magnetic field, spaceships provide only limited shielding from radiation exposure, explains study leader Robert D. Hienz, Ph.D., an associate professor of behavioral biology at the Johns Hopkins University School of Medicine. If they take space walks or work outside their vehicles, they will be exposed to the full effects of radiation from solar flares and intergalactic cosmic rays, he says, and since neither the moon nor Mars have a planet-wide magnetic field, astronauts will be exposed to relatively high radiation levels, even when they land on these surfaces.
But not everyone will be affected the same way, his experiments suggest. “In our radiated rats, we found that 40 to 45 percent had these attention-related deficits, while the rest were seemingly unaffected,” Hienz says. “If the same proves true in humans and we can identify those more susceptible to radiation’s effects before they are harmfully exposed, we may be able to mitigate the damage.”
If a biomarker can be identified for humans, it could have even broader implications in determining the best course of treatment for patients receiving radiotherapy for brain tumors or identifying which patients may be more at risk from radiation-based medical treatments, the investigators note.
Previous research has tested how well radiation-exposed rats do with basic learning tasks and mazes, but this new Johns Hopkins study focused on tests that closely mimic the self-tests of fitness for duty currently used by astronauts on the International Space Station prior to mission-critical events such as space walks. Similar fitness tests are also used for soldiers, airline pilots and long-haul truckers.
In one such test, an astronaut sees a blank screen on a handheld device and is instructed to tap the screen when an LED counter lights up. The normal reaction time should be less than 300 milliseconds. The rats in the experiment are similarly taught to touch a light-up key with their noses and are then tested to see how quickly they react.
To conduct the new study, rats were first trained for the tests and then taken to Brookhaven National Laboratory on Long Island in Upton, N.Y., where a collider produces the high-energy proton and heavy ion radiation particles that normally occur in space. The rats’ heads were exposed to varying levels of radiation that astronauts would normally receive during long-duration missions, while other rats were given sham exposures.
Once the rats returned to Johns Hopkins, they were tested every day for 250 days. The radiation-sensitive animals (19 of 46) all showed evidence of impairment that began at 50 to 60 days post–exposure and remained through the end of the study.
Lapses in attention occurred in 64 percent of the sensitive animals, elevations in impulsive responding occurred in 45 percent and slower reaction times occurred in 27 percent. The impairments were not dependent on radiation dose. Additionally, some of the rats didn’t recover at all from their deficits over time, while others showed some recovery over time.
The radiation-sensitive rats that received higher doses of radiation had a higher concentration of transporters for the neurotransmitter dopamine, which plays a role in vigilance and attention, says Catherine M. Davis, Ph.D., a postdoctoral fellow in the Department of Psychiatry and Behavioral Sciences and the study’s first author.
The dopamine transport system appears impaired in radiation-sensitive rats because the neurotransmitter is most likely not removed in the manner it should be for the brain to function properly, she says. Humans with genetic differences related to dopamine transport, she adds, have been shown to do worse on the type of mental fitness tests given to the astronauts and rats alike.
Davis says she wouldn’t want to see radiation-sensitive astronauts kept from future missions to the moon or Mars, but she would want those astronauts to be prepared to take special precautions to protect their brains, such as wearing extra shielding or not performing space walks.
“As with other areas of personalized medicine, we would seek to create individual treatment and prevention plans for astronauts we believe would be more susceptible to cognitive deficits from radiation exposure,” she says.
Current astronauts are not as exposed to the damaging effects of radiation, Davis says, because the International Space Station flies in an orbit low enough that the Earth’s magnetic field continues to provide protection.
While the Johns Hopkins team studies the likely effects of radiation on the brain during a deep space mission, other NASA-funded research groups are looking at the potential effects of radiation on other parts of the body and on whether it increases cancer risks.
Airport security-style technology could help doctors decide on stroke treatment
A new computer program could help doctors predict which patients might suffer potentially fatal side-effects from a key stroke treatment.
The program, which assesses brain scans using pattern recognition software similar to that used in airport security and passport control, has been developed by researchers at Imperial College London. Results of a pilot study funded by the Wellcome Trust, which used the software are published in the journal Neuroimage Clinical.
Stroke affects over 15 million people each year worldwide. Ischemic strokes are the most common and these occur when small clots interrupt the blood supply to the brain. The most effective treatment is called intravenous thrombolysis, which injects a chemical into the blood vessels to break up or ‘bust’ the clots, allowing blood to flow again.
However, because intravenous thombolysis effectively thins the blood, it can cause harmful side effects in about six per cent of patients, who suffer bleeding within the skull. This often worsens the disability and can cause death.
Clinicians attempt to identify patients most at risk of bleeding on the basis of several signs assessed from brain scans. However, these signs can often be very subtle and human judgements about their presence and severity tend to lack accuracy and reliability.
In the new study, researchers trained a computer program to recognise patterns in the brain scans that represent signs such as brain-thinning or diffuse small-vessel narrowing, in order to predict the likelihood of bleeding. They then pitted the automated pattern recognition software against radiologists’ ratings of the scans. The computer program predicted the occurrence of bleeding with 74 per cent accuracy compared to 63 per cent for the standard prognostic approach.
Dr Paul Bentley from the Department of Medicine, lead author of the study, said: “For each patient that doctors see, they have to weigh up whether the benefits of a treatment will outweigh the risks of side effects. Intravenous thrombolysis carries the risk of very severe side effects for a small proportion of patients, so having the best possible information on which to base our decisions is vital. Our new study is a pilot but it suggests that ultimately doctors might be able to use our pattern recognition software, alongside existing methods, in order to make more accurate assessments about who is most at risk and treat them accordingly. We are now planning to carry out a much larger study to more fully assess its potential.”
The research team conducted a retrospective analysis of computerized tomography (CT) scans from 116 patients. These are scans that use x-rays to produce ‘virtual slices’ of the brain. All the patients had suffered ischemic strokes and undergone intravenous thrombolysis in Charing Cross Hospital. In the sample the researchers included scans from 16 patients who had subsequently developed serious bleeding within the brain.
Without knowing the outcomes of the treatment, three independent experts examined the scans and used standard prognostic tools to predict whether patients would develop bleeding after treatment.
In parallel the computer program directly assessed and classified the patterns of the brain scans to produce its own predictions.
Researchers evaluated the performance of both approaches by comparing their predictions of bleeding with the actual experiences of the patients.
Using a statistical test the research showed the computer program predicted the occurrence of bleeding with 74 per cent accuracy compared to 63 per cent for the standard prognostic approach.
The researchers also gave the computer a series of ‘identity parades’ by asking the software to choose which patient out of ten scans went on to suffer bleeding. The computer correctly identified the patient 56 per cent of the time while the standard approach was correct 31 per cent of the time.
The researchers are keen to explore whether their software could also be used to identify stroke patients who might be helped by intravenous thrombolysis who are not currently offered this treatment. At present only about 20 per cent of patients with strokes are treated using intravenous thrombolysis, as doctors usually exclude those with particularly severe strokes or patients who have suffered the stroke more than four and half hours before arriving at hospital. The researchers believe that their software has the potential to help doctors to identify which of those patients are at low risk of suffering side effects and hence might benefit from treatment.
Novel compound halts cocaine addiction and relapse behaviors
A novel compound that targets an important brain receptor has a dramatic effect against a host of cocaine addiction behaviors, including relapse behavior, a University at Buffalo animal study has found.
The research provides strong evidence that this may be a novel lead compound for treating cocaine addiction, for which no effective medications exist.
The UB research was published as an online preview article in Neuropsychopharmacology last week.
In the study, the compound, RO5263397, severely blunted a broad range of cocaine addiction behaviors.
“This is the first systematic study to convincingly show that RO5263397 has the potential to treat cocaine addiction,” said Jun-Xu Li, MD, PhD, senior author and assistant professor of pharmacology and toxicology in the UB School of Medicine and Biomedical Sciences.
“Our research shows that trace amine associated receptor 1 – TAAR 1—holds great promise as a novel drug target for the development of novel medications for cocaine addiction,” he said.
TAAR 1 is a novel receptor in the brain that is activated by minute amounts of brain chemicals called trace amines.
The findings are especially important, Li added, since despite many years of research, there are no effective medications for treating cocaine addiction.
The compound targets TAAR 1, which is expressed in key drug reward and addiction regions of the brain.
“Because TAAR 1 anatomically and neurochemically is closely related to dopamine – one of the key molecules in the brain that contributes to cocaine addiction – and is thought to be a ‘brake’ on dopamine activity, drugs that stimulate TAAR 1 may be able to counteract cocaine addiction,” Li explained.
The UB research tested this hypothesis by using a newly developed TAAR 1 agonist RO5263397, a drug that stimulates TAAR 1 receptors, in animal models of human cocaine abuse.
One of the ways that researchers test the rewarding effects of cocaine in animals is called conditioned place preference. In this type of test, the animal’s persistence in returning to, or staying at, a physical location where the drug was given, is interpreted as indicating that the drug has rewarding effects.
In the UB study, RO5263397 dramatically blocked cocaine’s rewarding effects.
“When we give the rats RO5263397, they no longer perceive cocaine rewarding, suggesting that the primary effect that drives cocaine addiction in humans has been blunted,” said Li.
The compound also markedly blunted cocaine relapse in the animals.
“Cocaine users often stay clean for some time, but may relapse when they re-experience cocaine or hang out in the old cocaine use environments,” said Li. “We found that RO5263397 markedly blocked the effect of cocaine or cocaine-related cues for priming relapse behavior.
“Also, when we measured how hard the animals are willing to work to get an injection of cocaine, RO5263397 reduced the animals’ motivation to get cocaine,” said Li. “This compound makes rats less willing to work for cocaine, which led to decreased cocaine use.”
The UB researchers plan to continue studying RO5263397, especially its effectiveness and mechanisms in curbing relapse to cocaine addiction.
(Image: Shutterstock)
Neuroscientists have discovered a brain pathway that underlies the emotional behaviours critical for survival.

New research by the University of Bristol, published in the Journal of Physiology, has identified a chain of neural connections which links central survival circuits to the spinal cord, causing the body to freeze when experiencing fear.
Understanding how these central neural pathways work is a fundamental step towards developing effective treatments for emotional disorders such as anxiety, panic attacks and phobias.
An important brain region responsible for how humans and animals respond to danger is known as the PAG (periaqueductal grey), and it can trigger responses such as freezing, a high heart rate, increase in blood pressure and the desire for flight or fight.
This latest research has discovered a brain pathway leading from the PAG to a highly localised part of the cerebellum, called the pyramis. The research went on to show that the pyramis is involved in generating freezing behaviour when central survival networks are activated during innate and learnt threatening situations.
The pyramis may therefore serve as an important point of convergence for different survival networks in order to react to an emotionally challenging situation.
Dr Stella Koutsikou, first author of the study and Research Associate in the School of Physiology and Pharmacology at the University of Bristol, said: “There is a growing consensus that understanding the neural circuits underlying fear behaviour is a fundamental step towards developing effective treatments for behavioural changes associated with emotional disorders.”
Professor Bridget Lumb, Professor of Systems Neuroscience, added: “Our work introduces the novel concept that the cerebellum is a promising target for therapeutic strategies to manage dysregulation of emotional states such as panic disorders and phobias.”
The researchers involved in this work are all members of Bristol Neuroscience which fosters interactions across one of the largest communities of neuroscientists in the UK.
Professor Richard Apps said: “This is a great example of how Bristol Neuroscience brings together expertise in different fields of neuroscience leading to exciting new insights into brain function.”
A study of older adults at increased risk for Alzheimer’s disease shows that moderate physical activity may protect brain health and stave off shrinkage of the hippocampus – the brain region responsible for memory and spatial orientation that is attacked first in Alzheimer’s disease. Dr. J. Carson Smith, a kinesiology researcher in the University of Maryland School of Public Health who conducted the study, says that while all of us will lose some brain volume as we age, those with an increased genetic risk for Alzheimer’s disease typically show greater hippocampal atrophy over time. The findings are published in the open-access journal Frontiers in Aging Neuroscience.

"The good news is that being physically active may offer protection from the neurodegeneration associated with genetic risk for Alzheimer’s disease," Dr. Smith suggests. "We found that physical activity has the potential to preserve the volume of the hippocampus in those with increased risk for Alzheimer’s disease, which means we can possibly delay cognitive decline and the onset of dementia symptoms in these individuals. Physical activity interventions may be especially potent and important for this group."
Dr. Smith and colleagues, including Dr. Stephen Rao from the Cleveland Clinic, tracked four groups of healthy older adults ages 65-89, who had normal cognitive abilities, over an 18-month period and measured the volume of their hippocampus (using structural magnetic resonance imaging, or MRI) at the beginning and end of that time period. The groups were classified both for low or high Alzheimer’s risk (based on the absence or presence of the apolipoprotein E epsilon 4 allele) and for low or high physical activity levels.
Of all four groups studied, only those at high genetic risk for Alzheimer’s who did not exercise experienced a decrease in hippocampal volume (3 percent) over the 18-month period. All other groups, including those at high risk for Alzheimer’s but who were physically active, maintained the volume of their hippocampus.
"This is the first study to look at how physical activity may impact the loss of hippocampal volume in people at genetic risk for Alzheimer’s disease," says Dr. Kirk Erickson, an associate professor of psychology at the University of Pittsburgh. "There are no other treatments shown to preserve hippocampal volume in those that may develop Alzheimer’s disease. This study has tremendous implications for how we may intervene, prior to the development of any dementia symptoms, in older adults who are at increased genetic risk for Alzheimer’s disease."
Individuals were classified as high risk for Alzheimer’s if a DNA test identified the presence of a genetic marker – having one or both of the apolipoprotein E-epsilon 4 allele (APOE-e4 allele) on chromosome 19 – which increases the risk of developing the disease. Physical activity levels were measured using a standardized survey, with low activity being two or fewer days/week of low intensity activity, and high activity being three or more days/week of moderate to vigorous activity.
"We know that the majority of people who carry the E4 allele will show substantial cognitive decline with age and may develop Alzheimer’s disease, but many will not. So, there is reason to believe that there are other genetic and lifestyle factors at work," Dr. Smith says. "Our study provides additional evidence that exercise plays a protective role against cognitive decline and suggests the need for future research to investigate how physical activity may interact with genetics and decrease Alzheimer’s risk."
Dr. Smith has previously shown that a walking exercise intervention for patients with mild cognitive decline improved cognitive function by improving the efficiency of brain activity associated with memory. He is planning to conduct a prescribed exercise intervention in a population of healthy older adults with genetic and other risk factors for Alzheimer’s disease and to measure the impact on hippocampal volume and brain function.
(Source: umdrightnow.umd.edu)