Posts tagged science

Posts tagged science
UCSB Study of Cocaine Addiction Reveals Targets for Treatment
Scientists at UC Santa Barbara are researching cocaine addiction, part of a widespread problem, which, along with other addictions, costs billions of dollars in damage to individuals, families, and society. Laboratory studies at UCSB have revealed that the diminished brain function and learning impairment that result from cocaine addiction can be treated –– and that learning can be restored.
Karen Szumlinski, a professor in the Department of Psychological & Brain Sciences at UCSB, and her colleagues Osnat Ben-Shahar and Tod Kippin, have worked in the field of addiction for many years. Senior author of a paper on this topic published recently in The Journal of Neuroscience, Szumlinski is particularly interested in the part of the brain called the prefrontal cortex, where the process of “executive function” –– or decision-making –– is located. This area is involved in directing one’s behavior in an appropriate manner, and in controlling behavior.
With her research team, Szumlinski discovered that a drug that stimulates a certain type of glutamate receptor –– when aimed at the prefrontal cortex –– could restore learning impairment in rats with simulated cocaine addiction.
"Needless to say, this (the prefrontal cortex) is one of the last parts of the brain to develop, and, of relevance to our students, continues to develop through about age 25 to 28," said Szumlinski.
Szumlinski explained that in the prefrontal cortex there seems to be “hypo-frontality,” or reduced functioning, in drug addicts, as well as in patients with a range of neuropsychiatric diseases, including schizophrenia, depression, and attention deficit disorder.
Szumlinski calls the prefrontal cortex a late-developing brain area that is critical for making proper decisions, and inhibiting behavior. “You damage this brain region and you lose the ability to self-regulate, you make impulsive decisions like engaging in risky sexual behavior or drug-taking, you basically go off the deep end in terms of function,” she said. “So we were very much interested in how drugs of abuse impact the prefrontal cortex, given that human drug addicts show deficits in this brain area when you put them into a scanner. They show hypo-activity.” She said this hypo-activity, or hypo-frontality, might relate to a neurotransmitter that scientists know is involved in exciting the brain.
A key question, according to Szumlinski, is this: “Was that hypo-frontality there in the first place, and that’s why they became an addict; or did the drugs change their prefrontal cortext, to cause it to become hypo-functioning and thus they’re not able to control their drug use? You can’t parse that out in humans. So that’s why we turn then to animal models of the disorder, and we do have this rat model that we use in the paper.”
Szumlinski pointed out a key difficulty in the development of treatments for addiction: There is little money targeted to the study of this disease. Hence, in addition to studying the brain mechanisms that are involved, she is joining forces with researchers who study other neurological diseases that are well-funded, to help find cures. She hopes that government approval of new drugs for these other diseases would eventually make the drugs available for clinical trials to study their effects on cocaine addiction.
(Image: iStock)
Research update: Imaging fish in 3-D
Zebrafish larvae — tiny, transparent and fast-growing vertebrates — are widely used to study development and disease. However, visually examining the larvae for variations caused by drugs or genetic mutations is an imprecise, painstaking and time-consuming process.
Engineers at MIT have now built an automated system that can rapidly produce 3-D, micron-resolution images of thousands of zebrafish larvae and precisely analyze their physical traits. The system, described in the Feb. 12 edition of Nature Communications, offers a comprehensive view of how potential drugs affect vertebrates, says Mehmet Fatih Yanik, senior author of the paper.
“Complex processes involving organs cannot be accurately recapitulated in cell culture today. Existing 3-D tissue models are still far too simple to model live animals,” says Yanik, an MIT associate professor of electrical engineering and computer science and biological engineering. “In whole animals, the biology is far more complicated.”
Lead authors of the paper are MIT graduate student Carlos Pardo-Martin and Amin Allalou, a visiting student at MIT. Other authors are MIT senior research scientist Peter Eimon, MIT intern Jaime Medina, and Carolina Wahlby of the Broad Institute.
Zebrafish are genetically similar to humans and have many of the same developmental pathways, so scientists often use them to model human diseases including cancer, diabetes, Parkinson’s disease and autism.
Using the new technology, researchers can grow larvae in tiny wells and flow them through a channel to an imaging platform. Once there, the embryos are rotated and 320 images are taken from different angles, allowing 3-D reconstructions to be made using optical projection tomography (OPT). Getting larvae to the platform takes about 15 seconds, and the imaging takes only 2.5 seconds. This allows hundreds or thousands of larvae to be imaged within hours.
In a 2010 paper, Yanik’s team described the system that transports the embryos to the imaging platform, which they combined with high-resolution two-dimensional imaging. In the latest version, they developed a high-speed OPT imaging technique, which takes hundreds of two-dimensional images and subsequently generates a 3-D image, similar to a CT scan.
They also created a computer algorithm that can measure hundreds of traits and use that information to create a comprehensive phenotype map — the overall description of an organism’s characteristics — for each larva. This enables rapid and detailed studies of how different drugs affect those phenotypes.
“You could probably look at almost any organ or tissue that you’re interested in,” Eimon says. “It gives researchers a way to rapidly measure and quantify and put numbers on the kinds of phenotypes and gene-expression patterns that they’ve been looking at for years and years.”
In this study, the researchers focused on the craniofacial skeleton, which is analogous to the human skull. They measured the length and volume of each of the bones that make up this structure, as well as the angles between the bones.
Each embryo was imaged five days after being treated with one of nine different teratogens — drugs that cause developmental abnormalities. The researchers compared their results with the drugs’ known effects and found that they were very consistent. They also obtained high-resolution, 3-D images of the craniofacial skeletons, which are less than a millimeter long.
“Now that we’re able to load the animals, and we can image them really quickly, and we have a way to start looking at the information, the sky’s the limit,” Pardo-Martin says. “What we have to do now is ask the big questions, because the technology has advanced.”
This kind of analysis could be very valuable for drug developers who need to efficiently screen thousands of drug candidates. It could also be used to study hard-to-detect changes in phenotype caused by genetic mutations, says Joseph Fetcho, a professor of neurobiology and behavior at Cornell University.
“A really high-throughput way to assess phenotype is very important for measuring small effects on the development of an organism,” says Fetcho, who was not part of the research team. “You can see what the phenotype looks like in a large population and quantify it in a very rigorous way.”
For many patients with difficult-to-treat neuropathic pain, deep brain stimulation (DBS) can lead to long-term improvement in pain scores and other outcomes, according to a study in the February issue of Neurosurgery, official journal of the Congress of Neurological Surgeons. The journal is published by Lippincott Williams & Wilkins, a part of Wolters Kluwer Health.
About two-thirds of eligible patients who undergo DBS achieve significant and lasting benefits in terms of pain, quality of life, and overall health, according to the report by Sandra G.J. Boccard, PhD, and colleagues of University of Oxford, led by Tipu Aziz FMedSci and Alex Green, MD. Some outcomes show continued improvement after the first year, according to the new report, which is one of the largest studies of DBS for neuropathic pain performed to date.
Most Patients Benefit from DBS for Neuropathic Pain
The authors reviewed their 12-year experience with DBS for neuropathic pain. Neuropathic pain is a common and difficult-to-treat type of pain caused by nerve damage, seen in patients with trauma, diabetes, and other conditions. Phantom limb pain after amputation is an example of neuropathic pain.
In DBS, a small electrode is surgically placed in a precise location in the brain. A mild electrical current is delivered to stimulate that area of the brain, with the goal of interrupting abnormal activity. Deep brain stimulation has become a standard and effective treatment for movement disorders such as Parkinson’s disease. Although DBS has also been used to treat various types of chronic pain, its role in patients with neuropathic pain remains unclear.
Between 1999 and 2011, that authors’ program evaluated 197 patients with chronic neuropathic pain for eligibility for DBS. Of these, 85 patients proceeded to DBS treatment. The remaining patients did not receive DBS—most commonly because they were unable to secure funding from the U.K. National Health Service or decided not to undergo electrode placement surgery.
The patients who underwent DBS were 60 men and 25 women, average age 52 years. Stroke was the most common cause of neuropathic pain, followed by head and face pain, spinal disease, amputation, and injury to nerves from the upper spinal cord (brachial plexus).
In 74 patients, a trial of DBS produced sufficient pain relief to proceed with implantation of an electrical pulse generator. Of 59 patients with sufficient follow-up data, 39 had significant improvement in their overall health status up to four years later. Thus, 66 percent of patients “gained benefit and efficacy” by undergoing DBS.
Benefits Vary by Cause; Some Outcomes Improve with Time
The benefits of DBS varied for patients with different causes of neuropathic pain. Treatment was beneficial for 89 percent for patients with amputation and 70 percent of those with stroke, compared to 50 percent of those with brachial plexus injury.
On average, scores on a 10-point pain scale (with 10 indicating the most severe pain) decreased from about 8 to 4 within the first three months, remaining about the same with longer follow-up. Continued follow-up in a small number of patients suggested further improvement in other outcomes, including quality-of-life scores.
Deep brain stimulation has long been regarded as potentially useful for patients with severe neuropathic pain that is not relieved by other treatments. However, because of the difficulties of performing studies of this highly specialized treatment, there has been relatively little research to confirm its benefits; only about 1,500 patients have been treated worldwide. The new study—accounting for about five percent of all reported patients—used up-to-date DBS technologies, imaging, and surgical techniques.
Dr. Boccard and coauthors acknowledge some important limitations of their study—especially the lack of complete patient follow-up. However, they believe their experience is sufficiently encouraging to warrant additional studies, especially with continued advances in stimulation approaches and technology. The researchers conclude, “Clinical trials retaining patients in long-term follow-up are desirable to confirm findings from prospectively assessed case series.”
(Source: eurekalert.org)
Research team discovers: brain does not process sensory information sufficiently
The reason why some people are worse at learning than others has been revealed by a research team from Berlin, Bochum, and Leipzig, operating within the framework of the Germany-wide network “Bernstein Focus State Dependencies of Learning”. They have discovered that the main problem is not that learning processes are inefficient per se, but that the brain insufficiently processes the information to be learned. The scientists trained the subjects’ sense of touch to be more sensitive. In subjects who responded well to the training, the EEG revealed characteristic changes in brain activity, more specifically in the alpha waves. These alpha waves show, among other things, how effectively the brain exploits the sensory information needed for learning. “An exciting question now is to what extent the alpha activity can be deliberately influenced with biofeedback”, says PD Dr. Hubert Dinse from the Neural Plasticity Lab of the Ruhr-Universität Bochum. “This could have enormous implications for therapy after brain injury or, quite generally, for the understanding of learning processes.” The research team from the Ruhr-Universität, the Humboldt Universität zu Berlin, Charité – Universitätsmedizin Berlin and the Max Planck Institute (MPI) for Human Cognitive and Brain Sciences reported their findings in the Journal of Neuroscience.
Learning without attention: passive training of the sense of touch
How well we learn depends on genetic aspects, the individual brain anatomy, and, not least, on attention. “In recent years we have established a procedure with which we trigger learning processes in people that do not require attention”, says Hubert Dinse. The researchers were, therefore, able to exclude attention as a factor. They repeatedly stimulated the participants’ sense of touch for 30 minutes by electrically stimulating the skin of the hand. Before and after this passive training, they tested the so-called “two-point discrimination threshold”, a measure of the sensitivity of touch. For this, they applied gentle pressure to the hand with two needles and determined the smallest distance between the needles at which the patient still perceived them as separate stimuli. On average, the passive training improved the discrimination threshold by twelve percent—but not in all of the 26 participants. Using EEG, the team studied why some people learned better than others.
Imaging the brain state using EEG: the alpha waves are decisive
The cooperation partners from Berlin and Leipzig, PD Dr. Petra Ritter, Dr. Frank Freyer, and Dr. Robert Becker recorded the subjects’ spontaneous EEG before and during passive training. They then identified the components of the brain activity related to improvement in the discrimination test. The alpha activity was decisive, i.e., the brain activity was in the frequency range 8 to 12 hertz. The higher the alpha activity before the passive training, the better the people learned. In addition, the more the alpha activity decreased during passive training, the more easily they learned. These effects occurred in the somatosensory cortex, that is, where the sense of touch is located in the brain.
Researchers seek new methods for therapy
“How the alpha rhythm manages to affect learning is something we investigate with computer models”, says PD Dr. Petra Ritter, Head of the Working Group “Brain Modes” at the MPI Leipzig and the Berlin Charité. “Only when we understand the complex information processing in the brain, can we intervene specifically in the processes to help disorders”, adds Petra Ritter. New therapies are the aim of the cooperation network, which Ritter coordinates, the international “Virtual Brain” project, which her team collaborates on, and the “Neural Plasticity Lab”, chaired by Hubert Dinse at the RUB.
Learning is dependent on access to sensory information
A high level of alpha activity counts as a marker of the readiness of the brain to exploit new incoming information. Conversely, a strong decrease of alpha activity during sensory stimulation counts as an indicator that the brain processes stimuli particularly efficiently. The results, therefore, suggest that perception-based learning is highly dependent on how accessible the sensory information is. The alpha activity, as a marker of constantly changing brain states, modulates this accessibility.

Brain imaging research shows how unconscious processing improves decision-making
When faced with a difficult decision, it is often suggested to “sleep on it” or take a break from thinking about the decision in order to gain clarity.
But new brain imaging research from Carnegie Mellon University, published in the journal “Social Cognitive and Affective Neuroscience,” finds that the brain regions responsible for making decisions continue to be active even when the conscious brain is distracted with a different task. The research provides some of the first evidence showing how the brain unconsciously processes decision information in ways that lead to improved decision-making.
"This research begins to chip away at the mystery of our unconscious brains and decision-making," said J. David Creswell, assistant professor of psychology in CMU’s Dietrich College of Humanities and Social Sciences and director of the Health and Human Performance Laboratory. "It shows that brain regions important for decision-making remain active even while our brains may be simultaneously engaged in unrelated tasks, such as thinking about a math problem. What’s most intriguing about this finding is that participants did not have any awareness that their brains were still working on the decision problem while they were engaged in an unrelated task."
Scientists Discover How Animals Taste, and Avoid, High Salt Concentrations
For consumers of the typical Western diet—laden with levels of salt detrimental to long-term health—it may be hard to believe that there is such a thing as an innate aversion to very high concentrations of salt.
But Charles Zuker, PhD, and colleagues at Columbia University Medical Center have discovered how the tongue detects high concentrations of salt (think seawater levels, not potato chips), the first step in a salt-avoiding behavior common to most mammals.
The findings, which were published online in the journal Nature, could serve as a springboard for the development of taste modulators to help control the appetite for a high-salt diet and reduce the ill effects of too much sodium.
The sensation of saltiness is unique among the five basic tastes. Whereas mammals are always attracted to the tastes of sweet and umami, and repelled by sour and bitter, their behavioral response to salt dramatically changes with concentration.
“Salt taste in mammals can trigger two opposing behaviors,” said Dr. Zuker, professor in the Departments of Biochemistry & Molecular Biophysics and of Neuroscience at Columbia University College of Physicians & Surgeons. “Mammals are attracted to low concentrations of salt; they will choose a salty solution over a salt-free one. But they will reject highly concentrated salt solutions, even when salt-deprived.”
Over the past 15 years, the receptors and other cells on the tongue responsible for detecting sweet, sour, bitter, and umami tastes—as well as low concentrations of salt—have been uncovered largely through the efforts of Dr. Zuker and his collaborator Nicholas Ryba from the National Institute of Dental and Craniofacial Research.
“But we didn’t understand what was behind the aversion to high concentrations of salt,” said Yuki Oka, a postdoctoral fellow in Dr. Zuker’s laboratory and the lead author of the study.
The researchers expected high-salt receptors to reside in cells committed only to detecting high salt. “Over the years our studies have shown that each taste quality—sweet, bitter, sour, umami, and low-salt—is mediated by different cells,” Dr. Ryba said. “So we thought there must be different taste receptor cells for high-salt. But unexpectedly, Dr. Oka found high salt is mediated by cells we already knew.”

An interdisciplinary team of researchers from the University of Texas Medical Branch at Galveston and the University of Houston has found a new way to influence the vital serotonin signaling system — possibly leading to more effective medications with fewer side effects.
Scientists have linked malfunctions in serotonin signaling to a wide range of health issues, everything from depression and addictions to epilepsy and obesity and eating disorders. Much of their attention has focused on complex proteins called serotonin receptors, which are located in the cell membrane. Each receptor has a so-called “active site” specially suited to bond with a serotonin molecule; when that bond is formed, the receptor changes shape, transmitting a signal to the cell’s interior.
Traditional drug discovery efforts target interactions that take place at such active sites. But a receptor’s behavior can also be changed by additional proteins that bind to the receptor at locations quite distant (in molecular terms) from the active site, in a process called “allosteric regulation” — the mechanism examined by the UTMB-UH team for one specific and highly significant kind of serotonin receptor, designated the 5-HT2C.
“This is a whole new way of thinking about this system, targeting these interactions,” said UTMB professor Kathryn Cunningham, senior author of a paper on the research now online in the Journal of Neuroscience. “Basically, we’ve created a new series of molecules and validated that we can use them to change the way the receptor functions both in vitro and in vivo, through an allosteric effect.”
(Image: thedea.org)

Blood May Hold Clues to Risk of Memory Problems After Menopause
New Mayo Clinic research suggests that blood may hold clues to whether post-menopausal women may be at an increased risk for areas of brain damage that can lead to memory problems and possibly increased risk of stroke. The study shows that blood’s tendency to clot may contribute to areas of brain damage called white matter hyperintensities. The findings are published in the Feb. 13 online issue of Neurology, the medical journal of the American Academy of Neurology.
The study involved 95 women with an average age of 53 who recently went through menopause. The women had magnetic resonance imaging, or MRIs, taken of their brains at the start of the study. They then received a placebo, oral hormone therapy or the hormone skin patch. They had MRIs periodically over the next four years.
During the study, women with higher levels of thrombogenic microvesicles, the platelets more likely to cause blood to clot, were likelier to have higher increases in the amount of white matter hyperintensities (shown as concentrated white areas on an MRI scan), which may lead to memory loss.
"This study suggests that the tendency of the blood to clot may contribute to a cascade of events leading to the development of brain damage in women who have recently gone through menopause," says study author Kejal Kantarci, M.D., of Mayo Clinic. "Preventing the platelets from developing these microvesicles could be a way to stop the progression of white matter hyperintensities in the brain."
All of the women had white matter hyperintensities at the start of the study. The amount increased by an average volume of 63 cubic millimeters at 18 months, 122 cubic millimeters at three years and 155 cubic millimeters at four years.
(Image: Shutterstock)

A team of political scientists and neuroscientists has shown that liberals and conservatives use different parts of the brain when they make risky decisions, and these regions can be used to predict which political party a person prefers. The new study suggests that while genetics or parental influence may play a significant role, being a Republican or Democrat changes how the brain functions.
Dr. Darren Schreiber, a researcher in neuropolitics at the University of Exeter, has been working in collaboration with colleagues at the University of California, San Diego on research that explores the differences in the way the brain functions in American liberals and conservatives. The findings are published in the journal PLOS ONE on 13 February.
In a prior experiment, participants had their brain activity measured as they played a simple gambling game. Dr. Schreiber and his UC San Diego collaborators were able to look up the political party registration of the participants in public records. Using this new analysis of 82 people who performed the gambling task, the academics showed that Republicans and Democrats do not differ in the risks they take. However, there were striking differences in the participants’ brain activity during the risk-taking task.
Democrats showed significantly greater activity in the left insula, a region associated with social and self-awareness. Meanwhile Republicans showed significantly greater activity in the right amygdala, a region involved in the body’s fight-or-flight system. These results suggest that liberals and conservatives engage different cognitive processes when they think about risk.
In fact, brain activity in these two regions alone can be used to predict whether a person is a Democrat or Republican with 82.9% accuracy. By comparison, the longstanding traditional model in political science, which uses the party affiliation of a person’s mother and father to predict the child’s affiliation, is only accurate about 69.5% of the time. And another model based on the differences in brain structure distinguishes liberals from conservatives with only 71.6% accuracy.
The model also outperforms models based on differences in genes. Dr. Schreiber said: “Although genetics have been shown to contribute to differences in political ideology and strength of party politics, the portion of variation in political affiliation explained by activity in the amygdala and insula is significantly larger, suggesting that affiliating with a political party and engaging in a partisan environment may alter the brain, above and beyond the effect of heredity.”
These results may pave the way for new research on voter behaviour, yielding better understanding of the differences in how liberals and conservatives think. According to Dr. Schreiber: “The ability to accurately predict party politics using only brain activity while gambling suggests that investigating basic neural differences between voters may provide us with more powerful insights than the traditional tools of political science.”
Gene thought to be linked to Alzheimer’s is marker for only mild impairment
Defying the widely held belief that a specific gene is the biggest risk factor for Alzheimer’s disease, two Cornell developmental psychologists and their colleagues report that people with that gene are more likely to develop mild cognitive impairment — but not Alzheimer’s.
The study suggests that older adults with healthy brain function can get genetic tests to predict increased risk of future mild cognitive impairment. However, once they are impaired cognitively, the tests won’t predict their likelihood of developing Alzheimer’s.
"Right now, genetic tests are used in exactly the opposite way. That is, healthy people don’t get the tests to predict their risk of mild cognitive impairment, but impaired people get them to predict their risk of Alzheimer’s disease," said Charles Brainerd, professor of human development and the study’s lead co-author with Valerie Reyna, professor of human development. "So, impaired people think that tests will tell them if they are at increased risk of Alzheimer’s, which they won’t. And healthy people think that tests won’t tell them whether they are at increased risk of cognitive impairment, which they will."
The researchers describe their findings in the January issue of Neuropsychology (27:1).
The work builds on previous research by Brainerd and associates that suggested the ε4 allele of the APOE genotype increases the risk of mild cognitive impairment as well as Alzheimer’s.
The researchers analyzed data from the only nationally representative dataset of its kind, the National Institute on Aging’s Aging, Demographics and Memory Study. They looked at data from 418 people over age 70 to see if those who carried the allele were more likely to develop mild cognitive impairment compared with those who did not have the allele. They also looked at whether ε4 carriers with mild cognitive impairment were more likely to develop Alzheimer’s disease compared with non-carriers with mild cognitive impairment.
They found that healthy ε4 carriers were nearly three times — 58 percent — more likely to develop mild cognitive impairment compared with non-carriers. However, ε4 carriers with mild cognitive impairment developed Alzheimer’s at the same rate as non-carriers.