Neuroscience

Month

June 2012

Brain Structure Helps Guide Behavior by Anticipating Changing Demands

ScienceDaily (June 24, 2012) — Every day the human brain is presented with tasks ranging from the trivial to the complex. How much mental effort and attention are devoted to each task is usually determined in a split second and without conscious awareness. Now a study from Massachusetts General Hospital (MGH) researchers finds that a structure deep within the brain, believed to play an important role in regulating conscious control of goal-directed behavior, helps to optimize behavioral responses by predicting how difficult upcoming tasks will be. The report is receiving advance online publication in Nature.

"The dorsal anterior cingulate cortex (dACC), which lies deep beneath the outer layer of the frontal lobes, is part of an ancient and enigmatic part of the brain," says Emad Eskandar, MD, of the MGH Department of Neurosurgery, senior author of the Nature paper. “Some have speculated that it plays a role in detecting errors or monitoring for conflicting demands, but exactly how it contributes to regulating behavioral responses is unclear, so we used a variety of scientific techniques to get a better picture of its function.”

The study enrolled six participants who were scheduled to undergo cingulotomy — a procedure in which a small, precisely placed lesion is created within the ACC — to treat severe obsessive compulsive disorder (OCD) that has not responded to other types of treatment. A standard part of the cingulotomy procedure involves microelectrode recordings of the activity of single neurons in the area where the lesion is to be placed. To evaluate dACC function, the investigators recorded brain activity from several neurons within the structure while participants performed a behavioral task testing their reactions to visual images.

The task presented participants with a random series of images of three numerals, which could be 0, 1, 2, or 3. In each image, two of the numerals were identical. Participants responded by pressing one of three buttons, the position of which would indicate the identity of the number that was different, with the left button indicating 1, the middle 2 and the right button 3. Each image was ranked in difficulty depending on how much the position of the target numeral or the identity of the duplicate numerals might distract participants from the correct response. For example, when presented with 3-3-2, the correct response would be to press the middle button for number 2; and that image would be ranked more difficult than 3-2-3, in which both the target number and the correct button were in the same position.

Functional magnetic resonance imaging (fMRI) of four participants performing the behavioral task prior to the cingulotomy procedure revealed that the task increased metabolic activity within the dACC, a result seen in previous fMRI studies. The fMRI images also revealed that responding to more difficult images produced greater activity levels within the dACC and in other structures known to be involved in decision making. Intraoperative microelectrode recordings of all participants demonstrated that this apparent increase in metabolic activity corresponded with an increase in neuronal activity, linking for the first time the increased activation revealed by fMRI with increased neuronal firing.

Analysis of individual neuron activity indicated that dACC neuronal activity remained elevated immediately after difficult trials. Moreover, participant reaction time revealed that the difficulty of the prior trial had an impact on the next trial: if the preceding trial was of the same level of difficulty, reaction time was shorter; if the two tests were of different difficulty levels — even if the second test was easier — reaction time was longer. By anticipating the difficulty of upcoming tasks, the authors note, it appears that the dACC speeds up responses when difficulty levels are constant but slows response time down when faced with changing demands in order to promote accuracy.

While behavioral tests conducted after the cingulotomy procedure — which destroys tissue within the dACC — did not indicate a change in participants’ ability to perform the test accurately, the impact of preceding trials on reaction time appeared to vanish. “Participants could still perform the task, but the dACC’s role of priming the system based on immediate prior experience was gone,” Eskandar explains. “We believe this result indicates an important role for the dACC in rapidly adjusting to different cognitive demands, possibly by recruiting other areas of the brain to solve particular problems.”

An associate professor of Surgery at Harvard Medical School, Eskandar adds that, while significant cognitive changes have not been reported in patients undergoing cingulotomy, the apparent role of the dACC in adapting to changing situations implies a possible role for the structure in several psychiataric disorders. “A lack of behavior flexibility and adjustment is characteristic of OCD, for example. Whether or not our findings directly relate to these disorders remains to be determined, but we hope that continued study using complex tasks, such as the behavioral test used here, will be helpful in diagnosing or monitoring psychiatric disorders.”

Source: Science Daily

Jun 25, 201221 notes
#science #neuroscience #brain #psychology
Gene Mutations Cause Massive Brain Asymmetry

ScienceDaily (June 24, 2012) — Hemimegalencephaly is a rare but dramatic condition in which the brain grows asymmetrically, with one hemisphere becoming massively enlarged. Though frequently diagnosed in children with severe epilepsy, the cause of hemimegalencephaly is unknown and current treatment is radical: surgical removal of some or all of the diseased half of the brain.

image

This image depicts hemimegalencephaly. (Credit: UC San Diego School of Medicine)

In a paper published in the June 24, 2012 online issue of Nature Genetics, a team of doctors and scientists, led by researchers at the University of California, San Diego School of Medicine and the Howard Hughes Medical Institute, say de novo somatic mutations in a trio of genes that help regulate cell size and proliferation are likely culprits for causing hemimegalencephaly, though perhaps not the only ones.

De novo somatic mutations are genetic changes in non-sex cells that are neither possessed nor transmitted by either parent. The scientists’ findings — a collaboration between Joseph G. Gleeson, MD, professor of neurosciences and pediatrics at UC San Diego School of Medicine and Rady Children’s Hospital-San Diego; Gary W. Mathern, MD, a neurosurgeon at UC Los Angeles’ Mattel Children’s Hospital; and colleagues — suggest it may be possible to design drugs that inhibit or turn down signals from these mutated genes, reducing or even preventing the need for surgery.

Gleeson’s lab studied a group of 20 patients with hemimegalencephaly upon whom Mathern had operated, analyzing and comparing DNA sequences from removed brain tissue with DNA from the patients’ blood and saliva.

"Mathern had reported a family with identical twins, in which one had hemimegalencephaly and one did not. Since such twins share all inherited DNA, we got to thinking that there may be a new mutation that arose in the diseased brain that causes the condition," said Gleeson. Realizing they shared the same ideas about potential causes, the physicians set out to tackle this question using new exome sequencing technology, which allows sequencing of all of the protein-coding exons of the genome at the same time.

The researchers ultimately identified three gene mutations found only in the diseased brain samples. All three mutated genes had previously been linked to cancers.

"We found mutations in a high percentage of the cells in genes regulating the cellular growth pathways in hemimegalencephaly," said Gleeson. "These same mutations have been found in various solid malignancies, including breast and pancreatic cancer. For reasons we do not yet understand, our patients do not develop cancer, but rather this unusual brain condition. Either there are other mutations required for cancer propagation that are missing in these patients, or neurons are not capable of forming these types of cancers."

The mutations were found in 30 percent of the patients studied, indicating other factors are involved. Nonetheless, the researchers have begun investigating potential treatments that address the known gene mutations, with the clear goal of finding a way to avoid the need for surgery.

"Although counterintuitive, hemimegalencephaly patients are far better off following the functional removal or disconnection of the enlarged hemisphere," said Mathern. "Prior to the surgery, most patients have devastating epilepsy, with hundreds of seizures per day, completely resistant to even our most powerful anti-seizure medications. The surgery disconnects the affected hemisphere from the rest of the brain, causing the seizures to stop. If performed at a young age and with appropriate rehabilitation, most children suffer less language or cognitive delay due to neural plasticity of the remaining hemisphere."

But a less-invasive drug therapy would still be more appealing.

"We know that certain already-approved medications can turn down the signaling pathway used by the mutated genes in hemimegalencephaly," said lead author and former UC San Diego post-doctoral researcher Jeong Ho Lee, now at the Korea Advanced Institute of Science and Technology. "We would like to know if future patients might benefit from such a treatment. Wouldn’t it be wonderful if our results could prevent the need for such radical procedures in these children?"

Source: Science Daily

Jun 25, 201239 notes
#science #neuroscience #brain #psychology #genes
Neurons That Control Overeating Also Drive Appetite for Cocaine

ScienceDaily (June 24, 2012) — Researchers at Yale School of Medicine have zeroed in on a set of neurons in the part of the brain that controls hunger, and found that these neurons are not only associated with overeating, but also linked to non-food associated behaviors, like novelty-seeking and drug addiction.

image

A lean animal and a control were both exposed to a novelty item (center). The lean animal spent more time exploring the novelty, as shown by the higher concentration of yellow in the slide. (Credit: Image courtesy of Yale University)

Published in the June 24 online issue of Nature Neuroscience, the study was led by Marcelo O. Dietrich, postdoctoral associate, and Tamas L. Horvath, the Jean and David W. Wallace Professor of Biomedical Research and chair of comparative medicine at Yale School of Medicine.

In attempts to develop treatments for metabolic disorders such as obesity and diabetes, researchers have paid increasing attention to the brain’s reward circuits located in the midbrain, with the notion that in these patients, food may become a type of “drug of abuse” similar to cocaine. Dietrich notes, however, that this study flips the common wisdom on its head.

"Using genetic approaches, we found that increased appetite for food can actually be associated with decreased interest in novelty as well as in cocaine, and on the other hand, less interest in food can predict increased interest in cocaine," said Dietrich.

Horvath and his team studied two sets of transgenic mice. In one set, they knocked out a signaling molecule that controls hunger-promoting neurons in the hypothalamus. In the other set, they interfered with the same neurons by eliminating them selectively during development using diphtheria toxin. The mice were given various non-invasive tests that measured how they respond to novelty, and anxiety, and how they react to cocaine.

"We found that animals that have less interest in food are more interested in novelty-seeking behaviors and drugs like cocaine," said Horvath. "This suggests that there may be individuals with increased drive of the reward circuitry, but who are still lean. This is a complex trait that arises from the activity of the basic feeding circuits during development, which then impacts the adult response to drugs and novelty in the environment."

Horvath and his team argue that the hypothalamus, which controls vital functions such as body temperature, hunger, thirst fatigue and sleep, is key to the development of higher brain functions. “These hunger-promoting neurons are critically important during development to establish the set point of higher brain functions, and their impaired function may be the underlying cause for altered motivated and cognitive behaviors,” he said.

"There is this contemporary view that obesity is associated with the increased drive of the reward circuitry," Horvath added. "But here, we provide a contrasting view: that the reward aspect can be very high, but subjects can still be very lean. At the same time, it indicates that a set of people who have no interest in food, might be more prone to drug addiction."

Source: Science Daily

Jun 25, 201241 notes
#science #neuroscience #neuron #psychology #brain #addiction
Learn That Tune While Fast Asleep: Stimulation During Sleep Can Enhance Skill Learning

ScienceDaily (June 24, 2012) — Want to nail that tune that you’ve practiced and practiced? Maybe you should take a nap with the same melody playing during your sleep, new provocative Northwestern University research suggests.

image

Want to nail that tune that you’ve practiced and practiced? Maybe you should take a nap with the same melody playing during your sleep. (Credit: © Anton Maltsev / Fotolia)

The research grows out of exciting existing evidence that suggests that memories can be reactivated during sleep and storage of them can be strengthened in the process.

In the Northwestern study, research participants learned how to play two artificially generated musical tunes with well-timed key presses. Then while the participants took a 90-minute nap, the researchers presented one of the tunes that had been practiced, but not the other.

"Our results extend prior research by showing that external stimulation during sleep can influence a complex skill," said Ken A. Paller, professor of psychology in the Weinberg College of Arts and Sciences at Northwestern and senior author of the study.

By using EEG methods to record the brain’s electrical activity, the researchers ensured that the soft musical “cues” were presented during slow-wave sleep, a stage of sleep previously linked to cementing memories. Participants made fewer errors when pressing the keys to produce the melody that had been presented while they slept, compared to the melody not presented.

"We also found that electrophysiological signals during sleep correlated with the extent to which memory improved," said lead author James Antony of the Interdepartmental Neuroscience Program at Northwestern. "These signals may thus be measuring the brain events that produce memory improvement during sleep."

The age-old myth that you can learn a foreign language while you sleep is sure to come to mind, said Paul J. Reber, associate professor of psychology at Northwestern and a co-author of the study.

"The critical difference is that our research shows that memory is strengthened for something you’ve already learned," Reber said. "Rather than learning something new in your sleep, we’re talking about enhancing an existing memory by re-activating information recently acquired."

The researchers, he said, are now thinking about how their findings could apply to many other types of learning.

"If you were learning how to speak in a foreign language during the day, for example, and then tried to reactivate those memories during sleep, perhaps you might enhance your learning."

Paller said he hopes the study will help them learn more about the basic brain mechanisms that transpire during sleep to help preserve memory storage.

"These same mechanisms may not only allow an abundance of memories to be maintained throughout a lifetime, but they may also allow memory storage to be enriched through the generation of novel connections among memories," he said.

The study opens the door for future studies of sleep-based memory processing for many different types of motor skills, habits and behavioral dispositions, Paller said.

Source: Science Daily

Jun 25, 201245 notes
#science #neuroscience #learning #brain #psychology
Jun 24, 201222 notes
#Darwin #Turing #evolution #neuroscience #science #AI
Predicting Treatment Response in Central Nervous System Diseases: Simple Way of Avoiding Dangerous Side Effects?

ScienceDaily (June 23, 2012) — The commonly-used epilepsy drug, valproic acid (VPA), can have a highly beneficial effect on some babies born with spinal muscular atrophy (SMA), the number one genetic killer during early infancy. But in about two-thirds of such cases it is either damaging or simply has no effect. Now, for the first time, researchers have found a way to identify which patients are likely to respond well to VPA prior to starting treatment. Their results have major implications, not just for SMA patients, but for other conditions treated with the drug such as migraine and epilepsy, and may even provide the conditions for turning VPA non-responders into responders, the researchers say.

Dr. Lutz Garbes, from the Institute of Human Genetics, University of Cologne, Germany, will tell the annual conference of the European Society of Human Genetics on June 24 that he and his colleagues had analysed blood RNA samples from a small group of SMA patients who had been treated with VPA. They found, as expected, that only about one third of patients responded well. In an attempt to discover whether blood sampling was the most appropriate test method to use, they also looked at VPA response in another tissue — fibroblasts (a type of skin cell). They found that the response in blood and in skin was the same in 60% of cases.

The researchers then generated pluripotent stem cells from fibroblasts of both a VPA responder and a non-responder, and differentiated them into GABAergic neurons (neurons that produce the amino acid GABA, the chief neurotransmitter in the mammalian nervous system). These neurons, when treated with VPA, exhibited a similar response to that previously found in blood and fibroblasts.

"This indicates for the first time that response to VPA is the same among blood and skin and suggests that monitoring blood for VPA therapy is indeed feasible in central nervous system diseases," says Dr. Garbes. "But, even more importantly, by using the SMA patients’ fibroblasts we were able to identify a decisive factor in the suppression of the positive response to VPA treatment. Utilising transcriptome-wide microarray profiling*, we found that high levels of the fatty acid transporter protein CD36 are associated with the lack of positive response to treatment.

"The implications of this discovery are far-reaching. First, we have been able to prove that monitoring blood is a reliable method for doctors to determine response to VPA treatment in many central nervous system diseases, since our findings are not specific to SMA. Second, the identification of CD36 as the crucial factor in suppressing response to treatment provides a simple way of appraising whether a patient will respond to therapy before treatment starts. And third, in the long run we may find a way to target CD36 in order to be able to change a non-VPA responder into a responder."

Knowing that CD36 is a crucial factor here means that the current, potentially dangerous, ‘trial and error’ approach to VPA treatment is now obsolete, the researchers say. Screening of patients for CD36 prior to treatment would mean that only those who would respond positively to VPA would be given it. This is important because, in some cases, VPA can cause life-threatening side-effects such as impairment of liver, blood cell and pancreatic function, especially in those just starting the treatment. “But we still do not understand how CD36 suppresses response to VPA, only that it does so,” says Dr. Garbes. “A greater understanding of its effects could also lead to the detection of even better targets to overcome the problem. “

In the case of SMA, VPA works by inhibiting enzymes called histone deacetylase (HDACs) which are involved in regulating the packaging of DNA. HDACs lead to a denser DNA packaging whereby protein production from genes is reduced. Other enzymes called histone acetyltransferases (HATs) lead to a more relaxed DNA structure, producing more protein. By inhibiting HDACs with VPA, the DNA packaging balance shifts towards the more relaxed structure and thus genes get activated and proteins produced. In SMA, the crucial gene is SMN2, a copy gene of the disease-determining gene SMN1. In healthy individuals, SMN1 is the major source of SMN protein, but SMN2 cannot fully compensate for the loss of SMN1 in SMA patients. By increasing SMN2 activity, it will produce more SMN protein and ameliorate the condition.

"Avoiding needless VPA treatment of non-responders would have a major effect on healthcare costs and improve quality of life for patients," Dr. Garbes will say. "Half of the babies born with SMA will die within two years, but the other half can live to twenty or even longer, so this is an important finding for them. Our findings may also help identify patients who are candidates for VPA treatment in many other diseases of the central nervous system, some of them very common.

"In the EU, approximately 550 SMA babies are born each year, and there are about 311,000 new cases of epilepsy per year. It is estimated that, in Europe, migraine affects up to 28% of people at some time in their lives. We are happy that we may have been able to contribute to the development of personalised medicine for so many people," he will conclude.

*A transcriptome-wide microarray profile provides a way of identifying all the genes that are differentially expressed in distinct cell populations or subtypes, allowing the effects of treatment to be monitored.

Source: Science Daily

Jun 24, 20124 notes
#science #neuroscience #brain #psychology
New Approach to Diagnosing and Treating Dementia

ScienceDaily (June 22, 2012) — Some dementia patients show symptoms of a malfunctioning immune system and can receive appropriate treatment.

Scientists at Charité — Universitätsmedizin Berlin have succeeded in recommending a new type of therapeutic approach to dementia. The study published in the journal Neurology shows that immune reactions against the body’s own nerve cells can be the cause of advanced dementia and an appropriate immune suppressive therapy can develop with significant effectiveness.

Dementia burdens society with high costs, and those affected by it and their family members carry a tremendous psychosocial burden. Dementia is increasingly perceived as a sword of Damocles over an aging society due to its often unclear origin, difficult prevention and unsatisfactory therapies.

Together with a workgroup and cooperation partners in Germany and the US, Dr. Harald Prüß, physician at the Klinik für Neurologie of the Charité, was able to prove that dementia is also caused by the immune system. As an accessory symptom of an autoimmune disease, dementia can thus be treated. This approach to diagnostic criteria has been overlooked until now. It was proven that a number of patients in this study who suffered from advanced memory loss had developed an immune defense response with antibodies against an ion channel in the brain, a so-called NMDA-type glutamate channel. Particular proteins in the nerve cell membrane are reduced leading to the characteristic disruption in nerve function and synapsis loss. Those affected exhibit memory problems and abnormalities in mood and emotion. Eliminating these antibodies through hemodialysis improved the symptoms in cerebral metabolism in the hippocampus region — a part of the brain that is relevant for memory performance and particularly affected by dementia.

"Through the study results, a completely new approach to diagnosing dementia can possibly result. At the moment we are working on a follow-up study with larger test groups in order to verify our approach even further," explains Harald Prüß. He adds: "The potential promise of this new approach is that completely new perspectives could result for an entire group of people suffering from dementia for whom no specific therapeutic option exists."

Source: Science Daily

Jun 24, 201216 notes
#science #neuroscience #brain #psychology #dementia
Information Flow in the Brain Is Not a 'One-Way Street'

ScienceDaily (June 22, 2012) — A longstanding question in brain research is how information is processed in the brain. Neuroscientists at the Charité — Universitätsmedizin Berlin, Cluster of Excellence NeuroCure and University of Newcastle have made a contribution towards answering this question. In a new study, they have shown that signals are generated not only in the cell body of nerve cells, but also in their output extension, the axon. A specific filter cell regulates signal propagation.

These findings have now been published in the journal Science.

Until now it has been assumed that information flow in nerve cells proceeds along a “one-way street.” Electrical impulses are initiated at the cell body and propagate along the axon to the next neuron, where they are received by extensions, the dendrites, acting as antennae. However, the team around Charité researchers Tengis Gloveli and Tamar Dugladze has demonstrated that this model needs to be revised. They discovered that signals can also be initiated in axons, i.e. outside the cell body. This happens during highly synchronous neuronal activity as, for example, in a state of heightened attention. Moreover, these axonally generated signals flow bidirectionally and represent a new principle of information processing: on the one hand, impulses propagate from their origin towards other nerve cells; on the other hand, the signals also backpropagate towards the cell body, i.e. in the “wrong direction” down the one-way street. A potential problem is that backpropagating signals could lead to excessive cell activation.

However, the researchers found that backpropagating signals do not reach the cell body under normal conditions. The reason for this, the scientists discovered, is a natural filter that prevents these signals from passing. “Axo-axonic cells, an inhibitory cell type, regulate signal propagation and thus occupy an outstanding strategic position,” explains Tamar Dugladze. Through the filter function, these cells allow signals initiated at the cell body to pass, but suppress backpropagating impulses generated in the axon. By this means, excessive activation of the cell body is prevented. In experiments, the scientists could show that when this filter function is deactivated, backpropagating signals are allowed to pass, resulting in higher cell activation.

These filter cells can become damaged in various neurological diseases. The consequent misregulation of signal flow, in turn, has fatal effects on information processing in the brain. “Results of this study shed new light on the central question of how signals are processed in the brain. In addition, these findings could help us better understand the development and progress of neuronal diseases such as epilepsy, which involves excessive hypersynchronous activity of large sets of neurons. This knowledge could open up new therapeutic approaches,” says Tengis Gloveli. The neuroscientists will therefore focus their future research on both basic understanding of the mechanisms of signal flow in the nervous system, and the relevance of these mechanisms in the genesis of epilepsy.

Source: Science Daily

Jun 24, 201227 notes
#science #neuroscience #brain #psychology
Most Commonly Mutated Gene in Cancer May Have a Role in Stroke

ScienceDaily (June 22, 2012) — The gene p53 is the most commonly mutated gene in cancer. p53 is dubbed the “guardian of the genome” because it blocks cells with damaged DNA from propagating and eventually becoming cancerous. However, new research led by Ute M. Moll, M.D., Professor of Pathology at Stony Brook University School of Medicine, and colleagues, uncovers a novel role for p53 beyond cancer in the development of ischemic stroke. The research team identified an unexpected critical function of p53 in activating necrosis, an irreversible form of tissue death, triggered during oxidative stress and ischemia.

image

Dr. Ute Moll, Professor of Pathology, has uncovered a novel role for p53 in the development of ischemic stroke. (Credit: Image courtesy of Stony Brook Medicine)

The findings are detailed online in Cell.

Ischemia-associated oxidative damage leads to irreversible necrosis which is a major cause of catastrophic tissue loss. Elucidating its signaling mechanism is of paramount importance. p53 is a central cellular stress sensor that responds to multiple insults including oxidative stress and is known to orchestrate apoptotic and autophagic types of cell death. However, it was previously unknown whether p53 can also activate oxidative stress-induced necrosis, a regulated form of cell death that depends on the mitochondrial permeability transition pore (PTP) pore.

"We identified an unexpected and critical function of p53 in activating necrosis: In response to oxidative stress in normal healthy cells, p53 accumulates in the mitochondrial matrix and triggers the opening of the PTP pore at the inner mitochondrial membrane, leading to collapse of the electrochemical gradient and cell necrosis," explains Dr. Moll. "p53 acts via physical interaction with the critical PTP regulator Cyclophylin D (CypD). This p53 action occurs in cultured cells and in ischemic stroke in mice. "

Of note, they found in their model that when the destructive p53-CypD complex is blocked from forming by using Cyclosporine-A type inhibitors, the brain tissue is strongly protected from necrosis and stroke is prevented.

"The findings fundamentally expand our understanding of p53-mediated cell death networks," says Dr. Moll. "The data also suggest that acute temporary blockade of the destructive p53-CypD complex with clinically well-tolerated Cyclosporine A-type inhibitors may lead to a therapeutic strategy to limit the extent of an ischemic stroke in patients."

"p53 is one of the most important genes in cancer and by far the most studied," says Yusuf A. Hannun, M.D., Director of the Stony Brook University Cancer Center, Vice Dean for Cancer Medicine, and the Joel Kenny Professor of Medicine at Stony Brook. "Therefore, this discovery by Dr. Moll and her colleagues in defining the mechanism of a new p53 function and its importance in necrotic injury and stoke is truly spectacular."

Dr. Moll has studied p53 for 20 years in her Stony Brook laboratory. Her research has led to numerous discoveries about the function of p53 and two related genes. For example, previous to this latest finding regarding p53 and stroke, Dr. Moll identified that p73, a cousin to p53, steps in as a tumor suppressor gene when p53 is lost and can stabilize the genome. She found that p73 plays a major developmental role in maintaining the neural stem cell pool during brain formation and adult learning. Her work also helped to identify that another p53 cousin, called p63, has a critical surveillance function in the male germ line and likely contributed to the evolution of humans and great apes, enabling their long reproductive periods.

Source: Science Daily

Jun 24, 20129 notes
#science #neuroscience #brain #psychology #stroke #cancer
South African Daffodils May Be a Future Treatment for Depression

ScienceDaily (June 22, 2012) — Scientists have discovered that plant compounds from a South African flower may in time be used to treat diseases originating in the brain — including depression. At the University of Copenhagen, a number of these substances have now been tested in a laboratory model of the blood-brain barrier.

image

Crinum from South Africa. (Credit: Gary I. Stafford)

Scientists at the University of Copenhagen have previously documented that substances from the South African plant species Crinum and Cyrtanthus — akin to snowdrops and daffodils — have an effect on the mechanisms in the brain that are involved in depression. This research has now yielded further results, since a team based at the Faculty of Health and Medical Sciences has recently shown how several South African daffodils contain plant compounds whose characteristics enable them to negotiate the defensive blood-brain barrier that is a key challenge in all new drug development.

"Several of our plant compounds can probably be smuggled past the brain’s effective barrier proteins. We examined various compounds for their influence on the transporter proteins in the brain. This study was made in a genetically-modified cell model of the blood-brain barrier that contains high levels of the transporter P-glycoprotein. Our results are promising, and several of the chemical compounds studied should therefore be tested further, as candidates for long-term drug development," says Associate Professor Birger Brodin.

"The biggest challenge in medical treatment of diseases of the brain is that the drug cannot pass through the blood-brain barrier. The blood vessels of the brain are impenetrable for most compounds, one reason being the very active transporter proteins. You could say that the proteins pump the drugs out of the cells just as quickly as they are pumped in. So it is of great interest to find compounds that manage to ‘trick’ this line of defence."

The results of the study have been published in the Journal of Pharmacy and Pharmacology.

It will nonetheless be a long time before any possible new drug reaches our pharmacy shelves: “This is the first stage of a lengthy process, so it will take some time before we can determine which of the plant compounds can be used in further drug development,” says Birger Brodin.

Yet this does not curb his enthusiasm for the opportunities from the interdisciplinary cooperation with organic scientists from the Department of Drug Design and Pharmacology and the Natural History Museum of Denmark.

"In my research group, we have had a long-term focus on the body’s barrier tissue — and in recent years particularly the transport of drug compounds across the blood-brain barrier. More than 90 per cent of all potential drugs fail the test by not making it through the barrier, or being pumped out as soon as they do get in. Studies of natural therapies are a valuable source of inspiration, giving us knowledge that can also be used in other contexts," Birger Brodin emphasises.

Source: Science Daily

Jun 24, 201243 notes
#science #neuroscience #brain #psychology #depression
Jun 24, 20129 notes
#science #neuroscience #brain #psychology #taste
Finding sounds in an audible haystack

June 22, 2012 By Virat Markandeya

Listening to a single voice in a crowded cocktail party sometimes seems like picking a needle out of a haystack, but new research shows that people may be better at this than expected.

image

New research shows that people can comprehend one sound among many.

The results surprised the University of Washington, Seattle, research team, which tested how well people could pick out one sound from a dense collection of noises.

The researchers asked ten subjects to listen to multiple streams of letters. A stream consisted of a repeating letter, for example, Q-Q-Q-Q. If four streams were played, the listener heard four different repeating letters, say, D, C, Q and J. The letters came fast —the time interval between each letter was just one-twelfth of a second.

In front of the listener was a computer screen. Before the start of each trial, the researchers put one of the four letters on the screen to prime the subject to focus on it. If he heard an oddball letter in that stream, such as R instead of Q, he was to press a button.

To make it easier on the listener, each letter stream carried a different pitch and came from a different location in the room. R was chosen as the oddball because it doesn’t rhyme with any other letter.  

"Unlike most experiments where you try to make it difficult for the listener to do the task, we tried to give every advantage we could," said Adrian K.C. Lee, a speech and hearing researcher at the university, who worked closely with Ross Maddox.

As expected, when the number of streams went up, the ability to discern the letter came down. But even with 12 streams the letter was identified correctly around 70 percent of the time.

"We expected that 12 streams would have broken the upper limits of the [subject’s hearing] system," said Lee. "It is surprising that even with twelve things coming at you at the same time you can lock on to one with reasonably high accuracy."

The work was presented last month at the Acoustics 2012 Hong Kong conference.

Down the line, the researchers want to use these experiments to design a way for paralyzed patients to control a wheelchair or a computer using brain signals. Such devices, called brain-computer interfaces, have mostly relied on visual or motor stimuli. Typically, a subject might focus on a visual cue or imagine making a movement. Using a machine that detects brain signals, such as an electroencephalogram, researchers would attempt to characterize the brain responses connected with that task and translate them into commands. Focusing on an auditory signal too produces brain signals that can be characterized. However, the current study did not look at brain signals.

A very practical reason to look at auditory interfaces is that eye-gaze control — on which visually-controlled interfaces are based — is often absent in people in a late stage of a neurodegenerative disease, said Martijn Schreuder, a researcher at the Berlin Institute of Technology.

Schreuder, who has worked on an interface where subjects spelled words by focusing on particular sounds, pointed out that auditory interfaces allow someone who is completely blind to communicate.

Schreuder said Lee’s work provides hints on “whether or not it’s good or bad to have different [audio] streams or whether it is good to have a quicker repetition or not.” To his knowledge, this is the first time researchers have gone up to 12 streams. Previous research included only two streams.

The other part Schreuder found interesting was how quickly the listeners learned how to discriminate between letter streams.

"There is a difference between being able to spell one letter every two minutes or spelling three letters per minute, which is the range [brain-computer interfaces] go," Schreuder said. "So if one selection takes 20 seconds, it’s worse than if it goes 10 seconds."

The University of Washington researchers are planning follow-up experiments to directly investigate how the brain responds to audio streams.

Provided by Inside Science News Service

Source: medicalxpress.com

Jun 23, 201212 notes
#science #neuroscience #brain #psychology #hearing
Remembering to Forget

June 22nd, 2012

New research suggests that it is possible to suppress emotional autobiographical memories. The study published this month by psychologists at the University of St Andrews reveals that individuals can be trained to forget particular details associated with emotional memories.

The important findings may offer exciting new potential for therapeutic interventions for individuals suffering from emotional disorders, such as depression and post-traumatic stress disorder.

The research showed that although individuals could still accurately recall the cause of the event, they could be trained to forget the consequences and personal meaning associated with the memory.

The work was carried out by researchers Dr Saima Noreen and Professor Malcolm MacLeod of the University’s School of Psychology. Lead author Dr Noreen explained, “The ability to remember and interpret emotional events from our personal past forms the basic foundation of who we are as individuals.

image

Research is starting to show that autobiographical memories may be forgotten. This image is adapted from a photograph of a painting. Both are in the public domain. The original painting is translated as The Break-Up Letter and was painted by Alfred Émile Léopold Stevens (ca 1867).

“These novel findings show that individuals can be trained to not think about memories that have personal relevance and significance to them and provide the most direct evidence to date that we possess some kind of control over autobiographical memory.”

The research involved participants generating emotional memories in response to generic cue words, such as theatre, barbecue, wildlife etc. Participants were asked to recall the cause of the event, the consequence of the event and the personal meaning they derived from the event.

Subjects were then asked to provide a single word that was personal to them which reminded them of the memory. In a subsequent session, participants were shown the cue and personal word pairings and were asked to either recall the memory associated with the word pair or to not think about the associated memory.

Interestingly, the findings revealed that whilst the entire autobiographical episode was not forgotten, the details associated with the memory were. Specifically, individuals could remember what caused the event, but were able to forget what happened and how it made them feel.

Co-author Professor MacLeod commented, “The capacity to engage in this kind of intentional forgetting may be critical to our ability to maintain coherent images about who we are and what we are like”.

Source: Neuroscience News

Jun 23, 2012146 notes
#science #neuroscience #psychology #brain #memory
'Trust' hormone oxytocin found at heart of rare genetic disorder

June 22, 2012

The hormone oxytocin - often referred to as the “trust” hormone or “love hormone” for its role in stimulating emotional responses - plays an important role in Williams syndrome (WS), according to a study published June 12, 2012, in PLoS One.

The study, a collaboration between scientists at the Salk Institute for Biological Studies and the University of Utah, found that people with WS flushed with the hormones oxytocin and arginine vasopressin (AVP) when exposed to emotional triggers.

The findings may help in understanding human emotional and behavioral systems and lead to new treatments for devastating illnesses such as WS, post-traumatic stress disorder, anxiety and possibly even autism.

“Williams syndrome results from a very clear genetic deletion, allowing us to explore the genetic and neuronal basis of social behavior,” says Ursula Bellugi, the director of Salk’s Laboratory for Cognitive Neuroscience and a co-author on the paper. “This study provides us with crucial information about genes and brain regions involved in the control of oxytocin and vasopressin, hormones that may play important roles in other disorders.”

WS arises from a faulty recombination event during the development of sperm or egg cells. As a result, virtually everyone with WS has exactly the same set of genes missing (25 to 28 genes are missing from one of two copies of chromosome 7). There also are rare cases of individuals who retain one or more genes that most people with the disorder have lost.

To children with WS, people are much more comprehensible than inanimate objects. Despite myriad health problems they are extremely gregarious, irresistibly drawn to strangers, and insist on making eye contact. They have an affinity for music. But they also experience heightened anxiety, have an average IQ of 60, experience severe spatial-visual problems, and suffer from cardiovascular and other health issues. Despite their desire to befriend people, they have difficulty creating and maintaining social relationships, something that is not at all understood but can afflict many people without WS.

In the new study, led by Dr. Julie R. Korenberg, a University of Utah professor and Salk adjunct professor, the scientists conducted a trial with 21 participants, 13 who have WS and a control group of eight people without the disorder. The participants were evaluated at the Cedars-Sinai Medical Center in Los Angeles. Because music is a known strong emotional stimulus, the researchers asked participants to listen to music.

Before the music was played, the participants’ blood was drawn to determine a baseline level for oxytocin, and those with WS had three times as much of the hormone as those without the syndrome. Blood also was drawn at regular intervals while the music played and was analyzed afterward to check for real-time, rapid changes in the levels of oxytocin and AVP. Other studies have examined how oxytocin affects emotion when artificially introduced into people, such as through nasal sprays, but this is one of the first significant studies to measure naturally occurring changes in oxytocin levels in rapid, real time as people undergo an emotional response.

There was little outward response to the music, but when the blood samples were analyzed, the researchers were happily surprised. The analyses showed that the oxytocin levels, and to a lesser degree AVP, had not only increased but begun to bounce among WS participants while among those without WS, both the oxytocin and AVP levels remained largely unchanged as they listened to music.

Korenberg believes the blood analyses strongly indicate that oxytocin and AVP are not regulated correctly in people with WS, and that the behavioral characteristics unique to people with WS are related to this problem.

"This shows that oxytocin quite likely is very involved in emotional response," Korenberg says.

To ensure accuracy of results, those taking the test also were asked to place their hands in 60-degree Fahrenheit water to test for negative stress, and the same results were produced as when they listened to music. Those with WS experienced an increase in oxytocin and AVP, while those without the syndrome did not.

In addition to listening to music, study participants already had taken three social behavior tests that evaluate willingness to approach and speak to strangers, emotional states, and various areas of adaptive and problem behavior. Those test results suggest that increased levels of oxytocin are linked to both increased desire to seek social interaction and decreased ability to process social cues, a double-edged message that may be very useful at times, for example, during courtship, but damaging at others, as in WS.

"The association between abnormal levels of oxytocin and AVP and altered social behaviors found in people with Williams Syndrome points to surprising, entirely unsuspected deleted genes involved in regulation of these hormones and human sociability," Korenberg said. "It also suggests that the simple characterization of oxytocin as ‘the love hormone’ may be an overreach. The data paint a far more complicated picture."

In particular, the study results indicate that the missing genes affect the release of oxytocin and AVP through the hypothalamus and the pituitary gland. About the size of a pearl, the hypothalamus is located just above the brain stem and produces hormones that control body temperature, hunger, mood, sex drive, sleep, hunger and thirst, and the release of hormones from many glands, including the pituitary. The pituitary gland, about the size of a pea, controls many other glands responsible for hormone secretion.

Overall, the researchers say, their findings paint a very hopeful picture, and the study holds promise for speeding progress in treating WS, and perhaps Autism and anxiety through regulation of these key players in human brain and emotion, oxytocin and vasopressin.

Provided by Salk Institute

Source: medicalxpress.com

Jun 23, 201247 notes
#WS #brain #neuroscience #oxytocin #science #psychology
Balancing connections for proper brain function

June 22, 2012

Neuropsychiatric conditions such as autism, schizophrenia and epilepsy involve an imbalance between two types of synapses in the brain: excitatory synapses that release the neurotransmitter glutamate, and inhibitory synapses that release the neurotransmitter GABA. Little is known about the molecular mechanisms underlying development of inhibitory synapses, but a research team from Japan and Canada has reported that a molecular signal between adjacent neurons is required for the development of inhibitory synapses.

image

Figure 1: Compared with the brains of normal animals (left), mice lacking the Slitrk3 gene (right) have a reduced density of inhibitory synapses in the hippocampus. Reproduced from Ref. 1 © 2012 Jun Aruga, RIKEN Brain Science Institute

In earlier work, the researchers—led by Jun Aruga of the RIKEN Brain Science Institute, Wako, and Ann Marie Craig of the University of British Colombia, Vancouver—showed that a membrane protein called Slitrk2 organizes signaling molecules at synapses. They therefore tested whether five related proteins are involved in inhibitory synapse development. They cultured immature hippocampal neurons with non-neural cells expressing each of the six Slitrk proteins. They found that Slitrk3, but not other Slitrk proteins, induced clustering of VGAT, a GABA transporter protein found only at inhibitory synapses.

The researchers also examined the localization of Slitrk3 by tagging it with yellow fluorescent protein and introducing it into cultured hippocampal cells. This revealed that Slitrk3 co-localizes in the dendrites of neurons with gephyrin, a scaffold protein found only in inhibitory synapses. They then blocked Slitrk3 synthesis, and found that it led to a significant reduction in the number of inhibitory synapses.

To confirm these findings, the researchers generated a strain of genetically engineered mice lacking the Slitrk3 gene. These animals had significantly fewer inhibitory synapses than normal animals (Fig. 1), and therefore impaired neurotransmission of GABA. They were also susceptible to epileptic seizures. From a screen for proteins that bind to Slitrk3, Aruga, Craig and colleagues identified the protein PTPδ as its only binding partner. Introduction of PTPδ fused to yellow fluorescent protein to cultured hippocampal neurons showed that it is expressed in neuronal dendrites and cell bodies, but not in axons. Blocking PTPδ synthesis prevented the induction of inhibitory synapses by the Slitrk3 protein.

These results demonstrated that the interaction between Slitrk3 on dendrites and PTPδ on axons of adjacent cells is required for the proper development of inhibitory synapses and for inhibitory neurotransmission in the brain.

“We are now examining whether the balance of excitatory and inhibitory synapses is affected by other members of the Slitrk protein family,” says Aruga. “It is possible that Slitrk3 and other Slitrk proteins are acting synergistically or antagonistically. We are also clarifying whether Slitrk3 is involved in any neurological disorders.”

Provided by RIKEN

Source: medicalxpress.com

Jun 23, 201229 notes
#science #neuroscience #brain #psychology #synapses
Preventing or Better Managing Diabetes May Prevent Cognitive Decline

ScienceDaily (June 21, 2012) — Preventing diabetes or delaying its onset has been thought to stave off cognitive decline — a connection strongly supported by the results of a 9-year study led by researchers at the University of California, San Francisco (UCSF) and the San Francisco VA Medical Center.

Earlier studies have looked at cognitive decline in people who already had diabetes. The new study is the first to demonstrate that the greater risk of cognitive decline is also present among people who develop diabetes later in life. It is also the first study to link the risk of cognitive decline to the severity of diabetes.

The result is the latest finding to emerge from the Health, Aging, and Body Composition (Health ABC) Study, which enrolled 3,069 adults over 70 at two community clinics in Memphis, TN and Pittsburgh, PA beginning in 1997. All the patients provided periodic blood samples and took regular cognitive tests over time.

When the study began, hundreds of those patients already had diabetes. A decade later, many more of them had developed diabetes, and many also suffered cognitive decline. As described this week in Archives of Neurology, those two health outcomes were closely linked.

People who had diabetes at the beginning of the study showed a faster cognitive decline than people who developed it during the course of the study — and these people, in turn, tended to be worse off than people who never developed diabetes at all. The study also showed that patients with more severe diabetes who did not control their blood sugar levels as well suffered faster cognitive declines.

"Both the duration and the severity of diabetes are very important factors," said Kristine Yaffe, MD, the lead author of the study. "It’s another piece of the puzzle in terms of linking diabetes to accelerated cognitive aging."

An important question for future studies, she added, would be to ask if interventions that would effectively prevent, delay or better control diabetes would also lower people’s risk of cognitive impairment later in life.

Yaffe is the Roy and Marie Scola Endowed Chair of Psychiatry; professor in the UCSF departments of Psychiatry, Neurology and Epidemiology and Biostatistics; and Chief of Geriatric Psychiatry and Director of the Memory Disorders Clinic at the San Francisco VA Medical Center.

Diabetes and Cognitive Decline

Diabetes is a chronic and complex disease marked by high levels of sugar in the blood that arise due to problems with the hormone insulin, which regulates blood sugar levels. It is caused by an inability to produce insulin (type 1) or an inability to respond correctly to insulin (type 2).

A major health concern in the United States, diabetes of all types affects an estimated 8.3 percent of the U.S. population — some 25.8 million Americans — and costs U.S. taxpayers more than $200 billion annually. In California alone, an estimated 4 million people (one out of every seven adults) has type 2 diabetes and millions more are at risk of developing it. These numbers are poised to explode in the next half century if more is not done to prevent the disease.

Over the last several decades, scientists have come to appreciate that diabetes affects many tissues and organs of the body, including the brain and central nervous system — particularly because diabetes places people at risk of cognitive decline later in life.

In their study the scientists looked at a blood marker known as “glycosylated hemoglobin,” a standard measure of the severity of diabetes and the ability to control it over time. The marker shows evidence of high blood sugar because these sugar molecules become permanently attached to hemoglobin proteins in the blood. Yaffe and her colleagues found that greater levels of this biomarker were associated with more severe cognitive dysfunction.

While the underlying mechanism that accounts for the link between diabetes and risk of cognitive decline is not completely understood, Yaffe said, it may be related to a human protein known as insulin degrading enzyme, which plays an important role in regulating insulin, the key hormone linked to diabetes. This same enzyme also degrades a protein in the brain known as beta-amyloid, a brain protein linked to Alzheimer’s disease.

Source: Science Daily

Jun 22, 201213 notes
#science #neuroscience #diabetes #brain #dementia
New Candidate Drug Stops Cancer Cells, Regenerates Nerve Cells

ScienceDaily (June 21, 2012) — Scientists have developed a small-molecule-inhibiting drug that in early laboratory cell tests stopped breast cancer cells from spreading and also promoted the growth of early nerve cells called neurites.

Researchers from Cincinnati Children’s Hospital Medical Center report their findings online June 21 in Chemistry & Biology. The scientists named their lead drug candidate “Rhosin” and hope future testing shows it to be promising for the treatment of various cancers or nervous system damage.

The inhibitor overcomes a number of previous scientific challenges by precisely targeting a single component of a cell signaling protein complex called Rho GTPases. This complex regulates cell movement and growth throughout the body. Miscues in Rho GTPase processes are also widely implicated in human diseases, including various cancers and neurologic disorders.

"Although still years from clinical development, in principle Rhosin could be useful in therapy for many kinds of cancer or possibly neuron and spinal cord regeneration," said Yi Zheng, PhD, lead investigator and director of Experimental Hematology and Cancer Biology at Cincinnati Children’s. "We’ve performed in silica (computerized) rational drug design, pharmacological characterization and cell tests in the laboratory, and we are now starting to work with mouse models."

Because the role of Rho GTPases in cellular processes and cancer formation is well established, researchers have spent years trying to identify safe and effective therapeutic targets for specific parts of the protein complex. In particular, scientists have focused on the center protein in the complex called RhoA, which is essential for the signaling function of the complex. In breast cancer for example, increased RhoA activity makes the cancer cells more invasive and causes them to spread, while a deficiency of RhoA suppresses cancer growth and progression.

Despite this knowledge, past efforts to develop an effective small-molecule inhibitor for RhoA have failed, explained Zheng, who has studied Rho GTPases for over two decades. Most roadblocks stem from a lack of specificity in how researchers have been able to target RhoA, a resulting lack of efficiency in affecting molecular processes, problems with toxicity, and the inability to find a workable drug design.

For the current study, Zheng and his colleagues started with the extensive body of research from Cincinnati Children’s and other institutions describing the processes and functions of Rho GTPases. They then used high-throughput computerized molecular screening and computerized drug design to reveal a druggable target site. This also provided a preliminary virtual simulation on the potential effectiveness of candidate drugs.

A key challenge to binding a small-molecule inhibitor to RhoA is the protein’s globular structure and lack of surface pocket areas suitable for easy binding, Zheng said. The unique chemical structure of the lead compound identified by researchers, Rhosin, allows it to effectively bind to two shallow surface grooves on RhoA. This enables the candidate drug to take root and begin affecting cells. The two-legged configuration of Rosin also describes a useful drug design strategy for more effectively targeting difficult molecular sites like RhoA.

The researchers also wanted to make sure Rhosin effectively blocked what are known as guanine nucleotide exchange factors (GEFs). Guanine nucleotide is a critical energy source and signaling component of cells. Activation of GEFs is required to set off the regulatory signaling of GTPases (GTP stands for guanosine triphosphate).

After conducting a series of laboratory cell tests to verify the targeting and binding capabilities of Rhosin to RhoA, the researchers then tested the candidate drug’s impact on cultured breast cancer cells and nerve cells.

In tests on a human breast cancer cells, Rhosin inhibited cell growth and the formation of mammary spheres in a dose dependent manner, acting specifically on RhoA molecular targets without disrupting other critical cellular processes. Rhosin does not affect non-cancerous breast cells. This, along with other tests the scientists performed, indicated Rhosin’s effectiveness in targeting RhoA-mediated breast cancer proliferation, according to the researchers.

Researchers also treated an extensively tested line of neuronal cells with Rhosin, along with nerve growth factor, a protein that is important to the growth and survival of neurons. Rhosin worked with nerve growth factor in a dose-dependent way to promote the proliferation of branching neurites from the neuronal cells. Neurites are young or early stage extensions from neurons required for neuronal communications.

Source: Science Daily

Jun 22, 2012115 notes
#science #neuroscience
Jun 22, 201241 notes
#science #neuroscience #perception #psychology
Eating Disorder Behaviors and Weight Concerns Are Common in Women Over 50

ScienceDaily (June 21, 2012) — Eating disorders are commonly seen as an issue faced by teenagers and young women, but a new study reveals that age is no barrier to disordered eating. In women aged 50 and over, 3.5% report binge eating, nearly 8% report purging, and more than 70% are trying to lose weight. The study published in the International Journal of Eating Disorders revealed that 62% of women claimed that their weight or shape negatively impacted on their life.

The researchers, led by Dr Cynthia Bulik, Director of the University of North Carolina Eating Disorders Program, reached 1,849 women from across the USA participating in the Gender and Body Image Study (GABI) with a survey titled, ‘Body Image in Women 50 and Over — Tell Us What You Think and Feel.’

"We know very little about how women aged 50 and above feel about their bodies," said Bulik. "An unfortunate assumption is that they ‘grow out of’ body dissatisfaction and eating disorders, but no one has really bothered to ask. Since most research focuses on younger women, our goal was to capture the concerns of women in this age range to inform future research and service planning."

The average age of the participants was 59, while 92% were white. More than a quarter, 27%, were obese, 29% were overweight, 42% were normal weight and 2% were underweight.

Results revealed that eating disorder symptoms were common. About 8% of women reported purging in the last five years and 3.5% reported binge eating in the last month. These behaviors were most prevalent in women in their early 50s, but also occurred in women over 75.

When it came to weight issues, 36% of the women reported spending at least half their time in the last five years dieting, 41% checked their body daily and 40% weighed themselves a couple of times a week or more.

62% of women claimed that their weight or shape negatively impacted their life, 79% said that it affected their self-perception and 64% said that they thought about it daily.

The women reported resorting to a variety of unhealthy methods to change their body, including diet pills (7.5%), excessive exercise (7%), diuretics (2.5%), laxatives (2%) and vomiting (1%).

Two-thirds, 66%, were unhappy with their overall appearance and this was highest when it came to their stomach, 84%, and shape, 73%.

"The bottom line is that eating disorders and weight and shape concerns don’t discriminate on the basis of age," concluded Bulik. "Healthcare providers should remain alert for eating disorder symptoms and weight and shape concerns that may adversely influence women’s physical and psychological wellbeing as they mature."

Source: Science Daily

Jun 22, 201221 notes
#science #neuroscience #psychology #eating disorders
Functional Links Between Autism and Genes Explained

ScienceDaily (June 21, 2012) — A pioneering report of genome-wide gene expression in autism spectrum disorders (ASDs) finds genetic changes that help explain why one person has an ASD and another does not. The study, published by Cell Press on June 21 in The American Journal of Human Genetics, pinpoints ASD risk factors by comparing changes in gene expression with DNA mutation data in the same individuals. This innovative approach is likely to pave the way for future personalized medicine, not just for ASD but also for any disease with a genetic component.

ASDs are a heterogeneous group of developmental conditions characterized by social deficits, difficulty communicating, and repetitive behaviors. ASDs are thought to be highly heritable, meaning that they run in families. However, the genetics of autism are complex.

Researchers have found rare changes in the number of copies of defined genetic regions that associate with ASD. Although there are some hot-spot regions containing these alterations, very few genetic changes are exactly alike. Similarly, no two autistic people share the exact same symptoms. To discover how these genetic changes might affect gene transcription and, thus, the presentation of the disorder, Rui Luo, a graduate student in the Geschwind lab at UCLA, studied 244 families in which one child (the proband) was affected with an ASD and one was not.

In addition to identifying several potential new regions where copy-number variants (CNVs) are associated with ASDs, Geschwind’s team found genes within these regions to be significantly misregulated in ASD children compared with their unaffected siblings. “Strikingly, we observed a higher incidence of haploinsufficient genes in the rare CNVs in probands than in those of siblings, strongly indicating a functional impact of these CNVs on expression,” says Geschwind. Haploinsuffiency occurs when only one copy of a gene is functional; the result is that the body cannot produce a normal amount of protein. The researchers also found a significant enrichment of misexpressed genes in neural-related pathways in ASD children. Previous research has found that these pathways include other genetic variants associated with autism, which Geschwind explains further legitimizes the present findings.

Source: Science Daily

Jun 22, 201230 notes
#science #neuroscience #psychology #autism #genetics
Where is the Love?

June 21, 2012 By Janice Wood

Thanks to science, we know that love lives in the brain, not the heart.

image

Now a new international study has mapped out where love and sexual desire are in the brain.

“No one has ever put these two together to see the patterns of activation,” says Dr. Jim Pfaus, professor of psychology at Concordia University.

“We didn’t know what to expect –the two could have ended up being completely separate. It turns out that love and desire activate specific but related areas in the brain.”

Working with colleagues in the United States and Switzerland, Pfaus analyzed the results of 20 separate studies that examined brain activity while subjects engaged in tasks such as viewing erotic pictures or looking at photographs of their significant others. Pooling this data enabled the scientists to form a map of love and desire in the brain.

They found that two brain structures, the insula and the striatum, are responsible for tracking the progression from sexual desire to love.

The insula is a portion of the cerebral cortex folded deep within an area between the temporal lobe and the frontal lobe, while the striatum is located nearby, inside the forebrain.

According to the researchers, love and sexual desire activate different areas of the striatum. The area activated by sexual desire is usually turned on by things that are inherently pleasurable, such as sex or food.

The area activated by love is involved in the process of conditioning in which things paired with reward or pleasure are given inherent value. That is, as feelings of sexual desire develop into love, they are processed in a different place in the striatum, the researchers explain.

This area of the striatum is also the part of the brain associated with drug addiction. Pfaus says there is good reason for this.

“Love is actually a habit that is formed from sexual desire as desire is rewarded,” he explains. “It works the same way in the brain as when people become addicted to drugs.”

However, the habit is not a bad one, he said, noting that love activates different pathways in the brain that are involved in monogamy and pair bonding. Some areas in the brain are actually less active when a person feels love than when they feel desire, he added.

“While sexual desire has a very specific goal, love is more abstract and complex, so it’s less dependent on the physical presence someone else,” says Pfaus.

Source: PsychCentral

Jun 22, 2012136 notes
#science #neuroscience #brain #psychology
Mind games: Mental exercises are key to better brain function

June 20, 2012 By Robin Erb

Go ahead - do it: Grab a pencil. Right now. Write your name backward. And upside down. Awkward, right?

But if researchers and neurologists are correct, doing exercises like these just might buy you a bit more time with a healthy brain.

Some research suggests that certain types of mental exercises - whether they are memory games on your mobile device or jotting down letters backward - might help our gray matter maintain concentration, memory and visual and spatial skills over the years.

"There is some evidence of a use-it-or-lose-it phenomenon," says Dr. Michael Maddens, chief of medicine at Beaumont Hospital, Royal Oak, Mich.

Makers of computer brain games, in fact, are tapping into a market of consumers who have turned to home treadmills and gym memberships to maintain their bodies, and now worry that aging might take its toll on their mental muscle as well.

But tweaking every day routines can help.

Like brushing your teeth with your non-dominant hand. Or crossing your arms the opposite way you’re used to, says Cheryl Deep, who leads “Brain Neurobics” sessions on behalf of the Wayne State Institute of Gerontology.

At a recent session in Novi, Mich., Deep encouraged several dozen senior citizens to flip the pictures in their homes upside-down. It might baffle houseguests, but the exercise crowbars the brain out of familiar grooves cut deep by years of mindless habit.

"Every time you walk past and look, your brain has to rotate that image," Deep says. "Brain neurobics is about getting us out of those ruts, those pathways, and shaking things up."

Participants were asked to call out the color of ink that flashed on a screen in front them. The challenge: The colors spelled out names of other colors. Blue ink spelled o-r-a-n-g-e, for example.

Several in the crowd at Waltonwood Senior Living hesitated - a few scrunching up faces in concentration. The first instinct is to say “orange.”

In another exercise, participants had to try to name as many red foods as possible. Apple? Sure that’s an easy one. It took a while, but the crowd eventually made its way to pomegranate and pimento.

Elissa and Hal Leider chuckled with friends as they tested their recall.

Hal Leider, 82, a retired carpenter, was diagnosed with early-stage Alzheimer’s, and he tries to challenge himself mentally and physically - bowling and shooting pool and playing poker: “I think anything we can do might be helpful,” says Elissa Leider, 74.

The idea of mental workouts marks a dramatic shift in how we understand the brain these days.

"We want to stretch and flex and push" the brain, says Moriah Thomason, assistant professor in Wayne State University School of Medicine’s pediatrics department and in the Merrill Palmer Skillman Institute for Child and Family Development

Thomason also is a scientific adviser to http://www.Lumosity.com, one of the fastest-growing brain game websites.

"We used to think that what you’re born with is what you have through life. But now we understand that the brain is a lot more plastic and flexible than we ever appreciated," she says.

Still, like the rest of your body, aging takes its toll, she says.

The protective covering of the neural cells - white matter - begins to shrink first. Neural and glial cells, often called the gray matter, begin to shrink as well, but more slowly. Neurotransmitters, or chemical messengers, decrease.

But challenging the brain stimulates neural pathways - those tentacles that look like tree branches in a cluster of brain cells. It boosts the brain’s chemistry and connectivity, refueling the entire engine.

"Certain activities will lay more neural pathways that can be more readily re-engaged," Thomason says. "The hope is that there are ways to train and strengthen these pathways."

Maddens explains it this way: Consider the neurons of your brain like electrical wires and the white matter like the insulation. When the insulation breaks down over time, things can misfire.

In labs, those who engaged in mentally challenging games do, in fact, show improvement in cognitive functioning. They get faster at speed games and stronger in memory games, for example.

What’s less clear is whether this improvement transfers to everyday tasks, like remembering where you parked the car or the name of your child’s teacher, both Thomason and Maddens say.

But when it comes to the link between physical exercise and the brain, researchers and clinicians agree: physical exercise is good for the brain; it has also been linked to lower rates of chronic disease. Good nutrition is essential too.

Oxygen, itself, is essential, Deep said: “Your brain is an oxygen hog.”

Diet, exercise and mental maneuvers all may boost brain health in ways science still doesn’t understand. In the best cases, the right mix might stave off the effects of Alzheimer’s and other age-related disease too, Maddens says.

All this is good news for an aging, stressed out, and too-busy society, he says.

Reading a book, engaging with friends or going out for a walk and paying attention to what’s around you - that’s not really about goofing off. Rather, it’s critical time that stimulates neural pathways and boosts the odds of long-time brain health.

"It’s talking to friends. It’s getting out socially. It’s engaging in life. The question is ‘How do I force myself to learn?’" Thomason says.

The same might be true when it comes to mentally changing computer games.

Says Maddens: “Would I have patients playing computer games eight hours a day in hopes that they can delay Alzheimer’s by two months? No. But you can enjoy (playing such games) and possibly get a benefit from it, too.”

Read More →

Jun 22, 201266 notes
#science #neuroscience #brain #psychology
Confusion Can Be Beneficial for Learning

ScienceDaily (June 20, 2012) — Most of us assume that confidence and certainty are preferred over uncertainty and bewilderment when it comes to learning complex information. But a new study led by Sidney D’Mello of the University of Notre Dame shows that confusion when learning can be beneficial if it is properly induced, effectively regulated and ultimately resolved.

image

Most of us assume that confidence and certainty are preferred over uncertainty and bewilderment when it comes to learning complex information. But a new study shows that confusion when learning can be beneficial if it is properly induced, effectively regulated and ultimately resolved. (Credit: © Ana Blazic Pavlovic / Fotolia)

The study will be published in a forthcoming issue of the journal Learning and Instruction.

Notre Dame psychologist and computer scientist D’Mello, whose research areas include artificial intelligence, human-computer interaction and the learning sciences, together with Art Graesser of the University of Memphis, collaborated on the study, which was funded by the National Science Foundation.

They found that by strategically inducing confusion in a learning session on difficult conceptual topics, people actually learned more effectively and were able to apply their knowledge to new problems.

In a series of experiments, subjects learned scientific reasoning concepts through interactions with computer-animated agents playing the roles of a tutor and a peer learner. The animated agents and the subject engaged in interactive conversations where they collaboratively discussed the merits of sample research studies that were flawed in one critical aspect. For example, one hypothetical case study touted the merits of a diet pill, but was flawed because it did not include an appropriate control group. Confusion was induced by manipulating the information the subjects received so that the animated agents sometimes disagreed with each other and expressed contradictory or incorrect information. The agents then asked subjects to decide which opinion had more scientific merit, thereby putting the subject in the hot spot of having to make a decision with incomplete and sometimes contradictory information.

In addition to the confusion and uncertainty triggered by the contradictions, subjects who were confused scored higher on a difficult post-test and could more successfully identify flaws in new case studies.

"We have been investigating links between emotions and learning for almost a decade, and find that confusion can be beneficial to learning if appropriately regulated because it can cause learners to process the material more deeply in order to resolve their confusion," D’Mello says.

According to D’Mello, it is not advisable to intentionally confuse students who are struggling or induce confusion during high-stakes learning activities. Confusion interventions are best for higher-level learners who want to be challenged with difficult tasks, are willing to risk failure, and who manage negative emotions when they occur.

"It is also important that the students are productively instead of hopelessly confused. By productive confusion, we mean that the source of the confusion is closely linked to the content of the learning session, the student attempts to resolve their confusion, and the learning environment provides help when the student struggles. Furthermore, any misleading information in the form of confusion-induction techniques should be corrected over the course of the learning session, as was done in the present experiments."

According to D’Mello, the next step in this body of research is to apply these methods to some of the more traditional domains such as physics, where misconceptions are common.

Source: Science Daily

Jun 21, 2012218 notes
#science #neuroscience #brain #psychology #learning
Understanding of Spinal Muscular Atrophy Improved With Use of Stem Cells

ScienceDaily (June 20, 2012) — Cedars-Sinai’s Regenerative Medicine Institute has pioneered research on how motor-neuron cell-death occurs in patients with spinal muscular atrophy, offering an important clue in identifying potential medicines to treat this leading genetic cause of death in infants and toddlers.

The study, published in the June 19 online issue of PLoS ONE, extends the institute’s work to employ pluripotent stem cells to find a pharmaceutical treatment for spinal muscular atrophy or SMA, a genetic neuromuscular disease characterized by muscle atrophy and weakness.

"With this new understanding of how motor neurons die in spinal muscular atrophy patients, we are an important step closer to identifying drugs that may reverse or prevent that process," said Clive Svendsen, PhD, director of the Cedars-Sinai Regenerative Medicine Institute.

Svendsen and his team have investigated this disease for some time now. In 2009, Nature published a study by Svendsen and his colleagues detailing how skin cells taken from a patient with the disorder were used to generate neurons of the same genetic makeup and characteristics of those affected in the disorder; this created a “disease-in-a-dish” that could serve as a model for discovering new drugs.

As the disease is unique to humans, previous methods to employ this approach had been unreliable in predicting how it occurs in humans. In the research published in PLoS ONE, the team reproduced this model with skin cells from multiple patients, taking them back in time to a pluripotent stem cell state (iPS cells), and then driving them forward to study the diseased patient-specific motor neurons.

Children born with this disorder have a genetic mutation that doesn’t allow their motor neurons to manufacture a critical protein necessary for them to survive. The study found these cells die through apoptosis — the same form of cell death that occurs when the body eliminates old, unnecessary as well as unhealthy cells. As motor neuron cell death progresses, children with the disease experience increasing paralysis and eventually death. There is no effective treatment now for this disease. An estimated one in 35 to one in 60 people are carriers and about in 100,000 newborns have the condition.

"Now we are taking these motor neurons (from multiple children with the disease and in their pluripotent state) and screening compounds that can rescue these cells and create the protein necessary for them to survive," said Dhruv Sareen, director of Cedars-Sinai’s Induced Pluripotent Stem Cell Core Facility and a primary author on the study. "This study is an important stepping stone to guide us toward the right kinds of compounds that we hope will be effective in the model — and then be reproduced in clinical trials."

Source: Science Daily

Jun 21, 2012
#science #neuroscience #brain #psychology #neuron
What's Your Name Again? Lack of Interest, Not Brain's Ability, May Be Why We Forget

ScienceDaily (June 20, 2012) — Most of us have experienced it. You are introduced to someone, only to forget his or her name within seconds. You rack your brain trying to remember, but can’t seem to even come up with the first letter. Then you get frustrated and think, “Why is it so hard for me to remember names?”

You may think it’s just how you were born, but that’s not the case, according to Kansas State University’s Richard Harris, professor of psychology. He says it’s not necessarily your brain’s ability that determines how well you can remember names, but rather your level of interest.

"Some people, perhaps those who are more socially aware, are just more interested in people, more interested in relationships," Harris said. "They would be more motivated to remember somebody’s name."

This goes for people in professions like politics or teaching where knowing names is beneficial. But just because someone can’t remember names doesn’t mean they have a bad memory.

"Almost everybody has a very good memory for something," Harris said.

The key to a good memory is your level of interest, he said. The more interest you show in a topic, the more likely it will imprint itself on your brain. If it is a topic you enjoy, then it will not seem like you are using your memory.

For example, Harris said a few years ago some students were playing a geography game in his office. He started to join in naming countries and their capitals. Soon, the students were amazed by his knowledge, although Harris didn’t understand why. Then it dawned on him that his vast knowledge of capitals didn’t come from memorizing them from a map, but rather from his love of stamps and learning their whereabouts.

"I learned a lot of geographical knowledge without really studying," he said.

Harris said this also explains why some things seem so hard to remember — they may be hard to understand or not of interest to some people, such as remembering names.

Harris said there are strategies for training your memory, including using a mnemonic device.

"If somebody’s last name is Hefty and you notice they’re left-handed, you could remember lefty Hefty," he said.

Another strategy is to use the person’s name while you talk to them — although the best strategy is simply to show more interest in the people you meet, he said.

Source: Science Daily

Jun 21, 201254 notes
#science #neuroscience #brain #psychology
'Brain pacemaker' effective for years against Parkinson's disease

June 20, 2012

A “brain pacemaker” called deep brain stimulation (DBS) remains an effective treatment for Parkinson’s disease for at least three years, according to a study in the June 2012 online issue of Neurology, the medical journal of the American Academy of Neurology.

But while improvements in motor function remained stable, there were gradual declines in health-related quality of life and cognitive abilities.

First author of the study is Frances M. Weaver, PhD, who has joint appointments at Edward Hines Jr. VA Hospital and Loyola University Chicago Stritch School of Medicine.

Weaver was one of the lead investigators of a 2010 paper in the New England Journal of Medicine that found that motor functions remained stable for two years in DBS patients. The new additional analysis extended the follow-up period to 36 months.

DBS is a treatment for Parkinson’s patients who no longer benefit from medication, or who experience unacceptable side effects. DBS is not a cure, and it does not stop the disease from progressing. But in the right patients, DBS can significantly improve symptoms, especially tremors. DBS also can relieve muscle rigidity that causes decreased range of motion.

In the DBS procedure, a neurosurgeon drills a dime-size hole in the skull and inserts an electrode about 4 inches into the brain. A connecting wire from the electrode runs under the skin to a battery implanted near the collarbone. The electrode delivers mild electrical signals that effectively reorganize the brain’s electrical impulses. The procedure can be done on one or both sides of the brain.

Researchers evaluated 89 patients who were stimulated in a part of the brain called the globus pallidus interna and 70 patients who were stimulated in a different part of the brain called the subthalamic nucleus. (Patients received DBS surgery at seven VA and six affiliated university medical centers.) Patients were assessed at baseline (before DBS surgery) and at 3, 6, 12, 18, 24 and 36 months. Patients were rated on a Parkinson’s disease scale that includes motor functions such as speech, facial expression, tremors, rigidity, finger taps, hand movements, posture, gait, bradykinesia (slow movement) etc. The lower the rating, the better the function.

Improvements in motor function were similar in both groups of patients, and stable over time. Among patients stimulated in the globus pallidus interna, the score improved from 41.1 at baseline to 27.1 at 36 months. Among patients stimulated in the subthalamic nucleus, the score improved from 42.5 at baseline to 29.7 at 36 months.

By contrast, some early gains in quality of life and the abilities to do the activities of daily living were gradually lost, and there was a decline in neurocognitive function. This likely reflects the progression of the disease, and the emergence of symptoms that are resistant to DBS and medications.

Researchers concluded that both the globus pallidus interna and the subthalamic nucleus areas of the brain “are viable DBS targets for treatment of motor symptoms, but highlight the importance of nonmotor symptoms as determinants of quality of life in people with Parkinson’s disease.”

Source: medicalxpress.com

Jun 21, 201211 notes
#science #neuroscience #brain #psychology #parkinson
Proposed drug may reverse Huntington's disease symptoms

June 20, 2012

With a single drug treatment, researchers at the Ludwig Institute for Cancer Research at the University of California, San Diego School of Medicine can silence the mutated gene responsible for Huntington’s disease, slowing and partially reversing progression of the fatal neurodegenerative disorder in animal models.

image

This image shows stained mouse neurons. Credit: Image courtesy of Taylor Bayouth

The findings are published in the June 21, 2012 online issue of the journal Neuron.

Researchers suggest the drug therapy, tested in mouse and non-human primate models, could produce sustained motor and neurological benefits in human adults with moderate and severe forms of the disorder. Currently, there is no effective treatment.

Huntington’s disease afflicts approximately 30,000 Americans, whose symptoms include uncontrolled movements and progressive cognitive and psychiatric problems. The disease is caused by the mutation of a single gene, which results in the production and accumulation of toxic proteins throughout the brain.

Don W. Cleveland, PhD, professor and chair of the UC San Diego Department of Cellular and Molecular Medicine and head of the Laboratory of Cell Biology at the Ludwig Institute for Cancer Research, and colleagues infused mouse and primate models of Huntington’s disease with one-time injections of an identified DNA drug based on antisense oligonucleotides (ASOs). These ASOs selectively bind to and destroy the mutant gene’s molecular instructions for making the toxic huntingtin protein.

The singular treatment produced rapid results. Treated animals began moving better within one month and achieved normal motor function within two. More remarkably, the benefits persisted, lasting nine months, well after the drug had disappeared and production of the toxic proteins had resumed.

"For diseases like Huntington’s, where a mutant protein product is tolerated for decades prior to disease onset, these findings open up the provocative possibility that transient treatment can lead to a prolonged benefit to patients,” said Cleveland. “This finding raises the prospect of a ‘huntingtin holiday,’ which may allow for clearance of disease-causing species that might take weeks or months to re-form. If so, then a single application of a drug to reduce expression of a target gene could ‘reset the disease clock,’ providing a benefit long after huntingtin suppression has ended.”

Beyond improving motor and cognitive function, researchers said the ASO treatment also blocked brain atrophy and increased lifespan in mouse models with a severe form of the disease. The therapy was equally effective whether one or both huntingtin genes were mutated, a positive indicator for human therapy.

Cleveland noted that the approach was particularly promising because antisense therapies have already been proven safe in clinical trials and are the focus of much drug development. Moreover, the findings may have broader implications, he said, for other “age-dependent neurodegenerative diseases that develop from exposure to a mutant protein product” and perhaps for nervous system cancers, such as glioblastomas.

Provided by University of California - San Diego

Source: medicalxpress.com

Jun 21, 201231 notes
#science #neuroscience #brain #psychology #huntington
Study shows role of cellular protein in regulation of binge eating

June 20, 2012

Researchers from Boston University School of Medicine (BUSM) have demonstrated in experimental models that blocking the Sigma-1 receptor, a cellular protein, reduced binge eating and caused binge eaters to eat more slowly. The research, which is published online in Neuropsychopharmacology, was led by Pietro Cottone, PhD, and Valentina Sabino, PhD, both assistant professors in the pharmacology and psychiatry departments at BUSM.

Binge eating disorder, which affects approximately 15 million Americans, is believed to be the eating disorder that most closely resembles substance dependence. In binge eating subjects, normal regulatory mechanisms that control hunger do not function properly. Binge eaters typically gorge on “junk” foods excessively and compulsively despite knowing the adverse consequences, which are physical, emotional and social in nature. In addition, binge eaters typically experience distress and withdrawal when they abstain from junk food.

The researchers developed an experimental model of compulsive binge eating by providing a sugary, chocolate diet only for one hour a day while the control group was given a standard laboratory diet. Within two weeks, the group exposed to the sugary diet exhibited binge eating behavior and ate four times as much as the controls. In addition, the experimental binge eaters exhibited compulsive behavior by putting themselves in a potentially risky situation in order to get to the sugary food while the control group avoided the risk.

The researchers then tested whether a drug that blocks the Sigma-1 receptor could reduce binge eating of the sugary diet. The experimental data showed the drug successfully reduced binge eating by 40 percent, caused the binge eaters to eat more slowly and blocked the risky behavior.

The abnormal, risky behavior exhibited by the binge eating experimental group suggested to the researchers that there could be something wrong with how decisions were made. Because evaluation of risks and decision making are functions executed in the prefronto-cortical regions of the brain, the researchers tested whether the abundance of Sigma-1 receptors in those regions was abnormal in the binge eaters. They found that Sigma-1 receptor expression was unusually high in those areas, which could explain why blocking its function could decrease both compulsive binge eating and risky behavior.

"These findings suggest that the Sigma-1 receptor may contribute to the neurobiological adaptations that cause compulsive-like eating, opening up a new potential therapeutic treatment target for binge eating disorder,” said Cottone, who also co-directs the Laboratory of Addictive Disorders at BUSM with Sabino.

Provided by Boston University Medical Center

Source: medicalxpress.com

Jun 21, 201216 notes
#neuroscience #psychology #science
Scientists Identify Protein Required to Regrow Injured Nerves in Limbs

ScienceDaily (June 20, 2012) — A protein required to regrow injured peripheral nerves has been identified by researchers at Washington University School of Medicine in St. Louis.

image

These are images of axon regeneration in mice two weeks after injury to the hind leg’s sciatic nerve. On the left, axons (green) of a normal mouse have regrown to their targets (red) in the muscle. On the right, a mouse lacking DLK shows no axons have regenerated, even after two weeks. (Credit: Jung Eun Shin)

The finding, in mice, has implications for improving recovery after nerve injury in the extremities. It also opens new avenues of investigation toward triggering nerve regeneration in the central nervous system, notorious for its inability to heal.

Peripheral nerves provide the sense of touch and drive the muscles that move arms and legs, hands and feet. Unlike nerves of the central nervous system, peripheral nerves can regenerate after they are cut or crushed. But the mechanisms behind the regeneration are not well understood.

In the new study, published online June 20 in Neuron, the scientists show that a protein called dual leucine zipper kinase (DLK) regulates signals that tell the nerve cell it has been injured — often communicating over distances of several feet. The protein governs whether the neuron turns on its regeneration program.

"DLK is a key molecule linking an injury to the nerve’s response to that injury, allowing the nerve to regenerate," says Aaron DiAntonio, MD, PhD, professor of developmental biology. "How does an injured nerve know that it is injured? How does it take that information and turn on a regenerative program and regrow connections? And why does only the peripheral nervous system respond this way, while the central nervous system does not? We think DLK is part of the answer."

The nerve cell body containing the nucleus or “brain” of a peripheral nerve resides in the spinal cord. During early development, these nerves send long, thin, branching wires, called axons, out to the tips of the fingers and toes. Once the axons reach their targets (a muscle, for example), they stop extending and remain mostly unchanged for the life of the organism. Unless they’re damaged.

If an axon is severed somewhere between the cell body in the spinal cord and the muscle, the piece of axon that is no longer connected to the cell body begins to disintegrate. Earlier work showed that DLK helps regulate this axonal degeneration. And in worms and flies, DLK also is known to govern the formation of an axon’s growth cone, the structure responsible for extending the tip of a growing axon whether after injury or during development.

The formation of the growth cone is an important part of the early, local response of a nerve to injury. But a later response, traveling over greater distances, proves vital for relaying the signals that activate genes promoting regeneration. This late response can happen hours or even days after injury.

But in mice, unlike worms and flies, DiAntonio and his colleagues found that DLK is not involved in an axon’s early response to injury. Even without DLK, the growth cone forms. But a lack of DLK means the nerve cell body, nestled in the spinal cord far from the injury, doesn’t get the message that it’s injured. Without the signals relaying the injury message, the cell body doesn’t turn on its regeneration program and the growth cone’s progress in extending the axon stalls.

In addition, it was shown many years ago that axons regrow faster after a second injury than axons injured only once. In other words, injury itself increases an axon’s ability to regenerate. Furthering this work, first author Jung Eun Shin, graduate research assistant, and her colleagues found that DLK is required to promote this accelerated growth.

"A neuron that has seen a previous injury now has a different regenerative program than one that has never been damaged," Shin says. "We hope to be able to identify what is different between these two neurons — specifically what factors lead to the improved regeneration after a second injury. We have found that activated DLK is one such factor. We would like to activate DLK in a newly injured neuron to see if it has improved regeneration."

In addition to speeding peripheral nerve recovery, DiAntonio and Shin see possible implications in the central nervous system. It is known for example, that some of the important factors regulated and ramped up by DLK are not activated in the central nervous system.

"Since this sort of signaling doesn’t appear to happen in the central nervous system, it’s possible these nerves don’t ‘know’ when they are injured," DiAntonio says. "It’s an exciting idea — but not at all proven — that activating DLK in the central nervous system could promote its regeneration."

Source: Science Daily

Jun 21, 201239 notes
#science #neuroscience #psychology #protein
How Humans Predict Other's Decisions

ScienceDaily (June 20, 2012) — Researchers at the RIKEN Brain Science Institute (BSI) in Japan have uncovered two brain signals in the human prefrontal cortex involved in how humans predict the decisions of other people. Their results suggest that the two signals, each located in distinct prefrontal circuits, strike a balance between expected and observed rewards and choices, enabling humans to predict the actions of people with different values than their own.

image

Figure one shows the neural activity for the simulation of another person: Reward Signal (red) and Action Signal (green). The action signal shown in this figure (green) is in the dorsomedial prefrontal cortex. The activity of reward signal (red) largely overlaps with the activity of the signal for the self-valuation (blue) in the ventromedial prefrontal cortex. (Credit: RIKEN)

Every day, humans are faced with situations in which they must predict what decisions other people will make. These predictions are essential to the social interactions that make up our personal and professional lives. The neural mechanism underlying these predictions, however, by which humans learn to understand the values of others and use this information to predict their decision-making behavior, has long remained a mystery.

Researchers at the RIKEN Brain Science Institute (BSI) in Japan have now shed light on this mystery with a paper to appear in the June 21st issue of Neuron. The researchers describe for the first time the process governing how humans learn to predict the decisions of another person using mental simulation of their mind.

Learning another person’s values and mental processes is often assumed to require simulation of the other’s mind: using one’s own familiar mental processes to simulate unfamiliar processes in the mind of the other. While simple and intuitive, this explanation is hard to prove due to the difficulty in disentangling one’s own brain signals from those of the simulated other.

Research scientists Shinsuke Suzuki and Hiroyuki Nakahara, a Principal Investigator of the Laboratory for Integrated Theoretical Neuroscience at RIKEN BSI, together with their collaborators, set out to disentangle these signals using functional Magnetic Resonance Imaging (fMRI) on humans. First, they studied the behavior of subjects as they played a game by making predictions about the other’s behavior based on the knowledge of others and their decisions. Then they generated a computer model of the simulation process to examine the brain signals underlying the prediction of the other’s behavior.

The authors found that humans simulate the decisions of other people using two brain signals encoded in the prefrontal cortex, an area responsible for higher cognition. One signal involves the estimated value of the reward to the other person, and is called the reward signal, referring to the difference between the other’s values, simulated in one’s mind, and the reward benefit that the other actually received. The other signal is called the action signal, relating to the other’s expected action predicted by the simulation process in one’s mind, and what the other person actually did, which may or may not be different. They found that the reward signal is processed in a part of the brain called the ventromedial prefrontal cortex. The action signal, on the other hand, was found in a separate brain area called the dorsomedial prefrontal cortex.

"Every day, we interact with a variety of other individuals," Suzuki said. "Some may share similar values with us and for those interactions simulation using the reward signal alone may suffice. However, other people with different values may be quite different and then the action signal may become quite important."

Nakahara believes that their approach, using mathematical models based on human behavior with brain imaging, will be useful to answer a wide range of questions about the social functions employed by the brain. “Perhaps we may one day better understand how and why humans have the ability to predict others’ behavior, even those with different characteristics. Ultimately, this knowledge could help improving political, educational, and social systems in human societies.”

Source: Science Daily

Jun 21, 201256 notes
#science #neuroscience #brain #psychology
All Things Big and Small: The Brain's Discerning Taste for Size

ScienceDaily (June 20, 2012) — The human brain can recognize thousands of different objects, but neuroscientists have long grappled with how the brain organizes object representation; in other words, how the brain perceives and identifies different objects. Now researchers at the MIT Computer Science and Artificial Intelligence Lab (CSAIL) and the MIT Department of Brain and Cognitive Sciences have discovered that the brain organizes objects based on their physical size, with a specific region of the brain reserved for recognizing large objects and another reserved for small objects.

image

This figure shows brain activations while participants view pictures of large and small objects. (Credit: Image courtesy of Massachusetts Institute of Technology, CSAIL)

Their findings, to be published in the June 21 issue of Neuron, could have major implications for fields like robotics, and could lead to a greater understanding of how the brain organizes and maps information.

"Prior to this study, nobody had looked at whether the size of an object was an important factor in the brain’s ability to recognize it," said Aude Oliva, an associate professor in the MIT Department of Brain and Cognitive Sciences and senior author of the study.

"It’s almost obvious that all objects in the world have a physical size, but the importance of this factor is surprisingly easy to miss when you study objects by looking at pictures of them on a computer screen," said Dr. Talia Konkle, lead author of the paper. "We pick up small things with our fingers, we use big objects to support our bodies. How we interact with objects in the world is deeply and intrinsically tied to their real-world size, and this matters for how our brain’s visual system organizes object information."

As part of their study, Konkle and Oliva took 3D scans of brain activity during experiments in which participants were asked to look at images of big and small objects or visualize items of differing size. By evaluating the scans, the researchers found that there are distinct regions of the brain that respond to big objects (for example, a chair or a table), and small objects (for example, a paperclip or a strawberry).

By looking at the arrangement of the responses, they found a systematic organization of big to small object responses across the brain’s cerebral cortex. Large objects, they learned, are processed in the parahippocampal region of the brain, an area located by the hippocampus, which is also responsible for navigating through spaces and for processing the location of different places, like the beach or a building. Small objects are handled in the inferior temporal region of the brain, near regions that are active when the brain has to manipulate tools like a hammer or a screwdriver.

The work could have major implications for the field of robotics, in particular in developing techniques for how robots deal with different objects, from grasping a pen to sitting in a chair.

"Our findings shed light on the geography of the human brain, and could provide insight into developing better machine interfaces for robots," said Oliva.

Many computer vision techniques currently focus on identifying what an object is without much guidance about the size of the object, which could be useful in recognition. “Paying attention to the physical size of objects may dramatically constrain the number of objects a robot has to consider when trying to identify what it is seeing,” said Oliva.

The study’s findings are also important for understanding how the organization of the brain may have evolved. The work of Konkle and Oliva suggests that the human visual system’s method for organizing thousands of objects may also be tied to human interactions with the world. “If experience in the world has shaped our brain organization over time, and our behavior depends on how big objects are, it makes sense that the brain may have established different processing channels for different actions, and at the center of these may be size,” said Konkle.

Oliva, a cognitive neuroscientist by training, has focused much of her research on how the brain tackles scene and object recognition, as well as visual memory. Her ultimate goal is to gain a better understanding of the brain’s visual processes, paving the way for the development of machines and interfaces that can see and understand the visual world like humans do.

"Ultimately, we want to focus on how active observers move in the natural world. We think this not only matters for large-scale brain organization of the visual system, but it also matters for making machines that can see like us," said Konkle and Oliva.

Source: Science Daily

Jun 21, 201214 notes
#science #neuroscience #brain #psychology
Simple mathematical pattern describes shape of neuron 'jungle'

June 20, 2012

Neurons come in an astounding assortment of shapes and sizes, forming a thick inter-connected jungle of cells. Now, UCL neuroscientists have found that there is a simple pattern that describes the tree-like shape of all neurons.

Neurons look remarkably like trees, and connect to other cells with many branches that effectively act like wires in an electrical circuit, carrying impulses that represent sensation, emotion, thought and action.

Over 100 years ago, Santiago Ramon y Cajal, the father of modern neuroscience, sought to systematically describe the shapes of neurons, and was convinced that there must be a unifying principle underlying their diversity.

Cajal proposed that neurons spread out their branches so as to use as little wiring as possible to reach other cells in the network. Reducing the amount of wiring between cells provides additional space to pack more neurons into the brain, and therefore increases its processing power.

New work by UCL neuroscientists, published today in Proceedings of the National Academy of Sciences, has revisited this century-old hypothesis using modern computational methods. They show that a simple computer program which connects points with as little wiring as possible can produce tree-like shapes which are indistinguishable from real neurons - and also happen to be very beautiful. They also show that the shape of neurons follows a simple mathematical relationship called a power law.

Power laws have been shown to be common across the natural world, and often point to simple rules underlying complex structures. Dr Herman Cuntz (UCL Wolfson Institute for Biomedical Research) and colleagues find that the power law holds true for many types of neurons gathered from across the animal kingdom, providing strong evidence for Ramon y Cajal’s general principle.

The UCL team further tested the theory by examining neurons in the olfactory bulb, a part of the brain where new brain cells are constantly being formed. These neurons grow and form new connections even in the adult brain, and therefore provide a unique window into the rules behind the development of neural trees in a mature neural circuit.

The team analysed the change in shape of the newborn olfactory neurons over several days, and found that the growth of these neurons also follow the power law, providing further evidence to support the theory.

Dr Hermann Cuntz said: “The ultimate goal of neuroscience is to understand how the impenetrable neural jungle can give rise to the complexity of behaviour.

"Our findings confirm Cajal’s original far-reaching insight that there is a simple pattern behind the circuitry, and provides hope that neuroscientists will someday be able to see the forest for the trees."

Provided by University College London

Source: medicalxpress.com

Jun 21, 201229 notes
#science #neuroscience #brain #psychology #neuron
Jun 20, 20124,913 notes
#science #neuroscience #brain #psychology #neuron #connectome
Fishing for Answers to Autism Puzzle

ScienceDaily (June 19, 2012) — Fish cannot display symptoms of autism, schizophrenia, or other human brain disorders. However, a team of Whitehead Institute and MIT scientists has shown that zebrafish can be a useful tool for studying the genes that contribute to such disorders.

image

Zebrafish with certain genes turned off during embryonic development (center and right images) showed abnormalities of brain formation (top row) and axon wiring (bottom row). At left is a normally developing zebrafish embryo. (Credit: Sive Lab)

Led by Whitehead Member Hazel Sive, the researchers set out to explore a group of about two dozen genes known to be either missing or duplicated in about 1 percent of autistic patients. Most of the genes’ functions were unknown, but a new study by Sive and Whitehead postdocs Alicia Blaker-Lee, Sunny Gupta and, Jasmine McCammon, revealed that nearly all of them produced brain abnormalities when deleted in zebrafish embryos.

The findings, published online recently in the journal Disease Models & Mechanisms, should help researchers pinpoint genes for further study in mammals, says Sive, who is also professor of biology and associate dean of MIT’s School of Science. Autism is thought to arise from a variety of genetic defects; this research is part of a broad effort to identify culprit genes and develop treatments that target them.

"That’s really the goal — to go from an animal that shares molecular pathways, but doesn’t get autistic behaviors, into humans who have the same pathways and do show these behaviors," Sive says.

Sive recalls that some of her colleagues chuckled when she first proposed studying human brain disorders in fish, but it is actually a logical starting point, she says. Brain disorders are difficult to study because most of the symptoms are behavioral, and the biological mechanisms behind those behaviors are not well understood, she says.

"We thought that since we really know so little, that a good place to start would be with the genes that confer risk in humans to various mental health disorders, and to study these various genes in a system where they can readily be studied," she says.

Those genes tend to be the same across species — conserved throughout evolution, from fish to mice to humans — though they may control somewhat different outcomes in each species.

In the latest study, Sive and her colleagues focused on a genetic region known as 16p11.2, first identified by Mark Daly, a former Whitehead Fellow who discovered a type of genetic defect known as a copy number variant. A typical genome includes two copies of every gene, one from each parent; copy number variants occur when one of those copies is deleted or duplicated, and this can be associated with pathology.

The central “core” 16p11.2 region includes 25 genes. Both deletions and duplications in this region have been associated with autism, but it was unclear which of the genes might actually produce symptoms of the disease. “At the time, there was an inkling about some of them, but very few,” Sive says.

Sive and her postdocs began by identifying zebrafish genes analogous to the human genes found in this region. (In zebrafish, these genes are not clustered in a single genetic chunk, but are scattered across many chromosomes.) The researchers studied one gene at a time, silencing each with short strands of nucleic acids that target a particular gene and prevent its protein from being produced.

For 21 of the genes, silencing led to abnormal development. Most produced brain deficits, including improper development of the brain or eyes, thinning of the brain, or inflation of the brain ventricles, cavities that contain cerebrospinal fluid. The researchers also found abnormalities in the wiring of axons, the long neural projections that carry messages to other neurons, and in simple behaviors of the fish. The results show that the 16p11.2 genes are very important during brain development, helping to explain the connection between this region and brain disorders.

Furthermore, the researchers were able to restore normal development by treating the fish with the human equivalents of the genes that had been repressed. “That allows you to deduce that what you’re learning in fish corresponds to what that gene is doing in humans. The human gene and the fish gene are very similar,” Sive says.

To figure out which of these genes might have a strong effect in autism or other disorders, the researchers set out to identify genes that produce abnormal development when their activity is reduced by 50 percent, which would happen in someone who is missing one copy of the gene. (This correlation is not seen for most genes, because there are many other checks and balances that regulate how much of a particular protein is made.)

The researchers identified two such genes in the 16p11.2 region. One, called kif22, codes for a protein involved in the separation of chromosomes during cell division, and one, aldolase a, is involved in glycolysis — the process of breaking down sugar to generate energy for the cell.

In work that has just begun, Sive’s lab is working with Stanford University researchers to explore in mice predictions made from the zebrafish study. They are also conducting molecular studies in zebrafish of the pathways affected by these genes, to get a better idea of how defects in these might bring about neurological disorders.

Source: Science Daily

Jun 20, 201225 notes
#science #neuroscience #brain #psychology #autism
Study Finds High Brain Integration in Top Performers

June 19, 2012 By Janice Wood

Why do some people excel in sports, music and managing companies? New research points to uniquely high mind-brain development in those who excel.

image

“What we have found is an astonishing integration of brain functioning in high performers compared to average-performing controls,” said Fred Travis, Ph.D., director of the Center for Brain, Consciousness, and Cognition at Maharishi University of Management in Fairfield, Iowa.

He claims this research is the “first in the world to show that there is a brain measure of effective leadership.”

In the study, published in the journal Cognitive Processing, researchers found that 20 top-level managers scored higher on three measures — the Brain Integration Scale, Gibbs’s Socio-moral Reasoning questionnaire, and an inventory of peak experiences — compared to 20 low-level managers who served as controls.

“The current understanding of high performance is fragmented,” said co-researcher Harald Harung, Ph.D., of the Oslo and Akershus University College of Applied Sciences in Norway.

“What we have done in our research is to use quantitative and neurophysiological research methods on topics that so far have been dominated by psychology.”

The researchers carried out four studies comparing world-class performers to average performers. This recent study and two others examined top performers in management, sports and classical music. A number of years ago Harung and his colleagues published a study on a variety of professions, such as public administration, management, sports, arts, and education.

The studies include using electroencephalography (EEG) to look at the extent of integration and development of several brain processes.

Read More →

Jun 20, 201231 notes
#science #neuroscience #psychology #brain
Infants Can't Distinguish Between Large and Small Groups

ScienceDaily (June 19, 2012) — Human brains process large and small numbers of objects using two different mechanisms, but infants have not yet developed the ability to make those two processes work together, according to new research from the University of Missouri.

"This research was the first to show the inability of infants in a single age group to discriminate large and small sets in a single task," said Kristy vanMarle, assistant professor of psychological sciences in the College of Arts and Science. "Understanding how infants develop the ability to represent and compare numbers could be used to improve early education programs."

The MU study found that infants consistently chose the larger of two groups of food items when both sets were larger or smaller than four, just as an adult would. Unlike adults, the infants showed no preference for the larger group when choosing between one large and one small set. The results suggest that at age one infants have not yet integrated the two mental functions: one being the ability to estimate numbers of items at a glance and the other being the ability to visually track small sets of objects.

In vanMarle’s study, 10- to 12-month-old infants were presented with two opaque cups. Different numbers of pieces of breakfast cereal were hidden in each cup, while the infants observed, and then the infants were allowed to choose a cup. Four comparisons were tested between different combinations of large and small sets. Infants consistently chose two food items over one and eight items over four, but chose randomly when asked to compare two versus four and two versus eight.

"Being unable to determine that eight is larger than two would put an organism at a serious disadvantage," vanMarle said. "However, ongoing studies in my lab suggest that the capacity to compare small and large sets seems to develop before age two."

The ability to make judgments about the relative number of objects in a group has old evolutionary roots. Dozens of species, including some fish, monkeys and birds have shown the ability to recognize numerical differences in laboratory studies. VanMarle speculated that being unable to compare large and small sets early in infancy may not have been problematic during human evolution because young children probably received most of their food and protection from caregivers. Infants’ survival didn’t depend on determining which bush had the most berries or how many predators they just saw, she said.

"In the modern world there are educational programs that claim to give children an advantage by teaching them arithmetic at an early age," said vanMarle. "This research suggests that such programs may be ineffective simply because infants are unable to compare some numbers with others."

Source: Science Daily

Jun 20, 201213 notes
#science #neuroscience #brain #psychology
Detector of DNA Damage: Structure of a Repair Factor Revealed

ScienceDaily (June 19, 2012) — Double-stranded breaks in cellular DNA can trigger tumorigenesis. LMU researchers have now determined the structure of a protein involved in the repair and signaling of DNA double-strand breaks. The work throws new light on the origins of neurodegenerative diseases and certain tumor types.

Agents such as radiation or environmental toxins can cause double-stranded breaks in genomic DNA, which facilitate the development of tumors or the neurodegenerative disorders ataxia telangiectasia (AT) and AT-like disease (ATLD). Hence efficient repair mechanisms are essential for cell survival and function. The so-called MRN complex is an important component of one such system, and its structure has just been elucidated by a team led by Professor Karl-Peter Hopfner of LMU’s Gene Center.

Malignant mutations

The MRN complex consists of the nuclease Mre11, the ATPase Rad50 and the protein Nbs1. Nbs1 is responsible for recruiting the protein ATM, which plays a central role in early stages of the cellular response to DNA damage, to the site of damage. “How the MRN complex actually recognizes double-stranded breaks is still not clear,” says Hopfner. He and his colleagues therefore set out to clarify the issue by analyzing the structures of mutant, functionally defective versions of the complex.

"We found that pairs of Mre11 molecules form a flexible dimer, which is stabilized by Nbs1." Mutations in different subunits of the complex are associated with distinct syndromes, marked by a predisposition to certain cancers, sensitivity to radiation or neurodegeneration. Hopfner’s results help to explain these differences. For instance, the mutation linked to ATLD lies within the zone of contact between Mre11 and Nbs1, and may inhibit activation of ATM by weakening their interaction.

Source: Science Daily

Jun 20, 20125 notes
#science #neuroscience #biology #DNA
Hulk smash? Maybe not anymore: scientists block excess aggression in mice

June 19, 2012

Pathological rage can be blocked in mice, researchers have found, suggesting potential new treatments for severe aggression, a widespread trait characterized by sudden violence, explosive outbursts and hostile overreactions to stress.

In a study appearing today in the Journal of Neuroscience, researchers from the University of Southern California and Italy identify a critical neurological factor in aggression: a brain receptor that malfunctions in overly hostile mice. When the researchers shut down the brain receptor, which also exists in humans, the excess aggression completely disappeared.

The findings are a significant breakthrough in developing drug targets for pathological aggression, a component in many common psychological disorders including Alzheimer’s disease, autism, bipolar disorder and schizophrenia.

"From a clinical and social point of view, reactive aggression is absolutely a major problem," said Marco Bortolato, lead author of the study and research assistant professor of pharmacology and pharmaceutical sciences at the USC School of Pharmacy. “We want to find the tools that might reduce impulsive violence.”

A large body of independent research, including past work by Bortolato and senior author Jean Shih, USC University Professor and Boyd & Elsie Welin Professor in Pharmacology and Pharmaceutical Sciences at USC, has identified a specific genetic predisposition to pathological aggression: low levels of the enzyme monoamine oxidase A (MAO A). Both male humans and mice with congenital deficiency of the enzyme respond violently in response to stress.

"The same type of mutation that we study in mice is associated with criminal, very violent behavior in humans. But we really didn’t understand why that it is," Bortolato said.

Bortolato and Shih worked backwards to replicate elements of human pathological aggression in mice, including not just low enzyme levels but also the interaction of genetics with early stressful events such as trauma and neglect during childhood.

"Low levels of MAO A are one basis of the predisposition to aggression in humans. The other is an encounter with maltreatment, and the combination of the two factors appears to be deadly: it results consistently in violence in adults," Bortolato said.

The researchers show that in excessively aggressive rodents that lack MAO A, high levels of electrical stimulus are required to activate a specific brain receptor in the pre-frontal cortex. Even when this brain receptor does work, it stays active only for a short period of time.

"The fact that blocking this receptor moderates aggression is why this discovery has so much potential. It may have important applications in therapy," Bortolato said. "Whatever the ways environment can persistently affect behavior — and even personality over the long term — behavior is ultimately supported by biological mechanisms."

Importantly, the aggression receptor, known as NMDA, is also thought to play a key role in helping us make sense of multiple, coinciding streams of sensory information, according to Bortolato.

The researchers are now studying the potential side effects of drugs that reduce the activity of this receptor.

"Aggressive behaviors have a profound socio-economic impact, yet current strategies to reduce these staggering behaviors are extremely unsatisfactory," Bortolato said. "Our challenge now is to understand what pharmacological tools and what therapeutic regimens should be administered to stabilize the deficits of this receptor. If we can manage that, this could truly be an important finding."

Provided by University of Southern California

Source: medicalxpress.com

Jun 20, 201219 notes
#science #neuroscience #brain #psychology
Front-most part of the cortex involved in making short-term predictions about what will happen next

June 19, 2012

Researchers at the University of Iowa, together with colleagues from the California Institute of Technology and New York University, have discovered how a part of the brain helps predict future events from past experiences. The work sheds light on the function of the front-most part of the frontal lobe, known as the frontopolar cortex, an area of the cortex uniquely well developed in humans in comparison with apes and other primates.

image

The image shows the overlap of lesions for eight subjects superimposed on a template brain — red indicates maximum overlap (seven subjects) and dark blue is minimum overlap (one subject). The patient group was selected for lesions that include frontopolar cortex, but the lesions almost invariably extended outside to other parts of anterior prefrontal cortex. Credit: Christopher Kovach, University of Iowa

Making the best possible decisions in a changing and unpredictable environment is an enormous challenge. Not only does it require learning from past experience, but it also demands anticipating what might happen under previously unencountered circumstances. Past research from the UI Department of Neurology was among the first to show that damage to certain parts of the frontal lobe can cause severe deficits in decision making in rapidly changing environments. The new study from the same department on a rare group of patients with damage to the very frontal part of their brains reveals a critical aspect of how this area contributes to decision making. The findings were published June 19 in the Journal of Neuroscience.

"We gave the patients four slot machines from which to pick in order to win money. Unbeknownst to the patients, the probability of getting money from a particular slot machine gradually and unpredictably changed during the experiment. Finding the strategy that pays the most in the long run is a surprisingly difficult problem to solve, and one we hypothesized would require the frontopolar cortex,” explains Christopher Kovach, Ph.D., a UI post-doctoral fellow in neurosurgery and first author of the study.

Contrary to the authors’ initial expectation, the patients actually did quite well on the task, winning as much money, on average, as healthy control participants.

"But when we compared their behavior to that of subjects with intact frontal lobe, we found they used a different set of assumptions about how the payoffs changed over time,” Kovach says. “Both groups based their decisions on how much they had recently won from each slot machine, but healthy comparison subjects pursued a more elaborate strategy, which involved predicting the direction that payoffs were moving based on recent trends. This points towards a specific role for the frontopolar cortex in extrapolating recent trends.”

Kovach’s colleague and study author Ralph Adolphs, Ph.D., professor of neuroscience and psychology at the California Institute of Technology, adds that the study results “argue that the frontopolar cortex helps us to make short-term predictions about what will happen next, a strategy particularly useful in environments that change rapidly — such as the stock market or most social settings.”

Adolphs also hold an adjunct appointment in the UI Department of Neurology.

The study’s innovative approach to understanding the function of this part of the brain uses model-based analyses of behavior of patients with specific and precisely characterized areas of brain damage. These patients are members of the UI’s world-renowned Iowa Neurological Patient Registry, which was established in 1982 and has more than 500 active members with selective forms of damage, or lesions, to one or two defined regions in the brain.

"The University of Iowa is one of the few places in the world where you could carry out this kind of study, since it requires carefully assessed patients with damage to specific parts of their brain," says study author Daniel Tranel, Ph.D., UI professor of neurology and psychology and director of the UI Division of Behavioral Neurology and Cognitive Neuroscience.

In a final twist to the finding, the strategy taken by lesion patients was actually slightly better than the one used by comparison subjects. It happened that the task was designed so that the trends in the payoffs were, in fact, random and uninformative.

"The healthy comparison subjects seemed to perceive trends in what was just random noise," Kovach says.

This implies that the functions of the frontopolar cortex, which support more complex and detailed models of the environment, at times come with a downside: setting up mistaken assumptions.

"To the best of my knowledge this is the first study which links a normal tendency to see a nonexistent pattern in random noise, a type of cognitive bias, to a particular brain region," Kovach notes.

The researchers next want to investigate other parts of the frontal cortex in the brain, and have also begun to record activity directly from the brains of neurosurgical patients to see how single cells respond while making decisions. The work is also important to understand difficulties in decision making seen in disorders such as addiction.

Provided by University of Iowa

Source: medicalxpress.com

Jun 20, 201214 notes
#science #neuroscience #brain #psychology
First example of a heritable abnormality affecting semantic cognition found

June 19, 2012

Four generations of a single family have been found to possess an abnormality within a specific brain region which appears to affect their ability to recall verbal material, a new study by researchers at the University of Bristol and University College London has found.

This is the first suggestion of a heritable abnormality in otherwise healthy humans, and this has important implications for our understanding of the genetic basis of cognition.

Dr Josie Briscoe of Bristol’s School of Experimental Psychology and colleagues at the Institute of Child Health in London studied eight members of a single family (aged 8 years), who despite all having high levels of intelligence have since childhood, experienced profound difficulties in recalling sentences and prose, and language difficulties in listening comprehension and naming less common objects .

While their conversation is articulate and engaging, they can experience the inability to ‘find’ a particular word or topic – a phenomenon similar to the ‘tip-of-the-tongue’ problem experienced by many people. They also report associated problems such as struggling to follow a narrative thread while reading or watching television drama.

Dr Briscoe said: “With their consent, we conducted a number of standard memory and language tests on the affected members of the family. These showed they had difficulty repeating longer sentences correctly and learning words in lists and pairs. This suggests their difficulties lie in semantic cognition: the way people construct and generate meaning from words, objects and ideas.”

"Given the very wide variation in age, the coherence of their difficulties in semantic cognition was remarkable."

The researchers also used Magnetic Resonance Imaging (MRI) to study the brains of the affected family members and found they had reduced grey matter in the posterior inferior portion of the temporal lobe, a brain area known to be involved in semantic cognition.

Dr Briscoe said: “These brain abnormalities were surprising to find in healthy people, particularly in the same family, although similar brain regions have been implicated in research with older adults with neurological problems that are linked to semantic cognition”

"Our findings have uncovered a potential causal link between anomalous neuroanatomy and semantic cognition in a single family. Importantly, the pattern of inheritance appears as a potentially dominant trait. This may well prove to be the first example of a heritable, highly specific abnormality affecting semantic cognition in humans.”

Provided by University of Bristol

Source: medicalxpress.com

Jun 20, 201210 notes
#science #neuroscience #brain #psychology
'Hallucinating' robots arrange objects for human use

June 18, 2012 By Bill Steele

(Phys.org) — If you hire a robot to help you move into your new apartment, you won’t have to send out for pizza. But you will have to give the robot a system for figuring out where things go. The best approach, according to Cornell researchers, is to ask “How will humans use this?”

image

A robot populates a room with imaginary human stick figures in order to decide where objects should go to suit the needs of humans.

Researchers in the Personal Robotics Lab of Ashutosh Saxena, assistant professor of computer science, have already taught robots to identify common objects, pick them up and place them stably in appropriate locations. Now they’ve added the human element by teaching robots to “hallucinate” where and how humans might stand, sit or work in a room, and place objects in their usual relationship to those imaginary people.

Their work will be reported at the International Symposium on Experimental Robotics, June 21 in Quebec, and the International Conference of Machine Learning, June 29 in Edinburgh, Scotland.

Previous work on robotic placement, the researchers note, has relied on modeling relationships between objects. A keyboard goes in front of a monitor, and a mouse goes next to the keyboard. But that doesn’t help if the robot puts the monitor, keyboard and mouse at the back of the desk, facing the wall.

image

Above left, random placing of objects in a scene puts food on the floor, shoes on the desk and a laptop teetering on the top of the fridge. Considering the relationships between objects (upper right) is better, but he laptop is facing away from a potential user and the food higher than most humans would like. Adding human context (lower left) makes things more accessible. Lower right: how an actual robot carried it out. (Personal Robotics Lab)

Relating objects to humans not only avoids such mistakes but also makes computation easier, the researchers said, because each object is described in terms of its relationship to a small set of human poses, rather than to the long list of other objects in a scene. A computer learns these relationships by observing 3-D images of rooms with objects in them, in which it imagines human figures, placing them in practical relationships with objects and furniture. You don’t don’t put a sitting person where there is no chair. You can put a sitting person on top of a bookcase, but there are no objects there for the person to use, so that”s ignored. It The computer calculates the distance of objects from various parts of the imagined human figures, and notes the orientation of the objects.

Eventually it learns commonalities: There are lots of imaginary people sitting on the sofa facing the TV, and the TV is always facing them. The remote is usually near a human’s reaching arm, seldom near a standing person’s feet. “It is more important for a robot to figure out how an object is to be used by humans, rather than what the object is. One key achievement in this work is using unlabeled data to figure out how humans use a space,” Saxena said.

In a new situation the a robot places human figures in a 3-D image of a room, locating them in relation to objects and furniture already there. “It puts a sample of human poses in the environment, then figures out which ones are relevant and ignores the others,” Saxena explained. It decides where new objects should be placed in relation to the human figures, and carries out the action.

The researchers tested their method using images of living rooms, kitchens and offices from the Google 3-D Warehouse, and later, images of local offices and apartments. Finally, they programmed a robot to carry out the predicted placements in local settings. Volunteers who were not associated with the project rated the placement of each object for correctness on a scale of 1 to 5.

Comparing various algorithms, the researchers found that placements based on human context were more accurate than those based solely in relationships between objects, but the best results of all came from combining human context with object-to-object relationships, with an average score of 4.3. Some tests were done in rooms with furniture and some objects, others in rooms where only a major piece of furniture was present. The object-only method performed significantly worse in the latter case because there was no context to use. “The difference between previous works and our [human to object] method was significantly higher in the case of empty rooms,” Saxena reported.

Provided by Cornell University

Source: phys.org

Jun 19, 201211 notes
#science #neuroscience #robotics
Robots Get a Feel for the World

June 18th, 2012

Robots equipped with tactile sensor able to identify materials through touch, paving the way for more useful prostheses.

What does a robot feel when it touches something? Little or nothing until now. But with the right sensors, actuators and software, robots can be given the sense of feel, or at least the ability to identify different materials by touch.

Researchers at the University of Southern California’s Viterbi School of Engineering published a study today in Frontiers in Neurorobotics showing that a specially designed robot can outperform humans in identifying a wide range of natural materials according to their textures, paving the way for advancements in prostheses, personal assistive robots and consumer product testing.

The robot was equipped with a new type of tactile sensor built to mimic the human fingertip. It also used a newly designed algorithm to make decisions about how to explore the outside world by imitating human strategies. Capable of other human sensations, the sensor can also tell where and in which direction forces are applied to the fingertip and even the thermal properties of an object being touched.

Like the human finger, the group’s BioTac® sensor has a soft, flexible skin over a liquid filling. The skin even has fingerprints on its surface, greatly enhancing its sensitivity to vibration. As the finger slides over a textured surface, the skin vibrates in characteristic ways. These vibrations are detected by a hydrophone inside the bone-like core of the finger. The human finger uses similar vibrations to identify textures, but the robot finger is even more sensitive.

[Video: Robots Get a Feel for the World]
What does a robot feel when it touches something? Little or nothing until now. Researchers at the USC Viterbi School of Engineering publish a study in Frontiers in Neurorobotics showing that specially designed robots can be taught to feel even more than humans. Vimeo video by USC Viterbi. USC Viterbi.

When humans try to identify an object by touch, they use a wide range of exploratory movements based on their prior experience with similar objects. A famous theorem by 18th century mathematician Thomas Bayes describes how decisions might be made from the information obtained during these movements. Until now, however, there was no way to decide which exploratory movement to make next. The article, authored by Professor of Biomedical Engineering Gerald Loeb and recently graduated doctoral student Jeremy Fishel, describes their new theorem for solving this general problem as “Bayesian Exploration.”

Built by Fishel, the specialized robot was trained on 117 common materials gathered from fabric, stationery and hardware stores. When confronted with one material at random, the robot could correctly identify the material 95% of the time, after intelligently selecting and making an average of five exploratory movements. It was only rarely confused by pairs of similar textures that human subjects making their own exploratory movements could not distinguish at all.

image

Tactile sensors which mimic finger tips enables robots to identify materials through touch better than humans. Image from press release by USC Viterbi School of Engineering.

So, is touch another task that humans will outsource to robots? Fishel and Loeb point out that while their robot is very good at identifying which textures are similar to each other, it has no way to tell what textures people will prefer. Instead, they say this robot touch technology could be used in human prostheses or to assist companies who employ experts to assess the feel of consumer products and even human skin.

Source: Neuroscience News

Jun 19, 201213 notes
#science #neuroscience #robotics
Children, Brain Development and the Criminal Law

ScienceDaily (June 18, 2012) — The legal system needs to take greater account of new discoveries in neuroscience that show how a difficult childhood can affect the development of a young person’s brain which can increase the risk adolescent crimes, according to researchers.

The research will be presented as part of an Economic and Social Research Council seminar series in conjunction with the Parliamentary Office of Science and Technology.

Neuroscientists have recently shown that early adversity — such as a very chaotic and frightening home life — can result in a young child becoming hyper vigilant to potential threats in their environment. This appears to influence the development of brain connectivity and functions.

Such children may come to adolescence with brain systems that are set differently, and this may increase their likelihood of taking impulsive risks. For many young offenders such early adversity is a common experience, and it may increase both their vulnerability to mental health problems and also their risk of problem behaviours.

These insights, from a team led by Dr Eamon McCrory, University College London, are part of a wave of neuroscientific research questions that have potential implications for the legal system.

Other research by Dr Seena Fazel of Oxford University has shown that while social disadvantage is a major risk factor for offending, a Traumatic Brain Injury (TBI) — from an accident or assault — significantly increases the risk of involvement in violent crime. Professor Huw Williams, at University of Exeter, has similarly shown that around 45 per cent of young offenders have TBI histories, and more injuries are associated with greater violence.

Professor Williams said: “The latest message from neuroscience is that young people who suffer troubled childhoods may experience a kind of ‘triple whammy’. A difficult social background may put them at greater risk of offending and influence their brain development early on in childhood in a way that increases risky behaviour. This can then increase their chances of experiencing an injury to their brains that would compromise their ability to stay in school or contribute to society still further.”

Professor Williams wants to see better communication between neuroscientists, clinicians and lawyers so that research findings like these lead to changes in the legal system. “There is a big gap between research conducted by neuroscientists and the realities of the day to day work of the justice system,” he said. “Although criminal behaviour results from a complex interplay of a host of factors, neuroscientists and clinicians are identifying key risk factors that — if addressed — could reduce crime. Investment in earlier, focussed interventions may offset the costs of years of custody and social violence.”

Dr Eileen Vizard, a prominent adolescent forensic psychiatrist, will talk at the event Neuroscience, Children and the Law, about how the criminal justice system needs to be changed to age appropriate sentencing for children as young as ten years old, whilst also providing for the welfare needs of these deprived children. Laura Hoyano — a leading expert on vulnerable people in criminal courts — will discuss the problems children face when testifying in criminal courts.

Source: Science Daily

Jun 19, 201211 notes
#science #neuroscience #psychology #brain
Clues to Nervous System Evolution Found in Nerve-Less Sponge

ScienceDaily (June 18, 2012) — UC Santa Barbara scientists turned to the simple sponge to find clues about the evolution of the complex nervous system and found that, but for a mechanism that coordinates the expression of genes that lead to the formation of neural synapses, sponges and the rest of the animal world may not be so distant after all. Their findings, titled “Functionalization of a protosynaptic gene expression network,” are published in the Proceedings of the National Academy of Sciences.

image

The genes of Amphimedon queenslandica, a marine sponge native to the Great Barrier Reef, Australia, have been fully sequenced, allowing the researchers to monitor gene expression for signs of neural development. (Credit: UCSB)

"If you’re interested in finding the truly ancient origins of the nervous system itself, we know where to look," said Kenneth Kosik, Harriman Professor of Neuroscience Research in the Department of Molecular, Cellular & Developmental Biology, and co-director of UCSB’s Neuroscience Research Institute.

That place, said Kosik, is the evolutionary period of time when virtually the rest of the animal kingdom branched off from a common ancestor it shared with sponges, the oldest known animal group with living representatives. Something must have happened to spur the evolution of the nervous system, a characteristic shared by creatures as simple as jellyfish and hydra to complex humans, according to Kosik.

A previous sequencing of the genome of the Amphimedon queenslandica — a sponge that lives in Australia’s Great Barrier Reef — showed that it contained the same genes that lead to the formation of synapses, the highly specialized characteristic component of the nervous system that sends chemical and electrical signals between cells. Synapses are like microprocessors, said Kosik explaining that they carry out many sophisticated functions: They send and receive signals, and they also change behaviors with interaction — a property called “plasticity.”

"Specifically, we were hoping to understand why the marine sponge, despite having almost all the genes necessary to build a neuronal synapse, does not have any neurons at all," said the paper’s first author, UCSB postdoctoral researcher Cecilia Conaco, from the UCSB Department of Molecular, Cellular, and Developmental Biology (MCDB) and Neuroscience Research Institute (NRI). "In the bigger scheme of things, we were hoping to gain an understanding of the various factors that contribute to the evolution of these complex cellular machines."

This time the scientists, including Danielle Bassett, from the Department of Physics and the Sage Center for the Study of the Mind, and Hongjun Zhou and Mary Luz Arcila, from NRI and MCDB, examined the sponge’s RNA (ribonucleic acid), a macromolecule that controls gene expression. They followed the activity of the genes that encode for the proteins in a synapse throughout the different stages of the sponge’s development.

"We found a lot of them turning on and off, as if they were doing something," said Kosik. However, compared to the same genes in other animals, which are expressed in unison, suggesting a coordinated effort to make a synapse, the ones in sponges were not coordinated.

"It was as if the synapse gene network was not wired together yet," said Kosik. The critical step in the evolution of the nervous system as we know it, he said, was not the invention of a gene that created the synapse, but the regulation of preexisting genes that were somehow coordinated to express simultaneously, a mechanism that took hold in the rest of the animal kingdom.

The work isn’t over, said Kosik. Plans for future research include a deeper look at some of the steps that lead to the formation of the synapse; and a study of the changes in nervous systems after they began to evolve.

"Is the human brain just a lot more of the same stuff, or has it changed in a qualitative way?" he asked.

Source: Science Daily

Jun 19, 201213 notes
#science #neuroscience #evolution #psychology #nervous system
Diabetes, poor glucose control associated with greater cognitive decline in older adults

June 18, 2012

Among well-functioning older adults without dementia, diabetes mellitus (DM) and poor glucose control among those with DM are associated with worse cognitive function and greater cognitive decline, according to a report published Online First by Archives of Neurology, a JAMA Network publication.

Findings from previous studies have suggested an association between diabetes mellitus and an increased risk of cognitive impairment and dementia, including Alzheimer disease, but this association continues to be debated and less is known regarding incident DM in late life and cognitive function over time, the authors write as background in the study.

Kristine Yaffe, M.D., of the University of California, San Francisco and the San Francisco VA Medical Center, and colleagues evaluated 3,069 patients (mean age, 74.2 years; 42 percent black; 52 percent female) who completed the Modified Mini-Mental State Examination (3MS) and Digit Symbol Substitution Test (DSST) at baseline and selected intervals over 10 years.

At study baseline, 717 patients (23.4 percent) had prevalent DM and 2,352 (76.6 percent) were without DM, 159 of whom developed DM during follow-up. Patients who had prevalent DM at baseline had lower 3MS and DSST test scores than patients without DM, and results from analysis show similar patterns for 9-year decline with participants with prevalent DM showing significant decline on both the 3MS and DSST compared with those without DM.

Also, among participants with prevalent DM at baseline, higher levels of hemoglobin A1c (HbA1c) were associated with lower 3MS and DSST scores. However, after adjusting for age, sex, race and education, scores remained significantly lower for those with mid (7 percent to 8 percent) and high (greater than or equal to 8 percent) HbA1c levels on the 3MS but were no longer significant for the DSST.

"This study supports the hypothesis that older adults with DM have reduced cognitive function and that poor glycemic control may contribute to this association,” the authors conclude. “Future studies should determine if early diagnosis and treatment of DM lessen the risk of developing cognitive impairment and if maintaining optimal glucose control helps mitigate the effect of DM on cognition.”

Provided by JAMA and Archives Journals

Source: medicalxpress.com

Jun 19, 20122 notes
#science #neuroscience #brain #alzheimer
Highways of the brain: High-cost and high-capacity

June 18, 2012

A new study proposes a communication routing strategy for the brain that mimics the American highway system, with the bulk of the traffic leaving the local and feeder neural pathways to spend as much time as possible on the longer, higher-capacity passages through an influential network of hubs, the so-called rich club.

image

The study, published this week online in the Early Edition of the Proceedings of the National Academy of Sciences, involves researchers from Indiana University and the University Medical Center Utrecht in the Netherlands and advances their earlier findings that showed how select hubs in the brain not only are powerful in their own right but have numerous and strong connections between each other.

The current study characterizes the influential network within the rich club as the “backbone” for global brain communication. A costly network in terms of the energy and space consumed, said Olaf Sporns, professor in the Department of Psychological and Brain Sciences at IU Bloomington, but one with a big pay-off: providing quick and effective communication between billions and billions of brain cells.

"Until now, no one knew how central the brain’s rich club really was," Sporns said. "It turns out the rich club is always right in the middle when it comes to how brain regions talk to each other. It absorbs, transforms and disseminates information. This underscores its importance for brain communication.”

In earlier work, using diffusion imaging, the researchers found a group of 12 strongly interconnected bihemispheric hub regions, comprising the precuneus, superior frontal and superior parietal cortex, as well as the subcortical hippocampus, putamen and thalamus. Together, these regions form the brain’s “rich club.” Most of these areas are engaged in a wide range of complex behavioral and cognitive tasks, rather than more specialized processing such as vision and motor control.

For the current study, Martijn van den Heuvel, a professor at the Rudolf Magnus Institute of Neuroscience at University Medical Center Utrecht, used diffusion tensor imaging data for two sets of 40 healthy subjects to map the large-scale connectivity structure of the brain. The cortical sheet was divided into 1,170 regions, and then pathways between the regions were reconstructed and measured. As in the previous study, the rich club nodes were widely distributed and had up to 40 percent more connectivity compared to other areas.

The connections measured — almost 700,000 in total — were classified in one of three ways: as rich club connections if they connected nodes within the rich club; as feeder connections if they connected a non-rich club node to a rich club node; and as local connections if they connected non-rich club nodes. Rich club connections made up the majority of all long-distance neural pathways. The study also found that connections classified as rich club connections were used more heavily for communication than other feeder and local connections. A path analysis showed that when a minimally short path is traced from one area of the brain to another, it travels through the rich club network 69 percent of the time, even though the network accounts for only 10 percent of the brain.

A common pattern in communication paths spanning long distances, Sporns said, was that such paths involved sequences of steps leading across local, feeder, rich club, feeder and back to local connections. In other words, he said, many communication paths first traveled toward the rich club before reaching their destinations.

"It is as if the rich club acts as an attractor for signal traffic in the brain," Sporns said. "It soaks up information which is then integrated and sent back out to the rest of the brain."

Van den Heuvel agreed.

"It’s like a big ‘neuronal magnet’ for communication and information integration in our brains," he said. "Seeking out the rich club may offer a strategy for neurons and brain regions to find short communication paths across the brain, and might provide insight into how our brain manages to be so highly efficient."

From an evolutionary standpoint, it was important for the brain to minimize energy consumption and wiring volume, but if these were the only factors, there would be no rich club because of the extra resources it requires, Sporns said. The rich club is expensive, at least in terms of wiring volume, and perhaps also in terms of metabolic cost. The trade-off for higher cost, Sporns said, is higher performance — the integration of diverse signals and the ability to select short paths across the network.

“Brain neurons don’t have maps; how do they find paths to get in touch? Perhaps the rich club helps with this, offering the brain’s neurons and regions a way to communicate efficiently based on a routing strategy that involves the rich club.”

People use related strategies to navigate social networks.

"Strangely, neurons may solve their communication problems just like the people to which they belong," Sporns said.

Provided by Indiana University

Source: medicalxpress.com

Jun 19, 201213 notes
#science #neuroscience #brain #psychology
Coenzyme Q10 study indicates promise in Huntington's treatment

June 18, 2012

A new study shows that the compound Coenzyme Q10 (CoQ) reduces oxidative damage, a key finding that hints at its potential to slow the progression of Huntington disease. The discovery, which appears in the inaugural issue of the Journal of Huntington’s Disease, also points to a new biomarker that could be used to screen experimental treatments for this and other neurological disorders.

"This study supports the hypothesis that CoQ exerts antioxidant effects in patients with Huntington’s disease and therefore is a treatment that warrants further study," says University of Rochester Medical Center neurologist Kevin M. Biglan, M.D., M.P.H., lead author of the study. “As importantly, it has provided us with a new method to evaluate the efficacy of potential new treatments.”

Huntington’s disease (HD) is a genetic, progressive neurodegenerative disorder that impacts movement, behavior, cognition, and generally results in death within 20 years of the disease’s onset. While the precise causes and mechanism of the disease are not completely understood, scientists believe that one of the important triggers of the disease is a genetic “stutter" which produces abnormal protein deposits in brain cells. It is believed that these deposits – through a chain of molecular events – inhibit the cell’s ability to meet its energy demands resulting in oxidative stress and, ultimately, cellular death.

Scientists had previously identified the correlation between a specific fragment of genetic code, called 8-hydroxy-2’-deoxyguanosine (80HdG) and the presence of oxidative stress in brain cells. 80HdG can be detected in a person’s blood, meaning that it could serve as a convenient and accessible biomarker for the disease. Researchers have also been evaluating the compound Coenzyme Q10 as a possible treatment for HD because of its ability to support the function of mitochondria – the tiny power plants the provide cells with energy – and counter oxidative stress.

The study’s authors evaluated a series of blood samples of 20 individuals with HD who had previously undergone treatment with CoQ in clinical trial titled Pre-2Care. While these studies showed that CoQ alleviated some symptoms of the disease, it was not known what impact – if any – the treatment had at the molecular level in the brain. Upon analysis, the authors found that 80HdG levels dropped by 20 percent in individuals who had been treated with CoQ.

CoQ is currently being evaluated in a Phase 3 clinical trial, which is the largest therapeutic clinical study to date for HD. The trial – called 2Care – is being run by the Huntington Study Group, an international networks or investigators.

"Identifying treatments that slow the progression or delay the onset of Huntington’s disease is a major focus of the medical community," said Biglan. "This study demonstrates that 80HdG could be an ideal marker to identify the presence oxidative injury and whether or not treatment is having an impact."

Provided by University of Rochester Medical Center

Source: medicalxpress.com

Jun 18, 201211 notes
#science #neuroscience #brain #huntington #psychology
Device implanted in brain has therapeutic potential for Huntington's disease

June 18, 2012

Studies suggest that neurotrophic factors, which play a role in the development and survival of neurons, have significant therapeutic and restorative potential for neurologic diseases such as Huntington’s disease. However, clinical applications are limited because these proteins cannot easily cross the blood brain barrier, have a short half-life, and cause serious side effects. Now, a group of scientists has successfully treated neurological symptoms in laboratory rats by implanting a device to deliver a genetically engineered neurotrophic factor directly to the brain. They report on their results in the latest issue of Restorative Neurology and Neuroscience.

image

The tip of the EC biodelivery system, a straw-like device that is implanted in the brain of patients, contains living cells which are genetically modified to produce a therapeutic factor. The membrane enclosing the cells allows the factor to flow out of the device and into the patient’s brain tissue. This way, areas deep within the brain affected by Huntington’s disease can be treated to delay or prevent the disease. Credit: Jens Tornøe, NsGene A/S, Ballerup, Denmark

Researchers used Encapsulated Cell (EC) biodelivery, a platform which can be applied using conventional minimally invasive neurosurgical procedures to target deep brain structures with therapeutic proteins. “Our study adds to the continually increasing body of preclinical and clinical data positioning EC biodelivery as a promising therapeutic delivery method for larger biomolecules. It combines the therapeutic advantages of gene therapy with the well-established safety of a retrievable implant,” says lead investigator Jens Tornøe, NsGene A/S, Ballerup, Denmark.

Investigators made a catheter-like device consisting of a hollow fiber membrane encapsulating a polymeric “scaffold,” which provides a surface area to which neurotrophic factor-producing cells can attach. When implanted in the brain, the membrane allows the neurotrophic factor to flow out of the device, as well as allowing nutrients in. Dr. Tornøe and his colleagues used the neurotrophic factor Meteorin, which plays a role in the development of striatal projection neurons, whose degeneration is a hallmark of Huntington’s disease. The scientists engineered ARPE-19 cells to produce Meteorin and used those that produced high levels of Meteorin in their experiment.

The EC biodelivery devices were implanted in the brains of rats followed by injection with quinolinic acid (QA), a potent neurotoxin that causes excitotoxicity, a component of Huntington’s disease. They tested three different implant types: devices filled with the high-producing ARPE-19 cells (EC-Meteorin), devices with unmodified ARPE-19 cells (ARPE-19), and devices without cells. Motor dysfunction was tested immediately prior to injection with QA and at two and four weeks after injection.

The research team found that the EC-Meteorin devices significantly protected against QA-induced toxicity. Rats with EC-Meteorin devices manifested near normal neurological performance and significantly reduced loss of brain cells from the QA injection compared to controls. Analysis of the Meteorin-treated brains showed a markedly reduced striatal lesion size. The EC biodelivery devices were found to produce stable or even increasing levels of Meteorin throughout the study. Meteorin diffused readily from the biodelivery device to the striatal tissue.

"Huntington’s disease can be diagnosed with high accuracy by genetic testing. Pre-symptomatic administration of a safe therapeutic treatment providing sustained delay or prevention of disease would be of great benefit to patients," says Dr. Tornøe. "With additional functional and safety data, tests in animals larger than the rat to study distribution, and more accurate disease models to evaluate the therapeutic potential of Meteorin, we anticipate that EC biodelivery can be developed as a platform technology for targeted therapy in patients with Huntington’s disease."

Provided by IOS Press

Source: medicalxpress.com

Jun 18, 201210 notes
#science #neuroscience #brain #psychology #huntington
MRI images show what the brain looks like when you lose self-control

June 18, 2012

New pictures from the University of Iowa show what it looks like when a person runs out of patience and loses self-control.

image

This image shows brain activity when people exert self-control. Credit: University of Iowa

A study by University of Iowa neuroscientist and neuro-marketing expert William Hedgcock confirms previous studies that show self-control is a finite commodity that is depleted by use. Once the pool has dried up, we’re less likely to keep our cool the next time we’re faced with a situation that requires self-control.

But Hedgcock’s study is the first to actually show it happening in the brain using fMRI images that scan people as they perform self-control tasks. The images show the anterior cingulate cortex (ACC)—the part of the brain that recognizes a situation in which self-control is needed and says, “Heads up, there are multiple responses to this situation and some might not be good”—fires with equal intensity throughout the task.

However, the dorsolateral prefrontal cortex (DLPFC)—the part of the brain that manages self-control and says, “I really want to do the dumb thing, but I should overcome that impulse and do the smart thing”—fires with less intensity after prior exertion of self-control.

image

This image shows brain activity after people have been engaged in self-control tasks long enough that self-control resources have been depleted. Credit: University of Iowa

He said that loss of activity in the DLPFC might be the person’s self-control draining away. The stable activity in the ACC suggests people have no problem recognizing a temptation. Although they keep fighting, they have a harder and harder time not giving in.

Which would explain why someone who works very hard not to take seconds of lasagna at dinner winds up taking two pieces of cake at desert. The study could also modify previous thinking that considered self-control to be like a muscle. Hedgcock says his images seem to suggest that it’s like a pool that can be drained by use then replenished through time in a lower conflict environment, away from temptations that require its use.

The researchers gathered their images by placing subjects in an MRI scanner and then had them perform two self-control tasks—the first involved ignoring words that flashed on a computer screen, while the second involved choosing preferred options. The study found the subjects had a harder time exerting self-control on the second task, a phenomenon called “regulatory depletion.” Hedgcock says that the subjects’ DLPFCs were less active during the second self-control task, suggesting it was harder for the subjects to overcome their initial response.

Hedgcock says the study is an important step in trying to determine a clearer definition of self-control and to figure out why people do things they know aren’t good for them. One possible implication is crafting better programs to help people who are trying to break addictions to things like food, shopping, drugs, or alcohol. Some therapies now help people break addictions by focusing at the conflict recognition stage and encouraging the person to avoid situations where that conflict arises. For instance, an alcoholic should stay away from places where alcohol is served.

But Hedgcock says his study suggests new therapies might be designed by focusing on the implementation stage instead. For instance, he says dieters sometimes offer to pay a friend if they fail to implement control by eating too much food, or the wrong kind of food. That penalty adds a real consequence to their failure to implement control and increases their odds of choosing a healthier alternative.

The study might also help people who suffer from a loss of self-control due to birth defect or brain injury.

"If we know why people are losing self-control, it helps us design better interventions to help them maintain control," says Hedgcock, an assistant professor in the Tippie College of Business marketing department and the UI Graduate College’s Interdisciplinary Graduate Program in Neuroscience.

Provided by University of Iowa

Source: medicalxpress.com

Jun 18, 201248 notes
#science #neuroscience #brain #psychology
The neurological basis for fear and memory

June 18, 2012

Fear conditioning using sound and taste aversion, as applied to mice, have revealed interesting information on the basis of memory allocation.

image

Credit: Thinkstock

European ‘Cellular mechanisms underlying formation of the fear memory trace in the mouse amygdala’ (FEAR Memory TRACE) project is investigating memory allocation and the recruitment of certain neurons to encode a memory. By studying conditioned fear memory in response to an auditory stimulus, the researchers have delved into pathological emotional states and neural mechanisms involved in memory allocation, retrieval and extinction.

Prior research has revealed that the conditioned fear response in mice is located in a specific bundle of neurons in the amygdala. Memory allocation modulation is due to expression of the transcription factor, cyclic adenosine 3’, 5’-monophosphate response element binding protein (CREB) and possibly neuronal excitability.

FEAR Memory TRACE focused on the electrophysiological properties of neurons encoding the same memory. The project also aimed to ascertain the biophysical mechanisms in the plasticity changes recorded in the specific set of neurons in the fear memory trace.

Recording information on auditory fear conditioning and conditioned taste aversion, the scientists used intra-amygdala surgery using viral vectors and electrophysiological experiments to detect neuronal excitability.

Transfected by virus, CREB tagged with green fluorescent protein together with the gene for channelrhodopsin2 were used in neural control experiments. Combined, these two elements caused neuron firing in specific nerve cells. Molecular techniques included western blot for protein detection, genotyping and viral DNA preparation.

Behavioural tests on long- and short-term memory of mice involving fear conditioning and taste aversion showed increased memory performance at the three-hour point rather than the five-hour point. The intrinsic excitability of the mice receiving both shock and the tone was increased at three hours, not five, compared to mice that only received the tone.

As the project continues to its close in two years, the aim is to identify biophysical mechanisms involved in recruiting neurons that compete with each other for a specific memory. FEAR Memory TRACE will also develop computational models to assess the role of these mechanisms in memory performance.

Information on biochemical processes in neural mechanisms has wide application in many clinical situations including patients suffering memory loss, such as stroke victims. Fear response manipulation can be applied in treatment of neuroses and phobias.

Provided by CORDIS

Source: medicalxpress.com

Jun 18, 201238 notes
#science #neuroscience #brain #psychology #memory #emotion
Next page →
20132014
  • January
  • February
  • March
  • April
  • May
  • June
  • July
  • August
  • September
  • October
  • November
  • December
201220132014
  • January
  • February
  • March
  • April
  • May
  • June
  • July
  • August
  • September
  • October
  • November
  • December
20122013
  • January
  • February
  • March
  • April
  • May
  • June
  • July
  • August
  • September
  • October
  • November
  • December