Neuroscience

Month

August 2013

Aug 27, 2013236 notes
#sensorimotor cortex #plasticity #neuroprosthetic limbs #brain activity #neuroscience #science
Language can reveal the invisible

It is natural to imagine that the sense of sight takes in the world as it is — simply passing on what the eyes collect from light reflected by the objects around us.

But the eyes do not work alone. What we see is a function not only of incoming visual information, but also how that information is interpreted in light of other visual experiences, and may even be influenced by language.

Words can play a powerful role in what we see, according to a study published this month by UW-Madison cognitive scientist and psychology professor Gary Lupyan, and Emily Ward, a Yale University graduate student, in the journal Proceedings of the National Academy of Sciences.

"Perceptual systems do the best they can with inherently ambiguous inputs by putting them in context of what we know, what we expect," Lupyan says. "Studies like this are helping us show that language is a powerful tool for shaping perceptual systems, acting as a top-down signal to perceptual processes. In the case of vision, what we consciously perceive seems to be deeply shaped by our knowledge and expectations."

And those expectations can be altered with a single word.

To show how deeply words can influence perception, Lupyan and Ward used a technique called continuous flash suppression to render a series of objects invisible for a group of volunteers.

Each person was shown a picture of a familiar object — such as a chair, a pumpkin or a kangaroo — in one eye. At the same time, their other eye saw a series of flashing, “squiggly” lines.

"Essentially, it’s visual noise," Lupyan says. "Because the noise patterns are high-contrast and constantly moving, they dominate, and the input from the other eye is suppressed."

Immediately before looking at the combination of the flashing lines and suppressed object, the study participants heard one of three things: the word for the suppressed object (“pumpkin,” when the object was a pumpkin), the word for a different object (“kangaroo,” when the object was actually a pumpkin), or just static.

Then researchers asked the participants to indicate whether they saw something or not. When the word they heard matched the object that was being wiped out by the visual noise, the subjects were more likely to report that they did indeed see something than in cases where the wrong word or no word at all was paired with the image.

"Hearing the word for the object that was being suppressed boosted that object into their vision," Lupyan says.

And hearing an unmatched word actually hurt study subjects’ chances of seeing an object.

"With the label, you’re expecting pumpkin-shaped things," Lupyan says. "When you get a visual input consistent with that expectation, it boosts it into perception. When you get an incorrect label, it further suppresses that."

Experiments have shown that continuous flash suppression interrupts sight so thoroughly that there are no signals in the brain to suggest the invisible objects are perceived, even implicitly.

"Unless they can tell us they saw it, there’s nothing to suggest the brain was taking it in at all," Lupyan says. "If language affects performance on a test like this, it indicates that language is influencing vision at a pretty early stage. It’s getting really deep into the visual system."

The study demonstrates a deeper connection between language and simple sensory perception than previously thought, and one that makes Lupyan wonder about the extent of language’s power. The influence of language may extend to other senses as well.

"A lot of previous work has focused on vision, and we have neglected to examine the role of knowledge and expectations on other modalities, especially smell and taste," Lupyan says. "What I want to see is whether we can really alter threshold abilities," he says. "Does expecting a particular taste for example, allow you to detect a substance at a lower concentration?"

If you’re drinking a glass of milk, but thinking about orange juice, he says, that may change the way you experience the milk.

"There’s no point in figuring out what some objective taste is," Lupyan says. "What’s important is whether the milk is spoiled or not. If you expect it to be orange juice, and it tastes like orange juice, it’s fine. But if you expected it to be milk, you’d think something was wrong."

Aug 27, 2013178 notes
#language #visual representations #perception #continuous flash suppression #neuroscience #science
Combination of Two Imaging Techniques Allows New Insights into Brain Function

The ability to measure brain functions non-invasively is important both
for clinical diagnoses and research in Neurology and Psychology. Two main imaging techniques are used: positron emission tomography (PET), which reveals metabolic processes in the brain; and activity of different brain regions is measured on the basis of the cells’ oxygen consumption by magnetic resonance imaging (MRI). A direct comparison of PET and MRI measurements was previously difficult because each had to be performed in a separate machine.

Researchers from the Werner Siemens Imaging Center at the University of Tübingen under the direction of Professor Bernd J. Pichler in collaboration with the Department of Diagnostic and Interventional Radiology, University Hospital Tübingen, and the Tübingen Max Planck Institute for Intelligent Systems have now successfully combined both methods. The researchers are able to explore functional processes in the brain in detail and can better assess what course of action to take. These results were achieved by the use of a PET insert enabling complementary, simultaneous PET/MRI scans. It was developed and built at the University of Tübingen.

The researchers could identify in certain regions a mismatch between glucose metabolism related brain activation measured with PET and oxygenation related signals, measured with MRI. Furthermore information about functional connectivity in the brain could be derived from MRI and from dynamic PET data. These results help to further decipher the nature of brain function, and are ultimately useful for basic research as well as clinical practice. The study, by lead author Dr. Hans Wehrl of Professor Bernd J. Pichler’s research team is soon to be published in the journal “Nature Medicine”.

In PET imaging the distribution of a weakly radioactive substance is shown in cross sections of the body, enabling doctors to see many different metabolic and physiological functions at work. Functional MRI (fMRI) allows researchers to depict changes in blood oxygenation that are associated with brain function. This measurement of functional active brain regions is also important for the planning of brain surgeries, where particular care must be taken in certain areas. The ability to collect different kinds of data from different scans simultaneously represents a major step forward in the fields using these technologies.

Aug 26, 201348 notes
#PET #MRI #brain function #glucose metabolism #oxygenation #neuroscience #science
Aug 26, 2013180 notes
#inhibitory neurons #learning #cognitive functioning #plasticity #visual cortex #neuroscience #science
Study in mice links cocaine use to new brain structures

Mice given cocaine showed rapid growth in new brain structures associated with learning and memory, according to a research team from the Ernest Gallo Clinic and Research Center at UC San Francisco. The findings suggest a way in which drug use may lead to drug-seeking behavior that fosters continued drug use, according to the scientists.

The researchers used a microscope that allowed them to peer directly into nerve cells within the brains of living mice, and within two hours of giving a drug they found significant increases in the density of dendritic spines – structures that bear synapses required for signaling – in the animals’ frontal cortex. In contrast, mice given saline solution showed no such increase.

The researchers also found a relationship between the growth of new dendritic spines and drug-associated learning. Specifically, mice that grew the most new spines were those that developed the strongest preference for being in the enclosure where they received cocaine rather than in the enclosure where they received saline. The team published its findings online in Nature Neuroscience on August 25, 2013.

"This gives us a possible mechanism for how drug use fuels further drug-seeking behavior," said principal investigator Linda Wilbrecht, PhD, a Gallo investigator now at UC Berkeley, but who led the research while she was on the UCSF faculty.

"It’s been observed that long-term drug users show decreased function in the frontal cortex in connection with mundane cues or tasks, and increased function in response to drug-related activity or information," Wilbrecht said. "This research suggests how the brains of drug users might shift toward those drug-related associations."

In all living brains there is a baseline level of creation of new spines in response to, or in anticipation of, day-to-day learning, Wilbrecht said. By enhancing this growth, cocaine might be a super-learning stimulus that reinforces learning about the cocaine experience, she said.

The frontal cortex, which Wilbrecht called the “steering wheel” of the brain, controls functions such as long-term planning, decision-making and other behaviors involving higher reasoning and discipline.

The brain cells in the frontal cortex that Wilbrecht and her team studied regulate the output of this brain region, and may play a key role in decision-making. “These neurons, which are directly affected by cocaine use, have the potential to bias decision-making,” she said.

Wilbrecht said the findings could potentially advance research in human addiction “by helping us identify what is going awry in the frontal cortexes of drug-addicted humans, and by explaining how drug-related cues come to dominate the brain’s decision-making processes.”

In the first of a series of experiments, the scientists gave cocaine injections to one group of mice and saline injections to another. The next day, they observed the animals’ brain cells using a 2-photon laser scanning microscope. They were surprised to discover that even after the first dose, the mice treated with cocaine grew more new dendritic spines than the saline-treated mice.

In another experiment, they observed the mice before cocaine or saline treatment and then two hours afterward, and discovered that the animals that received cocaine were developing new dendritic spines within two hours after receiving the drug. Furthermore, the next morning, cocaine-induced spines accounted for almost four times more connections among nerve cells than was observed in saline-treated animals.

In a third experiment, the researchers for a week gave the mice cocaine in one distinctive chamber and saline in another, using identical procedures. Each chamber had its own characteristic visual design, texture and smell to distinguish it from the other chamber. They then let the mice choose which chamber to go to.

"The animals that showed the highest quantity of robust dendritic spines – the spines with the greatest likelihood of developing into synapses – showed the greatest change in preference toward the chamber where they received the cocaine," said Wilbrecht. "This suggests that the new spines might be material for the association that these mice have learned to make between the chamber and the drug."

Wilbrecht noted that the research would not have been possible without live brain imaging via the 2-photon laser scanning microscope, which was developed in 2002. “I grew up at the time of the famous public service campaign that showed a pan of frying eggs with the message, ‘this is your brain on drugs,’” recalled Wilbrecht. “Now, with this microscope, we can actually say, ‘this is a brain cell on drugs.’”

Aug 26, 2013106 notes
#cocaine #frontal cortex #dendritic spines #learning #animal model #neuroscience #science
Aug 26, 2013247 notes
#DNA methylation #reward memory #ventral tegmental area #pleasure #addiction #dopamine #neuroscience #science
Aug 25, 2013123 notes
#nicotine exposure #pregnancy #brain development #animal model #addictive behavior #neuroscience #science
Play
Aug 25, 2013242 notes
#sleep #sleep deprivation #circadian rhythms #memory consolidation #mental health #neuroscience #science
Brain Atrophy Seen in Patients With Diabetes

Brain atrophy rather than cerebrovascular lesions may explain the relationship between type 2 diabetes mellitus (T2DM) and cognitive impairment, according to a study published online Aug. 12 in Diabetes Care.

image

Chris Moran, M.B., B.Ch., from Monash University in Melbourne, Australia, and colleagues analyzed magnetic resonance imaging scans and cognitive tests in 350 participants with T2DM and 363 participants without T2DM. In a blinded fashion, cerebrovascular lesions (infarcts, microbleeds, and white matter hyperintensity [WMH] volume) and atrophy (gray matter, white matter, and hippocampal volumes) were evaluated.

The researchers found that T2DM was associated with significantly more cerebral infarcts and significantly lower total gray, white, and hippocampal volumes, but not with microbleeds or WMH. Gray matter loss was distributed mainly in medial temporal, anterior cingulate, and medial frontal lobe locations in patients with T2DM, while white matter loss was distributed in frontal and temporal regions. Independent of age, sex, education, and vascular risk factors, T2DM was associated with significantly poorer visuospatial construction, planning, visual memory, and speed. When adjusting for hippocampal and total gray volumes, the strength of these associations was cut by almost one-half, but was unchanged with adjustments for cerebrovascular lesions or white matter volume.

"Cortical atrophy in T2DM resembles patterns seen in preclinical Alzheimer’s disease," the authors write. "Neurodegeneration rather than cerebrovascular lesions may play a key role in T2DM-related cognitive impairment."

Aug 24, 2013102 notes
#diabetes #brain atrophy #gray matter #white matter #hippocampal volumes #neuroscience #science
Depressed people have a more accurate perception of time
 


People with mild depression underestimate their talents. However, new research carried out researchers at the University of Limerick and the University of Hertfordshire shows that depressed people are more accurate when it comes to time estimation than their happier peers.




image

Depressed people often appear to distort the facts and view their lives more negatively than non-depressed people. Feelings of helplessness, hopelessness and worthlessness and of being out of control are some of the main symptoms of depression. For these people time seems to pass slowly and they will often use phrases such as “time seems to drag” to describe their experiences and their life. However, depressed people sometimes have a more accurate perception of reality than their happier friends and family who often look at life through rose-tinted glasses and hope for the best.



Dr Rachel Msetfi, senior lecturer in psychology, University of Limerick and one of the studies authors, said: “We found that depressed people tended to be more accurate when estimating time whereas non-depressed people tended to be less accurate. This finding, along with some of our other work, suggests that depression leads to more attention paid to time passing. Sometimes this might lead to a phenomenon known as ‘depressive realism’, though on other occasions time might seem to be moving more slowly than usual.”





In the study, volunteers, who were classified as mildly depressed or non-depressed, made estimates of the length of different time intervals of between two and sixty-five seconds. Overall, those volunteers who were mildly–depressed were more accurate in their time estimations.

Dr Msetfi noted that: “Time is a very important part of everyday experience, it flies when we are having fun or enjoying ourselves. One of the commonest experiences of depression is that people feel that time passes slowly and sometimes painfully. Our findings may help to shed a little light on how people with depression can be treated. People with depression are often encouraged to check themselves against reality, but maybe this timing skill can be harnessed to help in the treatment of mildly-depressed people. These findings may also link to successful mindfulness based treatments for depression which focus on encouraging present moment awareness.”





The paper, “Time perception and depressive realism: Judgement type, psychophysical functions and bias”, is published in PLOS ONE.

Aug 24, 20131,368 notes
#time perception #depression #time estimation #psychology #neuroscience #science
Omega-3 removes ADHD symptoms

A new multidisciplinary study shows a clear connection between the intake of omega-3 fatty acids and a decline in ADHD symptoms in rats.

image

Researchers at the University of Oslo have observed the behaviour of rats and have analyzed biochemical processes in their brains. The results show a clear improvement in ADHD-related behaviour from supplements of omega-3 fatty acids, as well as a faster turnover of the signal substances dopamine, serotonin and glutamate in the nervous system. There are, however, clear sex differences: a better effect from omega-3 fatty acids is achieved in male rats than in female.

Unknown biology behind ADHD

Currently the psychiatric diagnosis ADHD (Attention Deficit/Hyperactivity Disorder) is purely based on behavioural criteria, while the molecular genetic background for the illness is largely unknown. The new findings indicate that ADHD has a biological component and that the intake of omega-3 may influence ADHD symptoms.

“In some research environments it is controversial to suggest that ADHD has something to do with biology. But we have without a doubt found molecular changes in the brain after rats with ADHD were given omega-3,” says Ivar Walaas, Professor of Biochemistry.

The fact that omega-3 can reduce ADHD behaviour in rats has also been indicated in previous international studies. What is unique about the study in question is a multidisciplinarity that has not previously been seen, with contributions from behavioural science in medicine as well as from psychology, nutritional science and biochemistry.

Hyperactive rats

The rats used in the study are called SHR rats – spontaneously hypertensive rats. Although this is primarily a common type of rat, random mutations in their genes have resulted in genetic damage that produces high blood pressure. It is therefore first and foremost blood-pressure researchers who have so far been interested in these rats.

However, the rats do not suffer from high blood pressure until they have reached puberty. Before that age they present totally different symptoms – namely hyperactivity, poor ability to concentrate and impulsiveness. It is exactly these three criteria that form the basis for making the ADHD diagnosis in humans. The animals also react to Ritalin, the central nervous system stimulant, in the same way as humans with ADHD: the hyperactive responses are stabilized. SHR rats are therefore increasingly used in research as a model for ADHD.

Supplements as early as the foetal stage

Researchers believe that omega-3 can have an effect from the very beginning of life. Omega-3 was therefore added to the food given to mother rats before they were impregnated, and this continued throughout their entire pregnancy and while they fed their young. The baby rats were also given omega-3 in their own food after they were separated from their mother at the age of 20 days. Another group of mother rats were given food that did not have omega-3 added, thus creating a control group of SHR offspring that had not been given these fatty acids at the foetal stage or later.

The researchers started to analyze the behaviour of the offspring some days after they were separated from the mother. They studied behaviour driven by reward as well as spontaneous behaviour. Substantial differences were noted for both types of behaviour between the rats that had been given the omega-3 supplement as foetuses and as baby rats and those that had not.

Rewards made male rats more concentrated

The reward-driven behaviour was such that the rats were allowed access to a drop of water each time they pressed an illuminated button. The ADHD rats that had not been given omega-3 could not concentrate on pressing the button, whereas the rats that had been brought up on omega-3 easily managed to hold their concentration for the seconds this takes and were able to enjoy a delicious drop of water as a reward.

Surprisingly enough, it was only male rats that showed an improvement in reward-driven behaviour. However, with regard to the rats’ spontaneous behavior, the same type of reduction in hyperactivity and attention difficulties was noted in both male and female rats that had been given the omega-3 supplement.

Changes in brain chemistry

Professor Walaas and his research group became involved in the study at this point in order to analyze the molecular processes in the rats’ brains.

The group analyzed the level of the chemical connections in the brain, the so-called neurotransmitters that transfer nerve impulses from one nerve cell to another. The researchers measured how much of the neurotransmitters such as dopamine, serotonin and glutamate was released and broken down within the nerve fibres. A key player in this work was Kine S. Dervola, PhD candidate, who reports clear sex differences in the turnover of the neurotransmitters – just as there had been in the reward-driven behaviour.

“We saw that the turnover of dopamine and serotonin took place much faster among the male rats that had been given omega-3 than among those that had not. For serotonin the turnover ratio was three times higher, and for dopamine it was just over two and a half times higher. These effects were not observed among the female rats. When we measured the turnover of glutamate, however, we saw that both sexes showed a small increase in turnover,” Ms Dervola tells us.

Transferrable to humans?

The researchers are cautious about drawing conclusions as to whether the results can be transferred to humans.

“In the first place there is of course a difference between rats and humans, and secondly the rats are sick at the outset. Thirdly the causes of ADHD in humans are in no way mapped sufficiently well. But the end result of what takes place in the brains of both rats and humans with ADHD is hyperactivity, poor ability to concentrate and impulsiveness,” says Professor Walaas, and concludes:

“Giving priority to basic research like this will greatly increase our detailed knowledge of ADHD.”

Reference:

Dervola, Kine-Susann Noren; Roberg, Bjørg Åse; Wøien, Grete; Bogen, Inger Lise; Sandvik, Torbjørn; Sagvolden, Terje; Drevon, Christian A, Espen B. Johansen and Sven Ivar Walaas (2012). Marine omega-3 polyunsaturated fatty acids induce sex-specific changes in reinforcer-controlled behavior and neurotransmitter metabolism in a spontaneously hypertensive rat model of ADHD. Behavioral and Brain Functions.  ISSN 1744-9081. 8(56).

Aug 24, 2013257 notes
#omega-3 #animal model #ADHD #blood pressure #neurotransmitters #neuroscience #science
Receptor may aid spread of Alzheimer’s and Parkinson’s in brain

Scientists at Washington University School of Medicine in St. Louis have found a way that corrupted, disease-causing proteins spread in the brain, potentially contributing to Alzheimer’s disease, Parkinson’s disease and other brain-damaging disorders.

image

Image: An electron micrograph shows clumps of corrupted tau protein outside a nerve cell. Scientists have identified a receptor that lets these clumps into the cell, where the corruption can spread. Blocking this receptor with drugs may help treat Alzheimer’s, Parkinson’s and other disorders.

The research identifies a specific type of receptor and suggests that blocking it may aid treatment of theses illnesses. The receptors are called heparan sulfate proteoglycans (HSPGs).

“Many of the enzymes that create HSPGs or otherwise help them function are good targets for drug treatments,” said senior author Marc I. Diamond, MD, the David Clayson Professor of Neurology. “We ultimately should be able to hit these enzymes with drugs and potentially disrupt several neurodegenerative conditions.”

The study is available online in the Proceedings of the National Academy of Sciences.

Over the last decade, Diamond has gathered evidence that Alzheimer’s disease and other neurodegenerative diseases spread through the brain in a fashion similar to conditions such as mad cow disease, which are caused by misfolded proteins known as prions.

Proteins are long chains of amino acids that perform many basic biological functions. A protein’s abilities are partially determined by the way it folds into a 3-D shape. Prions are proteins that have become folded in a fashion that makes them harmful.

Prions spread across the brain by causing other copies of the same protein to misfold.

Among the most infamous prion diseases are mad cow disease, which rapidly destroys the brain in cows, and a similar, inherited condition in humans called Creutzfeldt-Jakob disease.

Diamond and his colleagues have shown that a part of nerve cells’ inner structure known as tau protein can misfold into a configuration called an amyloid. These corrupted versions of tau stick to each other in clumps within the cells. Like prions, the clumps spread from one cell to another, seeding further spread by causing copies of tau protein in the new cell to become amyloids.

In the new study, first author Brandon Holmes, an MD/PhD student, showed that HSPGs are essential for binding, internalizing and spreading clumps of tau. When he genetically disabled or chemically modified the HSPGs in cell cultures and in a mouse model, clumps of tau could not enter cells, thus inhibiting the spread of misfolded tau from cell to cell.

Holmes also found that HSPGs are essential for the cell-to-cell spread of corrupted forms of alpha-synuclein, a protein linked to Parkinson’s disease.

“This suggests that it may one day be possible to unify our understanding and treatment of two or more broad classes of neurodegenerative disease,” Diamond said. 

“We’re now sorting through about 15 genes to determine which are the most essential for HSPGs’ interaction with tau,” Holmes said. “That will tell us which proteins to target with new drug treatments.”

Aug 23, 201392 notes
#heparan sulfate proteoglycans #receptors #neurodegenerative diseases #prions #nerve cells #neuroscience #science
First to measure the concerted activity of a neuronal circuit

Neurobiologists from the Friedrich Miescher Institute for Biomedical Research have been the first to measure the concerted activity of a neuronal circuit in the retina as it extracts information about a moving object. With their novel and powerful approach they can now not only visualize networks of neurons but can also measure functional aspects. These insights are direly needed for a better understanding of the processes in the brain in health and disease.

image

For many decades electrophysiology and genetics have been the main tools in the toolbox of approaches to study individual neurons in the central nervous system to understand perception and behavior. In the last five years however, neurobiology has been riding a wave of technological advances that brought unprecedented insights: Optogenetics and genetically encoded activity sensors has allowed scientists to control and measure the activity of clearly defined neurons; the application of rabies viruses enabled the visualization of networks of interconnected nerve cells. What was still missing, was the link between neural circuit and monitoring of activity.

Scientists from the Friedrich Miescher Institute for Biomedical Research have now been the first to measure the concerted activity of a neuronal circuit in the retina as it extracts information about the movement of an object.

In a world defined through eyesight, it is crucial to be able to discern whether something moves towards us, moves away or moves next to us. It comes as no surprise then that in the retina several parallel neuronal circuits are reserved for the extraction of information about movement and that most of them are dedicated to the analysis of the direction of motion.

As they report online in Neuron, Keisuke Yonehara and Karl Farrow, two Postdoctoral Fellows in Botond Roska’s team at the FMI, have now been able to monitor the activity of all circuit elements in a motion sensitive retinal circuit at once, and pinpoint the site, at a subcellular level, where the information about the direction of the movement becomes encoded. To achieve this, they used genetically altered rabies viruses expressing calcium sensors developed by the laboratory of Klaus Conzelmann in Munich. The special property of rabies viruses is that they move across connected neurons and therefore are able to deliver the sensors to all circuit elements within a defined neuronal circuit. Simultaneous two-photon imaging allowed them then to monitor activity in every part of the neuronal circuit at once, even in subcellular compartments, such as axons, synapses and dendrites.

"We are extremely thrilled that with this new method, which combines the power of genetically altered rabies viruses with very powerful two-photon microscopy, we are now able to link circuit architecture with activity and ultimately function," comments Yonehara. "We have illustrated the power of the method for a better understanding of the perception of movement and are convinced that the method will allow us to reach a better understanding of many processes in the retina and in other parts of the brain."

Aug 23, 201365 notes
#optogenetics #neural activity #retina #retinal circuit #nerve cells #neuroscience #science
Aug 23, 2013107 notes
#vascular dementia #memory #art #neurodegenerative diseases #Mary Hecht #neuroscience #science
Brain size may signal risk of developing an eating disorder

New research indicates that teens with anorexia nervosa have bigger brains than teens that do not have the eating disorder. That is according to a study by researchers at the University of Colorado’s School of Medicine that examined a group of adolescents with anorexia nervosa and a group without. They found that girls with anorexia nervosa had a larger insula, a part of the brain that is active when we taste food, and a larger orbitofrontal cortex, a part of the brain that tells a person when to stop eating.

Guido Frank, MD, assistant professor of psychiatry and neuroscience at CU School of Medicine, and his colleagues report that the bigger brain may be the reason people with anorexia are able to starve themselves. Similar results in children with anorexia nervosa and in adults who had recovered from the disease, raise the possibility that insula and orbitofrontal cortex brain size could predispose a person to develop eating disorders.

"While eating disorders are often triggered by the environment, there are most likely biological mechanisms that have to come together for an individual to develop an eating disorder such as anorexia nervosa," Frank says.

The researchers recruited 19 adolescent girls with anorexia nervosa and 22 in a control group and used magnetic resonance imaging (MRI) to study brain volumes. Individuals with anorexia nervosa showed greater left orbitofrontal, right insular, and bilateral temporal cortex gray matter compared to the control group. In individuals with anorexia nervosa, orbitofrontal gray matter volume related negatively with sweet tastes. An additional comparison of this study group with adults with anorexia nervosa and a healthy control group supported greater orbitofrontal cortex and insula volumes in the disorder across this age group as well.

The medial orbitofrontal cortex has been associated with signaling when we feel satiated by a certain type of food (so called “sensory specific satiety”). This study suggests that larger volume in this brain area could be a trait across eating disorders that promotes these individuals to stop eating faster than in healthy individuals, before eating enough.

The right insula is a region that processes taste, as well as integrates body perception and this could contribute to the perception of being fat despite being underweight.

This study is complementary to another that found adults with anorexia and individuals who had recovered from this illness also had differences in brain size, previously published in the American Journal of Psychiatry.

Aug 23, 2013127 notes
#eating disorders #anorexia nervosa #brain size #orbitofrontal cortex #adolescents
Aug 23, 2013255 notes
#empathy #social cognition #brain activity #interpersonal relationships #psychology #neuroscience #science
Aug 22, 201349 notes
#gustatory receptors #fruit flies #taste #odorant-binding protein #neuroscience #science
Aug 22, 2013317 notes
#science #amygdala #anxiety #hippocampus #PTSD #mental health #psychology #neuroscience
Mood is Influenced by Immune Cells Called to the Brain in Response to Stress

New research shows that in a dynamic mind-body interaction during the interpretation of prolonged stress, cells from the immune system are recruited to the brain and promote symptoms of anxiety.

The findings, in a mouse model, offer a new explanation of how stress can lead to mood disorders and identify a subset of immune cells, called monocytes, that could be targeted by drugs for treatment of mood disorders.

The Ohio State University research also reveals new ways of thinking about the cellular mechanisms behind the effects of stress, identifying two-way communication from the central nervous system to the periphery – the rest of the body – and back to the central nervous system that ultimately influences behavior.

Unlike an infection, trauma or other problems that attract immune cells to the site of trouble in the body, this recruitment of monocytes that can promote inflammation doesn’t damage the brain’s tissue – but it does lead to symptoms of anxiety.

The research showed that the brain under prolonged stress sends signals out to the bone marrow, calling up monocytes. The cells travel to specific regions of the brain and generate inflammation that causes anxiety-like behavior.

In experiments conducted in mice, the research showed that repeated stress exposure caused the highest concentration of monocytes migrating to the brain. The cells surrounded blood vessels and penetrated brain tissue in several areas linked to fear and anxiety, including the prefrontal cortex, amygdala and hippocampus, and their presence led to anxiety-like behavior in the mice.

“In the absence of tissue damage, we have cells migrating to the brain in response to the region of the brain that is activated by the stressor,” said John Sheridan, senior author of the study, professor of oral biology and associate director of Ohio State’s Institute for Behavioral Medicine Research (IBMR).

“In this case, the cells are recruited to the brain by signals generated by the animal’s interpretation of social defeat as stressful.”

The research appears in the Aug. 21, 2013, issue of The Journal of Neuroscience.

Mice in this study were subjected to stress that might resemble a person’s response to persistent life stressors. In this model of stress, male mice living together are given time to establish a hierarchy, and then an aggressive male is added to the group for two hours. This elicits a “fight or flight” response in the resident mice as they are repeatedly defeated. The experience of social defeat leads to submissive behaviors and the development of anxiety-like behavior.

Mice subjected to zero, one, three or six cycles of this social defeat were then tested for anxiety symptoms. The more cycles of social defeat, the higher the anxiety symptoms; mice took longer to enter an open space and opted for darkness rather than light when given the choice. Anxiety symptoms corresponded to higher levels of monocytes that had traveled to the animals’ brains from the blood.

Additional experiments showed that these cells did not originate in the brain, but traveled there from the bone marrow. In previous studies, this same research group showed that cells in the brain called microglia, the brain’s first line of immune defense, are activated by prolonged stress and are partly responsible for the signals that call up monocytes from the bone marrow.

“There are different moving parts from the central and peripheral components, and what’s novel is them coming together to influence behavior,” said Jonathan Godbout, a senior co-author of the paper and an associate professor of neuroscience at Ohio State.

Exactly what happens at this point in the brain remains unknown, but the research offers clues. The monocytes that travel to the brain don’t respond to natural anti-inflammatory steroids in the body and have characteristics signifying they are in a more inflammatory state. These results indicate that inflammatory gene expression occurs in the brain in response to the stressor.

“The monocytes are coming out of the bone marrow and they are not responsive to steroid regulation, so they overproduce proinflammatory signals when they’re stimulated. We think this is the key to the prolonged anxiety-like disorders that we see in these animals,” Sheridan said.

These findings do not apply to all forms of anxiety, the scientists noted, but they are a game-changer in research on stress-related mood disorders.

“Our data alter the idea of the neurobiology of mood disorders,” said Eric Wohleb, first author of the study and a predoctoral fellow in Ohio State’s Neuroscience Graduate Studies Program. “These findings indicate that a bidirectional system rather than traditional neurotransmitter pathways may regulate some forms of anxiety responses. We’re saying something outside the central nervous system – something from the immune system – is having a profound effect on behavior.”

Aug 22, 2013173 notes
#stress #anxiety #immune system #animal model #neuroscience #science
Aug 22, 2013135 notes
#insula #frontal cortex #schizophrenia #neuroimaging #neuroscience #psychology #science
Playing video games can boost brain power

Certain types of video games can help to train the brain to become more agile and improve strategic thinking, according to scientists from Queen Mary University of London and University College London (UCL).

image

The researchers recruited 72 volunteers and measured their ‘cognitive flexibility’ described as a person’s ability to adapt and switch between tasks, and think about multiple ideas at a given time to solve problems.

Two groups of volunteers were trained to play different versions of a real-time strategy game called StarCraft, a fast-paced game where players have to construct and organise armies to battle an enemy. A third of the group played a life simulation video game called The Sims, which does not require much memory or many tactics.

All the volunteers played the video games for 40 hours over six to eight weeks, and were subjected to a variety of psychological tests before and after. All the participants happened to be female as the study was unable to recruit a sufficient number of male volunteers who played video games for less than two hours a week.

The researchers discovered that those who played StarCraft were quicker and more accurate in performing cognitive flexibility tasks, than those who played The Sims.

Dr Brian Glass from Queen Mary’s School of Biological and Chemical Sciences, said: “Previous research has demonstrated that action video games, such as Halo, can speed up decision making but the current work finds that real-time strategy games can promote our ability to think on the fly and learn from past mistakes.

“Our paper shows that cognitive flexibility, a cornerstone of human intelligence, is not a static trait but can be trained and improved using fun learning tools like gaming.”

Professor Brad Love from UCL, said:  “Cognitive flexibility varies across people and at different ages. For example, a fictional character like Sherlock Holmes has the ability to simultaneously engage in multiple aspects of thought and mentally shift in response to changing goals and environmental conditions.

“Creative problem solving and ‘thinking outside the box’ require cognitive flexibility. Perhaps in contrast to the repetitive nature of work in past centuries, the modern knowledge economy places a premium on cognitive flexibility.”

Dr Glass added: “The volunteers who played the most complex version of the video game performed the best in the post-game psychological tests. We need to understand now what exactly about these games is leading to these changes, and whether these cognitive boosts are permanent or if they dwindle over time. Once we have that understanding, it could become possible to develop clinical interventions for symptoms related to attention deficit hyperactivity disorder or traumatic brain injuries, for example.”

Aug 22, 2013222 notes
#video games #cognition #technology #neuroscience #science
Researchers Identify Conditions Most Likely to Kill Encephalitis Patients

People with severe encephalitis — inflammation of the brain — are much more likely to die if they develop severe swelling in the brain, intractable seizures or low blood platelet counts, regardless of the cause of their illness, according to new Johns Hopkins research.

The Johns Hopkins investigators say the findings suggest that if physicians are on the lookout for these potentially reversible conditions and treat them aggressively at the first sign of trouble, patients are more likely to survive.

“The factors most associated with death in these patients are things that we know how to treat,” says Arun Venkatesan, M.D., Ph.D., an assistant professor of neurology at the Johns Hopkins University School of Medicine and leader of the study published in the Aug. 27 issue of the journal Neurology.

Experts consider encephalitis something of a mystery, and its origins and progress unpredictable. While encephalitis may be caused by a virus, bacteria or autoimmune disease, a precise cause remains unknown in 50 percent of cases. Symptoms range from fever, headache and confusion in some, to seizures, severe weakness or language disability in others. The most complex cases can land patients in intensive care units, on ventilators, for months. Drugs like the antiviral acyclovir are available for herpes encephalitis, which occurs in up to 15 percent of cases, but for most cases, doctors have only steroids and immunosuppressant drugs, which carry serious side effects.

“Encephalitis is really a syndrome with many potential causes, rather than a single disease, making it difficult to study,” says Venkatesan, director of the Johns Hopkins Encephalitis Center.

In an effort to better predict outcomes for his patients, Venkatesan and his colleagues reviewed records of all 487 patients with acute encephalitis admitted to The Johns Hopkins Hospital and Johns Hopkins Bayview Medical Center between January 1997 and July 2011. They focused further attention on patients who spent at least 48 hours in the ICU during their hospital stays and who were over the age of 16. Of those 103 patients, 19 died. Patients who had severe swelling in the brain were 18 times more likely to die, while those with continuous seizures were eight times more likely to die. Those with low counts in blood platelets, the cells responsible for clotting, were more than six times more likely to die than those without this condition.

The findings can help physicians know which conditions should be closely monitored and when the most aggressive treatments — some of which can come with serious side effects — should be tried, the researchers say. For example, it may be wise to more frequently image the brains of these patients to check for increased brain swelling and the pressure buildup that accompanies it.

Venkatesan says patients with cerebral edema may do better if intracranial pressure is monitored continuously and treated aggressively. He cautioned that although his research suggests such a course, further studies are needed to determine if it leads to better outcomes for patients.

Similarly, he says research has yet to determine whether aggressively treating seizures and low platelet counts also decrease mortality.

Venkatesan and his colleagues are also developing better guidelines for diagnosing encephalitis more quickly so as to minimize brain damage. Depending on where in the brain the inflammation is, he says, the illness can mimic other diseases, making diagnosis more difficult.

Another of the study’s co-authors, Romergryko G. Geocadin, M.D., an associate professor of neurology who co-directs the encephalitis center and specializes in neurocritical care, says encephalitis patients in the ICU are “the sickest of the sick,” and he fears that sometimes doctors give up on the possibility of them getting better.

“This research should give families — and physicians — hope that, despite how bad it is, it may be reversible,” he says.

Aug 21, 201342 notes
#brain #encephalitis #cerebral edema #neurology #neuroscience #science
Aug 21, 2013105 notes
#learning #motor learning #sleep #neuroimaging #neuroscience #science
How brain microcircuits integrate information from different senses

A new publication in the top-ranked journal Neuron sheds new light onto the unknown processes on how the brain integrates the inputs from the different senses in the complex circuits formed by molecularly distinct types of nerve cells. The work was led by new Umeå University associate professor Paolo Medini.

One of the biggest challenges in Neuroscience is to understand how the cerebral cortex of the brain processes and integrates the inputs from the different senses (like vision, hearing and touch) to control for example, that we can respond to an event in the environment with precise movement of our body.

The brain cortex is composed by morphologically and functionally different types of nerve cells, e.g. excitatory, inhibitory, that connect in very precise ways. Paolo Medini and co-workers show that the integration of inputs from different senses in the brain occurs differently in excitatory and inhibitory cells, as well as in superficial and in the deep layers of the cortex, the latter ones being those that send electrical signals out from the cortex to other brain structures.

“The relevance and the innovation of this work is that by combining advanced techniques to visualize the functional activity of many nerve cells in the brain and new molecular genetic techniques that allows us to change the electrical activity of different cell types, we can for the first time understand how the different nerve cells composing brain circuits communicate with each other”, says Paolo Medini.

The new knowledge is essential to design much needed future strategies to stimulate brain repair. It is not enough to transplant nerve cells in the lesion site, as the biggest challenge is to re-create or re-activate these precise circuits made by nerve cells.

Paolo Medini has a Medical background and worked in Germany at the Max Planck Institute for Medical Research of Heidelberg, as well as a Team leader at the Italian Institute of Technology in Genova, Italy. He recently started on the Associate Professor position in Cellular and Molecular Physiology at the Molecular Biology Department.

He is now leading a brand new Brain Circuits Lab with state of state-of-the-art techniques such as two-photon microscopy, optogenetics and electrophysiology to investigate the circuit functioning and repair in the brain cortex. This investment has been possible by a generous contribution from the Kempe Foundation and by the combined effort of Umeå University.

“By combining cell physiology knowledge in the intact brain with molecular biology expertise, we plan to pave the way for this kind of innovative research that is new to Umeå University and nationally”, says Paolo Medini.

Aug 21, 201365 notes
#multisensory integration #cerebral cortex #nerve cells #neuroscience #science
A new role for sodium in the brain

Researchers at McGill University have found that sodium – the main chemical component in table salt – is a unique “on/off” switch for a major neurotransmitter receptor in the brain. This receptor, known as the kainate receptor, is fundamental for normal brain function and is implicated in numerous diseases, such as epilepsy and neuropathic pain.

image

Prof. Derek Bowie and his laboratory in McGill’s Department of Pharmacology and Therapeutics, worked with University of Oxford researchers to make the discovery. By offering a different view of how the brain transmits information, their research highlights a new target for drug development. The findings are published in the journal Nature Structural & Molecular Biology.

Balancing kainate receptor activity is the key to maintaining normal brain function. For example, in epilepsy, kainate activity is thought to be excessive. Thus, drugs which would shut down this activity are expected to be beneficial.

“It has been assumed for decades that the “on/off” switch for all brain receptors lies where the neurotransmitter binds,” says Prof. Bowie, who also holds a Canada Research Chair in Receptor Pharmacology. “However, we found a completely separate site that binds individual atoms of sodium and controls when kainate receptors get turned on and off.”

The sodium switch is unique to kainate receptors, which means that drugs designed to stimulate this switch, should not act elsewhere in the brain. This would be a major step forward, since drugs often affect many locations, in addition to those they were intended to act on, producing negative side-effects as a result. These so called “off-target effects” for drugs represent one of the greatest challenges facing modern medicine.

“Now that we know how to stimulate kainate receptors, we should be able to design drugs to essentially switch them off,” says Dr. Bowie.

Dr. Philip Biggin’s lab at Oxford University used computer simulations to predict how the presence or absence of sodium would affect the kainate receptor.

Aug 21, 2013106 notes
#sodium #kainate receptor #brain function #drug development #neuroscience #science
Study suggests iron is at core of Alzheimer's disease

Alzheimer’s disease has proven to be a difficult enemy to defeat. After all, aging is the No. 1 risk factor for the disorder, and there’s no stopping that.

Most researchers believe the disease is caused by one of two proteins, one called tau, the other beta-amyloid. As we age, most scientists say, these proteins either disrupt signaling between neurons or simply kill them.

Now, a new UCLA study suggests a third possible cause: iron accumulation.

Dr. George Bartzokis, a professor of psychiatry at the Semel Institute for Neuroscience and Human Behavior at UCLA and senior author of the study, and his colleagues looked at two areas of the brain in patients with Alzheimer’s. They compared the hippocampus, which is known to be damaged early in the disease, and the thalamus, an area that is generally not affected until the late stages. Using sophisticated brain-imaging techniques, they found that iron is increased in the hippocampus and is associated with tissue damage in that area. But increased iron was not found in the thalamus.

The research appears in the August edition of the Journal of Alzheimer’s Disease.

While most Alzheimer’s researchers focus on the buildup of tau or beta-amyloid that results in the signature plaques associated with the disease, Bartzokis has long argued that the breakdown begins much further “upstream.” The destruction of myelin, the fatty tissue that coats nerve fibers in the brain, he says, disrupts communication between neurons and promotes the buildup of the plaques. These amyloid plaques in turn destroy more and more myelin, disrupting brain signaling and leading to cell death and the classic clinical signs of Alzheimer’s.

Myelin is produced by cells called oligodendrocytes. These cells, along with myelin, have the highest levels of iron of any cells in the brain, Bartzokis says, and circumstantial evidence has long supported the possibility that brain iron levels might be a risk factor for age-related diseases like Alzheimer’s. Although iron is essential for cell function, too much of it can promote oxidative damage, to which the brain is especially vulnerable.

In the current study, Bartzokis and his colleagues tested their hypothesis that elevated tissue iron caused the tissue breakdown associated with Alzheimer’s disease. They targeted the vulnerable hippocampus, a key area of the brain involved in the formation of memories, and compared it to the thalamus, which is relatively spared by Alzheimer’s until the very late stages of disease.

The researchers used an MRI technique that can measure the amount of brain iron in ferritin, a protein that stores iron, in 31 patients with Alzheimer’s and 68 healthy control subjects.

In the presence of diseases like Alzheimer’s, as the structure of cells breaks down, the amount of water increases in the brain, which can mask the detection of iron, according to Bartzokis.

"It is difficult to measure iron in tissue when the tissue is already damaged," he said. "But the MRI technology we used in this study allowed us to determine that the increase in iron is occurring together with the tissue damage. We found that the amount of iron is increased in the hippocampus and is associated with tissue damage in patients with Alzheimer’s but not in the healthy older individuals — or in the thalamus. So the results suggest that iron accumulation may indeed contribute to the cause of Alzheimer’s disease."

But it’s not all bad news from this study, Bartzokis noted.

"The accumulation of iron in the brain may be influenced by modifying environmental factors, such as how much red meat and iron dietary supplements we consume and, in women, having hysterectomies before menopause," he said.

In addition, he noted, medications that chelate and remove iron from tissue are being developed by several pharmaceutical companies as treatments for the disorder. This MRI technology may allow doctors to determine who is most in need of such treatments.

Aug 21, 2013110 notes
#alzheimer's disease #dementia #iron accumulation #aging #hippocampus #oligodendrocytes #neuroscience #science
Aug 21, 2013733 notes
#hallucinogens #mental illness #psychedelic drugs #LSD #neuroscience #science
Aug 21, 2013207 notes
#musical hallucinations #auditory hallucinations #memory #neurology #neuroscience #science
First Pre-Clinical Gene Therapy Study to Reverse Rett Symptoms

The concept behind gene therapy is simple: deliver a healthy gene to compensate for one that is mutated. New research published today in the Journal of Neuroscience suggests this approach may eventually be a feasible option to treat Rett Syndrome, the most disabling of the autism spectrum disorders. Gail Mandel, Ph.D., a Howard Hughes Investigator at Oregon Health and Sciences University, led the study. The Rett Syndrome Research Trust, with generous support from the Rett Syndrome Research Trust UK and Rett Syndrome Research & Treatment Foundation, funded this work through the MECP2 Consortium.

In 2007, co-author Adrian Bird, Ph.D., at the University of Edinburgh astonished the scientific community with proof-of-concept that Rett is curable, by reversing symptoms in adult mice. His unexpected results catalyzed labs around the world to pursue a multitude of strategies to extend the pre-clinical findings to people.

Today’s study is the first to show reversal of symptoms in fully symptomatic mice using techniques of gene therapy that have potential for clinical application.

Rett Syndrome is an X-linked neurological disorder primarily affecting girls; in the US, about 1 in 10,000 children a year are born with Rett.  In most cases symptoms begin to manifest between 6 and 18 months of age, as developmental milestones are missed or lost. The regression that follows is characterized by loss of speech, mobility, and functional hand use, which is often replaced by Rett’s signature gesture: hand-wringing, sometimes so intense that it is a constant during every waking hour. Other symptoms include seizures, tremors, orthopedic and digestive problems, disordered breathing and other autonomic impairments, sensory issues and anxiety. Most children live into adulthood and require round-the-clock care.

The cause of Rett Syndrome’s terrible constellation of symptoms lies in mutations of an X-linked gene called MECP2 (methyl CpG-binding protein). MECP2 is a master gene that regulates the activity of many other genes, switching them on or off.

“Gene therapy is well suited for this disorder,” Dr. Mandel explains. “Because MECP2 binds to DNA throughout the genome, there is no single gene currently that we can point to and target with a drug. Therefore the best chance of having a major impact on the disorder is to correct the underlying defect in as many cells throughout the body as possible. Gene therapy allows us to do that.”

Healthy genes can be delivered into cells aboard a virus, which acts as a Trojan horse. Many different types of these Trojan horses exist. Dr. Mandel used adeno-associated virus serotype 9 (AAV9), which has the unusual and attractive ability to cross the blood-brain barrier. This allows the virus and its cargo to be administered intravenously, instead of employing more invasive direct brain delivery systems that require drilling burr holes into the skull.

Because the virus has limited cargo space, it cannot carry the entire MECP2 gene. Co-author Brian Kaspar of Nationwide Children’s Hospital collaborated with the Mandel lab to package only the gene’s most critical segments. After being injected into the Rett mice, the virus made its way to cells throughout the body and brain, distributing the modified gene, which then started to produce the MeCP2 protein.

As in human females with Rett Syndrome, only approximately 50% of the mouse cells have a healthy copy of MECP2. After the gene therapy treatment 65% of cells now had a functioning MECP2 gene.

The treated mice showed profound improvements in motor function, tremors, seizures and hind limb clasping. At the cellular level the smaller body size of neurons seen in mutant cells was restored to normal. Biochemical experiments proved that the gene had found its way into the nuclei of cells and was functioning as expected, binding to DNA.

One Rett symptom that was not ameliorated was abnormal respiration. Researchers hypothesize that correcting this may require targeting a greater number of cells than the 15% that had been achieved in the brainstem.

“We learned a critical and encouraging point with these experiments – that we don’t have to correct every cell in order to reverse symptoms. Going from 50% to 65% of the cells having a functioning gene resulted in significant improvements,” said co-author Saurabh Garg.

One of the potential challenges of gene therapy in Rett is the possibility of delivering multiple copies of the gene to a cell. We know from the MECP2 Duplication Syndrome that too much of this protein is detrimental. “Our results show that after gene therapy treatment the correct amount of MeCP2 protein was being expressed. At least in our hands, with these methods, overexpression of MeCP2 was not an issue,” said co-author Daniel Lioy.

Dr. Mandel cautioned that key steps remain before clinical trials can begin. “Our study is an important first step in highlighting the potential for AAV9 to treating the neurological symptoms in Rett. We are now working on improving the packaging of MeCP2 in the virus to see if we can target a larger percentage of cells and therefore improve symptoms even further,” said Mandel. Collaborators Hélène Cheval and Adrian Bird see this as a promising follow up to the 2007 work showing symptom reversal in Rett mice. “That study used genetic tricks that could not be directly applicable to humans, but the AAV9 vector used here could in principle deliver a gene therapeutically. This is an important step forward, but there is a way to go yet.”

“Gene therapy has had a tumultuous road in the past few decades but is undergoing a renaissance due to recent technological advances. Europe and Asia have gene therapy treatments already in the clinic and it’s likely that the US will follow suit. Our goal now is to prioritize the next key experiments and facilitate their execution as quickly as possible. Gene therapy, especially to the brain, is a tricky undertaking but I’m cautiously optimistic that with the right team we can lay out a plan for clinical development. I congratulate the Mandel and Bird labs on today’s publication, which is the third to be generated from the MECP2 Consortium in a short period of time,” said Monica Coenraads, Executive Director of the Rett Syndrome Research Trust and mother of a teenaged daughter with the disorder.

Aug 21, 201363 notes
#rett syndrome #gene therapy #neurological disorders #MECP2 #neuroscience #science
Aug 20, 2013127 notes
#brain implants #neural implants #neurology #neuroscience #technology #science
Aug 20, 201372 notes
#prion diseases #neurodegenerative diseases #animal model #prion proteins #neuroscience #science
Aug 20, 201358 notes
#brain lesions #white matter #memory decline #decompression sickness #neuroscience #science
Aug 20, 2013272 notes
#neuroimaging #brain activity #brain scans #neuroscience #science
Aug 20, 201368 notes
#concussion #TBI #brain injury #neuroimaging #neurology #neuroscience #science
Brain network decay detected in early Alzheimer’s

In patients with early Alzheimer’s disease, disruptions in brain networks emerge about the same time as chemical markers of the disease appear in the spinal fluid, researchers at Washington University School of Medicine in St. Louis have shown.

While two chemical markers in the spinal fluid are regarded as reliable indicators of early disease, the new study, published in JAMA Neurology, is among the first to show that scans of brain networks may be an equally effective and less invasive way to detect early disease.

“Tracking damage to these brain networks may also help us formulate a more detailed understanding of what happens to the brain before the onset of dementia,” said senior author Beau Ances, MD, PhD, associate professor of neurology and of biomedical engineering.

Diagnosing Alzheimer’s early is a top priority for physicians, many of whom believe that treating patients long before dementia starts greatly improves the chances of success.

Ances and his colleagues studied 207 older but cognitively normal research volunteers at the Charles F. and Joanne Knight Alzheimer’s Disease Research Center at Washington University. Over several years, spinal fluids from the volunteers were sampled multiple times and analyzed for two markers of early Alzheimer’s: changes in amyloid beta, the principal ingredient of Alzheimer’s brain plaques, and in tau protein, a structural component of nerve cells.

The volunteers were also scanned repeatedly using a technique called resting state functional magnetic resonance imaging (fMRI). This scan tracks the rise and fall of blood flow in different brain regions as patients rest in the scanner. Scientists use the resulting data to assess the integrity of the default mode network, a set of connections between different brain regions that becomes active when the mind is at rest.

Earlier studies by Ances and other researchers have shown that Alzheimer’s damages connections in the default mode network and other brain networks.

The new study revealed that this damage became detectable at about the same time that amyloid beta levels began to fall and tau levels started to rise in spinal fluid. The part of the default mode network most harmed by the onset of Alzheimer’s disease was the connection between two brain areas associated with memory, the posterior cingulate and medial temporal regions.

The researchers are continuing to study the connections between brain network damage and the progress of early Alzheimer’s disease in normal volunteers and in patients in the early stages of Alzheimer’s-associated dementia.

Aug 20, 201354 notes
#alzheimer's disease #dementia #neuroimaging #beta amyloid #neuroscience #science
Copper Identified as Culprit in Alzheimer’s Disease

Copper appears to be one of the main environmental factors that trigger the onset  and enhance the progression of Alzheimer’s disease by preventing the clearance and accelerating the accumulation of toxic proteins in the brain. That is the conclusion of a study appearing today in the journal Proceedings of the National Academy of Sciences. 

image

“It is clear that, over time, copper’s cumulative effect is to impair the systems by which amyloid beta is removed from the brain,” said Rashid Deane, Ph.D., a research professor in the University of Rochester Medical Center (URMC) Department of Neurosurgery, member of the Center for Translational Neuromedicine, and the lead author of the study. “This impairment is one of the key factors that cause the protein to accumulate in the brain and form the plaques that are the hallmark of Alzheimer’s disease.” 

Copper’s presence in the food supply is ubiquitous. It is found in drinking water carried by copper pipes, nutritional supplements, and in certain foods such as red meats, shellfish, nuts, and many fruits and vegetables. The mineral plays an important and beneficial role in nerve conduction, bone growth, the formation of connective tissue, and hormone secretion. 

However, the new study shows that copper can also accumulate in the brain and cause the blood brain barrier – the system that controls what enters and exits the brain – to break down, resulting in the toxic accumulation of the protein amyloid beta, a by-product of cellular activity.  Using both mice and human brain cells Deane and his colleagues conducted a series of experiments that have pinpointed the molecular mechanisms by which copper accelerates the pathology of Alzheimer’s disease.  

Under normal circumstances, amyloid beta is removed from the brain by a protein called lipoprotein receptor-related protein 1 (LRP1). These proteins – which line the capillaries that supply the brain with blood – bind with the amyloid beta found in the brain tissue and escort them into the blood vessels where they are removed from the brain. 

The research team“dosed” normal mice with copper over a three month period. The exposure consisted of trace amounts of the metal in drinking water and was one-tenth of the water quality standards for copper established by the Environmental Protection Agency. 

“These are very low levels of copper, equivalent to what people would consume in a normal diet.” said Deane.

The researchers found that the copper made its way into the blood system and accumulated in the vessels that feed blood to the brain, specifically in the cellular “walls” of the capillaries. These cells are a critical part of the brain’s defense system and help regulate the passage of molecules to and from brain tissue. In this instance, the capillary cells prevent the copper from entering the brain. However, over time the metal can accumulate in these cells with toxic effect. 

The researchers observed that the copper disrupted the function of LRP1 through a process called oxidation which, in turn, inhibited the removal of amyloid beta from the brain. They observed this phenomenon in both mouse and human brain cells.

The researchers then looked at the impact of copper exposure on mouse models of Alzheimer’s disease. In these mice, the cells that form the blood brain barrier have broken down and become “leaky” – a likely combination of aging and the cumulative effect of toxic assaults – allowing elements such as copper to pass unimpeded into the brain tissue. They observed that the copper stimulated activity in neurons that increased the production of amyloid beta. The copper also interacted with amyloid beta in a manner that caused the proteins to bind together in larger complexes creating logjams of the protein that the brain’s waste disposal system cannot clear. 

This one-two punch, inhibiting the clearance and stimulating the production of amyloid beta, provides strong evidence that copper is a key player in Alzheimer’s disease. In addition, the researchers observed that copper provoked inflammation of brain tissue which may further promote the breakdown of the blood brain barrier and the accumulation of Alzheimer’s-related toxins.  

However, because metal is essential to so many other functions in the body, the researchers say that these results must be interpreted with caution.

“Copper is an essential metal and it is clear that these effects are due to exposure over a long period of time,” said Deane. “The key will be striking the right balance between too little and too much copper consumption. Right now we cannot say what the right level will be, but diet may ultimately play an important role in regulating this process.”

Aug 20, 2013263 notes
#science #alzheimer's disease #dementia #copper #amyloid plaques #blood brain barrier #neurology #neuroscience
Aug 19, 2013447 notes
#sleep #dreaming #brainwaves #memory #psychology #neuroscience #science
Aug 19, 201340 notes
#brain tumor #anti-angiogenesis therapy #glioblastoma #blood vessels #medicine #neuroscience #science
Aug 19, 201392 notes
#science #ion channels #potassium channels #G proteins #heart #brain #medicine #neuroscience
Aug 18, 2013801 notes
#science #progeria #aging #developmental inertia #genetics #neuroscience
Why One Cream Cake Leads to Another

Continuously eating fatty foods perturbs communication between the gut and brain, which in turn perpetuates a bad diet.

A chronic high-fat diet is thought to desensitize the brain to the feeling of satisfaction that one normally gets from a meal, causing a person to overeat in order to achieve the same high again. New research published today (August 15) in Science, however, suggests that this desensitization actually begins in the gut itself, where production of a satiety factor, which normally tells the brain to stop eating, becomes dialed down by the repeated intake of high-fat food.

image

“It’s really fantastic work,” said Paul Kenny, a professor of molecular therapeutics at The Scripps Research Institute in Jupiter, Florida, who was not involved in the study. “It could be a so-called missing link between gut and brain signaling, which has been something of a mystery.”

While pork belly, ice cream, and other high-fat foods produce an endorphin response in the brain when they hit the taste buds, according to Kenny, the gut also sends signals directly to the brain to control our feeding behavior. Indeed, mice nourished via gastric feeding tubes, which bypass the mouth, exhibit a surge in dopamine—a neurotransmitter promoting reinforcement in the brain’s reward circuitry—similar to that experienced by those eating normally.

This dopamine surge occurs in response to feeding in both mice and humans. But evidence suggests that dopamine signaling in the brain is deficient in obese people. Ivan de Araujo, a professor of psychiatry at the Yale School of Medicine, has now discovered that obese mice on a chronic high-fat diet also have a muted dopamine response when receiving fatty food via a direct tube to their stomachs.

To determine the nature of the dopamine-regulating signal emanating from the gut, Araujo and his team searched for possible candidates. “When you look at animals chronically exposed to high-fat foods, you see high levels of almost every circulating factor—leptin, insulin, triglycerides, glucose, et cetera,” he said. But one class of signaling molecule is suppressed. Of these, Araujo’s primary candidate was oleoylethanolamide. Not only is the factor produced by intestinal cells in response to food, he said, but during chronic high-fat exposure, “the suppression levels seemed to somehow match the suppression that we saw in dopamine release.”

Araujo confirmed oleoylethanol’s dopamine-regulating ability in mice by administering the factor via a catheter to the tissues surrounding their guts. “We discovered that by restoring the baseline level of [oleoylethanolamide] in the gut … the high-fat fed animals started having dopamine responses that were indistinguishable from their lean counterparts.”

The team also found that oleoylethanolamide’s effect on dopamine was transmitted via the vagus nerve, which runs between the brain and abdomen, and was dependent on its interaction with a transcription factor called PPAR-a.

Oleoylethanolamide levels are also reduced in fasting animals and increase in response to eating, communicating with the brain to stop further consumption once the belly is full. Indeed, oleoylethanolamide is a known satiety factor. Therefore, when chronic consumption of high-fat food diminishes its production, the satisfaction signal is not achieved, and the brain is essentially “blind to the presence of calories in the gut,” said Araujo, and thus demands more food.

It is not clear why a chronic high-fat diet suppresses the production of oleoylethanolamide. But once the vicious cycle starts, it is hard to break because the brain is receiving its information subconsciously, said Daniele Piomelli, a professor at the University of California, Irvine, and director of drug discovery and development at the Italian Institute of Technology in Genoa.

“We eat what we like, and we think we are conscious of what we like, but I think what this [paper] and others are indicating is that there is a deeper, darker side to liking—a side that we’re not aware of,” Piomelli said. “Because it is an innate drive, you can not control it.” Put another way, even if you could trick your taste buds into enjoying low-fat yogurt, you’re unlikely to trick your gut.

The good news, however, is that “there is no permanent impairment in the [animals’] dopamine levels,” Araujo said. This suggests that if drugs could be designed to regulate the oleoylethanolamide–to-PPAR-a pathway in the gut, Kenny added, it could have “a huge impact on people’s ability to control their appetite.”

Aug 18, 2013164 notes
#dopamine #dopamine deficiency #obesity #diet #appetite #neuroscience #science
Head hurts? Zap the wonder nerve in your neck

"It was like red-hot pokers needling one side of my face," says Catherine, recalling the cluster headaches she experienced for six years. "I just wanted it to stop." But it wouldn’t – none of the drugs she tried had any effect.

image

Thinking she had nothing to lose, last year she enrolled in a pilot study to test a handheld device that applies a bolt of electricity to the neck, stimulating the vagus nerve – the superhighway that connects the brain to many of the body’s organs, including the heart.

The results of the trial were presented last month at the International Headache Congress in Boston, and while the trial is small, the findings are positive. Of the 21 volunteers, 18 reported a reduction in the severity and frequency of their headaches, rating them, on average, 50 per cent less painful after using the device daily and whenever they felt a headache coming on.

This isn’t the first time vagal nerve stimulation has been used as a treatment – but it is one of the first that hasn’t required surgery. Some people with epilepsy have had a small generator that sends regular electrical signals to the vagus nerve implanted into their chest. Implanted devices have also been approved to treat depression. What’s more, there is increasing evidence that such stimulation could treat many more disorders from headaches to stroke and possibly Alzheimer’s disease.

The latest study suggests it is possible to stimulate the nerve through the skin, rather than resorting to surgery. “What we’ve done is figured out a way to stimulate the vagus nerve with a very similar signal, but non-invasively through the neck,” says Bruce Simon, vice-president of research at New Jersey-based ElectroCore, makers of the handheld device. “It’s a simpler, less invasive way to stimulate the nerve.”

Cluster headaches are thought to be triggered by the overactivation of brain cells involved in pain processing. The neurotransmitter glutamate, which excites brain cells, is a prime suspect. ElectroCore turned to the vagus nerve as previous studies had shown that stimulating it in people with epilepsy releases neurotransmitters that dampen brain activity.

When the firm used a smaller version of ElectroCore’s device on rats, it found it reduced glutamate levels and excitability in these pain centres. Other studies have shown that vagus nerve stimulation causes the release of inhibitory neurotransmitters which counter the effects of glutamate.

The big question is whether a non-implantable device can really trigger changes in brain chemistry in humans, or whether people are simply experiencing a placebo effect. “The vagus nerve is buried deep in the neck, and something that’s delivering currents through the skin can only go so deep,” says Mike Kilgard of the University of Texas at Dallas. As you turn up the voltage, there’s a risk of it activating muscle fibres that trigger painful cramps, he adds.

Simon says that volunteers using the device haven’t reported any serious side effects. He adds that ElectroCore will soon publish data showing changes in brain activity in humans after using the device. Placebo-controlled trials are also about to start.

Catherine has been using it for a year without ill effect. “I can now function properly as a human being again,” she says.

The many uses of the wonder nerve

Coma, irritable bowel syndrome, asthma and obesity are just some of the disparate conditions that vagus nerve stimulation may benefit and for which human trials are under way.

It might also help people with tinnitus. Although people with tinnitus complain of ringing in their ears, the problem actually arises because too many neurons fire in the auditory part of the brain when certain frequencies are heard.

Mike Kilgard of the University of Texas at Dallas reasoned that if people were played tones that didn’t trigger tinnitus while the vagus nerve was stimulated, this might coax the rogue neurons into firing in response to these frequencies instead. “By activating this nerve we can enhance the brain’s ability to rewire itself,” he says.

He has so far tested the method in rats and in 10 people with tinnitus, using an implanted device to stimulate the nerve. Not everyone noticed an improvement, but even so Kilgard is planning a larger trial. The work was presented at a meeting of the International Union of Physiological Sciences in Birmingham, UK, last month. The technique is also being tested in people who have had a stroke.

"If these studies stand up it could be worth changing the name of the vagus nerve to the wonder nerve," says Sunny Ogbonnaya at Cork University Hospital in Ireland.

Aug 18, 2013121 notes
#vagus nerve #vagal nerve stimulation #glutamate #headaches #brain activity #neuroscience #science
Device Could Spot Seizures by Reading Brainwaves through the Ear

Neuroscientists often use electroencephalography (EEG) as an inexpensive way to record electrical signals in the brain. Though it would be useful to run these recordings for long periods of time, that usually isn’t practical: EEG recording traditionally involves attaching many electrodes and cables to a patient’s scalp.

Now engineers at Imperial College in London have developed an EEG device that can be worn inside the ear, like a hearing aid. They say the device will allow scientists to record EEGs for several days at a time; this would allow doctors to monitor patients who have regularly recurring problems like seizures or microsleep.

image

“The ideal is to have a very stable recording system, and recordings which are repeatable,” explains co-creator Danilo Mandic. “It’s not interfering with your normal life, because there are acoustic vents so people can hear. After a while, they forget they’re having an EEG.”

By nestling the EEG inside the ear, the engineers avoid a lot of signal noise usually introduced by body movement. They can also ensure that the electrodes are always placed in exactly the same spot, which, they say, will make repeated readings more reliable.

Since the device attaches to just one area, it can record only from the temporal region. This limits its potential applications to events that involve local activity. Tzzy-Ping Jung, co-director of the University of California, San Diego’s Center for Advanced Neurological Engineering, says that this does not mean the device will not be valuable.

“Different modalities will have different applications. I would not rule out the usefulness of any modalities,” says Jung. “I think it’s a very good idea with very promising results.”

Aug 18, 2013154 notes
#EEG device #brain imaging #seizures #brainwaves #neuroscience #science
Female frogs prefer males who can multitask

From frogs to humans, selecting a mate is complicated. Females of many species judge suitors based on many indicators of health or parenting potential. But it can be difficult for males to produce multiple signals that demonstrate these qualities simultaneously.

image

In a study of gray tree frogs, a team of University of Minnesota researchers discovered that females prefer males whose calls reflect the ability to multitask effectively. In this species (Hyla chrysoscelis) males produce “trilled” mating calls that consist of a string of pulses.

Typical calls can range in duration from 20-40 pulses per call and occur between 5-15 calls per minute. Males face a trade-off between call duration and call rate, but females preferred calls that are longer and more frequent, which is no simple task.

The findings were published in August issue of Animal Behavior.

"It’s kind of like singing and dancing at the same time," says Jessica Ward, a postdoctoral researcher who is lead author for the study. Ward works in the laboratory of Mark Bee, a professor in the College of Biological Sciences’ Department of Ecology, Evolution and Behavior.

The study supports the multitasking hypothesis, which suggests that females prefer males who can do two or more hard-to-do things at the same time because these are especially good quality males, Ward says. The hypothesis, which explores how multiple signals produced by males influence female behavior, is a new area of interest in animal behavior research.

By listening to recordings of 1,000 calls, Ward and colleagues learned that males are indeed forced to trade off call duration and call rate. That is, males that produce relatively longer calls only do so at relatively slower rates.

"It’s easy to imagine that we humans might also prefer multitasking partners, such as someone who can successfully earn a good income, cook dinner, manage the finances and get the kids to soccer practice on time."

The study was carried out in connection with Bee’s research goal, which is understanding how female frogs are able to distinguish individual mating calls from a large chorus of males. By comparison, humans, especially as we age, lose the ability to distinguish individual voices in a crowd. This phenomenon, called the “cocktail party” problem, is often the first sign of a diminishing ability to hear. Understanding how frogs hear could lead to improved hearing aids.

Aug 17, 201356 notes
#multitasking #mating #frogs #animal behavior #psychology #neuroscience #science
Aug 17, 2013181 notes
#autism #ASD #mathematical skills #brain differences #brain activity #neuroimaging #neuroscience #psychology #science
Aug 16, 2013162 notes
#prospective memory #fMRI #brain activity #prefrontal cortex #memory #psychology #neuroscience #science
Making the Brain Take Notice of Faces in Autism

A new study in Biological Psychiatry explores the influence of oxytocin

Difficulty in registering and responding to the facial expressions of other people is a hallmark of autism spectrum disorder (ASD). Relatedly, functional imaging studies have shown that individuals with ASD display altered brain activations when processing facial images.

The hormone oxytocin plays a vital role in the social interactions of both animals and humans. In fact, multiple studies conducted with healthy volunteers have provided evidence for beneficial effects of oxytocin in terms of increased trust, improved emotion recognition, and preference for social stimuli.

This combination of scientific work led German researchers to hypothesize about the influence of oxytocin in ASD. Dr. Gregor Domes, from the University of Freiburg and first author of the new study, explained: “In the present study, we were interested in the question of whether a single dose of oxytocin would change brain responses to social compared to non-social stimuli in individuals with autism spectrum disorder.”

They found that oxytocin did show an effect on social processing in the individuals with ASD, “suggesting that oxytocin may help to treat a basic brain function that goes awry in autism spectrum disorders,” commented Dr. John Krystal, Editor of Biological Psychiatry.

To conduct this study, they recruited fourteen individuals with ASD and fourteen control volunteers, all of whom completed a face- and house-matching task while undergoing imaging scans. Each participant completed this task and scanning procedure twice, once after receiving a nasal spray containing oxytocin and once after receiving a nasal spray containing placebo. The order of the sprays was randomized, and the tests were administered one week apart.

Using two sets of stimuli in the matching task, one of faces and one of houses, allowed the researchers to not only compare the effects of the oxytocin and placebo administrations, but also allowed them to discriminate findings between specific effects to only social stimuli and non-specific effects to more general brain processing.

What they found was intriguing. The data indicate that oxytocin specifically increases responses of the amygdala to social stimuli in individuals with ASD. The amygdala, the authors explain, “has been associated with processing of emotional stimuli, threat-related stimuli, face processing, and vigilance for salient stimuli”.

This finding suggests oxytocin might promote the salience of social stimuli in ASD. Increased salience of social stimuli might support behavioral training of social skills in ASD.

These data support the idea that oxytocin may be a promising approach in the treatment of ASD and could stimulate further research, even clinical trials, on the exploration of oxytocin as an add-on treatment for individuals with autism spectrum disorder.

Aug 16, 201367 notes
#oxytocin #autism #ASD #amygdala #face processing #social cognition #neuroscience #science
Cell memory mechanism discovered

The cells in our bodies can divide as often as once every 24 hours, creating a new, identical copy. DNA binding proteins called transcription factors are required for maintaining cell identity. They ensure that daughter cells have the same function as their mother cell, so that for example muscle cells can contract or pancreatic cells can produce insulin. However, each time a cell divides the specific binding pattern of the transcription factors is erased and has to be restored in both mother and daughter cells. Previously it was unknown how this process works, but now scientists at Karolinska Institutet have discovered the importance of particular protein rings encircling the DNA and how these function as the cell’s memory.

image

The DNA in human cells is translated into a multitude of proteins required for a cell to function. When, where and how proteins are expressed is determined by regulatory DNA sequences and a group of proteins, known as transcription factors, that bind to these DNA sequences. Each cell type can be distinguished based on its transcription factors, and a cell can in certain cases be directly converted from one type to another, simply by changing the expression of one or more transcription factors. It is critical that the pattern of transcription factor binding in the genome be maintained. During each cell division, the transcription factors are removed from DNA and must find their way back to the right spot after the cell has divided. Despite many years of intense research, no general mechanism has been discovered which would explain how this is achieved.

"The problem is that there is so much DNA in a cell that it would be impossible for the transcription factors to find their way back within a reasonable time frame. But now we have found a possible mechanism for how this cellular memory works, and how it helps the cell remember the order that existed before the cell divided, helping the transcription factors find their correct places", explains Jussi Taipale, professor at Karolinska Institutet and the University of Helsinki, and head of the research team behind the discovery.

The results are now being published in the scientific journal Cell. The research group has produced the most complete map yet of transcription factors in a cell. They found that a large protein complex called cohesin is positioned as a ring around the two DNA strands that are formed when a cell divides, marking virtually all the places on the DNA where transcription factors were bound. Cohesin encircles the DNA strand as a ring does around a piece of string, and the protein complexes that replicate DNA can pass through the ring without displacing it. Since the two new DNA strands are caught in the ring, only one cohesin is needed to mark the two, thereby helping the transcription factors to find their original binding region on both DNA strands.

"More research is needed before we can be sure, but so far all experiments support our model," says Martin Enge, assistant professor at Karolinska Institutet.

Transcription factors play a pivotal role in many illnesses, including cancer as well as many hereditary diseases. The discovery that virtually all regulatory DNA sequences bind to cohesin may also end up having more direct consequences for patients with cancer or hereditary diseases. Cohesin would function as an indicator of which DNA sequences might contain disease-causing mutations.

"Currently we analyse DNA sequences that are directly located in genes, which constitute about three per cent of the genome. However, most mutations that have been shown to cause cancer are located outside of genes. We cannot analyse these in a reliable manner - the genome is simply too large. By only analysing DNA sequences that bind to cohesin, roughly one per cent of the genome, it would allow us to analyse an individual’s mutations and make it much easier to conduct studies to identify novel harmful mutations," Martin Enge concludes.

Aug 16, 2013113 notes
#transcription factors #DNA sequence #hereditary diseases #cohesin #genetics #neuroscience #science
Sympathetic Neurons Engage in “Cross Talk” With Cells in the Pancreas During Early Development

The human body is a complicated system of blood vessels, nerves, organs, tissue and cells each with a specific job to do. When all are working together, it’s a symphony of form and function as each instrument plays its intended roles.

image

Biologist Rejji Kuruvilla and her fellow researchers uncovered what happens when one instrument is not playing its part.

Kuruvilla along with graduate students Philip Borden and Jessica Houtz, both from the Biology Department at Johns Hopkins University’s Krieger School of Arts and Sciences, and Dr. Steven Leach from the McKusick-Nathans Institute of Genetic Medicine at the Johns Hopkins School of Medicine, recently published a paper in the journal Cell Reports exploring whether “cross-talk” or reciprocal signaling, takes place between the neurons in the sympathetic nervous system and the tissues that the nerves connect to. In this case the targeted tissue called islets, were in the pancreas.

“We knew that sympathetic neurons need molecular signals from the tissues that they connect with, to grow and survive,” said Kuruvilla. “What we did not know was whether the neurons would reciprocally signal to the target tissues to instruct them to grow and mature. It made sense to focus on the pancreas because of previous studies done in diabetic animal models where sympathetic nerves within the pancreas were found to retract early on in the disease, suggesting that dysfunction of the nerves could be an early trigger for pancreatic defects.”

The researchers spent approximately three years working with lab mice to test the various scenarios in which signaling between sympathetic neurons and islet cells might take place. The experiments focused on what effects removing the sympathetic nerves would have on pancreas development in newborn mice.

Previous studies had shown that pancreatic cells release a signal of their own, a nerve growth protein, that directs the sympathetic nerves toward the pancreas and provides necessary nutrition to sustain the nerves.

In turn, Kuruvilla’s team found that in mutant mice, the removal of the sympathetic neurons resulted in deformities in the architecture of the pancreatic islet cells and defects in insulin secretion and glucose metabolism.

Pancreatic islets are highly organized functional micro-organs with a defined size, shape and distinctive arrangement of endocrine cells. It’s this marriage of form and function that result in cells clustered close together, that creates greater, more efficient islet cell function.

However, the mutant mice, with their sympathetic neurons removed, had islet formations that were misshapen, sported lesions and developed in a patchy, uneven manner. Because of their dysfunctional islet cell development, postnatal mice did not secrete enough insulin when confronted with high glucose, and had high blood glucose levels as a result. Increased levels of blood glucose in humans is a hallmark of diabetes.

It’s known in neuroscience that the neurons in question from the sympathetic nervous system control the body’s “flight or fight” response and communicate with connected tissues by releasing a chemical messenger called norepinephrine. The release of norepinephrine also plays an important role in the development and maturation of islets, said Kuruvilla.

Using sympathetic neurons and islet cells grown together in a culture dish, the researchers observed that islet cells move toward the nerves and identified norepinephrine as the nerve signal that causes the movement of the islet cells.

“Seeing how these islet cells were responding to sympathetic neurons both in a dish and the effects of removing the nerves in a whole animal on islet shape and functions were pretty remarkable,” said Borden, lead author of the paper. “It was clear to us that sympathetic neurons were key to how islets were developing, something no one else had shown.”

Kuruvilla said these studies, identifying sympathetic nerves as a critical player in organizing pancreatic cells during development and influencing their later function, could add to a better understanding of treating diabetes in the future. The research also lends support to the value in considering the importance of external factors such as nerves and blood vessels when transplanting islet cells for the treatment of diabetes in patients.

“This study reveals interactions between two co-developing systems, sympathetic neurons and pancreatic islet cells, that has important implications for peripheral organ development, and for regeneration of these tissues following injury or disease,” said Kuruvilla.

Aug 16, 201353 notes
#sympathetic nervous system #sympathetic neurons #pancreatic cells #norepinephrine #neuroscience #science
Aug 16, 201395 notes
#depression #biomarker #bipolar disorder #neuroimaging #psychology #neuroscience #science
Next page →
20132014
  • January
  • February
  • March
  • April
  • May
  • June
  • July
  • August
  • September
  • October
  • November
  • December
201220132014
  • January
  • February
  • March
  • April
  • May
  • June
  • July
  • August
  • September
  • October
  • November
  • December
20122013
  • January
  • February
  • March
  • April
  • May
  • June
  • July
  • August
  • September
  • October
  • November
  • December