Neuroscience

Month

August 2013

Aug 20, 2013272 notes
#neuroimaging #brain activity #brain scans #neuroscience #science
Aug 20, 201368 notes
#concussion #TBI #brain injury #neuroimaging #neurology #neuroscience #science
Brain network decay detected in early Alzheimer’s

In patients with early Alzheimer’s disease, disruptions in brain networks emerge about the same time as chemical markers of the disease appear in the spinal fluid, researchers at Washington University School of Medicine in St. Louis have shown.

While two chemical markers in the spinal fluid are regarded as reliable indicators of early disease, the new study, published in JAMA Neurology, is among the first to show that scans of brain networks may be an equally effective and less invasive way to detect early disease.

“Tracking damage to these brain networks may also help us formulate a more detailed understanding of what happens to the brain before the onset of dementia,” said senior author Beau Ances, MD, PhD, associate professor of neurology and of biomedical engineering.

Diagnosing Alzheimer’s early is a top priority for physicians, many of whom believe that treating patients long before dementia starts greatly improves the chances of success.

Ances and his colleagues studied 207 older but cognitively normal research volunteers at the Charles F. and Joanne Knight Alzheimer’s Disease Research Center at Washington University. Over several years, spinal fluids from the volunteers were sampled multiple times and analyzed for two markers of early Alzheimer’s: changes in amyloid beta, the principal ingredient of Alzheimer’s brain plaques, and in tau protein, a structural component of nerve cells.

The volunteers were also scanned repeatedly using a technique called resting state functional magnetic resonance imaging (fMRI). This scan tracks the rise and fall of blood flow in different brain regions as patients rest in the scanner. Scientists use the resulting data to assess the integrity of the default mode network, a set of connections between different brain regions that becomes active when the mind is at rest.

Earlier studies by Ances and other researchers have shown that Alzheimer’s damages connections in the default mode network and other brain networks.

The new study revealed that this damage became detectable at about the same time that amyloid beta levels began to fall and tau levels started to rise in spinal fluid. The part of the default mode network most harmed by the onset of Alzheimer’s disease was the connection between two brain areas associated with memory, the posterior cingulate and medial temporal regions.

The researchers are continuing to study the connections between brain network damage and the progress of early Alzheimer’s disease in normal volunteers and in patients in the early stages of Alzheimer’s-associated dementia.

Aug 20, 201354 notes
#alzheimer's disease #dementia #neuroimaging #beta amyloid #neuroscience #science
Copper Identified as Culprit in Alzheimer’s Disease

Copper appears to be one of the main environmental factors that trigger the onset  and enhance the progression of Alzheimer’s disease by preventing the clearance and accelerating the accumulation of toxic proteins in the brain. That is the conclusion of a study appearing today in the journal Proceedings of the National Academy of Sciences. 

image

“It is clear that, over time, copper’s cumulative effect is to impair the systems by which amyloid beta is removed from the brain,” said Rashid Deane, Ph.D., a research professor in the University of Rochester Medical Center (URMC) Department of Neurosurgery, member of the Center for Translational Neuromedicine, and the lead author of the study. “This impairment is one of the key factors that cause the protein to accumulate in the brain and form the plaques that are the hallmark of Alzheimer’s disease.” 

Copper’s presence in the food supply is ubiquitous. It is found in drinking water carried by copper pipes, nutritional supplements, and in certain foods such as red meats, shellfish, nuts, and many fruits and vegetables. The mineral plays an important and beneficial role in nerve conduction, bone growth, the formation of connective tissue, and hormone secretion. 

However, the new study shows that copper can also accumulate in the brain and cause the blood brain barrier – the system that controls what enters and exits the brain – to break down, resulting in the toxic accumulation of the protein amyloid beta, a by-product of cellular activity.  Using both mice and human brain cells Deane and his colleagues conducted a series of experiments that have pinpointed the molecular mechanisms by which copper accelerates the pathology of Alzheimer’s disease.  

Under normal circumstances, amyloid beta is removed from the brain by a protein called lipoprotein receptor-related protein 1 (LRP1). These proteins – which line the capillaries that supply the brain with blood – bind with the amyloid beta found in the brain tissue and escort them into the blood vessels where they are removed from the brain. 

The research team“dosed” normal mice with copper over a three month period. The exposure consisted of trace amounts of the metal in drinking water and was one-tenth of the water quality standards for copper established by the Environmental Protection Agency. 

“These are very low levels of copper, equivalent to what people would consume in a normal diet.” said Deane.

The researchers found that the copper made its way into the blood system and accumulated in the vessels that feed blood to the brain, specifically in the cellular “walls” of the capillaries. These cells are a critical part of the brain’s defense system and help regulate the passage of molecules to and from brain tissue. In this instance, the capillary cells prevent the copper from entering the brain. However, over time the metal can accumulate in these cells with toxic effect. 

The researchers observed that the copper disrupted the function of LRP1 through a process called oxidation which, in turn, inhibited the removal of amyloid beta from the brain. They observed this phenomenon in both mouse and human brain cells.

The researchers then looked at the impact of copper exposure on mouse models of Alzheimer’s disease. In these mice, the cells that form the blood brain barrier have broken down and become “leaky” – a likely combination of aging and the cumulative effect of toxic assaults – allowing elements such as copper to pass unimpeded into the brain tissue. They observed that the copper stimulated activity in neurons that increased the production of amyloid beta. The copper also interacted with amyloid beta in a manner that caused the proteins to bind together in larger complexes creating logjams of the protein that the brain’s waste disposal system cannot clear. 

This one-two punch, inhibiting the clearance and stimulating the production of amyloid beta, provides strong evidence that copper is a key player in Alzheimer’s disease. In addition, the researchers observed that copper provoked inflammation of brain tissue which may further promote the breakdown of the blood brain barrier and the accumulation of Alzheimer’s-related toxins.  

However, because metal is essential to so many other functions in the body, the researchers say that these results must be interpreted with caution.

“Copper is an essential metal and it is clear that these effects are due to exposure over a long period of time,” said Deane. “The key will be striking the right balance between too little and too much copper consumption. Right now we cannot say what the right level will be, but diet may ultimately play an important role in regulating this process.”

Aug 20, 2013263 notes
#science #alzheimer's disease #dementia #copper #amyloid plaques #blood brain barrier #neurology #neuroscience
Aug 19, 2013447 notes
#sleep #dreaming #brainwaves #memory #psychology #neuroscience #science
Aug 19, 201340 notes
#brain tumor #anti-angiogenesis therapy #glioblastoma #blood vessels #medicine #neuroscience #science
Aug 19, 201392 notes
#science #ion channels #potassium channels #G proteins #heart #brain #medicine #neuroscience
Aug 18, 2013801 notes
#science #progeria #aging #developmental inertia #genetics #neuroscience
Why One Cream Cake Leads to Another

Continuously eating fatty foods perturbs communication between the gut and brain, which in turn perpetuates a bad diet.

A chronic high-fat diet is thought to desensitize the brain to the feeling of satisfaction that one normally gets from a meal, causing a person to overeat in order to achieve the same high again. New research published today (August 15) in Science, however, suggests that this desensitization actually begins in the gut itself, where production of a satiety factor, which normally tells the brain to stop eating, becomes dialed down by the repeated intake of high-fat food.

image

“It’s really fantastic work,” said Paul Kenny, a professor of molecular therapeutics at The Scripps Research Institute in Jupiter, Florida, who was not involved in the study. “It could be a so-called missing link between gut and brain signaling, which has been something of a mystery.”

While pork belly, ice cream, and other high-fat foods produce an endorphin response in the brain when they hit the taste buds, according to Kenny, the gut also sends signals directly to the brain to control our feeding behavior. Indeed, mice nourished via gastric feeding tubes, which bypass the mouth, exhibit a surge in dopamine—a neurotransmitter promoting reinforcement in the brain’s reward circuitry—similar to that experienced by those eating normally.

This dopamine surge occurs in response to feeding in both mice and humans. But evidence suggests that dopamine signaling in the brain is deficient in obese people. Ivan de Araujo, a professor of psychiatry at the Yale School of Medicine, has now discovered that obese mice on a chronic high-fat diet also have a muted dopamine response when receiving fatty food via a direct tube to their stomachs.

To determine the nature of the dopamine-regulating signal emanating from the gut, Araujo and his team searched for possible candidates. “When you look at animals chronically exposed to high-fat foods, you see high levels of almost every circulating factor—leptin, insulin, triglycerides, glucose, et cetera,” he said. But one class of signaling molecule is suppressed. Of these, Araujo’s primary candidate was oleoylethanolamide. Not only is the factor produced by intestinal cells in response to food, he said, but during chronic high-fat exposure, “the suppression levels seemed to somehow match the suppression that we saw in dopamine release.”

Araujo confirmed oleoylethanol’s dopamine-regulating ability in mice by administering the factor via a catheter to the tissues surrounding their guts. “We discovered that by restoring the baseline level of [oleoylethanolamide] in the gut … the high-fat fed animals started having dopamine responses that were indistinguishable from their lean counterparts.”

The team also found that oleoylethanolamide’s effect on dopamine was transmitted via the vagus nerve, which runs between the brain and abdomen, and was dependent on its interaction with a transcription factor called PPAR-a.

Oleoylethanolamide levels are also reduced in fasting animals and increase in response to eating, communicating with the brain to stop further consumption once the belly is full. Indeed, oleoylethanolamide is a known satiety factor. Therefore, when chronic consumption of high-fat food diminishes its production, the satisfaction signal is not achieved, and the brain is essentially “blind to the presence of calories in the gut,” said Araujo, and thus demands more food.

It is not clear why a chronic high-fat diet suppresses the production of oleoylethanolamide. But once the vicious cycle starts, it is hard to break because the brain is receiving its information subconsciously, said Daniele Piomelli, a professor at the University of California, Irvine, and director of drug discovery and development at the Italian Institute of Technology in Genoa.

“We eat what we like, and we think we are conscious of what we like, but I think what this [paper] and others are indicating is that there is a deeper, darker side to liking—a side that we’re not aware of,” Piomelli said. “Because it is an innate drive, you can not control it.” Put another way, even if you could trick your taste buds into enjoying low-fat yogurt, you’re unlikely to trick your gut.

The good news, however, is that “there is no permanent impairment in the [animals’] dopamine levels,” Araujo said. This suggests that if drugs could be designed to regulate the oleoylethanolamide–to-PPAR-a pathway in the gut, Kenny added, it could have “a huge impact on people’s ability to control their appetite.”

Aug 18, 2013164 notes
#dopamine #dopamine deficiency #obesity #diet #appetite #neuroscience #science
Head hurts? Zap the wonder nerve in your neck

"It was like red-hot pokers needling one side of my face," says Catherine, recalling the cluster headaches she experienced for six years. "I just wanted it to stop." But it wouldn’t – none of the drugs she tried had any effect.

image

Thinking she had nothing to lose, last year she enrolled in a pilot study to test a handheld device that applies a bolt of electricity to the neck, stimulating the vagus nerve – the superhighway that connects the brain to many of the body’s organs, including the heart.

The results of the trial were presented last month at the International Headache Congress in Boston, and while the trial is small, the findings are positive. Of the 21 volunteers, 18 reported a reduction in the severity and frequency of their headaches, rating them, on average, 50 per cent less painful after using the device daily and whenever they felt a headache coming on.

This isn’t the first time vagal nerve stimulation has been used as a treatment – but it is one of the first that hasn’t required surgery. Some people with epilepsy have had a small generator that sends regular electrical signals to the vagus nerve implanted into their chest. Implanted devices have also been approved to treat depression. What’s more, there is increasing evidence that such stimulation could treat many more disorders from headaches to stroke and possibly Alzheimer’s disease.

The latest study suggests it is possible to stimulate the nerve through the skin, rather than resorting to surgery. “What we’ve done is figured out a way to stimulate the vagus nerve with a very similar signal, but non-invasively through the neck,” says Bruce Simon, vice-president of research at New Jersey-based ElectroCore, makers of the handheld device. “It’s a simpler, less invasive way to stimulate the nerve.”

Cluster headaches are thought to be triggered by the overactivation of brain cells involved in pain processing. The neurotransmitter glutamate, which excites brain cells, is a prime suspect. ElectroCore turned to the vagus nerve as previous studies had shown that stimulating it in people with epilepsy releases neurotransmitters that dampen brain activity.

When the firm used a smaller version of ElectroCore’s device on rats, it found it reduced glutamate levels and excitability in these pain centres. Other studies have shown that vagus nerve stimulation causes the release of inhibitory neurotransmitters which counter the effects of glutamate.

The big question is whether a non-implantable device can really trigger changes in brain chemistry in humans, or whether people are simply experiencing a placebo effect. “The vagus nerve is buried deep in the neck, and something that’s delivering currents through the skin can only go so deep,” says Mike Kilgard of the University of Texas at Dallas. As you turn up the voltage, there’s a risk of it activating muscle fibres that trigger painful cramps, he adds.

Simon says that volunteers using the device haven’t reported any serious side effects. He adds that ElectroCore will soon publish data showing changes in brain activity in humans after using the device. Placebo-controlled trials are also about to start.

Catherine has been using it for a year without ill effect. “I can now function properly as a human being again,” she says.

The many uses of the wonder nerve

Coma, irritable bowel syndrome, asthma and obesity are just some of the disparate conditions that vagus nerve stimulation may benefit and for which human trials are under way.

It might also help people with tinnitus. Although people with tinnitus complain of ringing in their ears, the problem actually arises because too many neurons fire in the auditory part of the brain when certain frequencies are heard.

Mike Kilgard of the University of Texas at Dallas reasoned that if people were played tones that didn’t trigger tinnitus while the vagus nerve was stimulated, this might coax the rogue neurons into firing in response to these frequencies instead. “By activating this nerve we can enhance the brain’s ability to rewire itself,” he says.

He has so far tested the method in rats and in 10 people with tinnitus, using an implanted device to stimulate the nerve. Not everyone noticed an improvement, but even so Kilgard is planning a larger trial. The work was presented at a meeting of the International Union of Physiological Sciences in Birmingham, UK, last month. The technique is also being tested in people who have had a stroke.

"If these studies stand up it could be worth changing the name of the vagus nerve to the wonder nerve," says Sunny Ogbonnaya at Cork University Hospital in Ireland.

Aug 18, 2013121 notes
#vagus nerve #vagal nerve stimulation #glutamate #headaches #brain activity #neuroscience #science
Device Could Spot Seizures by Reading Brainwaves through the Ear

Neuroscientists often use electroencephalography (EEG) as an inexpensive way to record electrical signals in the brain. Though it would be useful to run these recordings for long periods of time, that usually isn’t practical: EEG recording traditionally involves attaching many electrodes and cables to a patient’s scalp.

Now engineers at Imperial College in London have developed an EEG device that can be worn inside the ear, like a hearing aid. They say the device will allow scientists to record EEGs for several days at a time; this would allow doctors to monitor patients who have regularly recurring problems like seizures or microsleep.

image

“The ideal is to have a very stable recording system, and recordings which are repeatable,” explains co-creator Danilo Mandic. “It’s not interfering with your normal life, because there are acoustic vents so people can hear. After a while, they forget they’re having an EEG.”

By nestling the EEG inside the ear, the engineers avoid a lot of signal noise usually introduced by body movement. They can also ensure that the electrodes are always placed in exactly the same spot, which, they say, will make repeated readings more reliable.

Since the device attaches to just one area, it can record only from the temporal region. This limits its potential applications to events that involve local activity. Tzzy-Ping Jung, co-director of the University of California, San Diego’s Center for Advanced Neurological Engineering, says that this does not mean the device will not be valuable.

“Different modalities will have different applications. I would not rule out the usefulness of any modalities,” says Jung. “I think it’s a very good idea with very promising results.”

Aug 18, 2013154 notes
#EEG device #brain imaging #seizures #brainwaves #neuroscience #science
Female frogs prefer males who can multitask

From frogs to humans, selecting a mate is complicated. Females of many species judge suitors based on many indicators of health or parenting potential. But it can be difficult for males to produce multiple signals that demonstrate these qualities simultaneously.

image

In a study of gray tree frogs, a team of University of Minnesota researchers discovered that females prefer males whose calls reflect the ability to multitask effectively. In this species (Hyla chrysoscelis) males produce “trilled” mating calls that consist of a string of pulses.

Typical calls can range in duration from 20-40 pulses per call and occur between 5-15 calls per minute. Males face a trade-off between call duration and call rate, but females preferred calls that are longer and more frequent, which is no simple task.

The findings were published in August issue of Animal Behavior.

"It’s kind of like singing and dancing at the same time," says Jessica Ward, a postdoctoral researcher who is lead author for the study. Ward works in the laboratory of Mark Bee, a professor in the College of Biological Sciences’ Department of Ecology, Evolution and Behavior.

The study supports the multitasking hypothesis, which suggests that females prefer males who can do two or more hard-to-do things at the same time because these are especially good quality males, Ward says. The hypothesis, which explores how multiple signals produced by males influence female behavior, is a new area of interest in animal behavior research.

By listening to recordings of 1,000 calls, Ward and colleagues learned that males are indeed forced to trade off call duration and call rate. That is, males that produce relatively longer calls only do so at relatively slower rates.

"It’s easy to imagine that we humans might also prefer multitasking partners, such as someone who can successfully earn a good income, cook dinner, manage the finances and get the kids to soccer practice on time."

The study was carried out in connection with Bee’s research goal, which is understanding how female frogs are able to distinguish individual mating calls from a large chorus of males. By comparison, humans, especially as we age, lose the ability to distinguish individual voices in a crowd. This phenomenon, called the “cocktail party” problem, is often the first sign of a diminishing ability to hear. Understanding how frogs hear could lead to improved hearing aids.

Aug 17, 201356 notes
#multitasking #mating #frogs #animal behavior #psychology #neuroscience #science
Aug 17, 2013181 notes
#autism #ASD #mathematical skills #brain differences #brain activity #neuroimaging #neuroscience #psychology #science
Aug 16, 2013162 notes
#prospective memory #fMRI #brain activity #prefrontal cortex #memory #psychology #neuroscience #science
Making the Brain Take Notice of Faces in Autism

A new study in Biological Psychiatry explores the influence of oxytocin

Difficulty in registering and responding to the facial expressions of other people is a hallmark of autism spectrum disorder (ASD). Relatedly, functional imaging studies have shown that individuals with ASD display altered brain activations when processing facial images.

The hormone oxytocin plays a vital role in the social interactions of both animals and humans. In fact, multiple studies conducted with healthy volunteers have provided evidence for beneficial effects of oxytocin in terms of increased trust, improved emotion recognition, and preference for social stimuli.

This combination of scientific work led German researchers to hypothesize about the influence of oxytocin in ASD. Dr. Gregor Domes, from the University of Freiburg and first author of the new study, explained: “In the present study, we were interested in the question of whether a single dose of oxytocin would change brain responses to social compared to non-social stimuli in individuals with autism spectrum disorder.”

They found that oxytocin did show an effect on social processing in the individuals with ASD, “suggesting that oxytocin may help to treat a basic brain function that goes awry in autism spectrum disorders,” commented Dr. John Krystal, Editor of Biological Psychiatry.

To conduct this study, they recruited fourteen individuals with ASD and fourteen control volunteers, all of whom completed a face- and house-matching task while undergoing imaging scans. Each participant completed this task and scanning procedure twice, once after receiving a nasal spray containing oxytocin and once after receiving a nasal spray containing placebo. The order of the sprays was randomized, and the tests were administered one week apart.

Using two sets of stimuli in the matching task, one of faces and one of houses, allowed the researchers to not only compare the effects of the oxytocin and placebo administrations, but also allowed them to discriminate findings between specific effects to only social stimuli and non-specific effects to more general brain processing.

What they found was intriguing. The data indicate that oxytocin specifically increases responses of the amygdala to social stimuli in individuals with ASD. The amygdala, the authors explain, “has been associated with processing of emotional stimuli, threat-related stimuli, face processing, and vigilance for salient stimuli”.

This finding suggests oxytocin might promote the salience of social stimuli in ASD. Increased salience of social stimuli might support behavioral training of social skills in ASD.

These data support the idea that oxytocin may be a promising approach in the treatment of ASD and could stimulate further research, even clinical trials, on the exploration of oxytocin as an add-on treatment for individuals with autism spectrum disorder.

Aug 16, 201367 notes
#oxytocin #autism #ASD #amygdala #face processing #social cognition #neuroscience #science
Cell memory mechanism discovered

The cells in our bodies can divide as often as once every 24 hours, creating a new, identical copy. DNA binding proteins called transcription factors are required for maintaining cell identity. They ensure that daughter cells have the same function as their mother cell, so that for example muscle cells can contract or pancreatic cells can produce insulin. However, each time a cell divides the specific binding pattern of the transcription factors is erased and has to be restored in both mother and daughter cells. Previously it was unknown how this process works, but now scientists at Karolinska Institutet have discovered the importance of particular protein rings encircling the DNA and how these function as the cell’s memory.

image

The DNA in human cells is translated into a multitude of proteins required for a cell to function. When, where and how proteins are expressed is determined by regulatory DNA sequences and a group of proteins, known as transcription factors, that bind to these DNA sequences. Each cell type can be distinguished based on its transcription factors, and a cell can in certain cases be directly converted from one type to another, simply by changing the expression of one or more transcription factors. It is critical that the pattern of transcription factor binding in the genome be maintained. During each cell division, the transcription factors are removed from DNA and must find their way back to the right spot after the cell has divided. Despite many years of intense research, no general mechanism has been discovered which would explain how this is achieved.

"The problem is that there is so much DNA in a cell that it would be impossible for the transcription factors to find their way back within a reasonable time frame. But now we have found a possible mechanism for how this cellular memory works, and how it helps the cell remember the order that existed before the cell divided, helping the transcription factors find their correct places", explains Jussi Taipale, professor at Karolinska Institutet and the University of Helsinki, and head of the research team behind the discovery.

The results are now being published in the scientific journal Cell. The research group has produced the most complete map yet of transcription factors in a cell. They found that a large protein complex called cohesin is positioned as a ring around the two DNA strands that are formed when a cell divides, marking virtually all the places on the DNA where transcription factors were bound. Cohesin encircles the DNA strand as a ring does around a piece of string, and the protein complexes that replicate DNA can pass through the ring without displacing it. Since the two new DNA strands are caught in the ring, only one cohesin is needed to mark the two, thereby helping the transcription factors to find their original binding region on both DNA strands.

"More research is needed before we can be sure, but so far all experiments support our model," says Martin Enge, assistant professor at Karolinska Institutet.

Transcription factors play a pivotal role in many illnesses, including cancer as well as many hereditary diseases. The discovery that virtually all regulatory DNA sequences bind to cohesin may also end up having more direct consequences for patients with cancer or hereditary diseases. Cohesin would function as an indicator of which DNA sequences might contain disease-causing mutations.

"Currently we analyse DNA sequences that are directly located in genes, which constitute about three per cent of the genome. However, most mutations that have been shown to cause cancer are located outside of genes. We cannot analyse these in a reliable manner - the genome is simply too large. By only analysing DNA sequences that bind to cohesin, roughly one per cent of the genome, it would allow us to analyse an individual’s mutations and make it much easier to conduct studies to identify novel harmful mutations," Martin Enge concludes.

Aug 16, 2013113 notes
#transcription factors #DNA sequence #hereditary diseases #cohesin #genetics #neuroscience #science
Sympathetic Neurons Engage in “Cross Talk” With Cells in the Pancreas During Early Development

The human body is a complicated system of blood vessels, nerves, organs, tissue and cells each with a specific job to do. When all are working together, it’s a symphony of form and function as each instrument plays its intended roles.

image

Biologist Rejji Kuruvilla and her fellow researchers uncovered what happens when one instrument is not playing its part.

Kuruvilla along with graduate students Philip Borden and Jessica Houtz, both from the Biology Department at Johns Hopkins University’s Krieger School of Arts and Sciences, and Dr. Steven Leach from the McKusick-Nathans Institute of Genetic Medicine at the Johns Hopkins School of Medicine, recently published a paper in the journal Cell Reports exploring whether “cross-talk” or reciprocal signaling, takes place between the neurons in the sympathetic nervous system and the tissues that the nerves connect to. In this case the targeted tissue called islets, were in the pancreas.

“We knew that sympathetic neurons need molecular signals from the tissues that they connect with, to grow and survive,” said Kuruvilla. “What we did not know was whether the neurons would reciprocally signal to the target tissues to instruct them to grow and mature. It made sense to focus on the pancreas because of previous studies done in diabetic animal models where sympathetic nerves within the pancreas were found to retract early on in the disease, suggesting that dysfunction of the nerves could be an early trigger for pancreatic defects.”

The researchers spent approximately three years working with lab mice to test the various scenarios in which signaling between sympathetic neurons and islet cells might take place. The experiments focused on what effects removing the sympathetic nerves would have on pancreas development in newborn mice.

Previous studies had shown that pancreatic cells release a signal of their own, a nerve growth protein, that directs the sympathetic nerves toward the pancreas and provides necessary nutrition to sustain the nerves.

In turn, Kuruvilla’s team found that in mutant mice, the removal of the sympathetic neurons resulted in deformities in the architecture of the pancreatic islet cells and defects in insulin secretion and glucose metabolism.

Pancreatic islets are highly organized functional micro-organs with a defined size, shape and distinctive arrangement of endocrine cells. It’s this marriage of form and function that result in cells clustered close together, that creates greater, more efficient islet cell function.

However, the mutant mice, with their sympathetic neurons removed, had islet formations that were misshapen, sported lesions and developed in a patchy, uneven manner. Because of their dysfunctional islet cell development, postnatal mice did not secrete enough insulin when confronted with high glucose, and had high blood glucose levels as a result. Increased levels of blood glucose in humans is a hallmark of diabetes.

It’s known in neuroscience that the neurons in question from the sympathetic nervous system control the body’s “flight or fight” response and communicate with connected tissues by releasing a chemical messenger called norepinephrine. The release of norepinephrine also plays an important role in the development and maturation of islets, said Kuruvilla.

Using sympathetic neurons and islet cells grown together in a culture dish, the researchers observed that islet cells move toward the nerves and identified norepinephrine as the nerve signal that causes the movement of the islet cells.

“Seeing how these islet cells were responding to sympathetic neurons both in a dish and the effects of removing the nerves in a whole animal on islet shape and functions were pretty remarkable,” said Borden, lead author of the paper. “It was clear to us that sympathetic neurons were key to how islets were developing, something no one else had shown.”

Kuruvilla said these studies, identifying sympathetic nerves as a critical player in organizing pancreatic cells during development and influencing their later function, could add to a better understanding of treating diabetes in the future. The research also lends support to the value in considering the importance of external factors such as nerves and blood vessels when transplanting islet cells for the treatment of diabetes in patients.

“This study reveals interactions between two co-developing systems, sympathetic neurons and pancreatic islet cells, that has important implications for peripheral organ development, and for regeneration of these tissues following injury or disease,” said Kuruvilla.

Aug 16, 201353 notes
#sympathetic nervous system #sympathetic neurons #pancreatic cells #norepinephrine #neuroscience #science
Aug 16, 201395 notes
#depression #biomarker #bipolar disorder #neuroimaging #psychology #neuroscience #science
Worms May Shed Light on Human Ability to Handle Chronic Stress

New research at Rutgers University may help shed light on how and why nervous system changes occur and what causes some people to suffer from life-threatening anxiety disorders while others are better able to cope.

image

Maureen Barr, a professor in the Department of Genetics, and a team of researchers, found that the architectural structure of the six sensory brain cells in the roundworm, responsible for receiving information, undergo major changes and become much more elaborate when the worm is put into a high stress environment.

Scientists have known for some time that changes in the tree-like dendrite structures that connect neurons in the human brain and enable our thought processes to work properly can occur under extreme stress, alter brain cell development and result in anxiety disorders like depression and Post Traumatic Stress Disorder affecting millions of Americans each year.

What scientists don’t understand for sure, Barr says, is the cause behind these molecular changes in the brain.

“This type of research provides us necessary clues that ultimately could lead to the development of drugs to help those suffering with severe anxiety disorders,” Barr says.

In the study published today in Current Biology, scientists at Rutgers have identified six sensory nerve cells in the tiny, transparent roundworm, known as the C. elegans and an enzyme called KPC-1/furin which triggers a chemical reaction in humans that is needed for essential life functions like blood-clotting. 

While the enzyme also appears to play a role in the growth of tumors and the activation of several types of virus and diseases in humans, in the roundworm the enzyme enables its simple neurons to morph into new elaborately branched shapes when placed under adverse conditions.

Normally, this one-millimeter long worm develops from an embryo through four larval stages before molting into a reproductive adult. Put it under stressful conditions of overcrowding, starvation and high temperature and the worm transforms into an alternative larval stage known as the dauer that becomes so stress-resistant it can survive almost anything – including the Space Shuttle Columbia disaster in 2003 of which they were the only living things to survive.  

“These worms that normally have a short life cycle turn into super worms when they go into the dauer stage and can live for months, although they are no longer able to reproduce,” Barr says.

What is so interesting to Barr is that when a perceived threat is over, these tiny creatures and their IL2 neurons transform back to a normal lifespan and reproductive state like nothing had ever happened. Under a microscope, the complicated looking tree-like connectors that receive information are pruned back and the worm appears as it did before the trauma occurred.

This type of neural reaction differs in humans who can suffer from extreme anxiety months or even years after the traumatic event even though they are no longer in a threatening situation.   

The ultimate goal, Barr says, is to determine how and why the nervous system responds to stress. By identifying molecular pathways that regulate neuronal remodeling, scientists may apply this knowledge to develop future therapeutics.

Aug 16, 201386 notes
#chronic stress #PTSD #anxiety #C. elegans #KPC-1/furin #neuroscience #science
Human eye movements for vision are remarkably adaptable

When something gets in the way of our ability to see, we quickly pick up a new way to look, in much the same way that we would learn to ride a bike, according to a new study published in the Cell Press journal Current Biology on August 15.

image

Our eyes are constantly on the move, darting this way and that four to five times per second. Now researchers have found that the precise manner of those eye movements can change within a matter of hours. This discovery by researchers from the University of Southern California might suggest a way to help those with macular degeneration better cope with vision loss.

"The system that controls how the eyes move is far more malleable than the literature has suggested," says Bosco Tjan of the University of Southern California. "We showed that people with normal vision can quickly adjust to a temporary occlusion of their foveal vision by adapting a consistent point in their peripheral vision as their new point of gaze."

The fovea refers to the small, center-most portion of the retina, which is responsible for our high-resolution vision. We move our eyes to direct the fovea to different parts of a scene, constructing a picture of the world around us. In those with age-related macular degeneration, progressive loss of foveal vision leads to visual impairment and blindness.

In the new study, MiYoung Kwon, Anirvan Nandy, and Tjan simulated a loss of foveal vision in six normally sighted young adults by blocking part of a visual scene with a gray disc that followed the individuals’ eye gaze. Those individuals were then asked to complete demanding object-following and visual-search tasks. Within three hours of working on those tasks, people showed a remarkably fast and spontaneous adjustment of eye movements. Once developed, that change in their “point of gaze” was retained over a period of weeks and was reengaged whenever their foveal vision was blocked.

Tjan and his team say they were surprised by the rate of this adjustment. They note that patients with macular degeneration frequently do adapt their point of gaze, but in a process that takes months, not days or hours. They suggest that practice with a visible gray disc like the one used in the study might help speed that process of visual rehabilitation along. The discovery also reveals that the oculomotor (eye movement) system prefers control simplicity over optimality.

"Gaze control by the oculomotor system, although highly automatic, is malleable in the same sense that motor control of the limbs is malleable," Tjan says. "This finding is potentially very good news for people who lose their foveal vision due to macular diseases. It may be possible to create the right conditions for the oculomotor system to quickly adjust," Kwon adds.

Aug 16, 201376 notes
#eye movements #vision loss #macular degeneration #fovea #foveal vision #neuroscience #science
Aug 16, 2013131 notes
#science #visual processing #vision #neural circuitry #robotics #neuroscience
A New Wrinkle in Parkinson’s Disease Research

The active ingredient in an over-the-counter skin cream might do more than prevent wrinkles. Scientists have discovered that the drug, called kinetin, also slows or stops the effects of Parkinson’s disease on brain cells.

image

Scientists identified the link through biochemical and cellular studies, but the research team is now testing the drug in animal models of Parkinson’s. The research is published in the August 15, 2013 issue of the journal Cell.

“Kinetin is a great molecule to pursue because it’s already sold in drugstores as a topical anti-wrinkle cream,” says HHMI investigator Kevan Shokat of the University of California, San Francisco. “So it’s a drug we know has been in people and is safe.”

Parkinson’s disease is a degenerative disease that causes the death of neurons in the brain. Initially, the disease affects one’s movement and causes tremors, difficulty walking, and slurred speech. Later stages of the disease can cause dementia and broader health problems. In 2004, researchers studying an Italian family with a high prevalence of early-onset Parkinson’s disease discovered mutations in a protein called PINK1 associated with the inherited form of the disease.

Since then, studies have shown that PINK1 normally wedges into the membrane of damaged mitochondria inside cells that causes another protein, Parkin, to be recruited to the mitochondria, which are organelles responsible for energy generation. Neurons require high levels of energy production, therefore when mitochondrial damage occurs, it can lead to neuronal death. However, when Parkin is present on damaged mitochondria, studding the mitochondrial surface, the cell is able to survive the damage. In people who inherit mutations in PINK1, however, Parkin is never recruited to the organelles, leading to more frequent neuronal death than usual.

Shokat and his colleagues wanted to develop a way to turn on or crank up PINK1 activity, therefore preventing an excess of cell death, in those with inherited Parkinson’s disease. But turning on activity of a mutant enzyme is typically more difficult than blocking activity of an overactive version.

“When we started this project, we really thought that there would be no conceivable way to make something that directly turns on the enzyme,” says Shokat. “For any enzyme we know that causes a disease, we have ways to make inhibitors but no real ways to turn up activity.”

His team expected it would have to find a less direct way to mimic the activity of PINK1 and recruit Parkin. In the hopes of more fully understanding how PINK1 works, they began investigating how PINK1 binds to ATP, the energy molecule that normally turns it on. In one test, instead of adding ATP to the enzymes, they added different ATP analogues, versions of ATP with altered chemical groups that slightly change its shape. Scientists typically must engineer new versions of proteins to be able to accept these analogs, since they don’t fit into the typical ATP binding site. But to Shokat’s surprise, one of the analogs—kinetin triphosphate, or KTP—turned on the activity of not only normal PINK1, but also the mutated version, which doesn’t bind ATP.

“This drug does something that chemically we just never thought was possible,” says Shokat. “But it goes to show that if you find the right key for the right lock, you’ll be able to open the door.”

To test whether the binding of KTP to PINK1 led to the same consequences as the usual ATP binding, Shokat’s group measured the activity of PINK1 directly, as well as the downstream consequences of this activity, including the amount of Parkin recruited to the mitochondrial surface, and the levels of cell death. Adding the precursor of KTP, kinetin, to cells—both those with PINK1 mutations and those with normal physiology—amplified the activity of PINK1, increased the level of Parkin on damaged mitochondria, and decreased levels of neuron death, they found.

“What we have here is a case where the molecular target has been shown to be important to Parkinson’s in human genetic studies,” says Shokat. “And now we have a drug that specifically acts on this target and reverses the cellular causes of the disease.”

The similar results in cells with and without PINK1 mutations suggest that kinetin, which is a precursor to KTP, could be used to treat not only Parkinson’s patients with a known PINK1 mutation, but to slow progression of the disease in those without a family history by decreasing cell death.

Shokat is now performing experiments on the effects of kinetin in mice with various forms of Parkinson’s disease. However, the usefulness of animal models in Parkinson’s research has been debated, and therefore the positive results from the cellular data, he says, is as good an indicator as results in animals that this drug has potential to treat Parkinson’s in humans. Initial human studies will likely focus on the small population of patients with PINK1 mutations, and if successful in that group the drug could later be tested in a wider array of Parkinson’s patients.

Aug 16, 201374 notes
#parkinson's disease #kinetin #animal model #PINK1 mutations #genetics #neuroscience #science
Aug 15, 201340 notes
#alzheimer's disease #dementia #genetics #mRNA #neurology #neuroscience #science
Study debunks controversial MS theory

There is no evidence that impaired blood flow or blockage in the veins of the neck or head is involved in multiple sclerosis, says a McMaster University study.

The research, published online by PLOS ONE Wednesday, found no evidence of abnormalities in the internal jugular or vertebral veins or in the deep cerebral veins of any of 100 patients with multiple sclerosis (MS) compared with 100 people who had no history of any neurological condition.

The study contradicts a controversial theory that says that MS, a chronic, neurodegenerative and inflammatory disease of the central nervous system, is associated with abnormalities in the drainage of venous blood from the brain. In 2008 Italian researcher Paolo Zamboni said that angioplasty, a blockage clearing procedure, would help MS patients with a condition he called chronic cerebrospinal venous insufficiency (CCSVI). This caused a flood of public response in Canada and elsewhere, with many concerned individuals lobbying for support of the ‘Liberation Treatment’ to clear the veins, as advocated by Zamboni.

“This is the first Canadian study to provide compelling evidence against the involvement of CCSVI in MS,” said principal investigator Ian Rodger, a professor emeritus of medicine in the Michael G. DeGroote School of Medicine. “Our findings bring a much needed perspective to the debate surrounding venous angioplasty for MS patients”.

In the study all participants received an ultrasound of deep cerebral veins and neck veins as well as a magnetic resonance imaging (MRI) of the neck veins and brain. Each participant had both examinations performed on the same day. The McMaster research team included a radiologist and two ultrasound technicians who had trained in the Zamboni technique at the Department of Vascular Surgery of the University of Ferrara.

Aug 15, 201350 notes
#MS #neuroimaging #cerebral veins #vertebral veins #neurology #neuroscience #science
Brain scans could predict response to antipsychotic medication

Researchers from King’s College London and the University of Nottingham have identified neuroimaging markers in the brain which could help predict whether people with psychosis respond to antipsychotic medications or not.

image

In approximately half of young people experiencing their first episode of a psychosis (FEP), the symptoms do not improve considerably with the initial medication prescribed, increasing the risk of subsequent episodes and worse outcome. Identifying individuals at greatest risk of not responding to existing medications could help in the search for improved medications, and may eventually help clinicians personalize treatment plans.

In a study published today in JAMA Psychiatry, researchers used structural Magnetic Resonance Imaging (MRI) to scan the brains of 126 individuals – 80 presenting with FEP, and 46 healthy controls. Participants had an MRI scan shortly after their FEP, and another assessment 12 weeks later, to establish whether symptoms had improved following the first treatment with antipsychotic medications.

The researchers examined a particular feature of the brain called “cortical gyrification” - the extent of folding of the cerebral cortex and a marker of how it has developed. They found that the individuals who did not respond to treatment already had a significant reduction in gyrification across multiple brain regions, compared to patients who did respond and to individuals without psychosis. This reduced gyrification was particularly present in brain areas considered important in psychosis, such as the temporal and frontal lobes. Those who responded to treatment were virtually indistinguishable from the healthy controls.

The researchers also investigated whether the differences could be explained by the type of diagnosis of psychosis (eg. with or without affective symptoms, such as depression or elated mood). They found that reduced gyrification predicted non-response to treatment independently of the diagnosis. 

Dr Paola Dazzan from the Department of Psychosis Studies at King’s College London’s Institute of Psychiatry, and senior author of the paper, says: “Our study provides crucial evidence of a neuroimaging marker that, if validated, could be used early in psychosis to help identify those people less likely to respond to medications. It is possible that the alterations we observed are due to differences in the way the brain has developed early on in people who do not respond to medication compared to those who do.”

She continues:”There have been few advances in developing novel anti-psychotic drugs over the past 50 years and we still face the same problems with a sub-group of people who do not respond to the drugs we currently use. We could envisage using a marker like this one to identify people who are least likely to respond to existing medications and focus our efforts on developing new medication specifically adapted to this group. In the longer term, if we were able to identify poor responders at the outset, we may be able to formulate personalized treatment plans for that individual patient.” 

Dr Lena Palaniyappan from the University of Nottingham adds: “All of us have complex and varying patterns of folding in our brains. For the first time we are showing that the measurement of these variations could potentially guide us in treating psychosis. It is possible that people with specific patterns of brain structure respond better to treatments other than antipsychotics that are currently in use. Clearly, the time is ripe for us to focus on utilising neuroimaging to guide treatment decisions.”

Psychosis is a term used to indicate mental health disorders that present with symptoms like hallucinations (such as hearing voices) or delusions (unshakeable beliefs based on the person’s altered perception of reality, which may not correspond to the way others see the world). Psychotic episodes are present in conditions such as schizophrenia and bipolar disorder.

Approximately 1 in 100 people in England have at least one episode of psychosis throughout their lives. In most cases, psychosis develops during late adolescence (15 or above) or adulthood. Treatment involves a combination of antipsychotic medication, psychological therapies and social support. Many people with psychosis go on to lead ordinary lives and for about 60% of people, the symptoms disappear within 12 months from onset. However, for others, treatment is less straightforward and many do not respond to the initial antipsychotic treatment prescribed by their doctor. Early response to antipsychotic medication is known to be associated with better outcome and fewer subsequent episodes, and intervening early with effective treatments is therefore important.

Aug 15, 2013115 notes
#brain scans #antipsychotic medications #neuroimaging #psychosis #cortical gyrification #neuroscience #science
Aug 15, 2013130 notes
#science #brain activity #EEG #loss of balance #sensorimotor cortex #neuroscience
Aug 15, 2013149 notes
#AI #computer chips #memristor devices #neural networks #neuroscience #science
Aug 15, 2013166 notes
#axons #dendrites #nerve damage #neurons #neuronal circuit #neurodegenerative diseases #neuroscience #science
Aug 15, 2013672 notes
#brain function #right-brained #left-brained #neuroimaging #personality traits #psychology #neuroscience #science
Newly Discovered ‘Switch’ Plays Dual Role In Memory Formation

Researchers at Johns Hopkins have uncovered a protein switch that can either increase or decrease memory-building activity in brain cells, depending on the signals it detects. Its dual role means the protein is key to understanding the complex network of signals that shapes our brain’s circuitry, the researchers say. A description of their discovery appears in the July 31 issue of the Journal of Neuroscience.

“What’s interesting about this protein, AGAP3, is that it is effectively double-sided: One side beefs up synapses in response to brain activity, while the other side helps bring synapse-building back down to the brain’s resting state,” says Richard Huganir, Ph.D., a professor and director of the Solomon H. Snyder Department of Neuroscience at the Johns Hopkins University School of Medicine and co-director of the Brain Science Institute at Johns Hopkins. “The fact that it links these two opposing activities indicates AGAP3 may turn out to be central to controlling the strength of synapses.”

Huganir has long studied how connections between brain cells, known as synapses, are strengthened and weakened to form or erase memories. The new discovery came about when he and postdoctoral fellow Yuko Oku, Ph.D., investigated the chain reaction of signals involved in one type of synaptic strengthening.

In a study of the proteins that interact with one of the known proteins from that chain reaction, the previously unknown AGAP3 turned up. It contained not only a site designed to bind another protein involved in the chain reaction that leads from brain stimulation to learning, but also a second site involved in bringing synapse-building activity down to normal levels after a burst of activity.

Although it might seem the two different functions are behaving at cross-purposes, Oku says, it also could be that nature’s bundling of these functions together in a single protein is an elegant way of enabling learning and memory while preventing dangerous overstimulation. More research is needed, Oku says, to figure out whether AGAP3’s two sites coordinate by affecting each other’s activity, or are effectively free agents.

Aug 14, 201372 notes
#memory #synapses #AGAP3 #AMPA receptors #NMDA receptors #LTP #neuroscience #science
Aug 14, 2013168 notes
#hypnotic suggestions #consciousness #color perception #brain activity #visual hallucinations #neuroscience #science
Study identifies new culprit that may make aging brains susceptible to neurodegenerative diseases

The steady accumulation of a protein in healthy, aging brains may explain seniors’ vulnerability to neurodegenerative disorders, a new study by researchers at the Stanford University School of Medicine reports.

The study’s unexpected findings could fundamentally change the way scientists think about neurodegenerative disease.

The pharmaceutical industry has spent billions of dollars on futile clinical trials directed at treating Alzheimer’s disease by ridding brains of a substance called amyloid plaque. But the new findings have identified another mechanism, involving an entirely different substance, that may lie at the root not only of Alzheimer’s but of many other neurodegenerative disorders — and, perhaps, even the more subtle decline that accompanies normal aging.

The study, published Aug. 14 in the Journal of Neuroscience, reveals that with advancing age, a protein called C1q, well-known as a key initiator of immune response, increasingly lodges at contact points connecting nerve cells in the brain to one another. Elevated C1q concentrations at these contact points, or synapses, may render them prone to catastrophic destruction by brain-dwelling immune cells, triggered when a catalytic event such as brain injury, systemic infection or a series of small strokes unleashes a second set of substances on the synapses.

“No other protein has ever been shown to increase nearly so profoundly with normal brain aging,” said Ben Barres, MD, PhD, professor and chair of neurobiology and senior author of the study. Examinations of mouse and human brain tissue showed as much as a 300-fold age-related buildup of C1q.

The finding was made possible by the diligence and ingenuity of the study’s lead author, Alexander Stephan, PhD, a postdoctoral scholar in Barres’ lab. Stephan screened about 1,000 antibodies before finding one that binds to C1q and nothing else. (Antibodies are proteins, generated by the immune system, that adhere to specific “biochemical shapes,” such as surface features of invading pathogens.)

Comparing brain tissue from mice of varying ages, as well as postmortem samples from a 2-month-old infant and an older person, the researchers showed that these C1q deposits weren’t randomly distributed along nerve cells but, rather, were heavily concentrated at synapses. Analyses of brain slices from mice across a range of ages showed that as the animals age, the deposits spread throughout the brain.

“The first regions of the brain to show a dramatic increase in C1q are places like the hippocampus and substantia nigra, the precise brain regions most vulnerable to neurodegenerative diseases like Alzheimer’s and Parkinson’s disease, respectively,” said Barres. Another region affected early on, the piriform cortex, is associated with the sense of smell, whose loss often heralds the onset of neurodegenerative disease.

Other scientists have observed moderate, age-associated increases (on the order of three- or four-fold) in brain levels of the messenger-RNA molecule responsible for transmitting the genetic instructions for manufacturing C1q to the protein-making machinery in cells. Testing for messenger-RNA levels — typically considered reasonable proxies for how much of a particular protein is being produced — is fast, easy and cheap compared with analyzing proteins.

But in this study, Barres and his colleagues used biochemical measures of the protein itself. “The 300-fold rise in C1q levels we saw in 2-year-old mice — equivalent to 70- or 80-year-old humans — knocked my socks off,” Barres said. “I was not expecting that at all.”

C1q is the first batter on a 20-member team of immune-response-triggering proteins, collectively called the complement system. C1q is capable of clinging to the surface of foreign bodies such as bacteria or to bits of our own dead or dying cells. This initiates a molecular chain reaction known as the complement cascade. One by one, the system’s other proteins glom on, coating the offending cell or piece of debris. This in turn draws the attention of omnivorous immune cells that gobble up the target.

The brain has its own set of immune cells, called microglia, which can secrete C1q. Still other brain cells, called astrocytes, secrete all of C1q’s complement-system “teammates.” The two cell types work analogously to the two tubes of an Epoxy kit, in which one tube contains the resin, the other a catalyst.

Previous work in Barres’ lab has shown that the complement cascade plays a critical role in the developing brain. A young brain generates an excess of synapses, creating a huge range of options for the potential formation of new neural circuits. These synapses strengthen or weaken over time, in response to their heavy use or neglect. The presence of feckless connections contributes noise to the system, so the efficiency of the maturing brain’s architecture is improved if these underused synapses are pruned away.

In a 2007 paper in Cell, Barres’ group reported that the complement system is essential to synaptic pruning in normal, developing brains. Then in 2012, in Neuron, in a collaboration with the lab of Harvard neuroscientist Beth Stevens, PhD, they showed that it is specifically microglia — the brain’s in-house immune cells — that attack and ingest complement-coated synapses.

Barres now believes something similar is happening in the normal, aging brain. C1q, but not the other protein components of the complement system, gradually becomes highly prevalent at synapses. By itself, this C1q buildup doesn’t trigger wholesale synapse loss, the researchers found — although it does seem to impair their performance. Old mice whose capacity to produce C1q had been eliminated performed subtly better on memory and learning tests than normal older mice did.

Still, this leaves the aging brain’s synapses precariously perched on the brink of catastrophe. A subsequent event such as brain trauma, a bad case of pneumonia or perhaps a series of tiny strokes that some older people experience could incite astrocytes — the second tube in the Epoxy kit — to start secreting the other complement-system proteins required for synapse destruction.

Most cells in the body have their own complement-inhibiting agents. This prevents the wholesale loss of healthy tissue during an immune attack on invading pathogens or debris from dead tissue during wound healing. But nerve cells lack their own supply of complement inhibitors. So, when astrocytes get activated, their ensuing release of C1q’s teammates may set off a synapse-destroying rampage that spreads “like a fire burning through the brain,” Barres said.

“Our findings may well explain the long-mysterious vulnerability specifically of the aging brain to neurodegenerative disease,” he said. “Kids don’t get Alzheimer’s or Parkinson’s. Profound activation of the complement cascade, associated with massive synapse loss, is the cardinal feature of Alzheimer’s disease and many other neurodegenerative disorders. People have thought this was because synapse loss triggers inflammation. But our findings here suggest that activation of the complement cascade is driving synapse loss, not the other way around.”

Aug 14, 201368 notes
#neurodegenerative diseases #aging #alzheimer's disease #immune cells #microglia #neuroscience #science
Aug 14, 201382 notes
#dyslexia #language processing #arcuate fasciculus #neuroimaging #neuroscience #science
Aug 13, 2013109 notes
#dementia #aphasia #primary progressive aphasia #cognitive impairment #neuroimaging #neuroscience #science
New clue on the origin of Huntington’s disease

The synapses in the brain act as key communication points between approximately one hundred billion neurons. They form a complex network connecting various centres in the brain through electrical impulses.

New research from Lund University suggests that it is precisely here, in the synapses, that Huntington’s disease might begin.

The researchers looked into the brains of mice with real-time imaging methods, following some of the very first stages of the disease through advanced microscopes. What they discovered was an unprecedented degradation of synaptic activity. Long before the well documented nerve cell death, synapses that are important for communication between brain centres that control memory and learning begin to wither. This process has never been mapped before and could be an important step towards understanding the serious non-motor symptoms that affect Huntington patients long before the movement disorders start to show.
“With the naked eye, we have now been able to follow the step by step events when these synapses start to break down. If we are to halt or reverse this process in the future, it is necessary to understand exactly what happens in the initial phase of the disease. Now we know more”, says Professor Jia-Yi Li, the research group leader.

Huntington’s disease has long been characterized by the involuntary writhing movements faced by patients. But in fact, Huntington’s has a very broad and highly individual symptomatology. Depression, memory loss and sleep disorders are all common early on in the disease.
“Many patients testify that these symptoms affect quality of life significantly more than the involuntary jerky movements. Therefore, it is extremely important that we achieve progress in this field of research. Our goal now is to find new therapies that can increase the lifespan of these synapses and maintain their vital function”, explains postdoc Reena, who lead the imaging experiments.

Aug 13, 201374 notes
#huntington's disease #synapses #synaptic activity #memory #learning #neuroscience #science
Aug 13, 201388 notes
#stroke #retina #retinal imaging #blood vessels #hypertensive retinopathy #medicine #science
Aug 13, 201353 notes
#olfactory bulb #olfactory retentivity #odor memory #memory #channelrhodopsin #neuroscience #science
Aug 13, 2013183 notes
#consciousness #near-death experience #brain activity #dying brain #animal model #neuroscience #science
There's Life After Radiation for Brain Cells

Johns Hopkins researchers suggest neural stem cells may regenerate after anti-cancer treatment

image

Scientists have long believed that healthy brain cells, once damaged by radiation designed to kill brain tumors, cannot regenerate. But new Johns Hopkins research in mice suggests that neural stem cells, the body’s source of new brain cells, are resistant to radiation, and can be roused from a hibernation-like state to reproduce and generate new cells able to migrate, replace injured cells and potentially restore lost function.

“Despite being hit hard by radiation, it turns out that neural stem cells are like the special forces, on standby waiting to be activated,” says Alfredo Quiñones-Hinojosa, M.D., a professor of neurosurgery at the Johns Hopkins University School of Medicine and leader of a study described online today in the journal Stem Cells. “Now we might figure out how to unleash the potential of these stem cells to repair human brain damage.”

The findings, Quiñones-Hinojosa adds, may have implications not only for brain cancer patients, but also for people with progressive neurological diseases such as multiple sclerosis (MS) and Parkinson’s disease (PD), in which cognitive functions worsen as the brain suffers permanent damage over time.

In Quiñones-Hinojosa’s laboratory, the researchers examined the impact of radiation on mouse neural stem cells by testing the rodents’ responses to a subsequent brain injury. To do the experiment, the researchers used a device invented and used only at Johns Hopkins that accurately simulates localized radiation used in human cancer therapy. Other techniques, the researchers say, use too much radiation to precisely mimic the clinical experience of brain cancer patients.

In the weeks after radiation, the researchers injected the mice with lysolecithin, a substance that caused brain damage by inducing a demyelinating brain lesion, much like that present in MS. They found that neural stem cells within the irradiated subventricular zone of the brain generated new cells, which rushed to the damaged site to rescue newly injured cells. A month later, the new cells had incorporated into the demyelinated area where new myelin, the protein insulation that protects nerves, was being produced.

“These mice have brain damage, but that doesn’t mean it’s irreparable,” Quiñones-Hinojosa says. “This research is like detective work. We’re putting a lot of different clues together. This is another tiny piece of the puzzle. The brain has some innate capabilities to regenerate and we hope there is a way to take advantage of them. If we can let loose this potential in humans, we may be able to help them recover from radiation therapy, strokes, brain trauma, you name it.”

His findings may not be all good news, however. Neural stem cells have been linked to brain tumor development, Quiñones-Hinojosa cautions. The radiation resistance his experiments uncovered, he says, could explain why glioblastoma, the deadliest and most aggressive form of brain cancer, is so hard to treat with radiation.

Aug 13, 2013110 notes
#brain cancer #glioblastoma #stem cells #radiation #demyelination #neurology #neuroscience #science
Scientists develop ‘molecular flashlight’ that illuminates brain tumors in mice

In a breakthrough that could have wide-ranging applications in molecular medicine, Stanford University researchers have created a bioengineered peptide that enables imaging of medulloblastomas, among the most devastating of malignant childhood brain tumors, in lab mice.

image

The researchers altered the amino acid sequence of a cystine knot peptide — or knottin — derived from the seeds of the squirting cucumber, a plant native to Europe, North Africa and parts of Asia. Peptides are short chains of amino acids that are integral to cellular processes; knottin peptides are notable for their stability and resistance to breakdown.

The team used their invention as a “molecular flashlight” to distinguish tumors from surrounding healthy tissue. After injecting their bioengineered knottin into the bloodstreams of mice with medulloblastomas, the researchers found that the peptide stuck tightly to the tumors and could be detected using a high-sensitivity digital camera.

The findings are described in a study published online Aug. 12 in the Proceedings of the National Academy of Sciences.

“Researchers have been interested in this class of peptides for some time,” said Jennifer Cochran, PhD, an associate professor of bioengineering and a senior author of the study. “They’re extremely stable. For example, you can boil some of these peptides or expose them to harsh chemicals, and they’ll remain intact.”

That makes them potentially valuable in molecular medicine. Knottins could be used to deliver drugs to specific sites in the body or, as Cochran and her colleagues have demonstrated, as a means of illuminating tumors.

For treatment purposes, it’s critical to obtain accurate images of medulloblastomas. In conjunction with chemotherapy and radiation therapy, the tumors are often treated by surgical resection, and it can be difficult to remove them while leaving healthy tissue intact because their margins are often indistinct.

“With brain tumors, you really need to get the entire tumor and leave as much unaffected tissue as possible,” Cochran said. “These tumors can come back very aggressively if not completely removed, and their location makes cognitive impairment a possibility if healthy tissue is taken.”

The researchers’ molecular flashlight works by recognizing a biomarker on human tumors. The bioengineered knottin is conjugated to a near-infrared imaging dye. When injected into the bloodstreams of a strain of mice that develop tumors similar to human medullublastomas, the peptide attaches to the brain tumors’ integrin receptors — sticky molecules that aid in adhesion to other cells.

But while the knottins stuck like glue to tumors, they were rapidly expelled from healthy tissue. “So the mouse brain tumors are readily apparent,” Cochran said. “They differentiate beautifully from the surrounding brain tissue.”

The new peptide represents a major advance in tumor-imaging technology, said Melanie Hayden Gephart, MD, neurosurgery chief resident at the Stanford Brain Tumor Center and a lead author of the paper.

"The most common technique to identify brain tumors relies on preoperative, intravenous injection of a contrast agent, enabling most tumors to be visualized on a magnetic resonance imaging scan," Gephart said. These MRI scans are used like in a computer program much like an intraoperative GPS system to locate and resect the tumors.

“But that has limitations,” she added. “When you’re using the contrast in an MRI scan to define the tumor margins, you’re basically working off a preoperative snapshot. The brain can sometimes shift during an operation, so there’s always the possibility you may not be as precise or accurate as you want to be. The great potential advantage of this new approach would be to illuminate the tumor in real time — you could see it directly under your microscope instead of relying on an image that was taken before surgery.”

Though the team’s research focused on medulloblastomas, Gephart said it’s likely the new knottins could prove useful in addressing other cancers.

“We know that integrins exist on many types of tumors,” she said. “The blood vessels that tumors develop to sustain themselves also contain integrins. So this has the potential for providing very detailed, real-time imaging for a wide variety of tumors.”

And imaging may not be the only application for the team’s engineered peptide.

“We’re very interested in related opportunities,” Cochran said. “We envision options we didn’t have before for getting molecules into the brain.” In other words, by substituting drugs for dye, the knottins might allow the delivery of therapeutic compounds directly to cranial tumors — something that has proved extremely difficult to date because of the blood/brain barrier, the mechanism that makes it difficult for pathogens, as well as medicines, to traverse from the bloodstream to the brain.

“We’re looking into it now,” Cochran said.

A little serendipity was involved in the peptide’s development, said Sarah Moore, a recently graduated bioengineering PhD student and another lead author of the study. Indeed, the propinquity of Cochran’s laboratory to co-author Matthew Scott’s lab at Stanford’s James H. Clark Center catalyzed the project. “Our labs are next to each other,” Moore said. “We had the peptide, and Matt had ideal models of pediatric brain tumors  —mice that develop tumors in a similar manner to human medulloblastomas. Our partnership grew out of that.”

Scott, PhD, professor of bioengineering and of developmental biology, credits the design of the Clark Center as a contributor to the project. The building is home to Stanford’s Bioengineering Department, a collaboration between the School of Engineering and the School of Medicine, and Stanford Bio-X, an initiative that encourages communication among researchers in diverse scientific disciplines.

“So in a very real sense, our project wasn’t an accident,” Scott said. “In fact, it’s exactly the kind of work the Clark Center was meant to foster. The lab spaces are wide and open, with very few walls and lots of glass. We have a restaurant that only has large tables — no tables for two, so people have to sit together. Everything is designed to increase the odds that people will meet and talk. It’s a form of social engineering that really works.”

Scott said he is gratified by the collaboration that led to the team’s breakthrough, and observed that the peptide has proved a direct boon to his own work. About 15 percent of Scott’s mice develop the tumors requisite for medulloblastoma research. The problem, he said, is that the cancers are cryptic in their early stages.

“By the time you know the mice have them, many of the things you want to study — the genesis and development of the tumors — are past,” Scott said. “We needed ways to detect these tumors early, and we needed methods for following the steps of tumor genesis.”

Ultimately, Scott concluded, the development of the new peptide can be attributed to Stanford’s long-established traditions of openness and relentless inquiry.

“You find not just a willingness, but an eagerness to exchange ideas and information here,” Scott said. “It transcends any competitive instinct, any impulse toward proprietary thinking. It is what makes Stanford — well, Stanford.”

Aug 13, 201389 notes
#medulloblastomas #brain tumors #integrins #peptide #medicine #science
Aug 13, 2013134 notes
#brain clots #intracerebral hemorrhage #technology #neurology #neuroscience #science
Aug 13, 2013137 notes
#brain mapping #lateral prefrontal cortex #posterior parietal cortex #cognitive processing #neural neetworks #neuroscience #science
Neuroscientists identify protein linked to Alzheimer's-like afflictions

A team of neuroscientists has identified a modification to a protein in laboratory mice linked to conditions associated with Alzheimer’s Disease. Their findings, which appear in the journal Nature Neuroscience, also point to a potential therapeutic intervention for alleviating memory-related disorders.

The research centered on eukaryotic initiation factor 2 alpha (eIF2alpha) and two enzymes that modify it with a phosphate group; this type of modification is termed phosphorylation. The phosphorylation of eIF2alpha, which decreases protein synthesis, was previously found at elevated levels in both humans diagnosed with Alzheimer’s and in Alzheimer’s Disease (AD) model mice.

"These results implicate the improper regulation of this protein in Alzheimer’s-like afflictions and offer new guidance in developing remedies to address the disease," said Eric Klann, a professor in New York University’s Center for Neural Science and the study’s senior author.

The study’s co-authors also included: Douglas Cavener, a professor of biology at Pennsylvania State University; Clarisse Bourbon, Evelina Gatti, and Philippe Pierre of Université de la Méditerranée in Marseille, France; and NYU researchers Tao Ma, Mimi A. Trinh, and Alyse J. Wexler.

It has been known for decades that triggering new protein synthesis is vital to the formation of long-term memories as well as for long-lasting synaptic plasticity — the ability of the neurons to change the collective strength of their connections with other neurons. Learning and memory are widely believed to result from changes in synaptic strength.

In recent years, researchers have found that both humans with Alzheimer’s Disease and AD model mice have relatively high levels of eIF2alpha phosphorylation. But the relationship between this characteristic and AD-related afflictions was unknown.

Klann and his colleagues hypothesized that abnormally high levels of eIF2alpha phosphorylation could become detrimental because, ultimately, protein synthesis would diminish, thereby undermining the ability to form long-term memories.

To explore this question, the researchers examined the neurological impact of two enzymes that phosphorylate eIF2alpha, kinases termed PERK and GCN2, in different populations of AD model mice — all of which expressed genetic mutations akin to those carried by humans with AD. These were: AD model mice; AD model mice that lacked PERK; and AD model mice that lacked GCN2.

Specifically, they looked at eIF2alpha phosphorylation and the regulation of protein synthesis in the mice’s hippocampus region — the part of the brain responsible for the retrieval of old memories and the encoding of new ones. They then compared these levels with those of postmortem human AD patients.

Here, they found both increased levels of phosphorylated eIF2alpha in the hippocampus of both AD patients and the AD model mice. Moreover, in conjunction with these results, they found decreased protein synthesis, known to be required for long-term potentiation — a form of long-lasting synaptic plasticity—and for long-term memory.

To test potential remedies, the researchers examined phosphorylation of eIF2alpha in mice lacking PERK, hypothesizing that removal of this kinase would return protein synthesis to normal levels. As predicted, mice lacking PERK had levels of phosphorylated eIF2alpha and protein synthesis similar to those of normal mice.

They then conducted spatial memory tests in which the mice needed to navigate a series of mazes. Here, the AD model mice lacking PERK were able to successfully maneuver through the mazes at rates achieved by normal mice. By contrast, the other AD model mice lagged significantly in performing these tasks.

The researchers replicated these procedures on AD model mice lacking GCN2. The results here were consistent with those of the AD model mice lacking PERK, demonstrating that removal of both kinases diminished memory deficits associated with Alzheimer’s Disease.

Aug 12, 201365 notes
#alzheimer's disease #protein synthesis #eIF2alpha #hippocampus #synaptic plasticity #neuroscience #science
Aug 12, 2013145 notes
#psychiatric disorders #mental illness #genetics #calcium channel #neuroscience #science
Why the #$%! Do We Swear? For Pain Relief

Bad language could be good for you, a new study shows. For the first time, psychologists have found that swearing may serve an important function in relieving pain.

image

The study, published in the journal NeuroReport, measured how long college students could keep their hands immersed in cold water. During the chilly exercise, they could repeat an expletive of their choice or chant a neutral word. When swearing, the 67 student volunteers reported less pain and on average endured about 40 seconds longer.

Although cursing is notoriously decried in the public debate, researchers are now beginning to question the idea that the phenomenon is all bad. “Swearing is such a common response to pain that there has to be an underlying reason why we do it,” says psychologist Richard Stephens of Keele University in England, who led the study. And indeed, the findings point to one possible benefit: “I would advise people, if they hurt themselves, to swear,” he adds.

How swearing achieves its physical effects is unclear, but the researchers speculate that brain circuitry linked to emotion is involved. Earlier studies have shown that unlike normal language, which relies on the outer few millimeters in the left hemisphere of the brain, expletives hinge on evolutionarily ancient structures buried deep inside the right half.

One such structure is the amygdala, an almond-shaped group of neurons that can trigger a fight-or-flight response in which our heart rate climbs and we become less sensitive to pain. Indeed, the students’ heart rates rose when they swore, a fact the researchers say suggests that the amygdala was activated.

That explanation is backed by other experts in the field. Psychologist Steven Pinker of Harvard University, whose book The Stuff of Thought (Viking Adult, 2007) includes a detailed analysis of swearing, compared the situation with what happens in the brain of a cat that somebody accidentally sits on. “I suspect that swearing taps into a defensive reflex in which an animal that is suddenly injured or confined erupts in a furious struggle, accompanied by an angry vocalization, to startle and intimidate an attacker,” he says.

But cursing is more than just aggression, explains Timothy Jay, a psychologist at the Massachusetts College of Liberal Arts who has studied our use of profanities for the past 35 years. “It allows us to vent or express anger, joy, surprise, happiness,” he remarks. “It’s like the horn on your car, you can do a lot of things with that, it’s built into you.”

In extreme cases, the hotline to the brain’s emotional system can make swearing harmful, as when road rage escalates into physical violence. But when the hammer slips, some well-chosen swearwords might help dull the pain.

There is a catch, though: The more we swear, the less emotionally potent the words become, Stephens cautions. And without emotion, all that is left of a swearword is the word itself, unlikely to soothe anyone’s pain.

Aug 11, 2013342 notes
#swearing #pain #pain tolerance #fight-or-flight response #psychology #neuroscience #science
Aug 11, 201373 notes
#Caffeine Orange #fluorescent caffeine sensor #caffeine detection #technology #science
Aug 11, 20131,600 notes
#science #brain #caffeine #addiction #blood-brain barrier #adenosine #dopamine #psychology #neuroscience
Aug 11, 2013171 notes
#deep brain stimulation #brain activity #Activa PC+S system #parkinson's disease #neuroscience #science
Aug 11, 2013640 notes
#3d printing #artificial ears #implants #medicine #science
Aug 11, 20133,614 notes
Next page →
20132014
  • January
  • February
  • March
  • April
  • May
  • June
  • July
  • August
  • September
  • October
  • November
  • December
201220132014
  • January
  • February
  • March
  • April
  • May
  • June
  • July
  • August
  • September
  • October
  • November
  • December
20122013
  • January
  • February
  • March
  • April
  • May
  • June
  • July
  • August
  • September
  • October
  • November
  • December