Neuroscience

Month

August 2013

Aug 22, 2013135 notes
#insula #frontal cortex #schizophrenia #neuroimaging #neuroscience #psychology #science
Playing video games can boost brain power

Certain types of video games can help to train the brain to become more agile and improve strategic thinking, according to scientists from Queen Mary University of London and University College London (UCL).

image

The researchers recruited 72 volunteers and measured their ‘cognitive flexibility’ described as a person’s ability to adapt and switch between tasks, and think about multiple ideas at a given time to solve problems.

Two groups of volunteers were trained to play different versions of a real-time strategy game called StarCraft, a fast-paced game where players have to construct and organise armies to battle an enemy. A third of the group played a life simulation video game called The Sims, which does not require much memory or many tactics.

All the volunteers played the video games for 40 hours over six to eight weeks, and were subjected to a variety of psychological tests before and after. All the participants happened to be female as the study was unable to recruit a sufficient number of male volunteers who played video games for less than two hours a week.

The researchers discovered that those who played StarCraft were quicker and more accurate in performing cognitive flexibility tasks, than those who played The Sims.

Dr Brian Glass from Queen Mary’s School of Biological and Chemical Sciences, said: “Previous research has demonstrated that action video games, such as Halo, can speed up decision making but the current work finds that real-time strategy games can promote our ability to think on the fly and learn from past mistakes.

“Our paper shows that cognitive flexibility, a cornerstone of human intelligence, is not a static trait but can be trained and improved using fun learning tools like gaming.”

Professor Brad Love from UCL, said:  “Cognitive flexibility varies across people and at different ages. For example, a fictional character like Sherlock Holmes has the ability to simultaneously engage in multiple aspects of thought and mentally shift in response to changing goals and environmental conditions.

“Creative problem solving and ‘thinking outside the box’ require cognitive flexibility. Perhaps in contrast to the repetitive nature of work in past centuries, the modern knowledge economy places a premium on cognitive flexibility.”

Dr Glass added: “The volunteers who played the most complex version of the video game performed the best in the post-game psychological tests. We need to understand now what exactly about these games is leading to these changes, and whether these cognitive boosts are permanent or if they dwindle over time. Once we have that understanding, it could become possible to develop clinical interventions for symptoms related to attention deficit hyperactivity disorder or traumatic brain injuries, for example.”

Aug 22, 2013222 notes
#video games #cognition #technology #neuroscience #science
Researchers Identify Conditions Most Likely to Kill Encephalitis Patients

People with severe encephalitis — inflammation of the brain — are much more likely to die if they develop severe swelling in the brain, intractable seizures or low blood platelet counts, regardless of the cause of their illness, according to new Johns Hopkins research.

The Johns Hopkins investigators say the findings suggest that if physicians are on the lookout for these potentially reversible conditions and treat them aggressively at the first sign of trouble, patients are more likely to survive.

“The factors most associated with death in these patients are things that we know how to treat,” says Arun Venkatesan, M.D., Ph.D., an assistant professor of neurology at the Johns Hopkins University School of Medicine and leader of the study published in the Aug. 27 issue of the journal Neurology.

Experts consider encephalitis something of a mystery, and its origins and progress unpredictable. While encephalitis may be caused by a virus, bacteria or autoimmune disease, a precise cause remains unknown in 50 percent of cases. Symptoms range from fever, headache and confusion in some, to seizures, severe weakness or language disability in others. The most complex cases can land patients in intensive care units, on ventilators, for months. Drugs like the antiviral acyclovir are available for herpes encephalitis, which occurs in up to 15 percent of cases, but for most cases, doctors have only steroids and immunosuppressant drugs, which carry serious side effects.

“Encephalitis is really a syndrome with many potential causes, rather than a single disease, making it difficult to study,” says Venkatesan, director of the Johns Hopkins Encephalitis Center.

In an effort to better predict outcomes for his patients, Venkatesan and his colleagues reviewed records of all 487 patients with acute encephalitis admitted to The Johns Hopkins Hospital and Johns Hopkins Bayview Medical Center between January 1997 and July 2011. They focused further attention on patients who spent at least 48 hours in the ICU during their hospital stays and who were over the age of 16. Of those 103 patients, 19 died. Patients who had severe swelling in the brain were 18 times more likely to die, while those with continuous seizures were eight times more likely to die. Those with low counts in blood platelets, the cells responsible for clotting, were more than six times more likely to die than those without this condition.

The findings can help physicians know which conditions should be closely monitored and when the most aggressive treatments — some of which can come with serious side effects — should be tried, the researchers say. For example, it may be wise to more frequently image the brains of these patients to check for increased brain swelling and the pressure buildup that accompanies it.

Venkatesan says patients with cerebral edema may do better if intracranial pressure is monitored continuously and treated aggressively. He cautioned that although his research suggests such a course, further studies are needed to determine if it leads to better outcomes for patients.

Similarly, he says research has yet to determine whether aggressively treating seizures and low platelet counts also decrease mortality.

Venkatesan and his colleagues are also developing better guidelines for diagnosing encephalitis more quickly so as to minimize brain damage. Depending on where in the brain the inflammation is, he says, the illness can mimic other diseases, making diagnosis more difficult.

Another of the study’s co-authors, Romergryko G. Geocadin, M.D., an associate professor of neurology who co-directs the encephalitis center and specializes in neurocritical care, says encephalitis patients in the ICU are “the sickest of the sick,” and he fears that sometimes doctors give up on the possibility of them getting better.

“This research should give families — and physicians — hope that, despite how bad it is, it may be reversible,” he says.

Aug 21, 201342 notes
#brain #encephalitis #cerebral edema #neurology #neuroscience #science
Aug 21, 2013105 notes
#learning #motor learning #sleep #neuroimaging #neuroscience #science
How brain microcircuits integrate information from different senses

A new publication in the top-ranked journal Neuron sheds new light onto the unknown processes on how the brain integrates the inputs from the different senses in the complex circuits formed by molecularly distinct types of nerve cells. The work was led by new Umeå University associate professor Paolo Medini.

One of the biggest challenges in Neuroscience is to understand how the cerebral cortex of the brain processes and integrates the inputs from the different senses (like vision, hearing and touch) to control for example, that we can respond to an event in the environment with precise movement of our body.

The brain cortex is composed by morphologically and functionally different types of nerve cells, e.g. excitatory, inhibitory, that connect in very precise ways. Paolo Medini and co-workers show that the integration of inputs from different senses in the brain occurs differently in excitatory and inhibitory cells, as well as in superficial and in the deep layers of the cortex, the latter ones being those that send electrical signals out from the cortex to other brain structures.

“The relevance and the innovation of this work is that by combining advanced techniques to visualize the functional activity of many nerve cells in the brain and new molecular genetic techniques that allows us to change the electrical activity of different cell types, we can for the first time understand how the different nerve cells composing brain circuits communicate with each other”, says Paolo Medini.

The new knowledge is essential to design much needed future strategies to stimulate brain repair. It is not enough to transplant nerve cells in the lesion site, as the biggest challenge is to re-create or re-activate these precise circuits made by nerve cells.

Paolo Medini has a Medical background and worked in Germany at the Max Planck Institute for Medical Research of Heidelberg, as well as a Team leader at the Italian Institute of Technology in Genova, Italy. He recently started on the Associate Professor position in Cellular and Molecular Physiology at the Molecular Biology Department.

He is now leading a brand new Brain Circuits Lab with state of state-of-the-art techniques such as two-photon microscopy, optogenetics and electrophysiology to investigate the circuit functioning and repair in the brain cortex. This investment has been possible by a generous contribution from the Kempe Foundation and by the combined effort of Umeå University.

“By combining cell physiology knowledge in the intact brain with molecular biology expertise, we plan to pave the way for this kind of innovative research that is new to Umeå University and nationally”, says Paolo Medini.

Aug 21, 201365 notes
#multisensory integration #cerebral cortex #nerve cells #neuroscience #science
A new role for sodium in the brain

Researchers at McGill University have found that sodium – the main chemical component in table salt – is a unique “on/off” switch for a major neurotransmitter receptor in the brain. This receptor, known as the kainate receptor, is fundamental for normal brain function and is implicated in numerous diseases, such as epilepsy and neuropathic pain.

image

Prof. Derek Bowie and his laboratory in McGill’s Department of Pharmacology and Therapeutics, worked with University of Oxford researchers to make the discovery. By offering a different view of how the brain transmits information, their research highlights a new target for drug development. The findings are published in the journal Nature Structural & Molecular Biology.

Balancing kainate receptor activity is the key to maintaining normal brain function. For example, in epilepsy, kainate activity is thought to be excessive. Thus, drugs which would shut down this activity are expected to be beneficial.

“It has been assumed for decades that the “on/off” switch for all brain receptors lies where the neurotransmitter binds,” says Prof. Bowie, who also holds a Canada Research Chair in Receptor Pharmacology. “However, we found a completely separate site that binds individual atoms of sodium and controls when kainate receptors get turned on and off.”

The sodium switch is unique to kainate receptors, which means that drugs designed to stimulate this switch, should not act elsewhere in the brain. This would be a major step forward, since drugs often affect many locations, in addition to those they were intended to act on, producing negative side-effects as a result. These so called “off-target effects” for drugs represent one of the greatest challenges facing modern medicine.

“Now that we know how to stimulate kainate receptors, we should be able to design drugs to essentially switch them off,” says Dr. Bowie.

Dr. Philip Biggin’s lab at Oxford University used computer simulations to predict how the presence or absence of sodium would affect the kainate receptor.

Aug 21, 2013106 notes
#sodium #kainate receptor #brain function #drug development #neuroscience #science
Study suggests iron is at core of Alzheimer's disease

Alzheimer’s disease has proven to be a difficult enemy to defeat. After all, aging is the No. 1 risk factor for the disorder, and there’s no stopping that.

Most researchers believe the disease is caused by one of two proteins, one called tau, the other beta-amyloid. As we age, most scientists say, these proteins either disrupt signaling between neurons or simply kill them.

Now, a new UCLA study suggests a third possible cause: iron accumulation.

Dr. George Bartzokis, a professor of psychiatry at the Semel Institute for Neuroscience and Human Behavior at UCLA and senior author of the study, and his colleagues looked at two areas of the brain in patients with Alzheimer’s. They compared the hippocampus, which is known to be damaged early in the disease, and the thalamus, an area that is generally not affected until the late stages. Using sophisticated brain-imaging techniques, they found that iron is increased in the hippocampus and is associated with tissue damage in that area. But increased iron was not found in the thalamus.

The research appears in the August edition of the Journal of Alzheimer’s Disease.

While most Alzheimer’s researchers focus on the buildup of tau or beta-amyloid that results in the signature plaques associated with the disease, Bartzokis has long argued that the breakdown begins much further “upstream.” The destruction of myelin, the fatty tissue that coats nerve fibers in the brain, he says, disrupts communication between neurons and promotes the buildup of the plaques. These amyloid plaques in turn destroy more and more myelin, disrupting brain signaling and leading to cell death and the classic clinical signs of Alzheimer’s.

Myelin is produced by cells called oligodendrocytes. These cells, along with myelin, have the highest levels of iron of any cells in the brain, Bartzokis says, and circumstantial evidence has long supported the possibility that brain iron levels might be a risk factor for age-related diseases like Alzheimer’s. Although iron is essential for cell function, too much of it can promote oxidative damage, to which the brain is especially vulnerable.

In the current study, Bartzokis and his colleagues tested their hypothesis that elevated tissue iron caused the tissue breakdown associated with Alzheimer’s disease. They targeted the vulnerable hippocampus, a key area of the brain involved in the formation of memories, and compared it to the thalamus, which is relatively spared by Alzheimer’s until the very late stages of disease.

The researchers used an MRI technique that can measure the amount of brain iron in ferritin, a protein that stores iron, in 31 patients with Alzheimer’s and 68 healthy control subjects.

In the presence of diseases like Alzheimer’s, as the structure of cells breaks down, the amount of water increases in the brain, which can mask the detection of iron, according to Bartzokis.

"It is difficult to measure iron in tissue when the tissue is already damaged," he said. "But the MRI technology we used in this study allowed us to determine that the increase in iron is occurring together with the tissue damage. We found that the amount of iron is increased in the hippocampus and is associated with tissue damage in patients with Alzheimer’s but not in the healthy older individuals — or in the thalamus. So the results suggest that iron accumulation may indeed contribute to the cause of Alzheimer’s disease."

But it’s not all bad news from this study, Bartzokis noted.

"The accumulation of iron in the brain may be influenced by modifying environmental factors, such as how much red meat and iron dietary supplements we consume and, in women, having hysterectomies before menopause," he said.

In addition, he noted, medications that chelate and remove iron from tissue are being developed by several pharmaceutical companies as treatments for the disorder. This MRI technology may allow doctors to determine who is most in need of such treatments.

Aug 21, 2013110 notes
#alzheimer's disease #dementia #iron accumulation #aging #hippocampus #oligodendrocytes #neuroscience #science
Aug 21, 2013733 notes
#hallucinogens #mental illness #psychedelic drugs #LSD #neuroscience #science
Aug 21, 2013207 notes
#musical hallucinations #auditory hallucinations #memory #neurology #neuroscience #science
First Pre-Clinical Gene Therapy Study to Reverse Rett Symptoms

The concept behind gene therapy is simple: deliver a healthy gene to compensate for one that is mutated. New research published today in the Journal of Neuroscience suggests this approach may eventually be a feasible option to treat Rett Syndrome, the most disabling of the autism spectrum disorders. Gail Mandel, Ph.D., a Howard Hughes Investigator at Oregon Health and Sciences University, led the study. The Rett Syndrome Research Trust, with generous support from the Rett Syndrome Research Trust UK and Rett Syndrome Research & Treatment Foundation, funded this work through the MECP2 Consortium.

In 2007, co-author Adrian Bird, Ph.D., at the University of Edinburgh astonished the scientific community with proof-of-concept that Rett is curable, by reversing symptoms in adult mice. His unexpected results catalyzed labs around the world to pursue a multitude of strategies to extend the pre-clinical findings to people.

Today’s study is the first to show reversal of symptoms in fully symptomatic mice using techniques of gene therapy that have potential for clinical application.

Rett Syndrome is an X-linked neurological disorder primarily affecting girls; in the US, about 1 in 10,000 children a year are born with Rett.  In most cases symptoms begin to manifest between 6 and 18 months of age, as developmental milestones are missed or lost. The regression that follows is characterized by loss of speech, mobility, and functional hand use, which is often replaced by Rett’s signature gesture: hand-wringing, sometimes so intense that it is a constant during every waking hour. Other symptoms include seizures, tremors, orthopedic and digestive problems, disordered breathing and other autonomic impairments, sensory issues and anxiety. Most children live into adulthood and require round-the-clock care.

The cause of Rett Syndrome’s terrible constellation of symptoms lies in mutations of an X-linked gene called MECP2 (methyl CpG-binding protein). MECP2 is a master gene that regulates the activity of many other genes, switching them on or off.

“Gene therapy is well suited for this disorder,” Dr. Mandel explains. “Because MECP2 binds to DNA throughout the genome, there is no single gene currently that we can point to and target with a drug. Therefore the best chance of having a major impact on the disorder is to correct the underlying defect in as many cells throughout the body as possible. Gene therapy allows us to do that.”

Healthy genes can be delivered into cells aboard a virus, which acts as a Trojan horse. Many different types of these Trojan horses exist. Dr. Mandel used adeno-associated virus serotype 9 (AAV9), which has the unusual and attractive ability to cross the blood-brain barrier. This allows the virus and its cargo to be administered intravenously, instead of employing more invasive direct brain delivery systems that require drilling burr holes into the skull.

Because the virus has limited cargo space, it cannot carry the entire MECP2 gene. Co-author Brian Kaspar of Nationwide Children’s Hospital collaborated with the Mandel lab to package only the gene’s most critical segments. After being injected into the Rett mice, the virus made its way to cells throughout the body and brain, distributing the modified gene, which then started to produce the MeCP2 protein.

As in human females with Rett Syndrome, only approximately 50% of the mouse cells have a healthy copy of MECP2. After the gene therapy treatment 65% of cells now had a functioning MECP2 gene.

The treated mice showed profound improvements in motor function, tremors, seizures and hind limb clasping. At the cellular level the smaller body size of neurons seen in mutant cells was restored to normal. Biochemical experiments proved that the gene had found its way into the nuclei of cells and was functioning as expected, binding to DNA.

One Rett symptom that was not ameliorated was abnormal respiration. Researchers hypothesize that correcting this may require targeting a greater number of cells than the 15% that had been achieved in the brainstem.

“We learned a critical and encouraging point with these experiments – that we don’t have to correct every cell in order to reverse symptoms. Going from 50% to 65% of the cells having a functioning gene resulted in significant improvements,” said co-author Saurabh Garg.

One of the potential challenges of gene therapy in Rett is the possibility of delivering multiple copies of the gene to a cell. We know from the MECP2 Duplication Syndrome that too much of this protein is detrimental. “Our results show that after gene therapy treatment the correct amount of MeCP2 protein was being expressed. At least in our hands, with these methods, overexpression of MeCP2 was not an issue,” said co-author Daniel Lioy.

Dr. Mandel cautioned that key steps remain before clinical trials can begin. “Our study is an important first step in highlighting the potential for AAV9 to treating the neurological symptoms in Rett. We are now working on improving the packaging of MeCP2 in the virus to see if we can target a larger percentage of cells and therefore improve symptoms even further,” said Mandel. Collaborators Hélène Cheval and Adrian Bird see this as a promising follow up to the 2007 work showing symptom reversal in Rett mice. “That study used genetic tricks that could not be directly applicable to humans, but the AAV9 vector used here could in principle deliver a gene therapeutically. This is an important step forward, but there is a way to go yet.”

“Gene therapy has had a tumultuous road in the past few decades but is undergoing a renaissance due to recent technological advances. Europe and Asia have gene therapy treatments already in the clinic and it’s likely that the US will follow suit. Our goal now is to prioritize the next key experiments and facilitate their execution as quickly as possible. Gene therapy, especially to the brain, is a tricky undertaking but I’m cautiously optimistic that with the right team we can lay out a plan for clinical development. I congratulate the Mandel and Bird labs on today’s publication, which is the third to be generated from the MECP2 Consortium in a short period of time,” said Monica Coenraads, Executive Director of the Rett Syndrome Research Trust and mother of a teenaged daughter with the disorder.

Aug 21, 201363 notes
#rett syndrome #gene therapy #neurological disorders #MECP2 #neuroscience #science
Aug 20, 2013127 notes
#brain implants #neural implants #neurology #neuroscience #technology #science
Aug 20, 201372 notes
#prion diseases #neurodegenerative diseases #animal model #prion proteins #neuroscience #science
Aug 20, 201358 notes
#brain lesions #white matter #memory decline #decompression sickness #neuroscience #science
Aug 20, 2013272 notes
#neuroimaging #brain activity #brain scans #neuroscience #science
Aug 20, 201368 notes
#concussion #TBI #brain injury #neuroimaging #neurology #neuroscience #science
Brain network decay detected in early Alzheimer’s

In patients with early Alzheimer’s disease, disruptions in brain networks emerge about the same time as chemical markers of the disease appear in the spinal fluid, researchers at Washington University School of Medicine in St. Louis have shown.

While two chemical markers in the spinal fluid are regarded as reliable indicators of early disease, the new study, published in JAMA Neurology, is among the first to show that scans of brain networks may be an equally effective and less invasive way to detect early disease.

“Tracking damage to these brain networks may also help us formulate a more detailed understanding of what happens to the brain before the onset of dementia,” said senior author Beau Ances, MD, PhD, associate professor of neurology and of biomedical engineering.

Diagnosing Alzheimer’s early is a top priority for physicians, many of whom believe that treating patients long before dementia starts greatly improves the chances of success.

Ances and his colleagues studied 207 older but cognitively normal research volunteers at the Charles F. and Joanne Knight Alzheimer’s Disease Research Center at Washington University. Over several years, spinal fluids from the volunteers were sampled multiple times and analyzed for two markers of early Alzheimer’s: changes in amyloid beta, the principal ingredient of Alzheimer’s brain plaques, and in tau protein, a structural component of nerve cells.

The volunteers were also scanned repeatedly using a technique called resting state functional magnetic resonance imaging (fMRI). This scan tracks the rise and fall of blood flow in different brain regions as patients rest in the scanner. Scientists use the resulting data to assess the integrity of the default mode network, a set of connections between different brain regions that becomes active when the mind is at rest.

Earlier studies by Ances and other researchers have shown that Alzheimer’s damages connections in the default mode network and other brain networks.

The new study revealed that this damage became detectable at about the same time that amyloid beta levels began to fall and tau levels started to rise in spinal fluid. The part of the default mode network most harmed by the onset of Alzheimer’s disease was the connection between two brain areas associated with memory, the posterior cingulate and medial temporal regions.

The researchers are continuing to study the connections between brain network damage and the progress of early Alzheimer’s disease in normal volunteers and in patients in the early stages of Alzheimer’s-associated dementia.

Aug 20, 201354 notes
#alzheimer's disease #dementia #neuroimaging #beta amyloid #neuroscience #science
Copper Identified as Culprit in Alzheimer’s Disease

Copper appears to be one of the main environmental factors that trigger the onset  and enhance the progression of Alzheimer’s disease by preventing the clearance and accelerating the accumulation of toxic proteins in the brain. That is the conclusion of a study appearing today in the journal Proceedings of the National Academy of Sciences. 

image

“It is clear that, over time, copper’s cumulative effect is to impair the systems by which amyloid beta is removed from the brain,” said Rashid Deane, Ph.D., a research professor in the University of Rochester Medical Center (URMC) Department of Neurosurgery, member of the Center for Translational Neuromedicine, and the lead author of the study. “This impairment is one of the key factors that cause the protein to accumulate in the brain and form the plaques that are the hallmark of Alzheimer’s disease.” 

Copper’s presence in the food supply is ubiquitous. It is found in drinking water carried by copper pipes, nutritional supplements, and in certain foods such as red meats, shellfish, nuts, and many fruits and vegetables. The mineral plays an important and beneficial role in nerve conduction, bone growth, the formation of connective tissue, and hormone secretion. 

However, the new study shows that copper can also accumulate in the brain and cause the blood brain barrier – the system that controls what enters and exits the brain – to break down, resulting in the toxic accumulation of the protein amyloid beta, a by-product of cellular activity.  Using both mice and human brain cells Deane and his colleagues conducted a series of experiments that have pinpointed the molecular mechanisms by which copper accelerates the pathology of Alzheimer’s disease.  

Under normal circumstances, amyloid beta is removed from the brain by a protein called lipoprotein receptor-related protein 1 (LRP1). These proteins – which line the capillaries that supply the brain with blood – bind with the amyloid beta found in the brain tissue and escort them into the blood vessels where they are removed from the brain. 

The research team“dosed” normal mice with copper over a three month period. The exposure consisted of trace amounts of the metal in drinking water and was one-tenth of the water quality standards for copper established by the Environmental Protection Agency. 

“These are very low levels of copper, equivalent to what people would consume in a normal diet.” said Deane.

The researchers found that the copper made its way into the blood system and accumulated in the vessels that feed blood to the brain, specifically in the cellular “walls” of the capillaries. These cells are a critical part of the brain’s defense system and help regulate the passage of molecules to and from brain tissue. In this instance, the capillary cells prevent the copper from entering the brain. However, over time the metal can accumulate in these cells with toxic effect. 

The researchers observed that the copper disrupted the function of LRP1 through a process called oxidation which, in turn, inhibited the removal of amyloid beta from the brain. They observed this phenomenon in both mouse and human brain cells.

The researchers then looked at the impact of copper exposure on mouse models of Alzheimer’s disease. In these mice, the cells that form the blood brain barrier have broken down and become “leaky” – a likely combination of aging and the cumulative effect of toxic assaults – allowing elements such as copper to pass unimpeded into the brain tissue. They observed that the copper stimulated activity in neurons that increased the production of amyloid beta. The copper also interacted with amyloid beta in a manner that caused the proteins to bind together in larger complexes creating logjams of the protein that the brain’s waste disposal system cannot clear. 

This one-two punch, inhibiting the clearance and stimulating the production of amyloid beta, provides strong evidence that copper is a key player in Alzheimer’s disease. In addition, the researchers observed that copper provoked inflammation of brain tissue which may further promote the breakdown of the blood brain barrier and the accumulation of Alzheimer’s-related toxins.  

However, because metal is essential to so many other functions in the body, the researchers say that these results must be interpreted with caution.

“Copper is an essential metal and it is clear that these effects are due to exposure over a long period of time,” said Deane. “The key will be striking the right balance between too little and too much copper consumption. Right now we cannot say what the right level will be, but diet may ultimately play an important role in regulating this process.”

Aug 20, 2013263 notes
#science #alzheimer's disease #dementia #copper #amyloid plaques #blood brain barrier #neurology #neuroscience
Aug 19, 2013447 notes
#sleep #dreaming #brainwaves #memory #psychology #neuroscience #science
Aug 19, 201340 notes
#brain tumor #anti-angiogenesis therapy #glioblastoma #blood vessels #medicine #neuroscience #science
Aug 19, 201392 notes
#science #ion channels #potassium channels #G proteins #heart #brain #medicine #neuroscience
Aug 18, 2013801 notes
#science #progeria #aging #developmental inertia #genetics #neuroscience
Why One Cream Cake Leads to Another

Continuously eating fatty foods perturbs communication between the gut and brain, which in turn perpetuates a bad diet.

A chronic high-fat diet is thought to desensitize the brain to the feeling of satisfaction that one normally gets from a meal, causing a person to overeat in order to achieve the same high again. New research published today (August 15) in Science, however, suggests that this desensitization actually begins in the gut itself, where production of a satiety factor, which normally tells the brain to stop eating, becomes dialed down by the repeated intake of high-fat food.

image

“It’s really fantastic work,” said Paul Kenny, a professor of molecular therapeutics at The Scripps Research Institute in Jupiter, Florida, who was not involved in the study. “It could be a so-called missing link between gut and brain signaling, which has been something of a mystery.”

While pork belly, ice cream, and other high-fat foods produce an endorphin response in the brain when they hit the taste buds, according to Kenny, the gut also sends signals directly to the brain to control our feeding behavior. Indeed, mice nourished via gastric feeding tubes, which bypass the mouth, exhibit a surge in dopamine—a neurotransmitter promoting reinforcement in the brain’s reward circuitry—similar to that experienced by those eating normally.

This dopamine surge occurs in response to feeding in both mice and humans. But evidence suggests that dopamine signaling in the brain is deficient in obese people. Ivan de Araujo, a professor of psychiatry at the Yale School of Medicine, has now discovered that obese mice on a chronic high-fat diet also have a muted dopamine response when receiving fatty food via a direct tube to their stomachs.

To determine the nature of the dopamine-regulating signal emanating from the gut, Araujo and his team searched for possible candidates. “When you look at animals chronically exposed to high-fat foods, you see high levels of almost every circulating factor—leptin, insulin, triglycerides, glucose, et cetera,” he said. But one class of signaling molecule is suppressed. Of these, Araujo’s primary candidate was oleoylethanolamide. Not only is the factor produced by intestinal cells in response to food, he said, but during chronic high-fat exposure, “the suppression levels seemed to somehow match the suppression that we saw in dopamine release.”

Araujo confirmed oleoylethanol’s dopamine-regulating ability in mice by administering the factor via a catheter to the tissues surrounding their guts. “We discovered that by restoring the baseline level of [oleoylethanolamide] in the gut … the high-fat fed animals started having dopamine responses that were indistinguishable from their lean counterparts.”

The team also found that oleoylethanolamide’s effect on dopamine was transmitted via the vagus nerve, which runs between the brain and abdomen, and was dependent on its interaction with a transcription factor called PPAR-a.

Oleoylethanolamide levels are also reduced in fasting animals and increase in response to eating, communicating with the brain to stop further consumption once the belly is full. Indeed, oleoylethanolamide is a known satiety factor. Therefore, when chronic consumption of high-fat food diminishes its production, the satisfaction signal is not achieved, and the brain is essentially “blind to the presence of calories in the gut,” said Araujo, and thus demands more food.

It is not clear why a chronic high-fat diet suppresses the production of oleoylethanolamide. But once the vicious cycle starts, it is hard to break because the brain is receiving its information subconsciously, said Daniele Piomelli, a professor at the University of California, Irvine, and director of drug discovery and development at the Italian Institute of Technology in Genoa.

“We eat what we like, and we think we are conscious of what we like, but I think what this [paper] and others are indicating is that there is a deeper, darker side to liking—a side that we’re not aware of,” Piomelli said. “Because it is an innate drive, you can not control it.” Put another way, even if you could trick your taste buds into enjoying low-fat yogurt, you’re unlikely to trick your gut.

The good news, however, is that “there is no permanent impairment in the [animals’] dopamine levels,” Araujo said. This suggests that if drugs could be designed to regulate the oleoylethanolamide–to-PPAR-a pathway in the gut, Kenny added, it could have “a huge impact on people’s ability to control their appetite.”

Aug 18, 2013164 notes
#dopamine #dopamine deficiency #obesity #diet #appetite #neuroscience #science
Head hurts? Zap the wonder nerve in your neck

"It was like red-hot pokers needling one side of my face," says Catherine, recalling the cluster headaches she experienced for six years. "I just wanted it to stop." But it wouldn’t – none of the drugs she tried had any effect.

image

Thinking she had nothing to lose, last year she enrolled in a pilot study to test a handheld device that applies a bolt of electricity to the neck, stimulating the vagus nerve – the superhighway that connects the brain to many of the body’s organs, including the heart.

The results of the trial were presented last month at the International Headache Congress in Boston, and while the trial is small, the findings are positive. Of the 21 volunteers, 18 reported a reduction in the severity and frequency of their headaches, rating them, on average, 50 per cent less painful after using the device daily and whenever they felt a headache coming on.

This isn’t the first time vagal nerve stimulation has been used as a treatment – but it is one of the first that hasn’t required surgery. Some people with epilepsy have had a small generator that sends regular electrical signals to the vagus nerve implanted into their chest. Implanted devices have also been approved to treat depression. What’s more, there is increasing evidence that such stimulation could treat many more disorders from headaches to stroke and possibly Alzheimer’s disease.

The latest study suggests it is possible to stimulate the nerve through the skin, rather than resorting to surgery. “What we’ve done is figured out a way to stimulate the vagus nerve with a very similar signal, but non-invasively through the neck,” says Bruce Simon, vice-president of research at New Jersey-based ElectroCore, makers of the handheld device. “It’s a simpler, less invasive way to stimulate the nerve.”

Cluster headaches are thought to be triggered by the overactivation of brain cells involved in pain processing. The neurotransmitter glutamate, which excites brain cells, is a prime suspect. ElectroCore turned to the vagus nerve as previous studies had shown that stimulating it in people with epilepsy releases neurotransmitters that dampen brain activity.

When the firm used a smaller version of ElectroCore’s device on rats, it found it reduced glutamate levels and excitability in these pain centres. Other studies have shown that vagus nerve stimulation causes the release of inhibitory neurotransmitters which counter the effects of glutamate.

The big question is whether a non-implantable device can really trigger changes in brain chemistry in humans, or whether people are simply experiencing a placebo effect. “The vagus nerve is buried deep in the neck, and something that’s delivering currents through the skin can only go so deep,” says Mike Kilgard of the University of Texas at Dallas. As you turn up the voltage, there’s a risk of it activating muscle fibres that trigger painful cramps, he adds.

Simon says that volunteers using the device haven’t reported any serious side effects. He adds that ElectroCore will soon publish data showing changes in brain activity in humans after using the device. Placebo-controlled trials are also about to start.

Catherine has been using it for a year without ill effect. “I can now function properly as a human being again,” she says.

The many uses of the wonder nerve

Coma, irritable bowel syndrome, asthma and obesity are just some of the disparate conditions that vagus nerve stimulation may benefit and for which human trials are under way.

It might also help people with tinnitus. Although people with tinnitus complain of ringing in their ears, the problem actually arises because too many neurons fire in the auditory part of the brain when certain frequencies are heard.

Mike Kilgard of the University of Texas at Dallas reasoned that if people were played tones that didn’t trigger tinnitus while the vagus nerve was stimulated, this might coax the rogue neurons into firing in response to these frequencies instead. “By activating this nerve we can enhance the brain’s ability to rewire itself,” he says.

He has so far tested the method in rats and in 10 people with tinnitus, using an implanted device to stimulate the nerve. Not everyone noticed an improvement, but even so Kilgard is planning a larger trial. The work was presented at a meeting of the International Union of Physiological Sciences in Birmingham, UK, last month. The technique is also being tested in people who have had a stroke.

"If these studies stand up it could be worth changing the name of the vagus nerve to the wonder nerve," says Sunny Ogbonnaya at Cork University Hospital in Ireland.

Aug 18, 2013121 notes
#vagus nerve #vagal nerve stimulation #glutamate #headaches #brain activity #neuroscience #science
Device Could Spot Seizures by Reading Brainwaves through the Ear

Neuroscientists often use electroencephalography (EEG) as an inexpensive way to record electrical signals in the brain. Though it would be useful to run these recordings for long periods of time, that usually isn’t practical: EEG recording traditionally involves attaching many electrodes and cables to a patient’s scalp.

Now engineers at Imperial College in London have developed an EEG device that can be worn inside the ear, like a hearing aid. They say the device will allow scientists to record EEGs for several days at a time; this would allow doctors to monitor patients who have regularly recurring problems like seizures or microsleep.

image

“The ideal is to have a very stable recording system, and recordings which are repeatable,” explains co-creator Danilo Mandic. “It’s not interfering with your normal life, because there are acoustic vents so people can hear. After a while, they forget they’re having an EEG.”

By nestling the EEG inside the ear, the engineers avoid a lot of signal noise usually introduced by body movement. They can also ensure that the electrodes are always placed in exactly the same spot, which, they say, will make repeated readings more reliable.

Since the device attaches to just one area, it can record only from the temporal region. This limits its potential applications to events that involve local activity. Tzzy-Ping Jung, co-director of the University of California, San Diego’s Center for Advanced Neurological Engineering, says that this does not mean the device will not be valuable.

“Different modalities will have different applications. I would not rule out the usefulness of any modalities,” says Jung. “I think it’s a very good idea with very promising results.”

Aug 18, 2013154 notes
#EEG device #brain imaging #seizures #brainwaves #neuroscience #science
Female frogs prefer males who can multitask

From frogs to humans, selecting a mate is complicated. Females of many species judge suitors based on many indicators of health or parenting potential. But it can be difficult for males to produce multiple signals that demonstrate these qualities simultaneously.

image

In a study of gray tree frogs, a team of University of Minnesota researchers discovered that females prefer males whose calls reflect the ability to multitask effectively. In this species (Hyla chrysoscelis) males produce “trilled” mating calls that consist of a string of pulses.

Typical calls can range in duration from 20-40 pulses per call and occur between 5-15 calls per minute. Males face a trade-off between call duration and call rate, but females preferred calls that are longer and more frequent, which is no simple task.

The findings were published in August issue of Animal Behavior.

"It’s kind of like singing and dancing at the same time," says Jessica Ward, a postdoctoral researcher who is lead author for the study. Ward works in the laboratory of Mark Bee, a professor in the College of Biological Sciences’ Department of Ecology, Evolution and Behavior.

The study supports the multitasking hypothesis, which suggests that females prefer males who can do two or more hard-to-do things at the same time because these are especially good quality males, Ward says. The hypothesis, which explores how multiple signals produced by males influence female behavior, is a new area of interest in animal behavior research.

By listening to recordings of 1,000 calls, Ward and colleagues learned that males are indeed forced to trade off call duration and call rate. That is, males that produce relatively longer calls only do so at relatively slower rates.

"It’s easy to imagine that we humans might also prefer multitasking partners, such as someone who can successfully earn a good income, cook dinner, manage the finances and get the kids to soccer practice on time."

The study was carried out in connection with Bee’s research goal, which is understanding how female frogs are able to distinguish individual mating calls from a large chorus of males. By comparison, humans, especially as we age, lose the ability to distinguish individual voices in a crowd. This phenomenon, called the “cocktail party” problem, is often the first sign of a diminishing ability to hear. Understanding how frogs hear could lead to improved hearing aids.

Aug 17, 201356 notes
#multitasking #mating #frogs #animal behavior #psychology #neuroscience #science
Aug 17, 2013181 notes
#autism #ASD #mathematical skills #brain differences #brain activity #neuroimaging #neuroscience #psychology #science
Aug 16, 2013162 notes
#prospective memory #fMRI #brain activity #prefrontal cortex #memory #psychology #neuroscience #science
Making the Brain Take Notice of Faces in Autism

A new study in Biological Psychiatry explores the influence of oxytocin

Difficulty in registering and responding to the facial expressions of other people is a hallmark of autism spectrum disorder (ASD). Relatedly, functional imaging studies have shown that individuals with ASD display altered brain activations when processing facial images.

The hormone oxytocin plays a vital role in the social interactions of both animals and humans. In fact, multiple studies conducted with healthy volunteers have provided evidence for beneficial effects of oxytocin in terms of increased trust, improved emotion recognition, and preference for social stimuli.

This combination of scientific work led German researchers to hypothesize about the influence of oxytocin in ASD. Dr. Gregor Domes, from the University of Freiburg and first author of the new study, explained: “In the present study, we were interested in the question of whether a single dose of oxytocin would change brain responses to social compared to non-social stimuli in individuals with autism spectrum disorder.”

They found that oxytocin did show an effect on social processing in the individuals with ASD, “suggesting that oxytocin may help to treat a basic brain function that goes awry in autism spectrum disorders,” commented Dr. John Krystal, Editor of Biological Psychiatry.

To conduct this study, they recruited fourteen individuals with ASD and fourteen control volunteers, all of whom completed a face- and house-matching task while undergoing imaging scans. Each participant completed this task and scanning procedure twice, once after receiving a nasal spray containing oxytocin and once after receiving a nasal spray containing placebo. The order of the sprays was randomized, and the tests were administered one week apart.

Using two sets of stimuli in the matching task, one of faces and one of houses, allowed the researchers to not only compare the effects of the oxytocin and placebo administrations, but also allowed them to discriminate findings between specific effects to only social stimuli and non-specific effects to more general brain processing.

What they found was intriguing. The data indicate that oxytocin specifically increases responses of the amygdala to social stimuli in individuals with ASD. The amygdala, the authors explain, “has been associated with processing of emotional stimuli, threat-related stimuli, face processing, and vigilance for salient stimuli”.

This finding suggests oxytocin might promote the salience of social stimuli in ASD. Increased salience of social stimuli might support behavioral training of social skills in ASD.

These data support the idea that oxytocin may be a promising approach in the treatment of ASD and could stimulate further research, even clinical trials, on the exploration of oxytocin as an add-on treatment for individuals with autism spectrum disorder.

Aug 16, 201367 notes
#oxytocin #autism #ASD #amygdala #face processing #social cognition #neuroscience #science
Cell memory mechanism discovered

The cells in our bodies can divide as often as once every 24 hours, creating a new, identical copy. DNA binding proteins called transcription factors are required for maintaining cell identity. They ensure that daughter cells have the same function as their mother cell, so that for example muscle cells can contract or pancreatic cells can produce insulin. However, each time a cell divides the specific binding pattern of the transcription factors is erased and has to be restored in both mother and daughter cells. Previously it was unknown how this process works, but now scientists at Karolinska Institutet have discovered the importance of particular protein rings encircling the DNA and how these function as the cell’s memory.

image

The DNA in human cells is translated into a multitude of proteins required for a cell to function. When, where and how proteins are expressed is determined by regulatory DNA sequences and a group of proteins, known as transcription factors, that bind to these DNA sequences. Each cell type can be distinguished based on its transcription factors, and a cell can in certain cases be directly converted from one type to another, simply by changing the expression of one or more transcription factors. It is critical that the pattern of transcription factor binding in the genome be maintained. During each cell division, the transcription factors are removed from DNA and must find their way back to the right spot after the cell has divided. Despite many years of intense research, no general mechanism has been discovered which would explain how this is achieved.

"The problem is that there is so much DNA in a cell that it would be impossible for the transcription factors to find their way back within a reasonable time frame. But now we have found a possible mechanism for how this cellular memory works, and how it helps the cell remember the order that existed before the cell divided, helping the transcription factors find their correct places", explains Jussi Taipale, professor at Karolinska Institutet and the University of Helsinki, and head of the research team behind the discovery.

The results are now being published in the scientific journal Cell. The research group has produced the most complete map yet of transcription factors in a cell. They found that a large protein complex called cohesin is positioned as a ring around the two DNA strands that are formed when a cell divides, marking virtually all the places on the DNA where transcription factors were bound. Cohesin encircles the DNA strand as a ring does around a piece of string, and the protein complexes that replicate DNA can pass through the ring without displacing it. Since the two new DNA strands are caught in the ring, only one cohesin is needed to mark the two, thereby helping the transcription factors to find their original binding region on both DNA strands.

"More research is needed before we can be sure, but so far all experiments support our model," says Martin Enge, assistant professor at Karolinska Institutet.

Transcription factors play a pivotal role in many illnesses, including cancer as well as many hereditary diseases. The discovery that virtually all regulatory DNA sequences bind to cohesin may also end up having more direct consequences for patients with cancer or hereditary diseases. Cohesin would function as an indicator of which DNA sequences might contain disease-causing mutations.

"Currently we analyse DNA sequences that are directly located in genes, which constitute about three per cent of the genome. However, most mutations that have been shown to cause cancer are located outside of genes. We cannot analyse these in a reliable manner - the genome is simply too large. By only analysing DNA sequences that bind to cohesin, roughly one per cent of the genome, it would allow us to analyse an individual’s mutations and make it much easier to conduct studies to identify novel harmful mutations," Martin Enge concludes.

Aug 16, 2013113 notes
#transcription factors #DNA sequence #hereditary diseases #cohesin #genetics #neuroscience #science
Sympathetic Neurons Engage in “Cross Talk” With Cells in the Pancreas During Early Development

The human body is a complicated system of blood vessels, nerves, organs, tissue and cells each with a specific job to do. When all are working together, it’s a symphony of form and function as each instrument plays its intended roles.

image

Biologist Rejji Kuruvilla and her fellow researchers uncovered what happens when one instrument is not playing its part.

Kuruvilla along with graduate students Philip Borden and Jessica Houtz, both from the Biology Department at Johns Hopkins University’s Krieger School of Arts and Sciences, and Dr. Steven Leach from the McKusick-Nathans Institute of Genetic Medicine at the Johns Hopkins School of Medicine, recently published a paper in the journal Cell Reports exploring whether “cross-talk” or reciprocal signaling, takes place between the neurons in the sympathetic nervous system and the tissues that the nerves connect to. In this case the targeted tissue called islets, were in the pancreas.

“We knew that sympathetic neurons need molecular signals from the tissues that they connect with, to grow and survive,” said Kuruvilla. “What we did not know was whether the neurons would reciprocally signal to the target tissues to instruct them to grow and mature. It made sense to focus on the pancreas because of previous studies done in diabetic animal models where sympathetic nerves within the pancreas were found to retract early on in the disease, suggesting that dysfunction of the nerves could be an early trigger for pancreatic defects.”

The researchers spent approximately three years working with lab mice to test the various scenarios in which signaling between sympathetic neurons and islet cells might take place. The experiments focused on what effects removing the sympathetic nerves would have on pancreas development in newborn mice.

Previous studies had shown that pancreatic cells release a signal of their own, a nerve growth protein, that directs the sympathetic nerves toward the pancreas and provides necessary nutrition to sustain the nerves.

In turn, Kuruvilla’s team found that in mutant mice, the removal of the sympathetic neurons resulted in deformities in the architecture of the pancreatic islet cells and defects in insulin secretion and glucose metabolism.

Pancreatic islets are highly organized functional micro-organs with a defined size, shape and distinctive arrangement of endocrine cells. It’s this marriage of form and function that result in cells clustered close together, that creates greater, more efficient islet cell function.

However, the mutant mice, with their sympathetic neurons removed, had islet formations that were misshapen, sported lesions and developed in a patchy, uneven manner. Because of their dysfunctional islet cell development, postnatal mice did not secrete enough insulin when confronted with high glucose, and had high blood glucose levels as a result. Increased levels of blood glucose in humans is a hallmark of diabetes.

It’s known in neuroscience that the neurons in question from the sympathetic nervous system control the body’s “flight or fight” response and communicate with connected tissues by releasing a chemical messenger called norepinephrine. The release of norepinephrine also plays an important role in the development and maturation of islets, said Kuruvilla.

Using sympathetic neurons and islet cells grown together in a culture dish, the researchers observed that islet cells move toward the nerves and identified norepinephrine as the nerve signal that causes the movement of the islet cells.

“Seeing how these islet cells were responding to sympathetic neurons both in a dish and the effects of removing the nerves in a whole animal on islet shape and functions were pretty remarkable,” said Borden, lead author of the paper. “It was clear to us that sympathetic neurons were key to how islets were developing, something no one else had shown.”

Kuruvilla said these studies, identifying sympathetic nerves as a critical player in organizing pancreatic cells during development and influencing their later function, could add to a better understanding of treating diabetes in the future. The research also lends support to the value in considering the importance of external factors such as nerves and blood vessels when transplanting islet cells for the treatment of diabetes in patients.

“This study reveals interactions between two co-developing systems, sympathetic neurons and pancreatic islet cells, that has important implications for peripheral organ development, and for regeneration of these tissues following injury or disease,” said Kuruvilla.

Aug 16, 201353 notes
#sympathetic nervous system #sympathetic neurons #pancreatic cells #norepinephrine #neuroscience #science
Aug 16, 201395 notes
#depression #biomarker #bipolar disorder #neuroimaging #psychology #neuroscience #science
Worms May Shed Light on Human Ability to Handle Chronic Stress

New research at Rutgers University may help shed light on how and why nervous system changes occur and what causes some people to suffer from life-threatening anxiety disorders while others are better able to cope.

image

Maureen Barr, a professor in the Department of Genetics, and a team of researchers, found that the architectural structure of the six sensory brain cells in the roundworm, responsible for receiving information, undergo major changes and become much more elaborate when the worm is put into a high stress environment.

Scientists have known for some time that changes in the tree-like dendrite structures that connect neurons in the human brain and enable our thought processes to work properly can occur under extreme stress, alter brain cell development and result in anxiety disorders like depression and Post Traumatic Stress Disorder affecting millions of Americans each year.

What scientists don’t understand for sure, Barr says, is the cause behind these molecular changes in the brain.

“This type of research provides us necessary clues that ultimately could lead to the development of drugs to help those suffering with severe anxiety disorders,” Barr says.

In the study published today in Current Biology, scientists at Rutgers have identified six sensory nerve cells in the tiny, transparent roundworm, known as the C. elegans and an enzyme called KPC-1/furin which triggers a chemical reaction in humans that is needed for essential life functions like blood-clotting. 

While the enzyme also appears to play a role in the growth of tumors and the activation of several types of virus and diseases in humans, in the roundworm the enzyme enables its simple neurons to morph into new elaborately branched shapes when placed under adverse conditions.

Normally, this one-millimeter long worm develops from an embryo through four larval stages before molting into a reproductive adult. Put it under stressful conditions of overcrowding, starvation and high temperature and the worm transforms into an alternative larval stage known as the dauer that becomes so stress-resistant it can survive almost anything – including the Space Shuttle Columbia disaster in 2003 of which they were the only living things to survive.  

“These worms that normally have a short life cycle turn into super worms when they go into the dauer stage and can live for months, although they are no longer able to reproduce,” Barr says.

What is so interesting to Barr is that when a perceived threat is over, these tiny creatures and their IL2 neurons transform back to a normal lifespan and reproductive state like nothing had ever happened. Under a microscope, the complicated looking tree-like connectors that receive information are pruned back and the worm appears as it did before the trauma occurred.

This type of neural reaction differs in humans who can suffer from extreme anxiety months or even years after the traumatic event even though they are no longer in a threatening situation.   

The ultimate goal, Barr says, is to determine how and why the nervous system responds to stress. By identifying molecular pathways that regulate neuronal remodeling, scientists may apply this knowledge to develop future therapeutics.

Aug 16, 201386 notes
#chronic stress #PTSD #anxiety #C. elegans #KPC-1/furin #neuroscience #science
Human eye movements for vision are remarkably adaptable

When something gets in the way of our ability to see, we quickly pick up a new way to look, in much the same way that we would learn to ride a bike, according to a new study published in the Cell Press journal Current Biology on August 15.

image

Our eyes are constantly on the move, darting this way and that four to five times per second. Now researchers have found that the precise manner of those eye movements can change within a matter of hours. This discovery by researchers from the University of Southern California might suggest a way to help those with macular degeneration better cope with vision loss.

"The system that controls how the eyes move is far more malleable than the literature has suggested," says Bosco Tjan of the University of Southern California. "We showed that people with normal vision can quickly adjust to a temporary occlusion of their foveal vision by adapting a consistent point in their peripheral vision as their new point of gaze."

The fovea refers to the small, center-most portion of the retina, which is responsible for our high-resolution vision. We move our eyes to direct the fovea to different parts of a scene, constructing a picture of the world around us. In those with age-related macular degeneration, progressive loss of foveal vision leads to visual impairment and blindness.

In the new study, MiYoung Kwon, Anirvan Nandy, and Tjan simulated a loss of foveal vision in six normally sighted young adults by blocking part of a visual scene with a gray disc that followed the individuals’ eye gaze. Those individuals were then asked to complete demanding object-following and visual-search tasks. Within three hours of working on those tasks, people showed a remarkably fast and spontaneous adjustment of eye movements. Once developed, that change in their “point of gaze” was retained over a period of weeks and was reengaged whenever their foveal vision was blocked.

Tjan and his team say they were surprised by the rate of this adjustment. They note that patients with macular degeneration frequently do adapt their point of gaze, but in a process that takes months, not days or hours. They suggest that practice with a visible gray disc like the one used in the study might help speed that process of visual rehabilitation along. The discovery also reveals that the oculomotor (eye movement) system prefers control simplicity over optimality.

"Gaze control by the oculomotor system, although highly automatic, is malleable in the same sense that motor control of the limbs is malleable," Tjan says. "This finding is potentially very good news for people who lose their foveal vision due to macular diseases. It may be possible to create the right conditions for the oculomotor system to quickly adjust," Kwon adds.

Aug 16, 201376 notes
#eye movements #vision loss #macular degeneration #fovea #foveal vision #neuroscience #science
Aug 16, 2013131 notes
#science #visual processing #vision #neural circuitry #robotics #neuroscience
A New Wrinkle in Parkinson’s Disease Research

The active ingredient in an over-the-counter skin cream might do more than prevent wrinkles. Scientists have discovered that the drug, called kinetin, also slows or stops the effects of Parkinson’s disease on brain cells.

image

Scientists identified the link through biochemical and cellular studies, but the research team is now testing the drug in animal models of Parkinson’s. The research is published in the August 15, 2013 issue of the journal Cell.

“Kinetin is a great molecule to pursue because it’s already sold in drugstores as a topical anti-wrinkle cream,” says HHMI investigator Kevan Shokat of the University of California, San Francisco. “So it’s a drug we know has been in people and is safe.”

Parkinson’s disease is a degenerative disease that causes the death of neurons in the brain. Initially, the disease affects one’s movement and causes tremors, difficulty walking, and slurred speech. Later stages of the disease can cause dementia and broader health problems. In 2004, researchers studying an Italian family with a high prevalence of early-onset Parkinson’s disease discovered mutations in a protein called PINK1 associated with the inherited form of the disease.

Since then, studies have shown that PINK1 normally wedges into the membrane of damaged mitochondria inside cells that causes another protein, Parkin, to be recruited to the mitochondria, which are organelles responsible for energy generation. Neurons require high levels of energy production, therefore when mitochondrial damage occurs, it can lead to neuronal death. However, when Parkin is present on damaged mitochondria, studding the mitochondrial surface, the cell is able to survive the damage. In people who inherit mutations in PINK1, however, Parkin is never recruited to the organelles, leading to more frequent neuronal death than usual.

Shokat and his colleagues wanted to develop a way to turn on or crank up PINK1 activity, therefore preventing an excess of cell death, in those with inherited Parkinson’s disease. But turning on activity of a mutant enzyme is typically more difficult than blocking activity of an overactive version.

“When we started this project, we really thought that there would be no conceivable way to make something that directly turns on the enzyme,” says Shokat. “For any enzyme we know that causes a disease, we have ways to make inhibitors but no real ways to turn up activity.”

His team expected it would have to find a less direct way to mimic the activity of PINK1 and recruit Parkin. In the hopes of more fully understanding how PINK1 works, they began investigating how PINK1 binds to ATP, the energy molecule that normally turns it on. In one test, instead of adding ATP to the enzymes, they added different ATP analogues, versions of ATP with altered chemical groups that slightly change its shape. Scientists typically must engineer new versions of proteins to be able to accept these analogs, since they don’t fit into the typical ATP binding site. But to Shokat’s surprise, one of the analogs—kinetin triphosphate, or KTP—turned on the activity of not only normal PINK1, but also the mutated version, which doesn’t bind ATP.

“This drug does something that chemically we just never thought was possible,” says Shokat. “But it goes to show that if you find the right key for the right lock, you’ll be able to open the door.”

To test whether the binding of KTP to PINK1 led to the same consequences as the usual ATP binding, Shokat’s group measured the activity of PINK1 directly, as well as the downstream consequences of this activity, including the amount of Parkin recruited to the mitochondrial surface, and the levels of cell death. Adding the precursor of KTP, kinetin, to cells—both those with PINK1 mutations and those with normal physiology—amplified the activity of PINK1, increased the level of Parkin on damaged mitochondria, and decreased levels of neuron death, they found.

“What we have here is a case where the molecular target has been shown to be important to Parkinson’s in human genetic studies,” says Shokat. “And now we have a drug that specifically acts on this target and reverses the cellular causes of the disease.”

The similar results in cells with and without PINK1 mutations suggest that kinetin, which is a precursor to KTP, could be used to treat not only Parkinson’s patients with a known PINK1 mutation, but to slow progression of the disease in those without a family history by decreasing cell death.

Shokat is now performing experiments on the effects of kinetin in mice with various forms of Parkinson’s disease. However, the usefulness of animal models in Parkinson’s research has been debated, and therefore the positive results from the cellular data, he says, is as good an indicator as results in animals that this drug has potential to treat Parkinson’s in humans. Initial human studies will likely focus on the small population of patients with PINK1 mutations, and if successful in that group the drug could later be tested in a wider array of Parkinson’s patients.

Aug 16, 201374 notes
#parkinson's disease #kinetin #animal model #PINK1 mutations #genetics #neuroscience #science
Aug 15, 201340 notes
#alzheimer's disease #dementia #genetics #mRNA #neurology #neuroscience #science
Study debunks controversial MS theory

There is no evidence that impaired blood flow or blockage in the veins of the neck or head is involved in multiple sclerosis, says a McMaster University study.

The research, published online by PLOS ONE Wednesday, found no evidence of abnormalities in the internal jugular or vertebral veins or in the deep cerebral veins of any of 100 patients with multiple sclerosis (MS) compared with 100 people who had no history of any neurological condition.

The study contradicts a controversial theory that says that MS, a chronic, neurodegenerative and inflammatory disease of the central nervous system, is associated with abnormalities in the drainage of venous blood from the brain. In 2008 Italian researcher Paolo Zamboni said that angioplasty, a blockage clearing procedure, would help MS patients with a condition he called chronic cerebrospinal venous insufficiency (CCSVI). This caused a flood of public response in Canada and elsewhere, with many concerned individuals lobbying for support of the ‘Liberation Treatment’ to clear the veins, as advocated by Zamboni.

“This is the first Canadian study to provide compelling evidence against the involvement of CCSVI in MS,” said principal investigator Ian Rodger, a professor emeritus of medicine in the Michael G. DeGroote School of Medicine. “Our findings bring a much needed perspective to the debate surrounding venous angioplasty for MS patients”.

In the study all participants received an ultrasound of deep cerebral veins and neck veins as well as a magnetic resonance imaging (MRI) of the neck veins and brain. Each participant had both examinations performed on the same day. The McMaster research team included a radiologist and two ultrasound technicians who had trained in the Zamboni technique at the Department of Vascular Surgery of the University of Ferrara.

Aug 15, 201350 notes
#MS #neuroimaging #cerebral veins #vertebral veins #neurology #neuroscience #science
Brain scans could predict response to antipsychotic medication

Researchers from King’s College London and the University of Nottingham have identified neuroimaging markers in the brain which could help predict whether people with psychosis respond to antipsychotic medications or not.

image

In approximately half of young people experiencing their first episode of a psychosis (FEP), the symptoms do not improve considerably with the initial medication prescribed, increasing the risk of subsequent episodes and worse outcome. Identifying individuals at greatest risk of not responding to existing medications could help in the search for improved medications, and may eventually help clinicians personalize treatment plans.

In a study published today in JAMA Psychiatry, researchers used structural Magnetic Resonance Imaging (MRI) to scan the brains of 126 individuals – 80 presenting with FEP, and 46 healthy controls. Participants had an MRI scan shortly after their FEP, and another assessment 12 weeks later, to establish whether symptoms had improved following the first treatment with antipsychotic medications.

The researchers examined a particular feature of the brain called “cortical gyrification” - the extent of folding of the cerebral cortex and a marker of how it has developed. They found that the individuals who did not respond to treatment already had a significant reduction in gyrification across multiple brain regions, compared to patients who did respond and to individuals without psychosis. This reduced gyrification was particularly present in brain areas considered important in psychosis, such as the temporal and frontal lobes. Those who responded to treatment were virtually indistinguishable from the healthy controls.

The researchers also investigated whether the differences could be explained by the type of diagnosis of psychosis (eg. with or without affective symptoms, such as depression or elated mood). They found that reduced gyrification predicted non-response to treatment independently of the diagnosis. 

Dr Paola Dazzan from the Department of Psychosis Studies at King’s College London’s Institute of Psychiatry, and senior author of the paper, says: “Our study provides crucial evidence of a neuroimaging marker that, if validated, could be used early in psychosis to help identify those people less likely to respond to medications. It is possible that the alterations we observed are due to differences in the way the brain has developed early on in people who do not respond to medication compared to those who do.”

She continues:”There have been few advances in developing novel anti-psychotic drugs over the past 50 years and we still face the same problems with a sub-group of people who do not respond to the drugs we currently use. We could envisage using a marker like this one to identify people who are least likely to respond to existing medications and focus our efforts on developing new medication specifically adapted to this group. In the longer term, if we were able to identify poor responders at the outset, we may be able to formulate personalized treatment plans for that individual patient.” 

Dr Lena Palaniyappan from the University of Nottingham adds: “All of us have complex and varying patterns of folding in our brains. For the first time we are showing that the measurement of these variations could potentially guide us in treating psychosis. It is possible that people with specific patterns of brain structure respond better to treatments other than antipsychotics that are currently in use. Clearly, the time is ripe for us to focus on utilising neuroimaging to guide treatment decisions.”

Psychosis is a term used to indicate mental health disorders that present with symptoms like hallucinations (such as hearing voices) or delusions (unshakeable beliefs based on the person’s altered perception of reality, which may not correspond to the way others see the world). Psychotic episodes are present in conditions such as schizophrenia and bipolar disorder.

Approximately 1 in 100 people in England have at least one episode of psychosis throughout their lives. In most cases, psychosis develops during late adolescence (15 or above) or adulthood. Treatment involves a combination of antipsychotic medication, psychological therapies and social support. Many people with psychosis go on to lead ordinary lives and for about 60% of people, the symptoms disappear within 12 months from onset. However, for others, treatment is less straightforward and many do not respond to the initial antipsychotic treatment prescribed by their doctor. Early response to antipsychotic medication is known to be associated with better outcome and fewer subsequent episodes, and intervening early with effective treatments is therefore important.

Aug 15, 2013115 notes
#brain scans #antipsychotic medications #neuroimaging #psychosis #cortical gyrification #neuroscience #science
Aug 15, 2013130 notes
#science #brain activity #EEG #loss of balance #sensorimotor cortex #neuroscience
Aug 15, 2013149 notes
#AI #computer chips #memristor devices #neural networks #neuroscience #science
Aug 15, 2013166 notes
#axons #dendrites #nerve damage #neurons #neuronal circuit #neurodegenerative diseases #neuroscience #science
Aug 15, 2013672 notes
#brain function #right-brained #left-brained #neuroimaging #personality traits #psychology #neuroscience #science
Newly Discovered ‘Switch’ Plays Dual Role In Memory Formation

Researchers at Johns Hopkins have uncovered a protein switch that can either increase or decrease memory-building activity in brain cells, depending on the signals it detects. Its dual role means the protein is key to understanding the complex network of signals that shapes our brain’s circuitry, the researchers say. A description of their discovery appears in the July 31 issue of the Journal of Neuroscience.

“What’s interesting about this protein, AGAP3, is that it is effectively double-sided: One side beefs up synapses in response to brain activity, while the other side helps bring synapse-building back down to the brain’s resting state,” says Richard Huganir, Ph.D., a professor and director of the Solomon H. Snyder Department of Neuroscience at the Johns Hopkins University School of Medicine and co-director of the Brain Science Institute at Johns Hopkins. “The fact that it links these two opposing activities indicates AGAP3 may turn out to be central to controlling the strength of synapses.”

Huganir has long studied how connections between brain cells, known as synapses, are strengthened and weakened to form or erase memories. The new discovery came about when he and postdoctoral fellow Yuko Oku, Ph.D., investigated the chain reaction of signals involved in one type of synaptic strengthening.

In a study of the proteins that interact with one of the known proteins from that chain reaction, the previously unknown AGAP3 turned up. It contained not only a site designed to bind another protein involved in the chain reaction that leads from brain stimulation to learning, but also a second site involved in bringing synapse-building activity down to normal levels after a burst of activity.

Although it might seem the two different functions are behaving at cross-purposes, Oku says, it also could be that nature’s bundling of these functions together in a single protein is an elegant way of enabling learning and memory while preventing dangerous overstimulation. More research is needed, Oku says, to figure out whether AGAP3’s two sites coordinate by affecting each other’s activity, or are effectively free agents.

Aug 14, 201372 notes
#memory #synapses #AGAP3 #AMPA receptors #NMDA receptors #LTP #neuroscience #science
Aug 14, 2013168 notes
#hypnotic suggestions #consciousness #color perception #brain activity #visual hallucinations #neuroscience #science
Study identifies new culprit that may make aging brains susceptible to neurodegenerative diseases

The steady accumulation of a protein in healthy, aging brains may explain seniors’ vulnerability to neurodegenerative disorders, a new study by researchers at the Stanford University School of Medicine reports.

The study’s unexpected findings could fundamentally change the way scientists think about neurodegenerative disease.

The pharmaceutical industry has spent billions of dollars on futile clinical trials directed at treating Alzheimer’s disease by ridding brains of a substance called amyloid plaque. But the new findings have identified another mechanism, involving an entirely different substance, that may lie at the root not only of Alzheimer’s but of many other neurodegenerative disorders — and, perhaps, even the more subtle decline that accompanies normal aging.

The study, published Aug. 14 in the Journal of Neuroscience, reveals that with advancing age, a protein called C1q, well-known as a key initiator of immune response, increasingly lodges at contact points connecting nerve cells in the brain to one another. Elevated C1q concentrations at these contact points, or synapses, may render them prone to catastrophic destruction by brain-dwelling immune cells, triggered when a catalytic event such as brain injury, systemic infection or a series of small strokes unleashes a second set of substances on the synapses.

“No other protein has ever been shown to increase nearly so profoundly with normal brain aging,” said Ben Barres, MD, PhD, professor and chair of neurobiology and senior author of the study. Examinations of mouse and human brain tissue showed as much as a 300-fold age-related buildup of C1q.

The finding was made possible by the diligence and ingenuity of the study’s lead author, Alexander Stephan, PhD, a postdoctoral scholar in Barres’ lab. Stephan screened about 1,000 antibodies before finding one that binds to C1q and nothing else. (Antibodies are proteins, generated by the immune system, that adhere to specific “biochemical shapes,” such as surface features of invading pathogens.)

Comparing brain tissue from mice of varying ages, as well as postmortem samples from a 2-month-old infant and an older person, the researchers showed that these C1q deposits weren’t randomly distributed along nerve cells but, rather, were heavily concentrated at synapses. Analyses of brain slices from mice across a range of ages showed that as the animals age, the deposits spread throughout the brain.

“The first regions of the brain to show a dramatic increase in C1q are places like the hippocampus and substantia nigra, the precise brain regions most vulnerable to neurodegenerative diseases like Alzheimer’s and Parkinson’s disease, respectively,” said Barres. Another region affected early on, the piriform cortex, is associated with the sense of smell, whose loss often heralds the onset of neurodegenerative disease.

Other scientists have observed moderate, age-associated increases (on the order of three- or four-fold) in brain levels of the messenger-RNA molecule responsible for transmitting the genetic instructions for manufacturing C1q to the protein-making machinery in cells. Testing for messenger-RNA levels — typically considered reasonable proxies for how much of a particular protein is being produced — is fast, easy and cheap compared with analyzing proteins.

But in this study, Barres and his colleagues used biochemical measures of the protein itself. “The 300-fold rise in C1q levels we saw in 2-year-old mice — equivalent to 70- or 80-year-old humans — knocked my socks off,” Barres said. “I was not expecting that at all.”

C1q is the first batter on a 20-member team of immune-response-triggering proteins, collectively called the complement system. C1q is capable of clinging to the surface of foreign bodies such as bacteria or to bits of our own dead or dying cells. This initiates a molecular chain reaction known as the complement cascade. One by one, the system’s other proteins glom on, coating the offending cell or piece of debris. This in turn draws the attention of omnivorous immune cells that gobble up the target.

The brain has its own set of immune cells, called microglia, which can secrete C1q. Still other brain cells, called astrocytes, secrete all of C1q’s complement-system “teammates.” The two cell types work analogously to the two tubes of an Epoxy kit, in which one tube contains the resin, the other a catalyst.

Previous work in Barres’ lab has shown that the complement cascade plays a critical role in the developing brain. A young brain generates an excess of synapses, creating a huge range of options for the potential formation of new neural circuits. These synapses strengthen or weaken over time, in response to their heavy use or neglect. The presence of feckless connections contributes noise to the system, so the efficiency of the maturing brain’s architecture is improved if these underused synapses are pruned away.

In a 2007 paper in Cell, Barres’ group reported that the complement system is essential to synaptic pruning in normal, developing brains. Then in 2012, in Neuron, in a collaboration with the lab of Harvard neuroscientist Beth Stevens, PhD, they showed that it is specifically microglia — the brain’s in-house immune cells — that attack and ingest complement-coated synapses.

Barres now believes something similar is happening in the normal, aging brain. C1q, but not the other protein components of the complement system, gradually becomes highly prevalent at synapses. By itself, this C1q buildup doesn’t trigger wholesale synapse loss, the researchers found — although it does seem to impair their performance. Old mice whose capacity to produce C1q had been eliminated performed subtly better on memory and learning tests than normal older mice did.

Still, this leaves the aging brain’s synapses precariously perched on the brink of catastrophe. A subsequent event such as brain trauma, a bad case of pneumonia or perhaps a series of tiny strokes that some older people experience could incite astrocytes — the second tube in the Epoxy kit — to start secreting the other complement-system proteins required for synapse destruction.

Most cells in the body have their own complement-inhibiting agents. This prevents the wholesale loss of healthy tissue during an immune attack on invading pathogens or debris from dead tissue during wound healing. But nerve cells lack their own supply of complement inhibitors. So, when astrocytes get activated, their ensuing release of C1q’s teammates may set off a synapse-destroying rampage that spreads “like a fire burning through the brain,” Barres said.

“Our findings may well explain the long-mysterious vulnerability specifically of the aging brain to neurodegenerative disease,” he said. “Kids don’t get Alzheimer’s or Parkinson’s. Profound activation of the complement cascade, associated with massive synapse loss, is the cardinal feature of Alzheimer’s disease and many other neurodegenerative disorders. People have thought this was because synapse loss triggers inflammation. But our findings here suggest that activation of the complement cascade is driving synapse loss, not the other way around.”

Aug 14, 201368 notes
#neurodegenerative diseases #aging #alzheimer's disease #immune cells #microglia #neuroscience #science
Aug 14, 201382 notes
#dyslexia #language processing #arcuate fasciculus #neuroimaging #neuroscience #science
Aug 13, 2013109 notes
#dementia #aphasia #primary progressive aphasia #cognitive impairment #neuroimaging #neuroscience #science
New clue on the origin of Huntington’s disease

The synapses in the brain act as key communication points between approximately one hundred billion neurons. They form a complex network connecting various centres in the brain through electrical impulses.

New research from Lund University suggests that it is precisely here, in the synapses, that Huntington’s disease might begin.

The researchers looked into the brains of mice with real-time imaging methods, following some of the very first stages of the disease through advanced microscopes. What they discovered was an unprecedented degradation of synaptic activity. Long before the well documented nerve cell death, synapses that are important for communication between brain centres that control memory and learning begin to wither. This process has never been mapped before and could be an important step towards understanding the serious non-motor symptoms that affect Huntington patients long before the movement disorders start to show.
“With the naked eye, we have now been able to follow the step by step events when these synapses start to break down. If we are to halt or reverse this process in the future, it is necessary to understand exactly what happens in the initial phase of the disease. Now we know more”, says Professor Jia-Yi Li, the research group leader.

Huntington’s disease has long been characterized by the involuntary writhing movements faced by patients. But in fact, Huntington’s has a very broad and highly individual symptomatology. Depression, memory loss and sleep disorders are all common early on in the disease.
“Many patients testify that these symptoms affect quality of life significantly more than the involuntary jerky movements. Therefore, it is extremely important that we achieve progress in this field of research. Our goal now is to find new therapies that can increase the lifespan of these synapses and maintain their vital function”, explains postdoc Reena, who lead the imaging experiments.

Aug 13, 201374 notes
#huntington's disease #synapses #synaptic activity #memory #learning #neuroscience #science
Aug 13, 201388 notes
#stroke #retina #retinal imaging #blood vessels #hypertensive retinopathy #medicine #science
Aug 13, 201353 notes
#olfactory bulb #olfactory retentivity #odor memory #memory #channelrhodopsin #neuroscience #science
Next page →
20132014
  • January
  • February
  • March
  • April
  • May
  • June
  • July
  • August
  • September
  • October
  • November
  • December
201220132014
  • January
  • February
  • March
  • April
  • May
  • June
  • July
  • August
  • September
  • October
  • November
  • December
20122013
  • January
  • February
  • March
  • April
  • May
  • June
  • July
  • August
  • September
  • October
  • November
  • December