Neuroscience

Month

September 2013

Sep 2, 201378 notes
#neurological diseases #microphages #microglia #calcium channel #lysosome #neuroscience #science
Sep 1, 2013147 notes
#brain lateralization #brain hemispheres #cognitive ability #psychology #neuroscience #science
Shutting off Neurons Helps Bullied Mice Overcome Symptoms of Depression

Findings Point to New Potential Drug Target—GABA Neurons—to Treat Patients with Depression and Other Mood Disorders

A new drug target to treat depression and other mood disorders may lie in a group of GABA neurons (gamma-aminobutyric acid –the neurotransmitters which inhibit other cells) shown to contribute to symptoms like social withdrawal and increased anxiety, Penn Medicine researchers report in a new study in the Journal of Neuroscience.

Experts know that people suffering from depression and other mood disorders often react to rejection or bullying by withdrawing themselves socially more than the average person who takes it in strides, yet the biological processes behind these responses have remained unclear.

Now, a preclinical study, from the labs of Olivier Berton, PhD, an assistant professor in the department of Psychiatry, with Collin Challis of the Neuroscience Graduate Group, and Sheryl Beck, PhD, a professor in the department of Anesthesiology at Children’s Hospital of Philadelphia, found that bullying and other social stresses triggered symptoms of depression in mice by activating GABA neurons, in a never-before-seen direct relationship between social stimuli and this neural circuitry.  Activation of those neurons, they found, directly inhibited levels of serotonin, long known to play a vital role in behavioral responses—without it, a depressed person is more likely to socially withdrawal.

 Conversely, when the researchers successfully put the brake on the GABA neurons, mice became more resilient to bullying and didn’t avoid once -perceived threats.

“This is the first time that GABA neuron activity—found deep in the brainstem—has been shown to play a key role in the cognitive processes associated with social approach or avoidance behavior in mammals,” said Dr. Berton. “The results help us to understand why current antidepressants may not work for everyone and how to make them work better—by targeting GABA neurons that put the brake on serotonin cells.”

Less serotonin elicits socially defensive responses such as avoidance or submission, where enhancement—the main goal of antidepressants—induces a positive shift in the perception of socio-affective stimuli, promoting affiliation and dominance. However, current antidepressants targeting serotonin, like SSRIs, are only effective in about 50 percent of patients. 

These new findings point to GABA neurons as a new, neural drug target that could help treat the other patients who don’t respond to today’s treatment.

For the study, “avoidant” mice were exposed to brief bouts of aggression from trained “bully” mice. By comparing gene expression in the brains of resilient and avoidant mice, Berton and colleagues discovered that bullying in avoidant mice puts GABA neurons in a state where they become more excitable and the mice exhibit signs of social defeat. Resilient mice, however, had no change in neuron levels and behavior.

To better understand the link between GABA and the development of stress resilience, Berton, Beck, and colleagues also devised an approach to directly manipulate levels: Lifting GABA inhibition of serotonin neurons reduced social and anxiety symptoms in mice exposed to bullies and also fully prevented neurobiological changes due to stress.

“Our paper provides a novel cellular understanding of how social defensiveness and social withdrawal develop in mice and gives us a stepping stone to better understand the basis of similar social symptoms in humans,” said Berton. “This has important implications for the understanding and treatment of mood disorders.”

Sep 1, 2013173 notes
#depression #mood disorders #GABA neurons #serotonin #social withdrawal #stress #neuroscience #science
Sep 1, 201391 notes
#alzheimer's disease #frontotemporal dementia #stem cells #iPSCs #tauopathies #medicine #neuroscience #science
Researchers Discover New Way to Track Huntington’s Disease Progression Using PET Scans

Investigators at The Feinstein Institute for Medical Research have discovered a new way to measure the progression of Huntington’s disease, using positron emission tomography (PET) to scan the brains of carriers of the gene. The findings are published in the September issue of The Journal of Clinical Investigation.

Huntington’s disease causes the progressive breakdown of nerve cells in the brain, which leads to impairments in movement, thinking and emotions. Most people with Huntington’s disease develop signs and symptoms in their 40s or 50s, but the onset of disease may be earlier or later in life. Medications are available to help manage the symptoms of Huntington’s disease, but treatments do not prevent the physical, mental and behavioral decline associated with the condition.

Huntington’s disease is an inherited disease, passed from parent to child through a mutation in the normal gene. Each child of a parent with Huntington’s disease has a 50/50 chance of inheriting the Huntington’s disease gene, and a child who inherits the gene will eventually develop the disease. Genetic testing for Huntington’s disease can be performed to determine whether a person carries the gene and is developing the disease even before symptoms appear. Having this ability provides an opportunity for scientists to study how the disease first develops and how it progresses in its early, presymptomatic stages. Even though a carrier of the Huntington’s disease gene may not have experienced symptoms, changes in the brain have already taken place, which ultimately lead to severe disability. Brain imaging is one tool that could be used to track how quickly Huntington’s disease progresses in gene carriers. Having a better way to track the disease at its earliest stages will make it easier to test drugs designed to delay or even prevent the onset of symptoms.

Researchers at the Feinstein Institute used PET scanning to map changes in brain metabolism in 12 people with the Huntington’s disease gene who had not developed clinical signs of the illness. The researchers scanned the subjects repeatedly over a seven-year period and found a characteristic set (network) of abnormalities in their brains. The network was used to measure the rate of disease progression in the study participants. The Feinstein Institute investigators then confirmed the progression rate through independent measurements in scans from a separate group of Huntington’s disease gene carriers who were studied in the Netherlands. The investigators believe that progression networks similar to the one identified in Huntington’s disease carriers will have an important role in evaluating new drugs for degenerative brain disorders.

“Huntington’s disease is an extremely debilitating disease. The findings make it possible to evaluate the effects of new drugs on disease progression before symptoms actually appear. This is a major advance in the field,” said David Eidelberg, MD, Susan and Leonard Feinstein Professor and head of the Center for Neurosciences at the Feinstein Institute.

Sep 1, 201331 notes
#huntington's disease #brain imaging #PET scan #metabolic network #medicine #neuroscience #science
Why We Look At The Puppet, Not The Ventriloquist

The brain doesn’t require simultaneous visual and audio stimulation to locate the source of a sound

image

As ventriloquists have long known, your eyes can sometimes tell your brain where a sound is coming from more convincingly than your ears can.

A series of experiments in humans and monkeys by Duke University researchers has found that the brain does not require simultaneous visual and audio stimulation to locate the source of a sound. Rather, visual feedback obtained from trying to find a sound with the eyes had a stronger effect than visual stimuli presented at the same time as the audio, according to the Duke study.

The findings could help those with mild hearing loss learn to localize voices better, improving their ability to communicate in noisy environments, said Jennifer Groh, a professor of psychology and neuroscience at Duke.

Locating where a sound is coming from is partially learned with the aid of vision. Researchers sought to learn more about how the brain locates the source of a sound when the source is unclear and there are a number of possible visual matches.

"Our study is related to ventriloquism, in which the visual image of a puppet’s mouth ‘captures’ the sound of the puppeteer’s voice," Groh said. "It is thought that one reason this illusion occurs is because vision normally teaches the brain how to tell where sounds are coming from. We investigated how the brain knows which visual stimulus should capture the location of a sound, such as why it is the puppet’s mouth and not some other visual stimulus."

The study, which appears Thursday (Aug. 29) in the journal PLOS ONE, tested two competing hypotheses. In one, the brain determines the location of a sound based on the simultaneous occurrence of audio and its visual source. In the other, the brain uses a “guess and check” method. In this scenario, visual feedback sent to the brain after the eye focuses on a sound affects how the eye searches for that sound in the future, possibly through the brain’s reward-related circuitry.

In both paradigms, the visual stimulus — an LED — was displaced from the sound. Groh’s team then looked for evidence that the LED caused a persistent mislocation of the sound.

"Surprisingly, we found that visual feedback exerts the more powerful effect on altering localization of sounds," Groh said. "This suggests that the active behavior of looking at the puppet during a ventriloquism performance plays a role in causing the shift in where you hear the voice."

Participants in the study — 11 humans  and two rhesus monkeys — shifted their sight to a sound under different visual and audio scenarios.

In one scenario, called the “synchrony-only” task, a visual stimulus appeared at the same time as a sound but too briefly to provide feedback after an eye movement to that sound.

In another, the “feedback-only” task, the visual stimulus appeared during the execution of an eye movement to a sound, but was never on at the same time as the sound.

The study found that the “feedback-only task” exerted a much more powerful effect on the estimation of sound location, as measured with eye tracking, than did the other scenario. This suggests that those who have difficulty localizing sounds may benefit from practice involving eye movements.

On average, participants altered their eye movements in the direction of the lights’ location to a greater degree, about a quarter of the way, when the visual stimulus was presented as feedback than when it was presented at the same time as the sound, the study found.

"This is about the brain’s self-improvement skills," said co-author Daniel Pages, a graduate student in Psychology & Neuroscience at Duke. "What we’re getting at is how the brain uses different types of information to improve how it does its job. In this case, it uses vision coupled with eye movements to improve hearing."

"We were surprised at how important the eye movements were," Groh said. "But finding sounds is really hard. Feedback about your performance is important for anything that is difficult, whether it is the B- you get on your homework or the error your eyes detect in localizing a sound."

Sep 1, 201390 notes
#science #eye movements #visual stimulus #hearing loss #sound location #neuroscience #psychology
Sep 1, 2013105 notes
#nerve cells #intellectual disability #mental retardation #primary cilium #brain development #neuroscience #medicine #science

August 2013

Researchers develop new model to study schizophrenia and other neurological conditions

Schizophrenia is one of the most devastating neurological conditions, with only 30 percent of sufferers ever experiencing full recovery. While current medications can control most psychotic symptoms, their side effects can leave individuals so severely impaired that the disease ranks among the top ten causes of disability in developed countries.

Now, in this week’s issue of the Proceedings of the National Academy of Sciences, Thomas Albright and Ricardo Gil-da-Costa of the Salk Institute for Biological Studies describe a model system that completes the bridge between cellular and human studies of schizophrenia, an advance that should help speed the development of therapeutics for schizophrenia and other neurological disorders.

"Part of the terror of schizophrenia is that the brain can’t properly integrate sensory information, so the world is a disorientating series of unrelated bits of input," says Albright, the Conrad T. Prebys Chair in Vision Research. "We’ve created a model that tests the ability to do sensory integration, which should be extremely useful for pharmaceutical research."

Currently, over 1.1 percent of the world’s population has schizophrenia, with an estimated three million individuals in the United States alone. The economic cost is high: In 2002, Americans spent nearly $63 billion on treatment and managing disability. The emotional cost is higher still: Ten percent of those with schizophrenia are driven to commit suicide by the burden of coping with the disease.

Initially, it was thought that excessive amounts of the neurotransmitter dopamine caused psychotic symptoms, and indeed, current anti-psychotic drugs work by blocking dopamine from entering brain cells. But nearly all of these drugs have severe cognitive side effects, which led researchers to speculate that some other mechanism must also be involved.

A major clue to understanding schizophrenia came with the development of phencyclidine (PCP) in 1956. It was intended to keep patients safely asleep during surgeries, but many woke up with symptoms similar to those experienced by people with schizophrenia, including hallucinations and the disorientation of feeling “dissociated” from their limbs, resulting in PCP being abandoned for clinical purposes. A decade later, it was replaced by a derivative called ketamine. At doses high enough to put patients to sleep, ketamine is an effective anesthetic. At lower doses, it temporarily produces the same schizophrenia-like effects as PCP.

The two drugs are part of a class called N-methyl-D-aspartate receptor antagonists. Essentially, they work by gumming up the mechanism by which glutamate, the main excitatory neurotransmitter, would enter brain cells. Thus, it is clear that dopamine dysfunction accounts for some of the symptoms of psychosis, although that is probably not the full story.

"While dopamine has limited reach in the brain, any dysfunction in glutamate would be expected to have the sort of widespread effects we see in the perceptual disorders associated with schizophrenia," says Albright. "Nevertheless, which neurotransmitter was primary to these disorders—glutamate or dopamine—has been argued about for years."

Standing in the way of a definitive answer was a researcher’s Catch-22: Many experiments designed to understand cognitive disorders such as schizophrenia or Alzheimer’s require a participant’s conscious attention-yet these disorders interfere with attention.

To get around this, scientists turned to electroencephalograms (EEGs), which can be used to detect changes in cases where a subject is not consciously paying attention to a stimulus, by recording the brain’s electrical signals through electrodes placed in a scalp cap. In one test, a series of tones is played, but an “oddball” tone breaks the pattern in the sequence. A healthy brain can still easily spot the differences, even if a participant is concentrating on another task, such as reading a magazine.

"The test works because the brain is a prediction machine-it’s built to anticipate what should come next," says Albright. "If you have healthy working memory, you should be able to perceive a pattern and notice when something violates it, but patients suffering from some mental health disorders lack that basic ability."

In their latest research, Albright’s team detected the difference through two signals, event-related brain potentials called mismatch negativity (MMN) and P3. The MMN reflects differential brain activity to the detected oddball tone, below the level of conscious awareness. P3 picks up the next phase: a subject’s attention orientation to the oddball tone.

Still, a gap in understanding remained. While scientists could do cellular work in animal models on the role of dopamine versus glutamate, and they could do EEGs in human beings, a bridge between the two remained elusive. Such a bridge can help scientists understanding of how healthy and disordered brains work from the cellular level all the way to the multiple interactions between brain areas. Moreover, it can enable pre-clinical and clinical trials linking cellular and systems levels for successful therapeutic avenues.

Gil-da-Costa has at last crossed the bridge by crafting the first non-invasive scalp EEG setup that records accurately from the brains of non-human primates, with the same proportional density of electrodes as a human cap and no distortions in signal caused by an incorrect fit. This setup allows him to get accurate measurements of MMN and P3, with the same protocols that are followed in humans. As a result, the lab has come closer than ever before to untangling the roles of dopamine and glutamate.

"While rodents are essential for understanding mechanisms at a cellular or molecular level, at a higher cognitive level, the best you could do was a sort of rough analogy. Now, finally, we can have a one-to-one correspondence," says Gil-da-Costa. "For sensory integration, our findings with this model support the glutamate hypothesis."

Pharmaceutical companies are interested in the model, because of the potential for more precise testing and the universality of the MMN/P3 assays. “These brain makers are the same across dozens of neurological diseases, as well as brain trauma, so you can test potential therapies not just for schizophrenia, but for conditions such as Parkinson’s, Alzheimer’s, bi-polar disorder, and traumatic brain injuries,” says Gil-da-Costa. “We hope this will help begin a new era in neurological therapeutics.”

Aug 31, 2013118 notes
#schizophrenia #psychosis #glutamate #dopamine #brain activity #neuroscience #science
Brain imaging study reveals the wandering mind behind insomnia

Study is the first to find functional MRI differences in working memory in people with primary insomnia

image

A new brain imaging study may help explain why people with insomnia often complain that they struggle to concentrate during the day even when objective evidence of a cognitive problem is lacking.

"We found that insomnia subjects did not properly turn on brain regions critical to a working memory task and did not turn off ‘mind-wandering’ brain regions irrelevant to the task," said lead author Sean P.A. Drummond, PhD, associate professor in the department of psychiatry at the University of California, San Diego, and the VA San Diego Healthcare System, and Secretary/Treasurer of the Sleep Research Society. "Based on these results, it is not surprising that someone with insomnia would feel like they are working harder to do the same job as a healthy sleeper."

The research team led by Drummond and co-principal investigator Matthew Walker, PhD, studied 25 people with primary insomnia and 25 good sleepers. Participants had an average age of 32 years. The study subjects underwent a functional magnetic resonance imaging scan while performing a working memory task.

Results published in the September issue of the journal Sleep show that participants with insomnia did not differ from good sleepers in objective cognitive performance on the working memory task. However, the MRI scans revealed that people with insomnia could not modulate activity in brain regions typically used to perform the task.

As the task got harder, good sleepers used more resources within the working memory network of the brain, especially the dorsolateral prefrontal cortex. Insomnia subjects, however, were unable to recruit more resources in these brain regions. Furthermore, as the task got harder, participants with insomnia did not dial down the “default mode” regions of the brain that are normally only active when our minds are wandering.

"The data help us understand that people with insomnia not only have trouble sleeping at night, but their brains are not functioning as efficiently during the day," said Drummond. "Some aspects of insomnia are as much of a daytime problem as a nighttime problem. These daytime problems are associated with organic, measurable abnormalities of brain activity, giving us a biological marker for treatment success."

According to the authors, the study is the largest to examine cerebral activation with functional MRI during cognitive performance in people with primary insomnia, relative to well-matched good sleepers. It also is the first to characterize functional MRI differences in working memory in people with primary insomnia.

The American Academy of Sleep Medicine reports that about 10 to 15 percent of adults have an insomnia disorder with distress or daytime impairment. Most often insomnia is a comorbid disorder occurring with another problem such as depression or chronic pain, or caused by a medication or substance. Fewer people suffering from insomnia are considered to have primary insomnia, which is defined as a difficulty falling asleep or maintaining sleep in the absence of a coexisting condition.

Aug 31, 2013171 notes
#insomnia #working memory #cognitive performance #prefrontal cortex #neuroscience #psychology #science
Aug 30, 201380 notes
#alzheimer's disease #diagnostic tool #cerebral cortex #brainwaves #neuroscience #science
Aug 30, 2013113 notes
#nucleus accumbens #social reward #social media #facebook #reputation #psychology #neuroscience #science
Learning a new language alters brain development

The age at which children learn a second language can have a significant bearing on the structure of their adult brain, according to a new joint study by the Montreal Neurological Institute and Hospital - The Neuro at McGill University and Oxford University. The majority of people in the world learn to speak more than one language during their lifetime. Many do so with great proficiency particularly if the languages are learned simultaneously or from early in development.

image

The study concludes that the pattern of brain development is similar if you learn one or two language from birth. However, learning a second language later on in childhood after gaining proficiency in the first (native) language does in fact modify the brain’s structure, specifically the brain’s inferior frontal cortex. The left inferior frontal cortex became thicker and the right inferior frontal cortex became thinner. The cortex is a multi-layered mass of neurons that plays a major role in cognitive functions such as thought, language, consciousness and memory.

The study suggests that the task of acquiring a second language after infancy stimulates new neural growth and connections among neurons in ways seen in acquiring complex motor skills such as juggling. The study’s authors speculate that the difficulty that some people have in learning a second language later in life could be explained at the structural level.

“The later in childhood that the second language is acquired, the greater are the changes in the inferior frontal cortex,” said Dr. Denise Klein, researcher in The Neuro’s Cognitive Neuroscience Unit and a lead author on the paper published in the journal Brain and Language. “Our results provide structural evidence that age of acquisition is crucial in laying down the structure for language learning.”

Using a software program developed at The Neuro, the study examined MRI scans of 66 bilingual and 22 monolingual men and women living in Montreal. The work was supported by a grant from the Natural Science and Engineering Research Council of Canada and from an Oxford McGill Neuroscience Collaboration Pilot project.

Aug 30, 2013335 notes
#brain development #language #frontal cortex #cognitive function #neuroscience #psychology #science
Aug 30, 201389 notes
#circadian rhythms #jet lag #suprachiasmatic nuclei #chronic diseases #neuroscience #science
Aug 30, 2013321 notes
#poverty #cognitive function #cognitive performance #psychology #neuroscience #science
Aug 30, 2013338 notes
#stem cells #pluripotent stem cells #brain tissue #cerebral organoids #mini brains #neuroscience #science
Hospital scientists identify ALS disease mechanism

Study strengthens link between amyotrophic lateral sclerosis (ALS) and problems in protein production machinery of cells and identifies possible treatment strategy

Researchers have tied mutations in a gene that causes amyotrophic lateral sclerosis (ALS) and other neurodegenerative disorders to the toxic buildup of certain proteins and related molecules in cells, including neurons. The research, published recently in the scientific journal Cell, offers a new approach for developing treatments against these devastating diseases.

Scientists at St. Jude Children’s Research Hospital and the University of Colorado, Boulder, led the work.

The findings provide the first evidence that a gene named VCP plays a role in the break-up and clearance of protein and RNA molecules that accumulate in temporary structures called RNA granules. RNAs perform a variety of vital cell functions, including protein production. RNA granules support proper functioning of RNA.

In ALS and related degenerative diseases, the process of assembling and clearing RNA granules is impaired. The proteins and RNAs associated with the granules often build up in nerve cells of patients. This study shows how mutations in VCP might contribute to that process and neurodegenerative disease.

“The results go a long way to explaining the process that links a variety of neurodegenerative diseases, including ALS, frontotemporal dementia and related diseases of the brain, muscle and bone known as multisystem proteinopathies,” said the study’s co-corresponding author, J. Paul Taylor, M.D., Ph.D., a member of the St. Jude Department of Developmental Neurobiology. Roy Parker, Ph.D., of the University of Colorado’s Department of Chemistry and Biochemistry and the Howard Hughes Medical Institute (HHMI), is the other corresponding author.

ALS, also known as Lou Gehrig’s disease, is diagnosed in about 5,600 Americans annually and is associated with progressive deterioration of nerve cells in the brain and spine that govern movement, including breathing. There is no effective treatment, and death usually occurs within five years.

“A strength of this study is that it provides a unifying hypothesis about how different genetic mutations all affect stress granules, which suggests that understanding stress granule dynamics and how they can be manipulated might be beneficial for treatment of these diseases,” Parker said.

Earlier work from Taylor’s laboratory identified mutations in VCP as a cause of ALS and related multisystem proteinopathies. Until now, however, little was known about how those mistakes caused disease. The latest findings appeared in the June 20 issue and are highlighted in a review article published in the August 15 issue of Cell.

The research also ties VCP mutations to disruption of RNA regulation, which prior studies have connected to the progression of neurodegenerative diseases, said Regina-Maria Kolaitis, Ph.D., a postdoctoral fellow in Taylor’s laboratory. She and Ross Buchan, Ph.D., a postdoctoral fellow in Parker’s laboratory, are co-first authors.

The work focused on a class of RNA granules called stress granules. They are formed by proteins and an RNA molecule called mRNA that accumulates in the cell cytoplasm in response to stress. Stressed cells do not want to waste energy producing unnecessary proteins. Stress granules are one mechanism cells use to halt production until the cellular environment normalizes, which is when stress granules typically dissolve.

Proteins found in stress granules include RNA-binding proteins like TDP-43, FUS, hnRNPA1 and hnRNPA2B1 that regulate gene activity. Mutations in those proteins can also cause ALS and related disorders.

“VCP has many functions in cells, but it is not an RNA-binding protein and until now it was not connected to stress granules or RNA processing,” Kolaitis said. “This study provides a new window into the disease process, highlighting VCP’s role in keeping cells healthy.”

For this study, researchers used yeast to identify a network of 125 genes that affect the formation and behavior of stress granules. One of the genes that appeared to play a central role in the network was CDC48, which functions like VCP in yeast. In addition, many of the genes identified are involved in a process called autophagy that cells use to break down and recycle unneeded molecules, including proteins.

Working in yeast and mammalian cells, researchers showed that stress granules are cleared by autophagy, which stalled when VCP was mutated. Researchers also reported that stress granules accumulated following mutation of either CDC48 or VCP.

“This work suggests that activating autophagy to help rid cells of stress granules offers a new approach to neurodegenerative disease treatment,” Taylor said.

Aug 29, 201367 notes
#ALS #neurodegenerative diseases #stress granules #mRNA #mutations #neuroscience #science
Aug 29, 201382 notes
#stem cells #myelin #glial cells #spinal cord injury #viral transduction #neuroscience #science
Aug 29, 2013280 notes
#science #autism #ASD #topoisomerases #mutations #brain development #neuroscience
Aug 29, 2013102 notes
#memory #memory loss #alzheimer's disease #hippocampus #entorhinal cortex #neuroscience #science
Aug 29, 2013296 notes
#migraines #white matter #brain volume #migraine with aura #brain function #neuroscience #science
Aug 28, 20131,506 notes
#peripersonal space #defensive peripersonal space #anxiety #neuroscience #psychology #science
Aug 28, 2013343 notes
#marijuana #cannabis #prefrontal cortex #adolescence #mental health #neuroscience #science
Aug 28, 2013138 notes
#memory #cerebral cortex #hippocampus #synaptic plasticity #NMDA receptors #neuroscience #science
Aug 28, 2013147 notes
#migraines #melanopsin #retinoids #opsinamides #retinal ganglion cells #circadian rhythms #neuroscience #science
Researchers Find Promising Therapeutic Target for Hard-To-Treat Brain Tumor

Specific protein found in nearly all high-grade meningiomas

Johns Hopkins researchers say they have found a specific protein in nearly 100 percent of high-grade meningiomas — the most common form of brain tumor — suggesting a new target for therapies for a cancer that does not respond to current chemotherapy.

image

Importantly, the investigators say, the protein — NY-ESO-1 — is already at the center of a clinical trial underway at the National Cancer Institute. That trial is designed to activate the immune systems of patients with other types of tumors that express the protein, training the body to attack the cancer and eradicate it.

“Typically there is a lag time before a laboratory finding like this leads to a clear path forward to help patients. But in this case, since there is already a clinical trial underway, we have a chance of helping people sooner rather than later,” says Gregory J. Riggins, M.D., Ph.D., a professor of neurosurgery at the Johns Hopkins University School of Medicine and the senior author of the study published online in the journal Cancer Immunology Research.

In the NCI trial, NY-ESO-1 is found in a much smaller percentage of tumors than Riggins and his team found in high-grade meningioma, suggesting that for the brain cancer, the target would be potentially more significant.

Most low-grade meningiomas located in easy-to-reach locations can be treated successfully with surgery and radiation. But more atypical, higher-grade tumors are much more difficult to eradicate and are deadlier.

Riggins and his colleagues, including Gilson S. Baia, Ph.D., and Otavia L. Caballero, M.D., Ph.D., set out to find cancer antigens in meningioma. Cancer antigens are proteins expressed in tumors but not in healthy cells, making them good targets for chemical or immune system attack. They looked specifically at 37 cancer/testis (CT) genes, which are not found in normal cells in the body except in germ cells and cells cordoned off in the testicles or, in some cases, ovaries.

CT genes are activated, however, in various cancers. While they are seen as “foreign” by the body’s immune system, they are often locked behind the sophisticated defense system that cancers use to evade attack by immune cells. Finding a way to get the immune system to see these protein antigens, however, could allow for the body to recognize the invasion and go after the cancer cells. Various approaches are being used to do that, including vaccines and a system involving removing T-cells from the body and reprogramming them before returning them and setting them loose on the cancer cells.

The Johns Hopkins researchers took tissue from 18 different meningioma samples, removed the genetic material and protein and checked at what levels the 37 different genes were turned on. The gene that is the blueprint for the NY-ESO-1 protein was turned on more frequently than any other, in five of the 18 patient samples.

Then they analyzed NY-ESO-1 expression in a larger group of 110 meningioma tissue samples. They found NY-ESO-1 in 108 of them. The more expression in the sample, they also determined, the higher the tumor grade. The higher levels of NY-ESO-1 expressed also correlated with significantly lower disease-free and overall survival rates in the patients they came from.

The NCI trial originally began in melanoma patients. NY-ESO-1 is expressed in roughly one-third of melanomas as well as approximately one-third of breast, prostate, lung, ovarian, thyroid and bladder cancers, as well as sarcomas. Riggins and his team did not find the protein in glioblastoma, the deadliest form of brain cancer.

He calls the fact that the NCI trial could now include meningioma patients a “stroke of luck.”

“If that therapy did not exist, there would be a lot of work that would have to be done to convince people to pursue this,” Riggins says. “Our goal is to get something that works to the patients. This puts us well on our way.”

Aug 28, 201366 notes
#brain tumor #meningioma #cancer cells #proteins #NY-ESO-1 #neuroscience #science
Aug 28, 20131,898 notes
#tech #brain-to-brain interface #transcranial magnetic stimulation #EEG #neuroscience #science
Not guility: Parkinson and protein phosphorylation

EPFL scientists exonerated a process thought to play a role in causing Parkinson’s disease; rather than triggering toxic aggregates in neurons, it turns out that it actually slows down the disease, pharmas have now new tracks to explore

Clues left at the scene of the crime don’t always point to the guilty party, as EPFL researchers investigating Parkinson’s disease have discovered. It is generally accepted that the disease is aggravated when a specific protein is transformed by an enzyme. The EPFL neuroscientists were able to show that, on the contrary, this transformation tends to protect against the progression of the disease. This surprising conclusion could radically change therapeutic approaches that are currently being developed by pharmaceutical companies. The research is to appear in an article in the Proceedings of the National Academy of Sciences (PNAS).

Parkinson’s disease is characterized by the accumulation of a protein known as alpha-synuclein in the brain. If too much of it is produced or if it’s not eliminated properly, it then aggregates into small clumps inside the neurons, eventually killing them. Several years ago scientists discovered that these aggregated proteins in the brain had undergone a transformation known as “phosphorylation” — a process in which an enzyme adds an extra chemical element to a protein, thus modifying its properties.

The investigators’ conclusion that the enzyme’s activity could be responsible for the disease seems eminently reasonable. If phosphorylation and protein aggregation go hand in hand, then it makes sense that one should cause the other. This is the assumption that researchers and pharmaceutical companies made as they tried to reduce the phosphorylation by deactivating an enzyme involved in the process. But they have been following a false lead, as the EPFL team was able to show.

The scientists even discovered that the phosphorylation of the protein has positive effects. On the one hand, it considerably reduces the toxic aggregation of the protein, and on the other, it helps the cell eliminate the protein. “The two phenomena are undoubtedly related, and together could play a role in the reduction of alpha-synuclein toxicity, but we don’t yet understand the impact of both processes at each stage of the disease,” explains neurobiologist Abid Oueslati, first author on the study.

Going back to the beginning

To reach this conclusion, the biologists had to explore the initial disease conditions. They injected into rat neurons what were thought to be the elements needed to trigger the disease: an overexpression of alpha-synuclein and the enzyme that phosphorylates it (PLK2).

To their surprise, the group of animals subjected to both of the parameters — overproduction of the protein and phosphorylation — lost nearly 70% fewer neurons than another group in which only the protein was overexpressed. Consequently, they had fewer lesions, and less Parkinson symptoms.

"We owe this discovery to unique tools that we developed, in collaboration with the Aebischer group, in order to study the effect of this transformation at the molecular level. ," explains Hilal Lashuel, who directed the study. Our study revealed the limitations of the most commonly used approach, which uses genetic mutations to mimic this process.

Lashuel thinks it is highly probable that the phosphorylation of the proteins takes place after they are aggregated, that is to say once the disease is already established. Or it could be a defense mechanism of the neurons, an attempt to try and slow down the progression of the disease from the beginning.

The scientists’ research opens doors for the development of future drug therapies. “The lesson we learned from this research is that everything you find at the scene of a crime is not necessarily involved in the crime. By remaining fixated on that assumption, we may lose sight of the bigger picture.”

Aug 27, 201347 notes
#parkinson's disease #alpha-synuclein #phosphorylation #neuroscience #science
Aug 27, 2013236 notes
#sensorimotor cortex #plasticity #neuroprosthetic limbs #brain activity #neuroscience #science
Language can reveal the invisible

It is natural to imagine that the sense of sight takes in the world as it is — simply passing on what the eyes collect from light reflected by the objects around us.

But the eyes do not work alone. What we see is a function not only of incoming visual information, but also how that information is interpreted in light of other visual experiences, and may even be influenced by language.

Words can play a powerful role in what we see, according to a study published this month by UW-Madison cognitive scientist and psychology professor Gary Lupyan, and Emily Ward, a Yale University graduate student, in the journal Proceedings of the National Academy of Sciences.

"Perceptual systems do the best they can with inherently ambiguous inputs by putting them in context of what we know, what we expect," Lupyan says. "Studies like this are helping us show that language is a powerful tool for shaping perceptual systems, acting as a top-down signal to perceptual processes. In the case of vision, what we consciously perceive seems to be deeply shaped by our knowledge and expectations."

And those expectations can be altered with a single word.

To show how deeply words can influence perception, Lupyan and Ward used a technique called continuous flash suppression to render a series of objects invisible for a group of volunteers.

Each person was shown a picture of a familiar object — such as a chair, a pumpkin or a kangaroo — in one eye. At the same time, their other eye saw a series of flashing, “squiggly” lines.

"Essentially, it’s visual noise," Lupyan says. "Because the noise patterns are high-contrast and constantly moving, they dominate, and the input from the other eye is suppressed."

Immediately before looking at the combination of the flashing lines and suppressed object, the study participants heard one of three things: the word for the suppressed object (“pumpkin,” when the object was a pumpkin), the word for a different object (“kangaroo,” when the object was actually a pumpkin), or just static.

Then researchers asked the participants to indicate whether they saw something or not. When the word they heard matched the object that was being wiped out by the visual noise, the subjects were more likely to report that they did indeed see something than in cases where the wrong word or no word at all was paired with the image.

"Hearing the word for the object that was being suppressed boosted that object into their vision," Lupyan says.

And hearing an unmatched word actually hurt study subjects’ chances of seeing an object.

"With the label, you’re expecting pumpkin-shaped things," Lupyan says. "When you get a visual input consistent with that expectation, it boosts it into perception. When you get an incorrect label, it further suppresses that."

Experiments have shown that continuous flash suppression interrupts sight so thoroughly that there are no signals in the brain to suggest the invisible objects are perceived, even implicitly.

"Unless they can tell us they saw it, there’s nothing to suggest the brain was taking it in at all," Lupyan says. "If language affects performance on a test like this, it indicates that language is influencing vision at a pretty early stage. It’s getting really deep into the visual system."

The study demonstrates a deeper connection between language and simple sensory perception than previously thought, and one that makes Lupyan wonder about the extent of language’s power. The influence of language may extend to other senses as well.

"A lot of previous work has focused on vision, and we have neglected to examine the role of knowledge and expectations on other modalities, especially smell and taste," Lupyan says. "What I want to see is whether we can really alter threshold abilities," he says. "Does expecting a particular taste for example, allow you to detect a substance at a lower concentration?"

If you’re drinking a glass of milk, but thinking about orange juice, he says, that may change the way you experience the milk.

"There’s no point in figuring out what some objective taste is," Lupyan says. "What’s important is whether the milk is spoiled or not. If you expect it to be orange juice, and it tastes like orange juice, it’s fine. But if you expected it to be milk, you’d think something was wrong."

Aug 27, 2013178 notes
#language #visual representations #perception #continuous flash suppression #neuroscience #science
Combination of Two Imaging Techniques Allows New Insights into Brain Function

The ability to measure brain functions non-invasively is important both
for clinical diagnoses and research in Neurology and Psychology. Two main imaging techniques are used: positron emission tomography (PET), which reveals metabolic processes in the brain; and activity of different brain regions is measured on the basis of the cells’ oxygen consumption by magnetic resonance imaging (MRI). A direct comparison of PET and MRI measurements was previously difficult because each had to be performed in a separate machine.

Researchers from the Werner Siemens Imaging Center at the University of Tübingen under the direction of Professor Bernd J. Pichler in collaboration with the Department of Diagnostic and Interventional Radiology, University Hospital Tübingen, and the Tübingen Max Planck Institute for Intelligent Systems have now successfully combined both methods. The researchers are able to explore functional processes in the brain in detail and can better assess what course of action to take. These results were achieved by the use of a PET insert enabling complementary, simultaneous PET/MRI scans. It was developed and built at the University of Tübingen.

The researchers could identify in certain regions a mismatch between glucose metabolism related brain activation measured with PET and oxygenation related signals, measured with MRI. Furthermore information about functional connectivity in the brain could be derived from MRI and from dynamic PET data. These results help to further decipher the nature of brain function, and are ultimately useful for basic research as well as clinical practice. The study, by lead author Dr. Hans Wehrl of Professor Bernd J. Pichler’s research team is soon to be published in the journal “Nature Medicine”.

In PET imaging the distribution of a weakly radioactive substance is shown in cross sections of the body, enabling doctors to see many different metabolic and physiological functions at work. Functional MRI (fMRI) allows researchers to depict changes in blood oxygenation that are associated with brain function. This measurement of functional active brain regions is also important for the planning of brain surgeries, where particular care must be taken in certain areas. The ability to collect different kinds of data from different scans simultaneously represents a major step forward in the fields using these technologies.

Aug 26, 201348 notes
#PET #MRI #brain function #glucose metabolism #oxygenation #neuroscience #science
Aug 26, 2013180 notes
#inhibitory neurons #learning #cognitive functioning #plasticity #visual cortex #neuroscience #science
Study in mice links cocaine use to new brain structures

Mice given cocaine showed rapid growth in new brain structures associated with learning and memory, according to a research team from the Ernest Gallo Clinic and Research Center at UC San Francisco. The findings suggest a way in which drug use may lead to drug-seeking behavior that fosters continued drug use, according to the scientists.

The researchers used a microscope that allowed them to peer directly into nerve cells within the brains of living mice, and within two hours of giving a drug they found significant increases in the density of dendritic spines – structures that bear synapses required for signaling – in the animals’ frontal cortex. In contrast, mice given saline solution showed no such increase.

The researchers also found a relationship between the growth of new dendritic spines and drug-associated learning. Specifically, mice that grew the most new spines were those that developed the strongest preference for being in the enclosure where they received cocaine rather than in the enclosure where they received saline. The team published its findings online in Nature Neuroscience on August 25, 2013.

"This gives us a possible mechanism for how drug use fuels further drug-seeking behavior," said principal investigator Linda Wilbrecht, PhD, a Gallo investigator now at UC Berkeley, but who led the research while she was on the UCSF faculty.

"It’s been observed that long-term drug users show decreased function in the frontal cortex in connection with mundane cues or tasks, and increased function in response to drug-related activity or information," Wilbrecht said. "This research suggests how the brains of drug users might shift toward those drug-related associations."

In all living brains there is a baseline level of creation of new spines in response to, or in anticipation of, day-to-day learning, Wilbrecht said. By enhancing this growth, cocaine might be a super-learning stimulus that reinforces learning about the cocaine experience, she said.

The frontal cortex, which Wilbrecht called the “steering wheel” of the brain, controls functions such as long-term planning, decision-making and other behaviors involving higher reasoning and discipline.

The brain cells in the frontal cortex that Wilbrecht and her team studied regulate the output of this brain region, and may play a key role in decision-making. “These neurons, which are directly affected by cocaine use, have the potential to bias decision-making,” she said.

Wilbrecht said the findings could potentially advance research in human addiction “by helping us identify what is going awry in the frontal cortexes of drug-addicted humans, and by explaining how drug-related cues come to dominate the brain’s decision-making processes.”

In the first of a series of experiments, the scientists gave cocaine injections to one group of mice and saline injections to another. The next day, they observed the animals’ brain cells using a 2-photon laser scanning microscope. They were surprised to discover that even after the first dose, the mice treated with cocaine grew more new dendritic spines than the saline-treated mice.

In another experiment, they observed the mice before cocaine or saline treatment and then two hours afterward, and discovered that the animals that received cocaine were developing new dendritic spines within two hours after receiving the drug. Furthermore, the next morning, cocaine-induced spines accounted for almost four times more connections among nerve cells than was observed in saline-treated animals.

In a third experiment, the researchers for a week gave the mice cocaine in one distinctive chamber and saline in another, using identical procedures. Each chamber had its own characteristic visual design, texture and smell to distinguish it from the other chamber. They then let the mice choose which chamber to go to.

"The animals that showed the highest quantity of robust dendritic spines – the spines with the greatest likelihood of developing into synapses – showed the greatest change in preference toward the chamber where they received the cocaine," said Wilbrecht. "This suggests that the new spines might be material for the association that these mice have learned to make between the chamber and the drug."

Wilbrecht noted that the research would not have been possible without live brain imaging via the 2-photon laser scanning microscope, which was developed in 2002. “I grew up at the time of the famous public service campaign that showed a pan of frying eggs with the message, ‘this is your brain on drugs,’” recalled Wilbrecht. “Now, with this microscope, we can actually say, ‘this is a brain cell on drugs.’”

Aug 26, 2013106 notes
#cocaine #frontal cortex #dendritic spines #learning #animal model #neuroscience #science
Aug 26, 2013247 notes
#DNA methylation #reward memory #ventral tegmental area #pleasure #addiction #dopamine #neuroscience #science
Aug 25, 2013123 notes
#nicotine exposure #pregnancy #brain development #animal model #addictive behavior #neuroscience #science
Play
Aug 25, 2013242 notes
#sleep #sleep deprivation #circadian rhythms #memory consolidation #mental health #neuroscience #science
Brain Atrophy Seen in Patients With Diabetes

Brain atrophy rather than cerebrovascular lesions may explain the relationship between type 2 diabetes mellitus (T2DM) and cognitive impairment, according to a study published online Aug. 12 in Diabetes Care.

image

Chris Moran, M.B., B.Ch., from Monash University in Melbourne, Australia, and colleagues analyzed magnetic resonance imaging scans and cognitive tests in 350 participants with T2DM and 363 participants without T2DM. In a blinded fashion, cerebrovascular lesions (infarcts, microbleeds, and white matter hyperintensity [WMH] volume) and atrophy (gray matter, white matter, and hippocampal volumes) were evaluated.

The researchers found that T2DM was associated with significantly more cerebral infarcts and significantly lower total gray, white, and hippocampal volumes, but not with microbleeds or WMH. Gray matter loss was distributed mainly in medial temporal, anterior cingulate, and medial frontal lobe locations in patients with T2DM, while white matter loss was distributed in frontal and temporal regions. Independent of age, sex, education, and vascular risk factors, T2DM was associated with significantly poorer visuospatial construction, planning, visual memory, and speed. When adjusting for hippocampal and total gray volumes, the strength of these associations was cut by almost one-half, but was unchanged with adjustments for cerebrovascular lesions or white matter volume.

"Cortical atrophy in T2DM resembles patterns seen in preclinical Alzheimer’s disease," the authors write. "Neurodegeneration rather than cerebrovascular lesions may play a key role in T2DM-related cognitive impairment."

Aug 24, 2013102 notes
#diabetes #brain atrophy #gray matter #white matter #hippocampal volumes #neuroscience #science
Depressed people have a more accurate perception of time
 


People with mild depression underestimate their talents. However, new research carried out researchers at the University of Limerick and the University of Hertfordshire shows that depressed people are more accurate when it comes to time estimation than their happier peers.




image

Depressed people often appear to distort the facts and view their lives more negatively than non-depressed people. Feelings of helplessness, hopelessness and worthlessness and of being out of control are some of the main symptoms of depression. For these people time seems to pass slowly and they will often use phrases such as “time seems to drag” to describe their experiences and their life. However, depressed people sometimes have a more accurate perception of reality than their happier friends and family who often look at life through rose-tinted glasses and hope for the best.



Dr Rachel Msetfi, senior lecturer in psychology, University of Limerick and one of the studies authors, said: “We found that depressed people tended to be more accurate when estimating time whereas non-depressed people tended to be less accurate. This finding, along with some of our other work, suggests that depression leads to more attention paid to time passing. Sometimes this might lead to a phenomenon known as ‘depressive realism’, though on other occasions time might seem to be moving more slowly than usual.”





In the study, volunteers, who were classified as mildly depressed or non-depressed, made estimates of the length of different time intervals of between two and sixty-five seconds. Overall, those volunteers who were mildly–depressed were more accurate in their time estimations.

Dr Msetfi noted that: “Time is a very important part of everyday experience, it flies when we are having fun or enjoying ourselves. One of the commonest experiences of depression is that people feel that time passes slowly and sometimes painfully. Our findings may help to shed a little light on how people with depression can be treated. People with depression are often encouraged to check themselves against reality, but maybe this timing skill can be harnessed to help in the treatment of mildly-depressed people. These findings may also link to successful mindfulness based treatments for depression which focus on encouraging present moment awareness.”





The paper, “Time perception and depressive realism: Judgement type, psychophysical functions and bias”, is published in PLOS ONE.

Aug 24, 20131,368 notes
#time perception #depression #time estimation #psychology #neuroscience #science
Omega-3 removes ADHD symptoms

A new multidisciplinary study shows a clear connection between the intake of omega-3 fatty acids and a decline in ADHD symptoms in rats.

image

Researchers at the University of Oslo have observed the behaviour of rats and have analyzed biochemical processes in their brains. The results show a clear improvement in ADHD-related behaviour from supplements of omega-3 fatty acids, as well as a faster turnover of the signal substances dopamine, serotonin and glutamate in the nervous system. There are, however, clear sex differences: a better effect from omega-3 fatty acids is achieved in male rats than in female.

Unknown biology behind ADHD

Currently the psychiatric diagnosis ADHD (Attention Deficit/Hyperactivity Disorder) is purely based on behavioural criteria, while the molecular genetic background for the illness is largely unknown. The new findings indicate that ADHD has a biological component and that the intake of omega-3 may influence ADHD symptoms.

“In some research environments it is controversial to suggest that ADHD has something to do with biology. But we have without a doubt found molecular changes in the brain after rats with ADHD were given omega-3,” says Ivar Walaas, Professor of Biochemistry.

The fact that omega-3 can reduce ADHD behaviour in rats has also been indicated in previous international studies. What is unique about the study in question is a multidisciplinarity that has not previously been seen, with contributions from behavioural science in medicine as well as from psychology, nutritional science and biochemistry.

Hyperactive rats

The rats used in the study are called SHR rats – spontaneously hypertensive rats. Although this is primarily a common type of rat, random mutations in their genes have resulted in genetic damage that produces high blood pressure. It is therefore first and foremost blood-pressure researchers who have so far been interested in these rats.

However, the rats do not suffer from high blood pressure until they have reached puberty. Before that age they present totally different symptoms – namely hyperactivity, poor ability to concentrate and impulsiveness. It is exactly these three criteria that form the basis for making the ADHD diagnosis in humans. The animals also react to Ritalin, the central nervous system stimulant, in the same way as humans with ADHD: the hyperactive responses are stabilized. SHR rats are therefore increasingly used in research as a model for ADHD.

Supplements as early as the foetal stage

Researchers believe that omega-3 can have an effect from the very beginning of life. Omega-3 was therefore added to the food given to mother rats before they were impregnated, and this continued throughout their entire pregnancy and while they fed their young. The baby rats were also given omega-3 in their own food after they were separated from their mother at the age of 20 days. Another group of mother rats were given food that did not have omega-3 added, thus creating a control group of SHR offspring that had not been given these fatty acids at the foetal stage or later.

The researchers started to analyze the behaviour of the offspring some days after they were separated from the mother. They studied behaviour driven by reward as well as spontaneous behaviour. Substantial differences were noted for both types of behaviour between the rats that had been given the omega-3 supplement as foetuses and as baby rats and those that had not.

Rewards made male rats more concentrated

The reward-driven behaviour was such that the rats were allowed access to a drop of water each time they pressed an illuminated button. The ADHD rats that had not been given omega-3 could not concentrate on pressing the button, whereas the rats that had been brought up on omega-3 easily managed to hold their concentration for the seconds this takes and were able to enjoy a delicious drop of water as a reward.

Surprisingly enough, it was only male rats that showed an improvement in reward-driven behaviour. However, with regard to the rats’ spontaneous behavior, the same type of reduction in hyperactivity and attention difficulties was noted in both male and female rats that had been given the omega-3 supplement.

Changes in brain chemistry

Professor Walaas and his research group became involved in the study at this point in order to analyze the molecular processes in the rats’ brains.

The group analyzed the level of the chemical connections in the brain, the so-called neurotransmitters that transfer nerve impulses from one nerve cell to another. The researchers measured how much of the neurotransmitters such as dopamine, serotonin and glutamate was released and broken down within the nerve fibres. A key player in this work was Kine S. Dervola, PhD candidate, who reports clear sex differences in the turnover of the neurotransmitters – just as there had been in the reward-driven behaviour.

“We saw that the turnover of dopamine and serotonin took place much faster among the male rats that had been given omega-3 than among those that had not. For serotonin the turnover ratio was three times higher, and for dopamine it was just over two and a half times higher. These effects were not observed among the female rats. When we measured the turnover of glutamate, however, we saw that both sexes showed a small increase in turnover,” Ms Dervola tells us.

Transferrable to humans?

The researchers are cautious about drawing conclusions as to whether the results can be transferred to humans.

“In the first place there is of course a difference between rats and humans, and secondly the rats are sick at the outset. Thirdly the causes of ADHD in humans are in no way mapped sufficiently well. But the end result of what takes place in the brains of both rats and humans with ADHD is hyperactivity, poor ability to concentrate and impulsiveness,” says Professor Walaas, and concludes:

“Giving priority to basic research like this will greatly increase our detailed knowledge of ADHD.”

Reference:

Dervola, Kine-Susann Noren; Roberg, Bjørg Åse; Wøien, Grete; Bogen, Inger Lise; Sandvik, Torbjørn; Sagvolden, Terje; Drevon, Christian A, Espen B. Johansen and Sven Ivar Walaas (2012). Marine omega-3 polyunsaturated fatty acids induce sex-specific changes in reinforcer-controlled behavior and neurotransmitter metabolism in a spontaneously hypertensive rat model of ADHD. Behavioral and Brain Functions.  ISSN 1744-9081. 8(56).

Aug 24, 2013257 notes
#omega-3 #animal model #ADHD #blood pressure #neurotransmitters #neuroscience #science
Receptor may aid spread of Alzheimer’s and Parkinson’s in brain

Scientists at Washington University School of Medicine in St. Louis have found a way that corrupted, disease-causing proteins spread in the brain, potentially contributing to Alzheimer’s disease, Parkinson’s disease and other brain-damaging disorders.

image

Image: An electron micrograph shows clumps of corrupted tau protein outside a nerve cell. Scientists have identified a receptor that lets these clumps into the cell, where the corruption can spread. Blocking this receptor with drugs may help treat Alzheimer’s, Parkinson’s and other disorders.

The research identifies a specific type of receptor and suggests that blocking it may aid treatment of theses illnesses. The receptors are called heparan sulfate proteoglycans (HSPGs).

“Many of the enzymes that create HSPGs or otherwise help them function are good targets for drug treatments,” said senior author Marc I. Diamond, MD, the David Clayson Professor of Neurology. “We ultimately should be able to hit these enzymes with drugs and potentially disrupt several neurodegenerative conditions.”

The study is available online in the Proceedings of the National Academy of Sciences.

Over the last decade, Diamond has gathered evidence that Alzheimer’s disease and other neurodegenerative diseases spread through the brain in a fashion similar to conditions such as mad cow disease, which are caused by misfolded proteins known as prions.

Proteins are long chains of amino acids that perform many basic biological functions. A protein’s abilities are partially determined by the way it folds into a 3-D shape. Prions are proteins that have become folded in a fashion that makes them harmful.

Prions spread across the brain by causing other copies of the same protein to misfold.

Among the most infamous prion diseases are mad cow disease, which rapidly destroys the brain in cows, and a similar, inherited condition in humans called Creutzfeldt-Jakob disease.

Diamond and his colleagues have shown that a part of nerve cells’ inner structure known as tau protein can misfold into a configuration called an amyloid. These corrupted versions of tau stick to each other in clumps within the cells. Like prions, the clumps spread from one cell to another, seeding further spread by causing copies of tau protein in the new cell to become amyloids.

In the new study, first author Brandon Holmes, an MD/PhD student, showed that HSPGs are essential for binding, internalizing and spreading clumps of tau. When he genetically disabled or chemically modified the HSPGs in cell cultures and in a mouse model, clumps of tau could not enter cells, thus inhibiting the spread of misfolded tau from cell to cell.

Holmes also found that HSPGs are essential for the cell-to-cell spread of corrupted forms of alpha-synuclein, a protein linked to Parkinson’s disease.

“This suggests that it may one day be possible to unify our understanding and treatment of two or more broad classes of neurodegenerative disease,” Diamond said. 

“We’re now sorting through about 15 genes to determine which are the most essential for HSPGs’ interaction with tau,” Holmes said. “That will tell us which proteins to target with new drug treatments.”

Aug 23, 201392 notes
#heparan sulfate proteoglycans #receptors #neurodegenerative diseases #prions #nerve cells #neuroscience #science
First to measure the concerted activity of a neuronal circuit

Neurobiologists from the Friedrich Miescher Institute for Biomedical Research have been the first to measure the concerted activity of a neuronal circuit in the retina as it extracts information about a moving object. With their novel and powerful approach they can now not only visualize networks of neurons but can also measure functional aspects. These insights are direly needed for a better understanding of the processes in the brain in health and disease.

image

For many decades electrophysiology and genetics have been the main tools in the toolbox of approaches to study individual neurons in the central nervous system to understand perception and behavior. In the last five years however, neurobiology has been riding a wave of technological advances that brought unprecedented insights: Optogenetics and genetically encoded activity sensors has allowed scientists to control and measure the activity of clearly defined neurons; the application of rabies viruses enabled the visualization of networks of interconnected nerve cells. What was still missing, was the link between neural circuit and monitoring of activity.

Scientists from the Friedrich Miescher Institute for Biomedical Research have now been the first to measure the concerted activity of a neuronal circuit in the retina as it extracts information about the movement of an object.

In a world defined through eyesight, it is crucial to be able to discern whether something moves towards us, moves away or moves next to us. It comes as no surprise then that in the retina several parallel neuronal circuits are reserved for the extraction of information about movement and that most of them are dedicated to the analysis of the direction of motion.

As they report online in Neuron, Keisuke Yonehara and Karl Farrow, two Postdoctoral Fellows in Botond Roska’s team at the FMI, have now been able to monitor the activity of all circuit elements in a motion sensitive retinal circuit at once, and pinpoint the site, at a subcellular level, where the information about the direction of the movement becomes encoded. To achieve this, they used genetically altered rabies viruses expressing calcium sensors developed by the laboratory of Klaus Conzelmann in Munich. The special property of rabies viruses is that they move across connected neurons and therefore are able to deliver the sensors to all circuit elements within a defined neuronal circuit. Simultaneous two-photon imaging allowed them then to monitor activity in every part of the neuronal circuit at once, even in subcellular compartments, such as axons, synapses and dendrites.

"We are extremely thrilled that with this new method, which combines the power of genetically altered rabies viruses with very powerful two-photon microscopy, we are now able to link circuit architecture with activity and ultimately function," comments Yonehara. "We have illustrated the power of the method for a better understanding of the perception of movement and are convinced that the method will allow us to reach a better understanding of many processes in the retina and in other parts of the brain."

Aug 23, 201365 notes
#optogenetics #neural activity #retina #retinal circuit #nerve cells #neuroscience #science
Aug 23, 2013107 notes
#vascular dementia #memory #art #neurodegenerative diseases #Mary Hecht #neuroscience #science
Brain size may signal risk of developing an eating disorder

New research indicates that teens with anorexia nervosa have bigger brains than teens that do not have the eating disorder. That is according to a study by researchers at the University of Colorado’s School of Medicine that examined a group of adolescents with anorexia nervosa and a group without. They found that girls with anorexia nervosa had a larger insula, a part of the brain that is active when we taste food, and a larger orbitofrontal cortex, a part of the brain that tells a person when to stop eating.

Guido Frank, MD, assistant professor of psychiatry and neuroscience at CU School of Medicine, and his colleagues report that the bigger brain may be the reason people with anorexia are able to starve themselves. Similar results in children with anorexia nervosa and in adults who had recovered from the disease, raise the possibility that insula and orbitofrontal cortex brain size could predispose a person to develop eating disorders.

"While eating disorders are often triggered by the environment, there are most likely biological mechanisms that have to come together for an individual to develop an eating disorder such as anorexia nervosa," Frank says.

The researchers recruited 19 adolescent girls with anorexia nervosa and 22 in a control group and used magnetic resonance imaging (MRI) to study brain volumes. Individuals with anorexia nervosa showed greater left orbitofrontal, right insular, and bilateral temporal cortex gray matter compared to the control group. In individuals with anorexia nervosa, orbitofrontal gray matter volume related negatively with sweet tastes. An additional comparison of this study group with adults with anorexia nervosa and a healthy control group supported greater orbitofrontal cortex and insula volumes in the disorder across this age group as well.

The medial orbitofrontal cortex has been associated with signaling when we feel satiated by a certain type of food (so called “sensory specific satiety”). This study suggests that larger volume in this brain area could be a trait across eating disorders that promotes these individuals to stop eating faster than in healthy individuals, before eating enough.

The right insula is a region that processes taste, as well as integrates body perception and this could contribute to the perception of being fat despite being underweight.

This study is complementary to another that found adults with anorexia and individuals who had recovered from this illness also had differences in brain size, previously published in the American Journal of Psychiatry.

Aug 23, 2013127 notes
#eating disorders #anorexia nervosa #brain size #orbitofrontal cortex #adolescents
Aug 23, 2013255 notes
#empathy #social cognition #brain activity #interpersonal relationships #psychology #neuroscience #science
Aug 22, 201349 notes
#gustatory receptors #fruit flies #taste #odorant-binding protein #neuroscience #science
Aug 22, 2013317 notes
#science #amygdala #anxiety #hippocampus #PTSD #mental health #psychology #neuroscience
Mood is Influenced by Immune Cells Called to the Brain in Response to Stress

New research shows that in a dynamic mind-body interaction during the interpretation of prolonged stress, cells from the immune system are recruited to the brain and promote symptoms of anxiety.

The findings, in a mouse model, offer a new explanation of how stress can lead to mood disorders and identify a subset of immune cells, called monocytes, that could be targeted by drugs for treatment of mood disorders.

The Ohio State University research also reveals new ways of thinking about the cellular mechanisms behind the effects of stress, identifying two-way communication from the central nervous system to the periphery – the rest of the body – and back to the central nervous system that ultimately influences behavior.

Unlike an infection, trauma or other problems that attract immune cells to the site of trouble in the body, this recruitment of monocytes that can promote inflammation doesn’t damage the brain’s tissue – but it does lead to symptoms of anxiety.

The research showed that the brain under prolonged stress sends signals out to the bone marrow, calling up monocytes. The cells travel to specific regions of the brain and generate inflammation that causes anxiety-like behavior.

In experiments conducted in mice, the research showed that repeated stress exposure caused the highest concentration of monocytes migrating to the brain. The cells surrounded blood vessels and penetrated brain tissue in several areas linked to fear and anxiety, including the prefrontal cortex, amygdala and hippocampus, and their presence led to anxiety-like behavior in the mice.

“In the absence of tissue damage, we have cells migrating to the brain in response to the region of the brain that is activated by the stressor,” said John Sheridan, senior author of the study, professor of oral biology and associate director of Ohio State’s Institute for Behavioral Medicine Research (IBMR).

“In this case, the cells are recruited to the brain by signals generated by the animal’s interpretation of social defeat as stressful.”

The research appears in the Aug. 21, 2013, issue of The Journal of Neuroscience.

Mice in this study were subjected to stress that might resemble a person’s response to persistent life stressors. In this model of stress, male mice living together are given time to establish a hierarchy, and then an aggressive male is added to the group for two hours. This elicits a “fight or flight” response in the resident mice as they are repeatedly defeated. The experience of social defeat leads to submissive behaviors and the development of anxiety-like behavior.

Mice subjected to zero, one, three or six cycles of this social defeat were then tested for anxiety symptoms. The more cycles of social defeat, the higher the anxiety symptoms; mice took longer to enter an open space and opted for darkness rather than light when given the choice. Anxiety symptoms corresponded to higher levels of monocytes that had traveled to the animals’ brains from the blood.

Additional experiments showed that these cells did not originate in the brain, but traveled there from the bone marrow. In previous studies, this same research group showed that cells in the brain called microglia, the brain’s first line of immune defense, are activated by prolonged stress and are partly responsible for the signals that call up monocytes from the bone marrow.

“There are different moving parts from the central and peripheral components, and what’s novel is them coming together to influence behavior,” said Jonathan Godbout, a senior co-author of the paper and an associate professor of neuroscience at Ohio State.

Exactly what happens at this point in the brain remains unknown, but the research offers clues. The monocytes that travel to the brain don’t respond to natural anti-inflammatory steroids in the body and have characteristics signifying they are in a more inflammatory state. These results indicate that inflammatory gene expression occurs in the brain in response to the stressor.

“The monocytes are coming out of the bone marrow and they are not responsive to steroid regulation, so they overproduce proinflammatory signals when they’re stimulated. We think this is the key to the prolonged anxiety-like disorders that we see in these animals,” Sheridan said.

These findings do not apply to all forms of anxiety, the scientists noted, but they are a game-changer in research on stress-related mood disorders.

“Our data alter the idea of the neurobiology of mood disorders,” said Eric Wohleb, first author of the study and a predoctoral fellow in Ohio State’s Neuroscience Graduate Studies Program. “These findings indicate that a bidirectional system rather than traditional neurotransmitter pathways may regulate some forms of anxiety responses. We’re saying something outside the central nervous system – something from the immune system – is having a profound effect on behavior.”

Aug 22, 2013173 notes
#stress #anxiety #immune system #animal model #neuroscience #science
Aug 22, 2013135 notes
#insula #frontal cortex #schizophrenia #neuroimaging #neuroscience #psychology #science
Playing video games can boost brain power

Certain types of video games can help to train the brain to become more agile and improve strategic thinking, according to scientists from Queen Mary University of London and University College London (UCL).

image

The researchers recruited 72 volunteers and measured their ‘cognitive flexibility’ described as a person’s ability to adapt and switch between tasks, and think about multiple ideas at a given time to solve problems.

Two groups of volunteers were trained to play different versions of a real-time strategy game called StarCraft, a fast-paced game where players have to construct and organise armies to battle an enemy. A third of the group played a life simulation video game called The Sims, which does not require much memory or many tactics.

All the volunteers played the video games for 40 hours over six to eight weeks, and were subjected to a variety of psychological tests before and after. All the participants happened to be female as the study was unable to recruit a sufficient number of male volunteers who played video games for less than two hours a week.

The researchers discovered that those who played StarCraft were quicker and more accurate in performing cognitive flexibility tasks, than those who played The Sims.

Dr Brian Glass from Queen Mary’s School of Biological and Chemical Sciences, said: “Previous research has demonstrated that action video games, such as Halo, can speed up decision making but the current work finds that real-time strategy games can promote our ability to think on the fly and learn from past mistakes.

“Our paper shows that cognitive flexibility, a cornerstone of human intelligence, is not a static trait but can be trained and improved using fun learning tools like gaming.”

Professor Brad Love from UCL, said:  “Cognitive flexibility varies across people and at different ages. For example, a fictional character like Sherlock Holmes has the ability to simultaneously engage in multiple aspects of thought and mentally shift in response to changing goals and environmental conditions.

“Creative problem solving and ‘thinking outside the box’ require cognitive flexibility. Perhaps in contrast to the repetitive nature of work in past centuries, the modern knowledge economy places a premium on cognitive flexibility.”

Dr Glass added: “The volunteers who played the most complex version of the video game performed the best in the post-game psychological tests. We need to understand now what exactly about these games is leading to these changes, and whether these cognitive boosts are permanent or if they dwindle over time. Once we have that understanding, it could become possible to develop clinical interventions for symptoms related to attention deficit hyperactivity disorder or traumatic brain injuries, for example.”

Aug 22, 2013222 notes
#video games #cognition #technology #neuroscience #science
Researchers Identify Conditions Most Likely to Kill Encephalitis Patients

People with severe encephalitis — inflammation of the brain — are much more likely to die if they develop severe swelling in the brain, intractable seizures or low blood platelet counts, regardless of the cause of their illness, according to new Johns Hopkins research.

The Johns Hopkins investigators say the findings suggest that if physicians are on the lookout for these potentially reversible conditions and treat them aggressively at the first sign of trouble, patients are more likely to survive.

“The factors most associated with death in these patients are things that we know how to treat,” says Arun Venkatesan, M.D., Ph.D., an assistant professor of neurology at the Johns Hopkins University School of Medicine and leader of the study published in the Aug. 27 issue of the journal Neurology.

Experts consider encephalitis something of a mystery, and its origins and progress unpredictable. While encephalitis may be caused by a virus, bacteria or autoimmune disease, a precise cause remains unknown in 50 percent of cases. Symptoms range from fever, headache and confusion in some, to seizures, severe weakness or language disability in others. The most complex cases can land patients in intensive care units, on ventilators, for months. Drugs like the antiviral acyclovir are available for herpes encephalitis, which occurs in up to 15 percent of cases, but for most cases, doctors have only steroids and immunosuppressant drugs, which carry serious side effects.

“Encephalitis is really a syndrome with many potential causes, rather than a single disease, making it difficult to study,” says Venkatesan, director of the Johns Hopkins Encephalitis Center.

In an effort to better predict outcomes for his patients, Venkatesan and his colleagues reviewed records of all 487 patients with acute encephalitis admitted to The Johns Hopkins Hospital and Johns Hopkins Bayview Medical Center between January 1997 and July 2011. They focused further attention on patients who spent at least 48 hours in the ICU during their hospital stays and who were over the age of 16. Of those 103 patients, 19 died. Patients who had severe swelling in the brain were 18 times more likely to die, while those with continuous seizures were eight times more likely to die. Those with low counts in blood platelets, the cells responsible for clotting, were more than six times more likely to die than those without this condition.

The findings can help physicians know which conditions should be closely monitored and when the most aggressive treatments — some of which can come with serious side effects — should be tried, the researchers say. For example, it may be wise to more frequently image the brains of these patients to check for increased brain swelling and the pressure buildup that accompanies it.

Venkatesan says patients with cerebral edema may do better if intracranial pressure is monitored continuously and treated aggressively. He cautioned that although his research suggests such a course, further studies are needed to determine if it leads to better outcomes for patients.

Similarly, he says research has yet to determine whether aggressively treating seizures and low platelet counts also decrease mortality.

Venkatesan and his colleagues are also developing better guidelines for diagnosing encephalitis more quickly so as to minimize brain damage. Depending on where in the brain the inflammation is, he says, the illness can mimic other diseases, making diagnosis more difficult.

Another of the study’s co-authors, Romergryko G. Geocadin, M.D., an associate professor of neurology who co-directs the encephalitis center and specializes in neurocritical care, says encephalitis patients in the ICU are “the sickest of the sick,” and he fears that sometimes doctors give up on the possibility of them getting better.

“This research should give families — and physicians — hope that, despite how bad it is, it may be reversible,” he says.

Aug 21, 201342 notes
#brain #encephalitis #cerebral edema #neurology #neuroscience #science
Aug 21, 2013105 notes
#learning #motor learning #sleep #neuroimaging #neuroscience #science
Next page →
20132014
  • January
  • February
  • March
  • April
  • May
  • June
  • July
  • August
  • September
  • October
  • November
  • December
201220132014
  • January
  • February
  • March
  • April
  • May
  • June
  • July
  • August
  • September
  • October
  • November
  • December
20122013
  • January
  • February
  • March
  • April
  • May
  • June
  • July
  • August
  • September
  • October
  • November
  • December