Neuroscience

Month

September 2013

Sep 11, 2013111 notes
#memory formation #acetylcholine #nucleus basalis #neurons #plasticity #neuroscience #science
Western scientists discover a novel opiate addiction switch in the brain

Neuroscientists at Western University (London, Canada) have made a remarkable new discovery revealing the underlying molecular process by which opiate addiction develops in the brain. Opiate addiction is largely controlled by the formation of powerful reward memories that link the pleasurable effects of opiate-class drugs to environmental triggers that induce drug craving in individuals addicted to opiates. The research is published in the September 11th issue of The Journal of Neuroscience.

image

The Addiction Research Group led by Steven Laviolette of the Schulich School of Medicine & Dentistry was able to identify how exposure to heroin induces a specific switch in a memory molecule in a region of the brain called the basolateral amygdala, which is involved importantly in controlling memories related to opiate addiction, withdrawal, and relapse. Using a rodent model of opiate addiction, Laviolette’s team found that the process of opiate addiction and withdrawal triggered a switch between two molecular pathways in the amygdala controlling how opiate addiction memories were formed.  In the non-dependent state, they found that a molecule called extracellular signal-related kinase or “ERK” was recruited for early stage addiction memories. However, once opiate addiction had developed, the scientists observed a functional switch to a separate molecular memory pathway, controlled by a molecule called calmodulin-dependent kinase II or “CaMKII”.

“These findings will shed important new light on how the brain is altered by opiate drugs and provide exciting new targets for the development of novel pharmacotherapeutic treatments for individuals suffering from chronic opiate addiction,” says Laviolette, an associate professor in the Departments of Anatomy & Cell Biology, Psychiatry, and Psychology.

Sep 11, 2013115 notes
#addiction #opiate addiction #basolateral amygdala #extracellular signal-related kinase #memory #neuroscience #science
Early-onset Parkinson’s disease linked to genetic deletion

Scientists at the Centre for Addiction and Mental Health (CAMH) and University Health Network (UHN) have found a new link between early-onset Parkinson’s disease and a piece of DNA missing from chromosome 22. The findings help shed new light on the molecular changes that lead to Parkinson’s disease.

The study appears online today in JAMA Neurology.

Among people aged 35 to 64 who were missing DNA from a specific part of chromosome 22, the research team found a marked increase in the number of cases of Parkinson’s disease, compared to expected rates of Parkinson’s disease in the general population from the same age group.

The deletion, which occurs when a person is born with about 50 genes missing on one chromosome 22, is associated with 22q11.2 deletion syndrome. People with this condition may have heart or other birth defects, learning or speech difficulties, and some develop schizophrenia. It occurs in an estimated 1 in 2,000 to 4,000 births, but is believed to be under-diagnosed.

“22q11.2 deletion syndrome has been fairly well studied in childhood and adolescence, but less is known about its effects as people age,” said Dr. Anne Bassett, Director of CAMH’s Clinical Genetics Research Program and Director of the Dalglish Family Hearts and Minds Clinic at UHN, the world’s first clinic dedicated to adults with 22q11.2 deletion syndrome. A few cases of patients with the syndrome who had Parkinson’s disease symptoms had been previously reported, which suggested that the two conditions might be linked.

Parkinson’s disease is one of the most common neurodegenerative disorders worldwide, typically affecting people over the age of 65. Earlier onset of Parkinson’s disease, before age 50, is rare and has been associated with several other genetic changes that are not on chromosome 22.

The researchers studied 159 adults with 22q11.2 deletion syndrome to discover how many had been clinically diagnosed with Parkinson’s disease. For three individuals with the deletion and Parkinson’s disease who were deceased, brain tissue was also examined.

“Through a post-mortem examination, we were able to show that all three patients had a loss of neurons that was typical of that seen in Parkinson’s disease. The examination also helped to show that the symptoms of Parkinson’s disease were not related to side effects of the medications commonly used to treat schizophrenia,” added Dr.Rasmus Kiehl, neuropathologist in UHN’s Laboratory Medicine Program, who co-authored the report with CAMH graduate student Nancy Butcher. The team also found that Parkinson’s disease in 22q11.2 deletion syndrome is associated with abnormal accumulations of protein called Lewy bodies in the brain in some, but not all cases, just as in another genetic form of Parkinson’s disease.

The findings highlight the complexity of clinical care when both Parkinson’s disease and 22q11.2 deletion syndrome are present. “Our results may inform best practices in the clinic in these cases,” said Dr. Bassett, Senior Scientist in CAMH’s Campbell Family Mental Health Research Institute.

Because patients with 22q11.2DS who have schizophrenia are often prescribed anti-psychotic medications, they may experience side-effects such as tremors and muscle stiffness, similar to symptoms of Parkinson’s disease.

As a result, the researchers found that anti-psychotic use delayed the diagnosis of Parkinson’s disease – and the opportunity for treatment – by up to 10 years.

For people with early-onset Parkinson’s disease, who also have other features that could indicate 22q11.2 deletion syndrome, clinical genetic testing for the deletion on chromosome 22 should be considered, the researchers suggest.

“Our discovery that the 22q11.2 deletion syndrome is associated with Parkinson’s disease is very exciting,” said Dr. Anthony Lang, Director of the Movement Disorders Program at the Krembil Neuroscience Centre of Toronto Western Hospital. “The varying pathology that we found is reminiscent of certain other genetic causes of Parkinson’s disease, and opens new directions to search for novel genes that could cause its more common form. Studies of patients with 22q11.2 deletion syndrome before they ever develop clinical features of Parkinson’s disease may not only provide important information on the effectiveness of screening methods for early detection of the disease, but also allow for future ‘neuroprotective treatments’ to be introduced at the ultimate time when they can have a chance to make an important impact on preventing the disease or slowing its course.” 

“Most people with 22q11.2 deletion syndrome will not develop Parkinson’s disease,” emphasizes Dr. Bassett. “But it does occur at a rate higher than in the general population. We will now be on the look-out for this so we can provide the best care for patients.”

Sep 11, 201362 notes
#parkinson's disease #chromosome 22 #22q11.2 deletion syndrome #genetics #neuroscience #science
Therapy Slows Onset and Progression of Lou Gehrig’s Disease

Studies of a therapy designed to treat amyotrophic lateral sclerosis (ALS) suggest that the treatment dramatically slows onset and progression of the deadly disease, one of the most common neuromuscular disorders in the world. The researchers, led by teams from The Research Institute at Nationwide Children’s Hospital and the Ludwig Institute at the University of California, San Diego, found a survival increase of up to 39 percent in animal models with a one-time treatment, a crucial step toward moving the therapy into human clinical trials.

The therapy reduces expression of a gene called SOD1, which in some cases of familial ALS has a mutation that weakens and kills nerve cells called motor neurons that control muscle movement. While many drug studies involve only one type of animal model, this effort included analysis in two different models treated before and after disease onset. The in-depth study could vault the drug into human clinical trials, said Brian Kaspar, PhD, a principal investigator in the Center for Gene Therapy at Nationwide Children’s and a senior author on the research, which was published online Sept. 6 in Molecular Therapy.

“We designed these rigorous studies using two different models of the disease with the experimenters blinded to the treatment and in two separate laboratories,” said Dr. Kaspar, who collaborated on the study with a team led by Don Cleveland, PhD, at the University of California, San Diego. “We were very pleased with the results, and found that the delivery approach was successful in a larger species, enabling us to initiate a clinical translational plan for this horrible disease.”

There currently is no cure for ALS, also called Lou Gehrig’s disease. The Centers for Disease Control and Prevention estimates there are about 5,000 new cases in the U.S. each year, mostly in people age 50 to 60. Although the exact cause of ALS is unknown, more than 170 mutations in the SOD1 gene have been found in many patients with familial ALS, which accounts for about 2 percent of all cases.

SOD1 provides instructions for making an enzyme called superoxide dismutase, which is found throughout the body and breaks down toxic molecules that can be damaging to cells. When mutated, the SOD1 gene yields a faulty version of the enzyme that is especially harmful to motor neurons. One of the mutations, which is found in about half of all familial ALS patients, is particularly devastating, with death usually coming within 18 months of diagnosis. SOD1 has also been implicated in other types of ALS, called sporadic ALS, which means the therapy could prove beneficial for larger numbers of patients suffering with this disease.

Earlier work by Dr. Kaspar and others found that they could halt production of the mutated enzyme by blocking SOD1 expression, which in turn, they suspected, would slow ALS progression. To test this hypothesis, the researchers would not only need to come up with an approach that would block the gene, but also figure out how to specifically target cells implicated in the disease, which include motor neurons and glial cells. What’s more, the therapy would preferably be administered noninvasively instead of direct delivery via burr holes drilled into the skull.

Dr. Kaspar’s team accomplished the second part of this challenge in 2009, when they discovered that adeno-associated virus serotype 9 (AAV9) could cross the blood-brain barrier, making it an ideal transport system for delivering genes and RNA interference strategies designed to treat disease.

In this new work, funded by the National Institutes of Health, the researchers blocked human SOD1, using a technology known as short hairpin RNA, or shRNA. These single strands of RNA are designed in the lab to seek out specific sequences found in the human SOD1 gene, latch onto them and block gene expression.

In one of the mouse models used in the study, ALS develops earlier and advances more quickly. In the other, the disease develops later and progresses more slowly. All of the mice received a single injection of AAV9-SOD1-shRNA before or after disease onset.

Results showed that in the rapid-disease-progressing model, mice treated before disease onset saw a  39 percent increase in survival compared to control treated mice. Strikingly, in mice treated at 21 days of age, disease progression was slowed by 66 percent. Perhaps more surprising was the finding that even after symptoms surfaced in these models, treatment still resulted in a 23 percent increase in survival  and a 36 percent reduction in disease progression. In the slower-disease-onset model, treatment extended survival by 22 percent and delayed disease progression by 38 percent.

“The extension of survival is fantastic, and the fact that we delayed disease progression in both models when treated at disease onset is what drives our excitement to advance this work to human clinical trials,” said Kevin Foust, PhD, co-first author on the manuscript and an assistant professor in neurosciences at The Ohio State University College of Medicine.

In addition to the potential therapeutic benefit, the study also offers some interesting insights into the biological underpinnings of ALS. The role of motor neurons in ALS has been well documented, but this study also highlighted another key player—astrocytes, the most abundant cell type in the human brain and supporters of neuronal function.

“Recent work from our collaborator Dr. Cleveland has demonstrated that astrocytes and other types of glia are as important if not more important in ALS, as they really drive disease progression,” said Dr. Kaspar. “Indeed, in looking at data from mice, more than 50 percent of astrocytes were targeted throughout the spinal cord by this gene-delivery approach.”

Ideally, a therapy would hit motor neurons and astrocytes equally hard. The best way to do that is to deliver the drug directly into the cerebrospinal fluid (CSF), which would reduce the amount of SOD1 suppression in cells outside the brain and reduce immune system exposure to AAV9—elements that would add weight to an argument for studying the drug in humans.

Injections directly into CSF cannot be done easily in mice, so the team took the study a crucial step further by injecting AAV9-SOD1-shRNA into the CSF of healthy nonhuman primates. The results were just as the team hoped—the amount of gene expression dropped by as much as 90 percent in motor neurons and nearly 70 percent in astrocytes and no side effects were reported, laying the groundwork towards moving to human clinical trials.

“We have a vast amount of work to do to move this toward a clinical trial, but we’re encouraged by the results to date and our team at Nationwide Children’s and our outstanding collaborators are fully committed to making a difference in this disease,” Dr. Kaspar said.

The findings could impact other studies underway in Dr. Kaspar’s lab, including research on Spinal Muscular Atrophy, an often fatal genetic disease in infants and children that can cause profoundly weakened muscles in the arms and legs and respiratory failure.

“This research provides further proof of targeting motor neurons and glial cells throughout the entire spinal cord for treatment of Spinal Muscular Atrophy and other degenerative diseases of the brain and spinal cord, through a less invasive manner than direct injections,” said Dr. Kaspar, who also is an associate professor of pediatrics and neurosciences at The Ohio State University College of Medicine.

Sep 10, 201351 notes
#ALS #neurodegeneration #neurodegenerative diseases #motor neurons #SOD1 gene #neuroscience #science
Brain circuitry loss may be a very early sign of cognitive decline in healthy elderly people

The degeneration of a small, wishbone-shaped structure deep inside the brain may provide the earliest clues to future cognitive decline, long before healthy older people exhibit clinical symptoms of memory loss or dementia, a study by researchers with the UC Davis Alzheimer’s Disease Center has found.

image

The longitudinal study found that the only discernible brain differences between normal people who later developed cognitive impairment and those who did not were changes in their fornix, an organ that carries messages to and from the hippocampus, and that has long been known to play a role in memory.

“This could be a very early and useful marker for future incipient decline,” said Evan Fletcher, the study’s lead author and a project scientist with the UC Davis Alzheimer’s Disease Center.

“Our results suggest that fornix variables are measurable brain factors that precede the earliest clinically relevant deterioration of cognitive function among cognitively normal elderly individuals,” Fletcher said.

The research is published online today in JAMA Neurology.

Hippocampal atrophy occurs in the later stages of cognitive decline and is one of the most studied changes associated with the Alzheimer’s disease process. However, changes to the fornix and other regions of the brain structurally connected to the hippocampus have not been as closely examined. The study found that degeneration of the fornix in relation to cognition was detectable even earlier than changes in the hippocampus.

“Although hippocampal measures have been studied much more deeply in relation to cognitive decline, our direct comparison between fornix and hippocampus measures suggests that fornix properties have a superior ability to identify incipient cognitive decline among healthy individuals,” Fletcher said.

The study was conducted over five years in a group of 102 diverse, cognitively normal people with an average age of 73 who were recruited through community outreach at the Alzheimer’s Disease Center. The researchers conducted magnetic resonance imaging (MRI) studies of the participants’ brains that described their volumes and integrity. A different type of MRI was used to determine the integrity of the myelin, the fatty coating that sheaths and protects the axons. The axons are analogous to the copper wiring of the brain’s circuitry and the myelin is like the wiring’s plastic insulation.

Either one of those things being lost will “degrade the signal transmission” in the brain, Fletcher said.

The researchers also conducted psychological tests and cognitive evaluations of the study participants to gauge their level of cognitive functioning. The participants returned for updated MRIs and cognitive testing at approximately one-year intervals. At the outset, none of the study participants exhibited symptoms of cognitive decline. Over time about 20 percent began to show symptoms that led to diagnoses with either mild cognitive impairment (MCI) and, in a minority of cases, Alzheimer’s disease.

“We found that if you looked at various brain factors there was one — and only one — that seemed to be predictive of whether a person would have cognitive decline, and that was the degradation of the fornix,” Fletcher said.

The study measured two relevant fornix characteristics predicting future cognitive impairment — low fornix white matter volume and reduced axonal integrity. Each of these was stronger than any other brain factor in models predicting cognitive loss, Fletcher said. 

He said that routine MRI examination of the fornix could conceivably be used clinically in the future as a predictor of abnormal cognitive decline.

“Our findings suggest that if your fornix volume or integrity is within a certain range you’re at an increased risk of cognitive impairment down the road. But developing the use of the fornix as a predictor in a clinical setting will take some time, in the same way that it took time for evaluation of cholesterol levels to be used to predict future heart disease,” he said.

Fletcher also said that the finding may mark a paradigm shift toward evaluation of the brain’s white matter, rather than its gray matter, as among the very earliest indicators of developing cognitive loss. There is currently a strong research focus on understanding brain processes that lead eventually to Alzheimer’s disease. He said the current finding could fill in one piece of the picture and motivate new directions in research to understand why and how fornix and other white matter change is such an important harbinger of cognitive impairment. 

“The key importance of this finding is that it suggests that white matter tract measures may prove to be promising candidate biomarkers for predicting incipient cognitive decline among cognitively normal individuals in a clinical setting, possibly more so than gray matter measures,” he said.

Sep 10, 201341 notes
#alzheimer's disease #dementia #cognitive decline #fornix #hippocampus #neuroscience #science
A New Method Will Enable the Early Detection of Parkinson’s Disease Through Handwriting

Today’s primary tool for diagnosing Parkinson’s disease is the diagnostic ability of the physician, who can generally identify the clinical symptoms only when the disease is at a relatively advanced stage. A new joint study by researchers at the University of Haifa and Rambam Hospital that compared the handwriting of 40 sick and healthy subjects suggests an innovative and noninvasive method of diagnosing Parkinson’s at a fairly early stage.

“Identifying the changes in handwriting could lead to an early diagnosis of the illness and neurological intervention at a critical moment,” explains Prof. Sara Rosenblum, of the University of Haifa’s Department of Occupational Therapy, who initiated the study.

The methods for diagnosing Parkinson’s today are a physician evaluation or a test called SPECT, which uses radioactive material to image the brain. The latter, however, is no more effective in diagnosing the illness than an expert doctor and it exposes the patient to unnecessary radiation.

Studies from recent years show that there are unique and distinctive differences between the handwriting of patients with Parkinson’s disease and that of healthy people. However, most studies that to date have focused on handwriting focused on motor skills (such as the drawing of spirals) and not on writing that involves cognitive abilities, such as signing a check, copying addresses, etc.

According to Prof. Rosenblum, Parkinson’s patients report feeling a change in their cognitive abilities before detecting a change in their motor abilities and therefore a test of cognitive impairment like the one performed in this study could attest to the presence of the disease and offer a way to diagnose it earlier.

This research was conducted in cooperation with Dr. Ilana Schlesinger, head of the Center for Movement Disorders and Parkinson’s Disease at Haifa’s Rambam Medical Center and occupational therapists working in the hospital. In the study, the researchers asked the subjects to write their names and gave them addresses to copy, two everyday tasks that require cognitive abilities. Participants were 40 adults with at least 12 years of schooling, half healthy and half known to be in the early stages of Parkinson’s disease (before obvious motor signs are visible).

The writing was done on a regular piece of paper that was placed on electronic tablet, using a special pen with pressure-sensitive sensors operated by the pen when it hit the writing surface. A computerized analysis of the results compared a number of parameters: writing form (length, width and height of the letters), time required, and the pressure exerted on the surface while performing the assignment.

Analysis of the results showed significant differences between the patients and the healthy group, and all subjects, except one, had their status correctly diagnosed (97.5% accuracy). The Parkinson’s disease patients wrote smaller letters (“micrograph”), exerted less pressure on the writing surface, and took more time to complete the task. According to Prof. Rosenblum a particularly noticeable difference was the length of time the pen was in the air between the writing of each letter and each word.

“This finding is particularly important because while the patient holds the pen in the air, his mind is planning his next action in the writing process, and the need for more time reflects the subject’s reduced cognitive ability. Changes in handwriting can occur years before a clinical diagnosis and therefore can be an early signal of the approaching disease,” Prof. Rosenblum said.

According to Dr. Schlesinger, validating these findings in a broader study would allow this method to be used for a preliminary diagnosis of the disease in a safe and non-invasive fashion. “This study is a breakthrough toward an objective diagnosis of the disease,” said Dr. Schlesinger, adding, “Publication of the study in the journal of the European Neurological Society aroused great interest at the International Congress of Parkinson’s Disease and Movement held last week in Sydney, Australia.”

The researchers note that this diagnostic method has the added benefit of reducing the load on the health system, because the test can be performed by a professional other than a doctor. After the results are in, patients can be referred to a doctor for further treatment and testing if necessary. The researchers are currently using the method in a new experiment, in which they use handwriting analysis to evaluate the degree of Parkinson’s patients’ improved functioning after they have brain pacemakers implanted.

Sep 10, 201358 notes
#parkinson's disease #handwriting #SPECT #biomarker #neuroscience #science
Cell transplants may be a novel treatment for schizophrenia

Rodent research suggests feasibility of restoring neuron function

Research from the School of Medicine at The University of Texas Health Science Center at San Antonio suggests the exciting possibility of using cell transplants to treat schizophrenia.

Cells called “interneurons” inhibit activity within brain regions, but this braking or governing function is impaired in schizophrenia. Consequently, a group of nerve cells called the dopamine system go into overdrive. Different branches of the dopamine system are involved in cognition, movement and emotions.

“Since these cells are not functioning properly, our idea is to replace them,” said study senior author Daniel Lodge, Ph.D., assistant professor of pharmacology in the School of Medicine.

Transplant restored normal function

Dr. Lodge and lead author Stephanie Perez, graduate student in his laboratory, biopsied tissue from rat fetuses, isolated cells from the tissue and injected the cells into a brain center called the hippocampus. This center regulates the dopamine system and plays a role in learning, memory and executive functions such as decision making. Rats treated with the transplanted cells have restored hippocampal and dopamine function.

Stem cells are able to become different types of cells, and in this case interneurons were selected. “We put in a lot of cells and not all survived, but a significant portion did and restored hippocampal and dopamine function back to normal,” Dr. Lodge said.

‘You can essentially fix the problem’

Unlike traditional approaches to treating schizophrenia, such as medications and deep-brain stimulation, transplantation of interneurons potentially can produce a permanent solution. “You can essentially fix the problem,” Dr. Lodge said. “Ultimately, if this is translated to humans, we want to reprogram a patient’s own cells and use them.”

After meeting with other students, Perez brought the research idea to Dr. Lodge. “The students have journal club, and somebody had done a similar experiment to restore motor deficits and had good results,” Perez said. “We thought, why can’t we use it for schizophrenia and have good results, and so far we have.”

The study is in Molecular Psychiatry.

Sep 10, 2013120 notes
#schizophrenia #stem cells #interneurons #dopamine #hippocampus #neuroscience #science
Hypertensive smoking women have an exceptionally high risk of a fatal brain bleeding

Subarachnoid haemorrhage (SAH) is one of the most devastating cerebrovascular catastrophes causing death in 40 to 50% of the cases. The most common cause of SAH is a rupture of an intracranial aneurysm. If the aneurysm is found, it can be treated before the possible rupture. However, some intracranial aneurysms will never rupture – the problem is that the doctors don’t know which aneurysms will and which will not. So, they don’t know which patients should be treated and who can safely be left untreated.

image

(Image: This picture shows: A middle cerebral artery bifurcation aneurysm. Credit: Miikka Korja)

A long-term, population-based Finnish study on SAH, which is based on the FINRISK health examination surveys, and published in PLOS ONE on 9th September, shows that the risk of SAH depends strongly on the combination of certain risk factors. The SAH incidence was shown to vary from 8 up to 171 per 100 000 person-years, depending on whether people had multiple risk factors for SAH – such as smoking, hypertension and female sex – or not.

Such an extreme risk factor -dependent variation in the incidence of any cardiovascular disease is exceptional, and may have significant clinical implications, says one of the main authors, Associate Professor Miikka Korja from the Helsinki University Central Hospital and Australian School of Advanced Medicine.

If smoking women with high systolic blood pressure values have 20 times higher rate of these brain bleeds than never-smoking men with low blood pressure values, it may very well be that these women diagnosed with unruptured intracranial aneurysms should be treated. On the other hand, never-smoking men with low blood pressure values and intracranial aneurysms may not need to be treated at all.

In this largest SAH risk factor study ever, the study group also identified three new risk factors for SAH: previous myocardial infarction, history of stroke in mother, and elevated cholesterol levels in men. The results revise the understanding of the epidemiology of SAH and indicate that the risk factors for SAH appear to be similar to those for other cardiovascular diseases.

We have previously shown that lifestyle risk factors affect significantly the life expectancy of SAH survivors, and now we have shown that the same risk factors also affect dramatically the risk of SAH itself. Thus, it appears quite clear that especially smoking cessation and hypertension treatment are important in preventing SAH and increasing life expectancy after SAH, clarifies one of the study group members, Academy Professor Jaakko Kaprio, from the University of Helsinki and National Institute for Health and Welfare, referring to their previous publication on cause-specific mortality on SAH survivors (Korja et al., Neurology, 2013).

The study group members have previously published also the largest twin study to date, confirming that heritability for SAH is very low (Korja et al., Stroke, 2010), and the first study on the incidence of SAH in type 1 diabetes, showing that the rate of non-aneurysmal SAHs in type 1 diabetes is unusually high (Korja et al., Diabetes Care, 2013).

Many of the previous studies on the epidemiology of SAH have relied on retrospective and single-center databases, which are unfortunately not very reliable data sources. Due to the unique health care system and common academic interest among doctors in Nordic countries, it has been possible to conduct high-quality and unbiased studies on SAH. We hope that our studies truly help doctors and patients, and are not only of interest in coffee tables on university campuses, says neurosurgeon Korja, and rushes to continue his working day in the operation room in Macquarie University Hospital, Sydney, which is one of his current appointments.

Sep 10, 201349 notes
#aneurysm #subarachnoid haemorrhage #cardiovascular disease #smoking #hypertension #neuroscience #science
Sep 9, 201363 notes
#robots #robotics #perception #technology #neuroscience #science
Sep 9, 2013184 notes
#self-control #neuroimaging #brain activity #decision making #neuroscience #science
Old memories recombine to give a taste of the unknown

Ever tried beetroot custard? Probably not, but your brain can imagine how it might taste by reactivating old memories in a new pattern.

image

Helen Barron and her colleagues at University College London and Oxford University wondered if our brains combine existing memories to help us decide whether to try something new.

So the team used an fMRI scanner to look at the brains of 19 volunteers who were asked to remember specific foods they had tried.

Each volunteer was then given a menu of 13 unusual food combinations – including beetroot custard, tea jelly, and coffee yoghurt – and asked to imagine how good or bad they would taste, and whether or not they would eat them.

"Tea jelly was popular," says Barron. "Beetroot custard not so much."

When each volunteer imagined a new combination, they showed brain activity associated with each of the known ingredients at the same time. It is the first evidence to suggest that we use memory combination to make decisions, says Barron.

Sep 9, 2013175 notes
#decision making #memory #medial prefrontal cortex #hippocampus #neuroscience #science
Genetic breakthrough another step to understanding schizophrenia

A consortium of scientists from 20 countries, including researchers from The University of Western Australia, has made a major breakthrough in understanding the genetic basis of the debilitating disorder, schizophrenia.

More than 175 scientists from 99 institutions across Europe, the United States of America and Australia contributed to a genome-wide association analysis which identified 13 new risk loci for schizophrenia.

In an article published in the journal, Nature Genetics, the study authors write that the results provide deeper insight into the genetic architecture of schizophrenia than ever before achieved, and provide a pathway to further research.

"For the first time, there is a clear path to increased knowledge of the etiology of schizophrenia through the application of standard, off-the-shelf genomic technologies for elucidating the effects of common variation," the authors wrote.

Schizophrenia is a complex mental disorder which affects about one per cent of people over their lifetime, leading to prolonged or recurrent episodes that impair severely social functioning and quality of life.

In terms of the ‘global burden of disease and disability’ index, developed by the World Health Organization, it ranks among the top 10 disorders, along with cancer, heart disease, diabetes and other non-communicable diseases.

Winthrop Professor Assen Jablensky, director of UWA’s Centre for Clinical Research in Neuropsychiatry (CCRN) at Graylands Hospital, and Professor Luba Kalaydjieva, of the UWA-affiliated Western Australian Institute for Medical Research (WAIMR), led the UWA research team which took part in the study.

Professor Jablensky said that while a strong genetic component in the causation of schizophrenia had been well established, the role of specific genes and the mechanisms of their regulation remained largely unknown.

"Until recently, results of genetic linkage and association studies could explain only a small fraction of the estimated heritability of the disorder and of its ‘genetic architecture’," Professor Jablensky said.

However recent technological advances, enabling efficient coverage of the entire human genome with millions of single nucleotide polymorphisms (SNPs) as genetic markers, had given rise to a new generation of genome-wide association studies (GWAS), which trace the DNA differences between people affected with the disease and healthy control individuals.

"Since the effects of individual SNPs are quite tiny, their reliable measurement requires very large samples of adequately diagnosed patients and controls," Professor Jablensky said.

"This recent study reports on a major breakthrough in the understanding of the genetic basis of schizophrenia, achieved through meta-analysis of GWAS datasets contributed by a large international Psychiatric Genomics Consortium (PGC) - which includes the UWA research team."

A WA case-control sample consisting of 893 schizophrenia patients and healthy controls was part of a collection of 21,246 schizophrenia cases and 38,072 controls from 19 research centres and consortia across Europe, Australia and the USA.

The study found that a total of 8300 SNPs contribute to the risk for schizophrenia and account for at least 32 per cent of the variance in liability.

"A particularly important result of this study is that many of these SNPs are located on a molecular pathway involved in neuronal calcium signalling, which suggests a novel pathogenetic link in the causation of schizophrenia and possibly other psychotic disorders," Professor Jablensky said.

He said ongoing and future studies by the UWA research team would aim to further refine the genetic analyses of the WA schizophrenia study (which at present includes 1259 persons), and to test neurobiological hypotheses about the treatment responses of genetically defined subsets of patients. 

Sep 9, 2013111 notes
#schizophrenia #GWAS #genetics #neuroscience #science
Sep 9, 201357 notes
#brain function #nerve cells #C. elegans #nervous system #neural activity #neuroscience #science
Sep 8, 2013144 notes
#glial cells #brain mapping #connectome #neuroscience #science
Sep 8, 201362 notes
#parkinson's disease #alpha synuclein #neurodegenerative diseases #protein #medicine #neuroscience #science
Sep 8, 2013956 notes
#decision making #trust #betrayal #frontal cortex #psychology #neuroscience #science
Sep 8, 2013121 notes
#vitamin B-12 #B-12 deficiency #cognitive decline #dementia #neuroscience #science
Finally mapped: The brain region that distinguishes bits from bounty

In comparing amounts of things — be it the grains of sand on a beach, or the size of a sea gull flock inhabiting it — humans use a part of the brain that is organized topographically, researchers have finally shown. In other words, the neurons that work to make this “numerosity” assessment are laid out in a shape that allows those most closely related to communicate and interact over the shortest possible distance.

image

This layout, referred to as a topographical map, is characteristic of all primary senses — sight, hearing, touch, smell and taste — and scientists have long assumed that numerosity, while not a primary sense (but perceived similarly to one), might be characterized by such a map, too.

But they have not been able to find it, which has caused some doubt in the field as to whether a map for numerosity exists.

Now, however, Utrecht University’s Benjamin Harvey, along with his colleagues, have sussed out signals that illustrate the hypothesized numerosity map is real.

Numerosity, it is important to note, is distinct from symbolic numbers. “We use symbolic numbers to represent numerosity and other aspects of magnitude, but the symbol itself is only a representation,” Harvey said. He went on to explain that numerosity selectivity in the brain is derived from visual processing of image features, where symbolic number selectivity is derived by recognizing the shapes of numerals, written words, and linguistic sounds that represent numbers. “This latter task relies on very different parts of the brain that specialize in written and spoken language.”

Understanding whether the brain’s processing of numerosity and symbolic numbers is related, as we might be tempted to think, is just one area that will be better informed by Harvey’s new map.

To uncover it, he and his colleagues asked eight adult study participants to look at patterns of dots that varied in number over time, all the while analysing the neural response properties in a numerosity-linked part of their brain using high-field fMRI (functional magnetic resonance imaging). Use of this advanced neuroimaging method allowed them to scan the subjects for far fewer hours per sitting than would have been required with a less powerful scanning technology.

With the fMRI data that resulted, Harvey and his team used population receptive field modelling, which aims to measure neural response as directly and quantitatively as possible. “This was the key to our success,” Harvey said. It allowed the researchers to model the human fMRI response properties they observed following results of recordings from macaque neurons, in which numerosity experiments had been conducted more extensively.

Their efforts revealed a topographical layout of numerosity in the human brain; the small quantities of dots the participants observed were encoded by neurons in one part of the brain, and the larger quantities, in another.

This finding demonstrates that topography can emerge not just for lower-level cognitive functions, like the primary senses, but for higher-level cognitive functions, too.

"We are very excited that association cortex can produce emergent topographic structures," Harvey said.

Because scientists know a great deal about topographical maps (and have the tools to probe them), the work of Harvey et al. may help scientists better analyse the neural computation underlying number processing.

"We believe this will lead to a much more complete understanding of humans’ unique numerical and mathematical skills," Harvey said.

Having heard from others in the field about the difficulty associated with the hunt for a topographical map of numerosity, Harvey and colleagues were surprised to obtain the results they did.

They also found the variations between their subjects interesting.

"Every individual brain is a complex and very different system," Harvey explained. "I was very surprised then that the map we report is in such a consistent location between our subjects, and that numerosity preferences always increased in the same direction along the cortex."

"On the other hand," he continued, "the extent of individual differences … is also striking." Harvey explained that understanding the consequences of these differences for their subjects’ perception or task performance will require further study.

Sep 7, 201383 notes
#numerosity #parietal cortex #topographical map #neuroimaging #neuroscience #science
Salk scientists and colleagues discover important mechanism underlying Alzheimer's disease

Details of destructive neuronal pathway should help improve drug therapies

Alzheimer’s disease affects more than 26 million people worldwide. It is predicted to skyrocket as boomers age—nearly 106 million people are projected to have the disease by 2050. Fortunately, scientists are making progress towards therapies. A collaboration among several research entities, including the Salk Institute and the Sanford-Burnham Medical Research Institute, has defined a key mechanism behind the disease’s progress, giving hope that a newly modified Alzheimer’s drug will be effective.

In a previous study in 2009, Stephen F. Heinemann, a professor in Salk’s Molecular Neurobiology Laboratory, found that a nicotinic receptor called Alpha7 may help trigger Alzheimer’s disease. “Previous studies exposed a possible interaction between Alpha-7 nicotinic receptors (α7Rs) with amyloid beta, the toxic protein found in the disease’s hallmark plaques,” says Gustavo Dziewczapolski, a staff researcher in Heinemann’s lab. “We showed for the first time, in vivo, that the binding of this two proteins, α7Rs and amyloid beta, provoke detrimental effects in mice similar to the symptoms observed in Alzheimer’s disease .”

Their experiments, published in The Journal of Neuroscience, with Dziewczapolski as first author, consisted in testing Alzheimer’s disease-induced mice with and without the gene for α7Rs. They found that while both types of mice developed plaques, only the ones with α7Rs showed the impairments associated with Alzheimer’s.

But that still left a key question: Why was the pairing deleterious?

In a recent paper in the Proceedings of the National Academy of Sciences, Heinemann and Dziewczapolski here at Salk with Juan Piña-Crespo, Sara Sanz-Blasco, Stuart A. Lipton of the Sanford-Burnham Medical Research Institute and their collaborators announced they had found the answer in unexpected interactions among neurons and other brain cells.

Neurons communicate by sending electrical and chemical signals to each other across gaps called synapses. The biochemical mix at synapses resembles a major airport on a holiday weekend—it’s crowded, complicated and exquisitely sensitive to increases and decreases in traffic. One of these signaling chemicals is glutamate, an excitatory neurotransmitter, which is essential for learning and storing memories. In the right balance, glutamate is part of the normal functioning of neuronal synapses. But neurons are not the only cells in the brain capable of releasing glutamate. Astrocytes, once thought to be merely cellular glue between neurons, also release this neurotransmitter.

In this new understanding of Alzheimer’s disease, there is a cellular signaling cascade, in which amyloid beta stimulates the alpha 7 nicotine receptors, which trigger astrocytes to release additional glutamate into the synapse, overwhelming it with excitatory (“go”) signals.

This release in turn activates another set of receptors outside of the synapse, called extrasynaptic-N-methyl-D-aspartate receptors (eNMDARs) that depress synaptic activity. Unfortunately, the eNMDARs seem to overly depress synaptic function, leading to the memory loss and confusion associated with Alzheimer’s.

Now that the team has finally determined the steps in this destructive pathway, the good news is that a drug developed by the Lipton’s Laboratory called NitroMemantine, a modification of the earlier Alzheimer’s medication, Memantine, may block the entry of eNMDARs into the cascade.

"Thanks to the joint effort of our colleagues and collaborators, we seem to finally have a clear mechanistic link between a key target of the amyloid beta in the brain, the Alpha7 nicotinic receptors, triggering downstream harmful effects associated with the initiation and progression of Alzheimer’s disease," says Dziewczapolski. "This is a clear demonstration of the value of basic biomedical research. Drug development cannot proceed without knowing the details of interactions at the molecular and cellular level. Our research revealed two potential targets, α7Rs and eNMDARs, for future disease-modifying therapeutics, which Dr. Heinemann and I both hope will translate in a better treatment for Alzheimer’s patients."

Sep 7, 201355 notes
#alzheimer's disease #amyloid beta #nicotine receptors #eNMDARs #neuroscience #science
Shout now! ‒ How Nerve Cells Initiate Voluntary Calls

University of Tübingen neuroscientists show that monkeys can decide to call out or keep silent

image

“Should I say something or not?” Human beings are not alone in pondering this dilemma – animals also face decisions when they communicate by voice. University of Tübingen neurobiologists Dr. Steffen Hage and Professor Andreas Nieder have now demonstrated that nerve cells in the brain signal the targeted initiation of calls – forming the basis of voluntary vocal expression. Their results are published in “Nature Communications.”

When we speak, we use the sounds we make for a specific purpose – we intentionally say what we think, or consciously withhold information. Animals, however, usually make sounds according to what they feel at that moment. Even our closest relations among the primates make sounds as a reflex based on their mood. Now, Tübingen neuroscientists have shown that rhesus monkeys are able to call (or be silent) on command. They can instrumentalize the sounds they make in a targeted way, an important behavioral ability which we also use to put language to a purpose.

To find out how the neural cells in the brain catalyse the production of controled vocal noises, the researchers taught rhesus monkeys to call out quickly when a spot appeared on a computer screen. While the monkeys solved puzzles, measurements taken in their prefrontal cortex revealed astonishing reactions in the cells there. The nerve cells became active whenever the monkey saw the spot of light which was the instruction to call out. But if the monkey simply called out spontaneously, these nerve cells were not activated. The cells therefore did not signaled for just any vocalisation – only for calls that the monkey actively decided to make.

The results published in “Nature Communications” provide valuable insights into the neurobiological foundations of vocalization. “We want to understand the physiological mechanisms in the brain which lead to the voluntary production of calls,” says Dr. Steffen Hage of the Institute for Neurobiology, “because it played a key role in the evolution of human ability to use speech.” The study offers important indicators of the function of part of the brain which in humans has developed into one of the central locations for controlling speech. “Disorders in this part of the human brain lead to severe speech disorders or even complete loss of speech in the patient,” Professor Andreas Nieder explains. The results – giving insights into how the production of sound is initiated – may help us better understand speech disorders.

Sep 7, 201351 notes
#speech production #vocalizations #primates #nerve cells #Broca's area #neuroscience #science
Experimental Compound Reverses Down Syndrome-Like Learning Deficits In Mice

Researchers at Johns Hopkins and the National Institutes of Health have identified a compound that dramatically bolsters learning and memory when given to mice with a Down syndrome-like condition on the day of birth. As they report in the Sept. 4 issue of Science Translational Medicine, the single-dose treatment appears to enable the cerebellum of the rodents’ brains to grow to a normal size.

The scientists caution that use of the compound, a small molecule known as a sonic hedgehog pathway agonist, has not been proven safe to try in people with Down syndrome, but say their experiments hold promise for developing drugs like it.

“Most people with Down syndrome have a cerebellum that’s about 60 percent of the normal size,” says Roger Reeves, Ph.D., a professor in the McKusick-Nathans Institute of Genetic Medicine at the Johns Hopkins University School of Medicine. “We treated the Down syndrome-like mice with a compound we thought might normalize the cerebellum’s growth, and it worked beautifully. What we didn’t expect were the effects on learning and memory, which are generally controlled by the hippocampus, not the cerebellum.”

Reeves has devoted his career to studying Down syndrome, a condition that occurs when people have three, rather than the usual two, copies of chromosome 21. As a result of this “trisomy,” people with Down syndrome have extra copies of the more than 300 genes housed on that chromosome, which leads to intellectual disabilities, distinctive facial features and sometimes heart problems and other health effects. Since the condition involves so many genes, developing treatments for it is a formidable challenge, Reeves says.

For the current experiments, Reeves and his colleagues used mice that were genetically engineered to have extra copies of about half of the genes found on human chromosome 21.
The mice have many characteristics similar to those of people with Down syndrome, including relatively small cerebellums and difficulty learning and remembering how to navigate through a familiar space. (In the case of the mice, this was tested by tracking how readily the animals located a platform while swimming in a so-called water maze.)
Based on previous experiments on how Down syndrome affects brain development, the researchers tried supercharging a biochemical chain of events known as the sonic hedgehog pathway that triggers growth and development. They used a compound — a sonic hedgehog pathway agonist — that could do just that.

The compound was injected into the Down syndrome-like mice just once, on the day of birth, while their cerebellums were still developing. “We were able to completely normalize growth of the cerebellum through adulthood with that single injection,” Reeves says.

But the research team went beyond measuring the cerebellums, looking for changes in behavior, too. “Making the animals, synthesizing the compound and guessing the right dose were so difficult and time-consuming that we wanted to get as much data out of the experiment as we could,” Reeves says. The team tested the treated mice against untreated Down syndrome-like mice and normal mice in a variety of ways, and found that the treated mice did just as well as the normal ones on the water maze test.

Reeves says further research is needed to learn why exactly the treatment works, because their examination of certain cells in the hippocampus known to be involved in learning and affected by Down syndrome appeared unchanged by the sonic hedgehog agonist treatment. One idea is that the treatment improved learning by strengthening communication between the cerebellum and the hippocampus, he says.

As for the compound’s potential to become a human drug, the problem, Reeves says, is that altering an important biological chain of events like sonic hedgehog would likely have many unintended effects throughout the body, such as raising the risk of cancer by triggering inappropriate growth. But now that the team has seen the potential of this strategy, they will look for more targeted ways to safely harness the power of sonic hedgehog in the cerebellum. Even if his team succeeds in developing a clinically useful drug, however, Reeves cautions that it wouldn’t constitute a “cure” for the learning and memory-related effects of Down syndrome. “Down syndrome is very complex, and nobody thinks there’s going to be a silver bullet that normalizes cognition,” he says. “Multiple approaches will be needed.”

Sep 7, 201352 notes
#down syndrome #trisomy #sonic hedgehog pathway #cerebellum #animal model #neuroscience #science
Sep 6, 2013150 notes
#peripersonal space #premotor cortex #mirror neurons #fMRI #psychology #neuroscience #science
Sep 6, 201352 notes
#hyperactivity #inner-ear disorders #gene mutation #striatum #neuroscience #science
“Seeing” Faces Through Touch

Our sense of touch can contribute to our ability to perceive faces, according to new research published in Psychological Science, a journal of the Association for Psychological Science.

“In daily life, we usually recognize faces through sight and almost never explore them through touch,” says lead researcher Kazumichi Matsumiya of Tohoku University in Japan. “But we use information from multiple sensory modalities in order to perceive many everyday non-face objects and events, such as speech perception or object recognition — these new findings suggest that even face processing is essentially multisensory.”

In a series of studies, Matsumiya took advantage of a phenomenon called the “face aftereffect” to investigate whether our visual system responds to nonvisual signals for processing faces. Inthe face aftereffect, we adapt to a face with a particular expression — happiness, for example — which causes us to perceive a subsequent neutral face as having the opposite facial expression (i.e., sadness).

Matsumiya hypothesized that if the visual system really does respond to signals from another modality, then we should see evidence for face aftereffects from one modality to the other. So, adaptation to a face that is explored by touch should produce visual face aftereffects.

To test this, Matsumiya had participants explore face masks concealed below a mirror by touching them. After this adaptation period, the participants were visually presented with a series of faces that had varying expressions and were asked to classify the faces as happy or sad. The visual faces and the masks were created from the same exemplar.

In line with his hypothesis, Matsumiya found that participants’ experiences exploring the face masks by touch shifted their perception of the faces presented visually compared to participants who had no adaptation period, such that the visual faces were perceived as having the opposite facial expression.

Further experiments ruled out other explanations for the results, including the possibility that the face aftereffects emerged because participants were intentionally imagining visual faces during the adaptation period.

And a fourth experiment revealed that the aftereffect also works the other way: Visual stimuli can influence how we perceive a face through touch.

According to Matsumiya, current views on face processing assume that the visual system only receives facial signals from the visual modality — but these experiments suggest that face perception is truly crossmodal.

“These findings suggest that facial information may be coded in a shared representation between vision and haptics in the brain,” notes Matsumiya, suggesting that these findings may have implications for enhancing vision and telecommunication in the development of aids for the visually impaired.

Sep 6, 201365 notes
#face perception #face processing #face aftereffects #adaptation #psychology #neuroscience #science
Nasal inhalation of oxytocin improves face blindness

Prosopagnosia (face blindness) may be temporarily improved following inhalation of the hormone oxytocin.

image

This is the finding of research led by Dr Sarah Bate and Dr Rachel Bennetts of the Centre for Face Processing Disorders at Bournemouth University that will be presented today, Friday 6 September, at the British Psychological Society’s Joint Cognitive and Developmental annual conference at the University of Reading.

Dr Bate explained: “Prosopagnosia is characterised by a severe impairment in face recognition, whereby a person cannot identify the faces of their family or friends, or even their own face”

The researchers tested twenty adults (10 with prosopagnosia and 10 control participants). Each participant visited the laboratory on two occasions, approximately two weeks apart. On one visit they inhaled the oxytocin nasal spray, and on the other visit they inhaled the placebo spray. The two sprays were prepared by an external pharmaceutical company in identical bottles, and neither the participants nor the researchers knew the identity of the sprays until the data had been analysed.

Regardless of which spray the person inhaled, the testing sessions had an identical format. Participants inhaled the spray, then sat quietly for 45 minutes to allow the spray to take effect. They then participated in two face processing tests: one testing their ability to remember faces and the other testing their ability to match faces of the same identity.

The researchers found that the participants with prosopagnosia achieved higher scores on both face processing tests in the oxytocin condition. Interestingly, no improvement was observed in the control participants, suggesting the hormone may be more effective in those with impaired face recognition systems.

The initial ten participants with prosopagnosia had a developmental form of the condition. Individuals with developmental prosopagnosia have never experienced brain damage, and this form of face blindness is thought to be very common, affecting one in 50 people. Much more rarely, people can acquire prosopagnosia following a brain injury. At a later date, the researchers had the opportunity to test one person with acquired prosopagnosia, and also observed a large improvement following oxytocin inhalation in this individual.

Dr Bate said: “This study provides the first evidence that oxytocin may be used to temporarily improve face recognition in people with either developmental or acquired prosopagnosia. The effects of the hormone are thought to last 2-3 hours, and it may be that the nasal spray can be used to improve face recognition on a special occasion. However, much more research needs to be carried out, as we don’t currently know whether there are benefits or risks associated with longer-term inhalation of the hormone.”

Sep 6, 201380 notes
#prosopagnosia #oxytocin #face recognition #psychology #neuroscience #science
Sep 6, 2013435 notes
#science #schizophrenia #bipolar disorder #psychiatric disorders #endophenotype #neuroscience
Sep 6, 201361 notes
#aging #neurodegenerative diseases #prion proteins #amyloid beta #neuroscience #science
Sep 6, 2013102 notes
#C. elegans #nerve cells #EBAX proteins #Hsp90 protein #neuroscience #science
Robots Could One Day Help Surgeons Remove Hard to Reach Brain Tumors

NIBIB-funded scientists and engineers are teaming up with neurosurgeons to develop technologies that enable less invasive, image-guided removal of hard-to-reach brain tumors. Their technologies combine novel imaging techniques that allow surgeons to see deep within the brain during surgery with robotic systems that enhance the precision of tissue removal.

A robot that worms its way in

image

The median survival rate for patients with glioblastomas, or high grade primary brain cancer, is less than two years. One factor contributing to this low rate is the fact that many deep-seated and pervasive tumors are not entirely accessible or even visible when using current neurosurgical tools and imaging techniques.

But several years ago, J. Marc Simard, M.D., a professor of neurosurgery at the University of Maryland School of Medicine in Baltimore (UMB), had an insight that he hoped might address this problem. At the time, he had been watching a TV show in which plastic surgeons were using sterile maggots to remove damaged or dead tissue from a patient.

“Here you had a natural system that recognized bad from good and good from bad,” said Simard. “In other words, the maggots removed all the bad stuff and left all the good stuff alone and they’re really small. I thought, if you had something equivalent to that to remove a brain tumor that would be an absolute home run.”

image

Image: Initial prototype for the minimally invasive neurosurgical intracranial robot. Image courtesy of University of Maryland.

And so Simard teamed up with Rao Gullapalli, Ph.D., professor of diagnostic radiology and nuclear medicine also at UMB, as well as Jaydev Desai, Ph.D., professor of mechanical engineering at the University of Maryland, College Park, to develop a small neurosurgical robot that could be used to remove deep-seated brain tumors.

Within four years, the team had designed, constructed, and tested their first prototype, a finger-like device with multiple joints, allowing it to move in many directions. At the tip of the robot is an electrocautery tool, which uses electricity to heat and ultimately destroy tumors, as well as a suction tube for removing debris.

“The idea was to have a device that’s small but that can do all the work a surgeon normally does,” said Simard. “You could place this small robotic device inside a tumor and have it work its way around from within, removing pieces of diseased tissue.”

A key component of the team’s device is its ability to be used while a patient is undergoing MRI. By replacing normal vision with continuously updated MRI, the surgeon is able to visualize deep-seated tumors and monitor the robot’s movement without having to create a large incision in the brain.

In addition to reducing incision size, Simard says the ability to view the brain under continuous MRI also helps surgeons keep track of tumor boundaries throughout an operation. “When we’re operating in a conventional way, we get an MRI on a patient before we do the surgery, and we use landmarks that can either be affixed to the scalp or are part of the skull to know where we are within the patient’s brain. But when the surgeon gets in there and starts to remove the tumor, the tissues shift around so that now the boundaries that were well-established when everything was in place don’t exist anymore, and you’re confronted once again with having to distinguish normal brain from tumor. This is very difficult for a surgeon using direct vision, but with MRI, the ability to discriminate tumor from non-tumor is much more powerful.”

Steve Krosnick, M.D., a program director at NIBIB, says real-time MRI guidance during brain tumor surgery would be a tremendous advantage. “Unlike pre-operative MRI or intermittent MRI, which requires interruption of the surgical procedure, real-time intra-operative MRI offers rapid delineation of normal tissue from tumor while accounting for brain shifts that occur during surgery.”

But designing a neurosurgical device that can be used inside an MRI magnet is no easy task. One of the first issues you have to consider, said Gullapalli, is a surgeon’s access to the brain. “When you scan a person’s brain during an MRI, he’s deep inside the machine’s tunnel. The problem is, how do you get your hands on the brain while the patient’s in the scanner?”

The team’s solution was to give the surgeon robotic control of the device in order to circumvent the need to access the brain directly. In other words, a surgeon can insert the robot into the brain while the patient is outside of the scanner. Then, when the patient moves into the scanner, the surgeon can sit in a different room and –while watching MRI images of the brain on a monitor—move the robot deep inside the brain and direct it to electrocauterize and aspirate the tissue.

Jaydev Desai, the team’s mechanical engineer, says the most challenging aspect of the project has been designing a robot that can be controlled inside the magnetic field of an MRI. While robots are often controlled via electromagnetic motors, this was not an option because, besides being magnetic, these motors create significant image distortion, making it impossible for the surgeon to perform the task. Other potential mechanisms such as hydraulic systems were off the table due to concerns about fluid leakage.

Instead, Desai decided to use shape memory alloy (SMA)—a material that alters its shape in response to changes in temperature—to control the robot’s movement. In the most recent prototype—developed by Desai and his team at the Robotics, Automation, and Medical Systems (RAMS) laboratory at the University of Maryland, College Park—a system of cables, pulleys and SMA springs are used. This cable and pulley system is an improvement from their previous prototype which caused some image distortion.

image

Image: The newest prototype for the minimally invasive neurosurgical intracranial robot uses a system of pulleys and springs to move the robot. Source: Jaydev Desai, University of Maryland

With continued support from NIBIB, Desai and colleagues are now working to further reduce image distortion and to test the safety and efficacy of their device in swine as well as in human cadavers. Though it will be several years before their device finds its way into the operating room, Simard is excited by the prospect. “Advancing brain surgery to this level where tiny machines or robots could navigate inside people’s heads while being directed by neurosurgeons with the help of MRI imaging…It’s beyond anything that most people dream of.”

Scoping the brain

On the opposite side of the country, a different group of engineers and neurosurgeons is also working to develop an image-guided, robotically-controlled neurosurgical tool. Lead by Eric Seibel, Ph.D., a professor of mechanical engineering at the University of Washington, the team is attempting to adapt a scanning fiber endoscope—a tool initially developed by Seibel to image inside the narrow bile ducts of the liver—so that it can be used to visualize the brain during surgery.

An endoscope is a thin, tube-like instrument with a video camera attached to its end that can be inserted through a small incision or natural opening in the body to produce real-time video during surgery. Endoscopes are an essential component of minimally invasive surgeries because they allow surgeons to view the inside of the body on a monitor without having to make a large incision.

However, there are many parts of the body such as small vessels and ducts as well as areas deep in the brain that are inaccessible to conventional endoscopes. Although ultrathin endoscopes have recently been developed, Seibel says these smaller scopes come with the price of greatly reduced image resolution.

“Right now, with the current state of the art ultrathin endoscopes, I calculate based on the field of view and their resolution that the person looking at that display would see so little as to be classified in the US as legally blind,” said Seibel.

image

Image: Microfabricated optical fiber scanner emitting red laser light, with scan amplitude of 1 mm peak-to-peak. Image courtesy of Eric Seibel, University of Washington

But with support from NIBIB over ten years ago, Seibel began working on a new type of endoscope that could fit into tiny crevices in the body while retaining high image quality. His end product was a new type of endoscope that, despite having the diameter of a toothpick, can provide doctors with microscopic views of the inside of the body.

Seibel retained image quality while significantly reducing the size of his scope by eschewing traditional endoscope models. Instead of a light source and a video camera, Seibel’s scope consists of a single optical fiber—approximately the size of a human hair—located in the middle of the scope. The fiber releases white laser light (a combination of green, red, and blue lasers) when vibrated at a particular frequency. By directing the laser light through a series of lenses in the scope, it can be reflected widely within the body, providing a 100 degree field of view. As the white laser light interacts with tissue, it picks up coloration and scatters it back to a ring of additional optical fibers which transmit this information to a monitor.

“It’s almost like putting your eyes inside the body so you can see with the wide field view of your human vision,” said Seibel.

In collaboration with three neurosurgeons and an electrical engineer, Seibel is now working to secure his novel endoscope to the tip of a robotically controlled micro-dissection neurosurgical tool.

As opposed to larger traditional endoscopes, Seibel say his scanning fiber endoscope is barely noticeable.

“It’ s like a piece of wet spaghetti,” said Seibel. “It’s even smaller then a piece of wet spaghetti in diameter, but it feels like that. So when it is actually at the tip of the surgeon’s tool, the surgeon wouldn’t feel it dragging behind her.”

One advantage of having the endoscope under robotic control is that the brain can be imaged at a higher magnification.

“A surgeon couldn’t hold a microscope steady in her hand while performing surgery, but the robot can,” said Seibel.

Microscopic detail is essential when trying to determine the border between healthy tissue—which if removed could lead to neurological deficits—and cancerous tissue—which if left in the brain could allow a tumor to return.

Krosnick says he’s excited by the combination of high-quality imaging and robotic enabled micro-neurosurgery. “It addresses a critical need, which is to discern tumor margins at high resolution while minimizing disruption to normal structures.”

Seibel believes this discrimination between cancerous and healthy tissue could be enhanced even further by taking advantage of the fact that his scanning endoscope is also able to detect fluorescence. One of the main focuses of his current research is a collaboration with Jim Olson, M.D., Ph.D. at the Fred Hutchinson Cancer Research Center, who is the inventor of a substance called “tumor paint”.

Tumor paint is a fluorescent probe that attaches to cancerous but not healthy cells when injected into the body. Seibel says the ultimate goal would be to give a patient an injection of tumor paint and then use his endoscope to create an image of the fluorescing cancer cells as well as a colored anatomic image of the brain. The two images could then be merged on a screen for the surgeon to view during an operation.“You would be able to see all the structure that a surgeon would see, but you’d also see those molecular pinpoints of light that are cancer cells…and from there the robot can be used to resect, or remove, these small cells of cancer, and it can do it very precisely because you don’t have the shaking of a human holding it.”

image

Image: Tumor paint is made of a compound extracted from scorpion venom that can travel through the blood brain barrier and bind specifically to tumor cells. Source: iStockphoto

Seibel concluded by saying, “There’s a real niche for video-quality, high-resolution, multi-modal imaging that’s in a tiny package so that it can be put on microscopic tools for minimally invasive medicine. I really feel it’s an enabling technology that could move the whole field forward.”

Krosnick is enthusiastic about the progress the two teams have made so far. “These are innovative technologies that, if effective, could significantly add to the brain surgery armamentarium. They’re still early in development, but I think both show considerable promise.” He concluded by emphasizing that, like all new devices, these technologies would need to undergo a series of clinical trials to ensure that they are safe and effective before making their way into an operating room.

Sep 6, 2013119 notes
#brain tumors #robotics #glioblastoma #neurosurgery #neuroscience #science
TB and Parkinson’s Disease Linked By Unique Protein

UCSF Researchers Seek Way to Boost Parkin to Fight Both Diseases

A protein at the center of Parkinson’s disease research now also has been found to play a key role in causing the destruction of bacteria that cause tuberculosis, according to scientists led by UC San Francisco microbiologist and tuberculosis expert Jeffery Cox, PhD.

The protein, named Parkin, already is the focus of intense investigation in Parkinson’s disease, in which its malfunction is associated with a loss of nerve cells. Cox and colleagues now report that Parkin also acts on tuberculosis, triggering destruction of the bacteria by immune cells known as macrophages. Results appear online today (September 4, 2013) in the journal Nature.

The finding suggests that disease-fighting strategies already under investigation in pre-clinical studies for Parkinson’s disease might also prove useful in fighting tuberculosis, according to Cox. Cox is investigating ways to ramp up Parkin activity in mice infected with tuberculosis using a strategy similar to one being explored by his UCSF colleague Kevan Shokat, PhD, as a way to ward off neurodegeneration in Parkinson’s disease.

Globally, tuberculosis kills 1.4 million people each year, spreading from person to person through the air. Parkinson’s disease, the most common neurodegenerative movement disorder, also affects millions of mostly elderly people worldwide.

Cox homed in on the enzyme Parkin as a common element in Parkinson’s and tuberculosis through his investigations of how macrophages engulf and destroy bacteria. In a sense the macrophage — which translates from Greek as “big eater” — gobbles down foreign bacteria, through a process scientists call xenophagy.

Mycobacterium tuberculosis, along with a few other types of bacteria, including Salmonella and leprosy-causing Mycobacterium leprae, are different from other kinds of bacteria in that, like viruses, they need to get inside cells to mount a successful infection.

The battle between macrophage and mycobacterium can be especially intense. M. tuberculosis invades the macrophage, but then becomes engulfed in a sac within the macrophage that is pinched off from the cell’s outer membrane. The bacteria often escape this intracellular jail by secreting a protein that degrades the sac, only to be targeted yet again by molecular chains made from a protein called ubiquitin. Previously, Cox discovered molecules that escort these chained mycobacteria to more secure confinement within compartments inside cells called lysosomes, where the bacteria are destroyed.

The cells of non-bacterial organisms ranging in complexity from baker’s yeast to humans also use a similar mechanism — called autophagy — to dispose of their own unneeded molecules or worn out cellular components. Among the most abundant and crucial of these components are the cell’s mitochondria, metabolic powerhouses that convert food molecules into a source of energy that the cell can readily use to carry out its everyday housekeeping chores, as well as its more specialized functions.

Like other cellular components, mitochondria can wear out and malfunction, and often require replacement. The process through which mitochondria are disposed of, called mitophagy, depends on Parkin.

Cox became curious about the enzyme when he learned that specific, naturally occurring variations in the Parkin gene, called polymorphisms, are associated with increased susceptibility to tuberculosis infection.

“Because of the commonalities between mitophagy and the xenophagy of intracellular mycobacteria, as well as the links between Parkin gene polymorphisms and increased susceptibility to bacterial infection in humans, we speculated that Parkin may also be recruited to M. tuberculosis and target it for xenophagy,” Cox said.

In both mouse and human macrophages infected with M. tuberculosis in the lab, Parkin played a key role in fighting the bacteria, Cox and colleagues found. In addition, genetically engineered mice lacking Parkin died when infected with M. tuberculosis, while mice with normal Parkin survived infection.

The involvement of Parkin in targeting both damaged mitochondria and infectious mycobacteria arose long ago in evolution, Cox argues. As part of the Nature study, the research team found that Parkin-deficient mice and flies – creatures quite distant from humans in evolutionary time – also are more sensitive than normal mice and flies to intracellular bacterial infections.

Looking back more than 1 billion years, Cox noted that mitochondria evolved from bacteria that were taken up by cells in a symbiotic relationship.

In the same way that the immune system recognizes infectious bacteria as foreign, Cox said, “The evolutionary origin of mitochondria from bacteria suggests that perhaps mitochondrial dysfunction triggers the recognition of a mitochondrian as non-self.”

Having now demonstrated the importance of Parkin in fighting mycobacterial infection, Cox has begun working with Shokat to find a way to boost Parkin activity against cell-invading bacteria. “We are exploring the possibility that small-molecule drugs could be developed to activate Parkin to better fight tuberculosis infection,” Cox said.

Sep 5, 201365 notes
#parkinson's disease #tuberculosis #parkin protein #macrophages #lysosomes #medicine #neuroscience #science
Sep 5, 201389 notes
#aging #cognitive performance #cognitive control #prefrontal cortex #neuroscience #science
Play
Sep 5, 2013319 notes
#science #schizophrenia #OCD #mental disorders #compulsive behavior #neuroscience #psychology
Discovery helps to unlock brain’s speech-learning mechanism

USC scientists have discovered a population of neurons in the brains of juvenile songbirds that are necessary for allowing the birds to recognize the vocal sounds they are learning to imitate.

image

These neurons encode a memory of learned vocal sounds and form a crucial (and hitherto only theorized) part of the neural system that allows songbirds to hear, imitate and learn its species’ songs — just as human infants acquire speech sounds.

The discovery will allow scientists to uncover the exact neural mechanisms that allow songbirds to hear their own self-produced songs, compare them to the memory of the song that they are trying to imitate and then adjust their vocalizations accordingly.

Because this brain-behavior system is thought to be a model for how human infants learn to speak, understanding it could prove crucial to future understanding and treatment of language disorders in children. In both songbirds and humans, feedback of self-produced vocalizations is compared to memorized vocal sounds and progressively refined to achieve a correct imitation.

“Every neurodevelopmental disorder you can think of — including Tourette syndrome, autism and Rett syndrome — entails in some way a breakdown in auditory processing and vocal communication,” said Sarah Bottjer, senior author of an article on the research that appears in the Journal of Neuroscience on Sept. 4. “Understanding mechanisms of vocal learning at a cellular level is a huge step toward being able to someday address the biological issues behind the behavioral issues.”

Bottjer professor of neurobiology at the USC Dornsife College of Letters, Arts and Sciences, collaborated with lead author Jennifer Achiro, a graduate student at USC, to examine the activity of neurons in songbirds’ brains using electrodes to record the activity of individual neurons.

In the basal ganglia — a complex system of neurons in the brain responsible for, among other things, procedural learning — Bottjer and Achiro were able to isolate two different types of neurons in young songbirds: ones that were activated only when the birds heard themselves singing and others that were activated only when the birds heard the songs of adult birds that they were trying to imitate.

The two sets of neurons allow the songbirds to recognize both their current behavior and a goal behavior that they would like to achieve.

“The process of learning speech requires the brain to compare feedback of current vocal behavior to a memory of target vocal sounds,” Achiro said. “The discovery of these two distinct populations of neurons means that this brain region contains separate neural representation of current and goal behaviors. Now, for the first time, we can test how these two neural representations are compared so that correct matches between the two are somehow rewarded.”

The next step for scientists will be to learn how the brain rewards correct matches between feedback of current vocal behavior and the goal memory that depicts memorized vocal sounds as songbirds make progress in bringing their current behavior closer to their goal behavior, Bottjer said.

Sep 5, 201386 notes
#songbirds #neural activity #basal ganglia #vocal learning #speech #neuroscience #science
New laser-based tool could dramatically improve the accuracy of brain tumor surgery

Imaging technique tells tumor tissue from normal tissue, could be used in operating room for real-time guidance of surgery

A new laser-based technology may make brain tumor surgery much more accurate, allowing surgeons to tell cancer tissue from normal brain at the microscopic level while they are operating, and avoid leaving behind cells that could spawn a new tumor.

image

This image of a human glioblastoma brain tumor in the brain of a mouse was made with stimulated Raman scattering, or SRS, microscopy. The technique allows the tumor (blue) to be easily distinguished from normal tissue (green) based on faint signals emitted by tissue with different cellular structures.

In a new paper, featured on the cover of the journal Science Translational Medicine, a team of University of Michigan Medical School and Harvard University researchers describes how the technique allows them to “see” the tiniest areas of tumor cells in brain tissue.

They used this technique to distinguish tumor from healthy tissue in the brains of living mice — and then showed that the same was possible in tissue removed from a patient with glioblastoma multiforme, one of the most deadly brain tumors.

Now, the team is working to develop the approach, called SRS microscopy, for use during an operation to guide them in removing tissue, and test it in a clinical trial at U-M. The work was funded by the National Institutes of Health.

A need for improvement in tumor removal

On average, patients diagnosed with glioblastoma multiforme live only 18 months after diagnosis. Surgery is one of the most effective treatments for such tumors, but less than a quarter of patients’ operations achieve the best possible results, according to a study published last fall in the Journal of Neurosurgery.

“Though brain tumor surgery has advanced in many ways, survival for many patients is still poor, in part because surgeons can’t be sure that they’ve removed all tumor tissue before the operation is over,” says co-lead author Daniel Orringer, M.D., a lecturer in the U-M Department of Neurosurgery who has worked with the Harvard team since a chance meeting with a team member during his U-M residency.

image

On the left, the view of the brain that neurosurgeons currently see during an operation using bright-field microscopy. On the right, an SRS microscopy view of the same area of brain - in this case, a mouse brain that has had human brain tumor tissue transplanted into it. SRS might someday allow surgeons to see this same view of patients’ brains.

“We need better tools for visualizing tumor during surgery, and SRS microscopy is highly promising,” he continues. “With SRS we can see something that’s invisible through conventional surgical microscopy.”

The SRS in the technique’s name stands for stimulated Raman scattering. Named for C.V. Raman, one of the Indian scientists who co-discovered the effect and shared a 1930 Nobel Prize in physics for it, Raman scattering involves allows researchers to measure the unique chemical signature of materials.

In the SRS technique, they can detect a weak light signal that comes out of a material after it’s hit with light from a non-invasive laser. By carefully analyzing the spectrum of colors in the light signal, the researchers can tell a lot about the chemical makeup of the sample.

Over the past 15 years, Sunney Xie, Ph.D., of the Department of Chemistry and Chemical Biology at Harvard University – the senior author of the new paper — has advanced the technique for high-speed chemical imaging. By amplifying the weak Raman signal by more than 10,000 times, it is now possible to make multicolor SRS images of living tissue or other materials. The team can even make 30 new images every second — the rate needed to create videos of the tissue in real time.

Seeing the brain’s microscopic architecture

A multidisciplinary team of chemists, neurosurgeons, pathologists and others worked to develop and test the tool. The new paper is the first time SRS microscopy has been used in a living organism to see the “margin” of a tumor – the boundary area where tumor cells infiltrate among normal cells. That’s the hardest area for a surgeon to operate – especially when a tumor has invaded a region with an important function.

As the images in the paper show, the technique can distinguish brain tumor from normal tissue with remarkable accuracy, by detecting the difference between the signal given off by the dense cellular structure of tumor tissue, and the normal healthy grey and white matter.

The authors suggest that SRS microscopy may be as accurate for detecting tumor as the approach currently used in brain tumor diagnosis – called H&E staining.

image

This image shows the same areas of brain, imaged with SRS microscopy (left) and conventional H&E staining, which is the current technique used to diagnose brain tumors at the tissue level. The research suggests that SRS microscopy could be as accurate as H&E staining in allowing doctors to see tumors - without having to remove tissue or inject dyes into the patient.

The paper contains data from a test that pitted H&E staining directly against SRS microscopy. Three surgical pathologists, trained in studying brain tissue and spotting tumor cells, had nearly the same level of accuracy no matter which images they studied. But unlike H&E staining, SRS microscopy can be done in real time, and without dyeing, removing or processing the tissue.

Next steps: A smaller laser, a clinical trial

The current SRS microscopy system is not yet small or stable enough to use in an operating room. The team is collaborating with a start-up company formed by members of Xie’s group, called Invenio Imaging Inc., which is developing a laser to perform SRS through inexpensive fiber-optic components. The team is also working with AdvancedMEMS Inc. to reduce the size of the probe that makes the images possible.

A validation study, to examine tissue removed from consenting U-M brain tumor patients, may begin as soon as next year.

Sep 5, 201370 notes
#brain tumor #glioblastoma #brain tissue #neuroimaging #SRS microscopy #neuroscience #science
Sep 5, 2013679 notes
#science #McGurk effect #auditory cortex #language #language processing #neuroscience
Sep 5, 2013152 notes
#science #cerebellum #proprioception #motor movements #neuroscience
Play
Sep 5, 201384 notes
#alzheimer's disease #tau protein #chaperone proteins #stress protein #neuroscience #science
Sep 5, 2013215 notes
#science #alzheimer's disease #mGluR5 #memory impairment #prion proteins #medicine #neuroscience
Brain study uncovers vital clue in bid to beat epilepsy

People with epilepsy could be helped by new research into the way a key molecule controls brain activity during a seizure.

Researchers have identified the role played by of a protein – called BDNF – and say the discovery could lead to new drugs that calm the symptoms of epileptic seizures.

Scientists analysed the way cells communicate when the brain is most active – such as in epileptic seizures – when electrical signalling by the brain’s neurons is increased.

They found that the BDNF molecule – which is known to be released in the brain during seizures – blocks a specific process known as activity-dependent bulk endocytosis (ABDE).

By blocking this process during an epileptic seizure, BDNF increases the release of neurotransmitters and causes heightened electrical activity in the brain.

Since ADBE is only triggered during high brain activity, drugs designed to target this process could have fewer side effects for normal day to day brain function, researchers say.

Experts say that not all epilepsy patients respond to current drug treatments and the finding could lead to the development of new medicines.

The team, however, offered a word of caution. Since ABDE is also implicated in a range of brain functions, such as creating new memories, more research is needed to establish what the effects of manipulating this molecule might be on these key processes.

The study, led by the University of Edinburgh, is published in the journal Nature Communications. The research was funded by the Wellcome Trust and the Medical Research Council.

Dr Mike Cousin, of the University of Edinburgh’s Centre for Integrative Physiology, who led the research, said: “Around one third of people with epilepsy do not respond to the treatments we currently have available. By studying the way brain cells behave during seizures, we have been able to uncover an exciting new research avenue for research into anti-epileptic therapies.”

Researchers will now focus on identifying specific genes that control this brain process to determine whether they hold the key to new drug treatments.

Sep 4, 201362 notes
#epilepsy #seizures #BDNF #activity-dependent bulk endocytosis #brain activity #neuroscience #science
Scientists fish for new epilepsy model and reel in potential drug

NIH-funded study finds zebrafish model may help identify treatments for a severe form of childhood epilepsy

image

According to new research on epilepsy, zebrafish have certainly earned their stripes. Results of a study in Nature Communications suggest that zebrafish carrying a specific mutation may help researchers discover treatments for Dravet syndrome (DS), a severe form of pediatric epilepsy that results in drug-resistant seizures and developmental delays.

Scott C. Baraban, Ph.D., and his colleagues at the University of California, San Francisco (UCSF), carefully assessed whether the mutated zebrafish could serve as a model for DS, and then developed a new screening method to quickly identify potential treatments for DS using these fish. This study was supported by the National Institute of Neurological Disorders and Stroke (NINDS), part of the National Institutes of Health and builds on pioneering epilepsy zebrafish models first described by the Baraban laboratory in 2005.

Dravet syndrome is commonly caused by a mutation in the Scn1a gene, which encodes for Nav1.1, a specific sodium ion channel found in the brain. Sodium ion channels are critical for communication between brain cells and proper brain functioning.

The researchers found that the zebrafish that were engineered to have the Scn1a mutation that causes DS in humans exhibited some of the same characteristics, such as spontaneous seizures, commonly seen in children with DS. Unprovoked seizure activity in the mutant fish resulted in hyperactivity and whole-body convulsions associated with very fast swimming. These types of behaviors are not seen in normal healthy zebrafish.

“We were also surprised at how similar the mutant zebrafish drug profile was to that of Dravet patients,” said Dr. Baraban. “Antiepileptic drugs shown to have some benefits in patients (such as benzodiazepines or stiripentol) also exhibited some antiepileptic activity in these mutants. Conversely, many of the antiepileptic drugs that do not reduce seizures in these patients showed no effect in the mutant zebrafish.”

In this study, the researchers developed a fast and automated drug screen to quickly test the effectiveness of various compounds in mutant zebrafish. The researchers tracked behavior and measured brain activity in the mutant zebrafish to determine if the compounds had an impact on seizures.

“Scn1a mutants seize often, so it is relatively easy to monitor their seizure behavior at baseline and then again after a drug application,” said Dr. Baraban. “Using zebrafish placed individually in a 96-part petri dish we can accurately quantify this seizure behavior. In this way, we can test almost 100 fish at one time and quickly determine whether a drug candidate has any effect on these spontaneous seizures.”

In the first such application of this approach, UCSF researchers screened 320 compounds and found that clemizole was most effective in inhibiting seizure activity. Clemizole is approved by the U.S. Food and Drug Administration and has a safe toxicology profile. “This finding was completely unexpected. Based on what is currently known about clemizole, we did not predict that it would have antiepileptic effects,” said Dr. Baraban.

These findings suggest that Scn1a mutant zebrafish may serve as a good model of DS and that the drug screen may be effective in quickly identifying novel therapies for epilepsy. 

Dr. Baraban also noted that someday these experiments can be “personalized,” by looking at mutated zebrafish that use genetic information from individual patients. 

Sep 4, 201351 notes
#Dravet syndrome #epilepsy #zebrafish #ion channels #Scn1a gene #mutations #neuroscience #science
Research confirms Mediterranean diet is good for the mind

The first systematic review of related research confirms a positive impact on cognitive function, but an inconsistent effect on mild cognitive impairment.

image

Over recent years many pieces of research have identified a link between adherence to a Mediterranean diet and a lower risk of age-related disease such as dementia.

Until now there has been no systematic review of such research, where a number of studies regarding a Mediterranean diet and cognitive function are reviewed for consistencies, common trends and inconsistencies.

A team of researchers from the University of Exeter Medical School, supported by the National Institute for Health Research Collaboration for Leadership in Applied Health Research and Care in the South West Peninsula (NIHR PenCLAHRC), has carried out the first such systematic review and their findings are published in Epidemiology.

The team analysed 12 eligible pieces of research, 11 observational studies and one randomised control trial. In nine out of the 12 studies, a higher adherence to a Mediterranean diet was associated with better cognitive function, lower rates of cognitive decline and a reduced risk of Alzheimer’s disease.

However, results for mild cognitive impairment were inconsistent.

A Mediterranean diet typically consists of higher levels of olive oil, vegetables, fruit and fish. A higher adherence to the diet means higher daily intakes of fruit and vegetables and fish, and reduced intakes of meat and dairy products.

The study was led by researcher Iliana Lourida. She said: “Mediterranean food is both delicious and nutritious, and our systematic review shows it may help to protect the ageing brain by reducing the risk of dementia. While the link between adherence to a Mediterranean diet and dementia risk is not new, ours is the first study to systematically analyse all existing evidence.”

She added: “Our review also highlights inconsistencies in the literature and the need for further research. In particular research is needed to clarify the association with mild cognitive impairment and vascular dementia. It is also important to note that while observational studies provide suggestive evidence we now need randomized controlled trials to confirm whether or not adherence to a Mediterranean diet protects against dementia.”

Sep 4, 2013157 notes
#Mediterranean diet #cognitive function #dementia #cognitive impairment #neuroscience #science
Aging really is ‘in your head’

Scientists answer hotly debated questions about how calorie restriction delays aging process

image

Among scientists, the role of proteins called sirtuins in enhancing longevity has been hotly debated, driven by contradictory results from many different scientists. But new research at Washington University School of Medicine in St. Louis may settle the dispute.

Reporting Sept. 3 in Cell Metabolism, Shin-ichiro Imai, MD, PhD, and his colleagues have identified the mechanism by which a specific sirtuin protein called Sirt1 operates in the brain to bring about a significant delay in aging and an increase in longevity. Both have been associated with a low-calorie diet.

The Japanese philosopher and scientist Ekiken Kaibara first described the concept of dietary control as a method to achieve good health and longevity in 1713. He died the following year at the ripe old age of 84—a long life for someone in the 18th century.

Since then, science has proven a link between a low-calorie diet (without malnutrition) and longevity in a variety of animal models. In the new study, Imai and his team have shown how Sirt1 prompts neural activity in specific areas of the hypothalamus of the brain, which triggers dramatic physical changes in skeletal muscle and increases in vigor and longevity.

“In our studies of mice that express Sirt1 in the brain, we found that the skeletal muscular structures of old mice resemble young muscle tissue,” said Imai. “Twenty-month-old mice (the equivalent of 70-year-old humans) look as active as five-month-olds.”

Imai and his team began their quest to define the critical junctures responsible for the connection between dietary restriction and longevity with the knowledge from previous studies that the Sirt1 protein played a role in delaying aging when calories are restricted. But the specific mechanisms by which it carried out its function were unknown.

Imai’s team studied mice that had been genetically modified to overproduce Sirt1 protein. Some of the mice had been engineered to overproduce Sirt1 in body tissues, while others were engineered to produce more of the Sirt1 protein only in the brain.

“We found that only the mice that overexpressed Sirt1 in the brain (called BRASTO) had significant lifespan extension and delay in aging, just like normal mice reared under dietary restriction regimens,” said Imai, an expert in aging research and a professor in the departments of Developmental Biology and Medicine.

The BRASTO mice demonstrated significant life span extension without undergoing dietary restriction. “They were free to eat regular chow whenever they wished,” he said.

In addition to positive skeletal muscle changes in the BRASTO mice, the investigators also observed significant increases in nighttime physical activity, body temperature and oxygen consumption compared with age-matched controls.

Mice are characteristically most active at night. The BRASTO mice also experienced better or deeper sleep, and both males and females had significant increases in longevity.

The median life span of BRASTO mice in the study was extended by 16 percent for females and 9 percent for males. Translated to humans, this could mean an extra 13 or 14 years for women, making their average life span almost 100 years, Shin said. For men, this would add another seven years, increasing their average life span to the mid-80s.

Delay in cancer-dependent death also was observed in the BRASTO mice relative to control mice, the researchers noted.

Imai said that the longevity and health profile associated with the BRASTO mice appears to be the result of a shift in the onset of aging rather than the pace of aging. “What we have observed in BRASTO mice is a delay in the time when age-related decline begins, so while the rate of aging does not change, aging and the risk of cancer has been postponed.”

Having narrowed control of aging to the brain, Imai’s team then traced the control center of aging regulation to two areas of the hypothalamus called the dorsomedial and lateral hypothalamic nuclei. They then were able to identify specific genes within those areas that partner with Sirt1 to kick off the neural signals that elicit the physical and behavioral responses observed.

“We found that overexpression of Sirt1 in the brain leads to an increase in the cellular response of a receptor called orexin type 2 receptor in the two areas of the hypothalamus,” said first author Akiko Satoh, PhD, a postdoctoral staff scientist in Imai’s lab.

“We have demonstrated that the increased response by the receptor initiates signaling from the hypothalamus to skeletal muscles,” said Satoh. She noted that the mechanism by which the signal is specifically directed to skeletal muscle remains to be discovered.

According to Imai, the tight association discovered between Sirt1-prompted brain activation and the regulation of aging and longevity raises the tantalizing possibility of a “control center of aging and longevity” in the brain, which could be manipulated to maintain youthful physiology and extend life span in other mammals as well.

Sep 4, 2013204 notes
#science #aging #calorie restriction #sirtuins #hypothalamus #Sirt1 #neuroscience
Sep 4, 2013125 notes
#sleep #oligodendrocytes #myelin #nerve cells #genes #MS #neuroscience #science
Sep 4, 2013100 notes
#parkinson's disease #brain cells #mitochondria #ursodeoxycholic acid #neuroscience #science
Sep 4, 2013303 notes
#science #tech #neurological disorders #cranial implants #brain imaging #neuroimaging #neuroscience
Sep 4, 2013207 notes
#auditory system #schizophrenia #psychosis #brain circuitry #motor cortex #neuroscience #science
Sep 3, 2013131 notes
#primates #vocalizations #language #categorization #psychology #neuroscience #science
Sep 3, 2013125 notes
#auditory system #auditory attention filter #cochlea #hair cells #neuroscience #science
Sep 3, 201341 notes
#fruit flies #hearing #noise-induced hearing loss #auditory system #neuroscience #science
Administering Natural Substance Spermidin Stopped Dementia

Scientists from Freie Universität Berlin and the University of Graz Have Shown That Feeding Fruit Flies with Spermidin Suppresses Age-dependent Memory Impairment

Age-induced memory impairment can be suppressed by administration of the natural substance spermidin. This was found in a recent study conducted by Prof. Dr. Stephan Sigrist from Freie Universität Berlin and the Neurocure Cluster of Excellence and Prof. Dr. Frank Madeo from Karl-Franzens-Universität Graz. Both biologists, they were able to show that the endogenous substance spermidine triggers a cellular cleansing process, which is followed by an improvement in the memory performance of older fruit flies. At the molecular level, memory processes in animal organisms such as fruit flies and mice are similar to those in humans. The work by Sigrist and Madeo has potential for developing substances for treating age-related memory impairment. The study was first published in the online version of Nature Neuroscience.

Aggregated proteins are potential candidates for causing age-related dementia. With increasing age, the proteins accumulate in the brains of fruit flies, mice, and humans. In 2009 Madeo’s group in Graz already found that the spermidin molecule has an anti-aging effect by setting off autophagy, a cleaning process at the cellular level. Protein aggregates and other cellular waste are delivered to lysosomes, the digestive apparatus in cells, and degraded.

Feeding the fruit flies spermidin significantly reduced the amount of protein aggregates in their brains, and their memories improved to juvenile levels. This can be measured because flies can learn under classical Pavovian conditioning and adjust their behavior accordingly.

In humans, memory capacity decreases beginnning around the age of 50. This loss accelerates with increasing age. Due to increasing life expectancy, age-related memory impairment is expected to increase drastically. The spermidine concentration increases with age in flies as in humans. If it were possible to delay the onset of age-related dementia by giving individuals spermidin as a food supplement, it would be a great breakthrough for individuals and for society. Patient studies are the next step for Sigrist and Madeo.

Sep 2, 201374 notes
#spermidin #fruit flies #memory impairment #dementia #aging #neuroscience #science
Next page →
20132014
  • January
  • February
  • March
  • April
  • May
  • June
  • July
  • August
  • September
  • October
  • November
  • December
201220132014
  • January
  • February
  • March
  • April
  • May
  • June
  • July
  • August
  • September
  • October
  • November
  • December
20122013
  • January
  • February
  • March
  • April
  • May
  • June
  • July
  • August
  • September
  • October
  • November
  • December