Neuroscience

Month

September 2013

Sep 18, 2013116 notes
#science #chronic pain #white matter #medial prefrontal cortex #axons #nucleus accumbens #neuroimaging #neuroscience
Study finds cognitive enhancers do not improve cognition or function in people with mild cognitive impairment but may cause gastrointestinal issues

Cognitive enhancers—drugs taken to enhance concentration, memory, alertness and moods—do not improve cognition or function in people with mild cognitive impairment in the long term, according to a new study by researchers at St. Michael’s Hospital.

In fact, patients on these medications experienced significantly more nausea, diarrhea, vomiting and headaches, according to the study published today in the Canadian Medical Association Journal.

“Our findings do not support the use of cognitive enhancers for mild cognitive impairment,” wrote Dr. Andrea Tricco and Dr. Sharon Straus, who are both scientists in the hospital’s Li Ka Shing Knowledge Institute. Dr. Straus is also a geriatrician at the hospital.

Mild cognitive impairment is a condition characterized by memory complaints without significant limitations in everyday activity. Between 3 and 42 per cent of people are diagnosed with the condition each year, about 4.6 million people worldwide. Each year about 3 to 17 per cent of people with mild cognitive impairment will develop dementia, such as Alzheimer’s disease. Given the aging population, it’s estimated the number of Canadians with dementia will double to more than 1 million in the next 25 years.

It has been hypothesized that cognitive enhancers may delay the onset of dementia. Families and patients are increasingly requesting these drugs even though their efficacy for patients with mild cognitive impairment has not been established. In Canada, cognitive enhancers can be obtained only with special authorization.

Drs. Tricco and Straus conducted a review of existing evidence to understand the efficacy and safety of cognitive enhancers. They looked at eight randomized trials that compared one of four cognitive enhancers (donepezil, rivastigmine, galantamine or memantine) to a placebo among patients diagnosed with mild cognitive impairment.

While they found short-term benefits to using these drugs on one cognition scale, there were no long-term effects after about a year and a half. No other benefits were observed on the second cognition scale or on function, behaviour, and mortality. As well, patients on these medications experienced significantly more nausea, diarrhea, vomiting and headaches. One study also found a higher risk of a heart condition known as bradycardia (slow heartbeat) among patients who received galantamine.

“Our results do not support the use of cognitive enhancers for patients with mild cognitive impairment,” the authors wrote. “These agents were not associated with any benefit and led to an increase in harms. Patients and their families should consider this information when requesting these medications. Similarly, health care decision-makers may not wish to approve the use of these medications for mild cognitive impairment, because these drugs might not be effective and are likely associated with harm.”

This study was funded by the Drug Safety and Effectiveness Network/Canadian Institutes of Health Research.

Another St. Michael’s study published in the CMAJ in April found no evidence that drugs, herbal products or vitamin supplements help prevent cognitive decline in healthy older adults. That review, led by Dr. Raza Naqvi, a University of Toronto resident, found some evidence that mental exercises, such as computerized memory training programs, might help.

Sep 17, 201380 notes
#science #alzheimer's disease #dementia #memory loss #cognitive impairment #neuroscience
Sep 17, 2013136 notes
#imagination #artificial intelligence #neuroimaging #brain mapping #neuroscience #science
Sep 16, 201365 notes
#language #child development #sex differences #neuroscience #science
Sep 16, 201388 notes
#brainhack 2013 #neuroscience #psychology #medicine #science
Girl who feels no pain could inspire new painkillers

A girl who does not feel physical pain has helped researchers identify a gene mutation that disrupts pain perception. The discovery may spur the development of new painkillers that will block pain signals in the same way.

image

People with congenital analgesia cannot feel physical pain and often injure themselves as a result – they might badly scald their skin, for example, through being unaware that they are touching something hot.

By comparing the gene sequence of a girl with the disorder against those of her parents, who do not, Ingo Kurth at Jena University Hospital in Germany and his colleagues identified a mutation in a gene called SCN11A.

This gene controls the development of channels on pain-sensing neurons. Sodium ions travel through these channels, creating electrical nerve impulses that are sent to the brain, which registers pain.

Blocked signals

Overactivity in the mutated version of SCN11A prevents the build-up of the charge that the neurons need to transmit an electrical impulse, numbing the body to pain. “The outcome is blocked transmission of pain signals,” says Kurth.

To confirm their findings, the team inserted a mutated version of SCN11A into mice and tested their ability to perceive pain. They found that 11 per cent of the mice with the modified gene developed injuries similar to those seen in people with congenital analgesia, such as bone fractures and skin wounds. They also tested a control group of mice with the normal SCN11A gene, none of which developed such injuries.

The altered mice also took 2.5 times longer on average than the control group to react to the “tail flick” pain test, which measures how long it takes for mice to flick their tails when exposed to a hot light beam. “What became clear from our experiments is that although there are similarities between mice and men with the mutation, the degree of pain insensitivity is more prominent in humans,” says Kurth.

The team has now begun the search for drugs that block the SCN11A channel. “It would require drugs that selectively block this but not other sodium channels, which is far from simple,” says Kurth.

Completely unexpected

"This is a cracking paper, and great science," says Geoffrey Woods of the University of Cambridge, whose team discovered in 2006 that mutations in another, closely related ion channel gene can cause insensitivity to pain. "It’s completely unexpected and not what people had been looking for," he says.

Woods says that there are three ion channels, called SCN9A, 10A and 11A, on pain-sensing neurons. People experience no pain when either of the first two don’t work, and agonising pain when they’re overactive. “With this new gene, it’s the opposite: when it’s overactive, they feel no pain. So maybe it’s some kind of gatekeeper that stops neurons from firing too often, but cancels pain signals completely when it’s overactive,” he says. “If you could get a drug that made SCN11A overactive, it should be a fantastic analgesic.”

"It’s fascinating that SCN11A appears to work the other way, and that could really advance our knowledge of the role of sodium channels in pain perception, which is a very hot topic,” says Jeffrey Mogil at McGill University in Canada, who was not involved in the new study.

Sep 16, 2013272 notes
#pain #pain perception #gene mutation #congenital analgesia #ion channels #neuroscience #science
Sep 16, 2013100 notes
#science #ALS #neurodegenerative diseases #DNA damage #HDAC1 #SIRT1 #FUS gene #DNA #neuroscience
Sep 15, 201379 notes
#macular degeneration #p53 protein #MDM2 #vision #medicine #science
Sep 15, 2013127 notes
#electroconvulsive therapy #depression #bipolar disorder #ECT #psychology #neuroscience #science
Sep 15, 2013125 notes
#sleep apnea #sleep #face mapping #facial appearance #CPAP treatment #medicine #science
Genes for body symmetry may also control handedness

Lefties and righties can thank same DNA that puts hearts on left side for hand dominance

Left- or right-handedness may be determined by the genes that position people’s internal organs.

image

About 10 percent of people prefer using their left hand. That ratio is found in every population in the world and scientists have long suspected that genetics controls hand preference. But finding the genes has been no simple task, says Chris McManus, a neuropsychologist at University College London who studies handedness but was not involved in the new research.

“There’s no single gene for the direction of handedness. That’s clear,” McManus says. Dozens of genes are probably involved, he says, which means that one person’s left-handedness might be caused by a variant in one gene, while another lefty might carry variants in an entirely different gene.

To find handedness genes, William Brandler, a geneticist at the University of Oxford, and colleagues  conducted a statistical sweep of DNA from 3,394 people. Statistical searches such as this are known as genome-wide association studies; scientists often do such studies to uncover genes that contribute to complex diseases or traits such as diabetes and height. The people in this study had taken tests involving moving pegs on a board. The difference in the amount of time they took with one hand versus the other reflected how strongly left- or right-handed they were.

A variant in a gene called PCSK6 was most tightly linked with strong hand preference, the researchers report in the Sept. 12 PLOS Genetics. The gene has been implicated in handedness before, including in a 2011 study by the same research group. PCSK6 is involved in the asymmetrical positioning of internal organs in organisms from snails to vertebrates.

Brandler, who happens to be a lefty, knew the gene wasn’t the only cause of hand preference, so he and his colleagues looked at other genetic variants that didn’t quite cross the threshold of statistical significance. Many of the genes the team uncovered had previously been shown in studies of mice to be necessary for correctly placing organs such as the heart and liver. Four of the genes when disrupted in mice can cause cilia-related diseases. Cilia are hairlike appendages on cells that act a bit like GPS units and direct many aspects of development of a wide range of species, including humans.

One of the cilia genes, GLI3, also helps build the corpus callosum, a bundle of nerves that connects the two hemispheres of the brain. Some studies have suggested that the structure is bigger in left-handers.

It’s still a mystery how these genes direct handedness, says Larissa Arning, a human geneticist at Ruhr University Bochum in Germany. In addition to genes that direct body plans, she says, the study suggests that many more yet-to-be-discovered genes probably play a role in handedness.

Brandler hopes the study will also help remove some of the stigma of being left-handed. Left-handedness isn’t a character flaw or a sign of being sinister, he says: “It’s an outcome of genetic variation.”

Sep 15, 2013121 notes
#handedness #hand preference #genes #genetics #PCSK6 gene #psychology #neuroscience #science
Sep 14, 2013422 notes
#bilingualism #bilingual thinking #language switching #psychology #neuroscience #science
Low Omega-3 could explain why some children struggle with reading

An Oxford University study has shown that a representative sample of UK schoolchildren aged seven to nine years had low levels of key Omega-3 fatty acids in their blood. Furthermore, the study found that children’s blood levels of the long-chain Omega-3 DHA (the form found in most abundance in the brain) ‘significantly predicted’ how well they were able to concentrate and learn. Oxford University researchers explained the findings, recently published in the journal PLOS ONE, at a conference in London on 4 September.

image

The study was presented at the conference by co-authors Dr Alex Richardson and Professor Paul Montgomery from Oxford University’s Centre for Evidence-Based Intervention in the Department of Social Policy and Intervention. It is one of the first to evaluate blood Omega-3 levels in UK schoolchildren. The long-chain Omega-3 fats (EPA and DHA) found in fish, seafood and some algae, are essential for the brain’s structure and function as well as for maintaining a healthy heart and immune system. Parents also reported on their child’s diet, revealing to the researchers that almost nine out of ten children in the sample ate fish less than twice a week, and nearly one in ten never ate fish at all. The government’s guidelines for a healthy diet recommend at least two portions of fish a week. This is because like vitamins, omega-3 fats have to come from our diets – and although humans can in theory make some EPA and DHA from shorter-chain omega-3 (found in some vegetable oils), research has shown this conversion is not reliable, particularly for DHA, say the researchers.

Blood samples were taken from 493 schoolchildren, aged between seven and nine years, from 74 mainstream schools in Oxfordshire. All of the children were thought to have below-average reading skills, based on national assessments at the age of seven or their teachers’ current judgements. Analyses of their blood samples showed that, on average, just under two per cent of the children’s total blood fatty acids were Omega-3 DHA (Docosahexaenoic acid) and 0.5 per cent were Omega-3 EPA (Eicosapentaenoic acid), with a total of 2.45 per cent for these long-chain Omega-3 combined. This is below the minimum of 4 per cent recommended by leading scientists to maintain cardiovascular health in adults, with 8-12 per cent regarded as optimal for a healthy heart, the researchers reported.

Co-author Professor Paul Montgomery said: ‘From a sample of nearly 500 schoolchildren, we found that levels of Omega-3 fatty acids in the blood significantly predicted a child’s behaviour and ability to learn. Higher levels of Omega-3 in the blood, and DHA in particular, were associated with better reading and memory, as well as with fewer behaviour problems as rated by parents and teachers. These results are particularly noteworthy given that we had a restricted range of scores, especially with respect to blood DHA but also for reading ability, as around two-thirds of these children were still reading below their age-level when we assessed them. Although further research is needed, we think it is likely that these findings could be applied generally to schoolchildren throughout the UK.’

Co-author Dr Alex Richardson added: ‘The longer term health implications of such low blood Omega-3 levels in children obviously can’t be known. But this study suggests that many, if not most UK children, probably aren’t getting enough of the long-chain Omega-3 we all need for a healthy brain, heart and immune system. That gives serious cause for concern  because we found that lower blood DHA was linked with poorer behaviour and learning in these children.
‘Most of the children we studied had blood levels of long-chain Omega-3 that in adults would indicate a high risk of heart disease. This was consistent with their parents’ reports that most of them failed to meet current dietary guidelines for fish and seafood intake. Similarly, few took supplements or foods fortified with these Omega-3.’

The current findings build on earlier work by the same researchers, showing that dietary supplementation with Omega-3 DHA improved both reading progress and behaviour in children from the general school population who were behind on their reading. Their previous research has already shown benefits of supplementation with long-chain omega-3 (EPA+DHA) for children with ADHD, Dyspraxia, Dyslexia, and related conditions. The DHA Oxford Learning and Behaviour (DOLAB) Studies have now extended these findings to children from the general school population.

‘Technical advances in recent years have enabled the measurement of individual Omega-3 and other fatty acids from fingerstick blood samples. ‘These new techniques have been revolutionary – because in the past, blood samples from a vein were needed for assessing fatty acids, and that has seriously restricted research into the blood Omega-3 status of healthy UK children until now,’ said Dr Richardson.

Sep 14, 2013130 notes
#omega-3 #fatty acids #school children #reading #neuroscience #science
Researchers Pinpoint Molecular Path that Makes Antidepressants Act Quicker in Mouse Model

Understanding alternate pathways for how mental meds work could lead to faster-acting drug targets

The reasons behind why it often takes people several weeks to feel the effect of newly prescribed antidepressants remains somewhat of a mystery – and likely, a frustration to both patients and physicians.

image

(Image: Mouse hippocampus expressing the Cre- virus. Credit: Julie Blendy, PhD; Brigitta Gunderson, PhD; Perelman School of Medicine, University of Pennsylvania)

Julie Blendy, PhD, professor of Pharmacology, at the Perelman School of Medicine, University of Pennsylvania; Brigitta Gunderson, PhD, a former postdoctoral fellow in the Blendy lab, and colleagues, have been working to find out why and if there is anything that can be done to shorten the time in which antidepressants kick in.

“Our goal is to find ways for antidepressants to work faster,” says Blendy.  

The proteins CREB and CREM are both transcription factors, which bind to specific DNA sequences to control the “reading” of genetic information from DNA to messenger RNA (mRNA). Both CREB and CREM bind to the same 8-base-pair DNA sequence in the cell nucleus. But, the comparative influence of CREM versus CREB on the action of antidepressants is a “big unknown,” says Blendy.

CREB, and CREM to some degree, has been implicated in the pathophysiology of depression, as well as in the efficacy of antidepressants. However, whenever CREB is deleted, CREM is upregulated, further complicating the story.

Therefore, how an antidepressant works on the biochemistry and behavior in a mouse in which the CREB protein is deleted only in the hippocampus versus a wild type mouse in which CREM is overexpressed let the researchers tease out the relative influence of CREB and CREM on the pharmacology of an antidepressant. They saw the same results in each type of mouse line – increased nerve-cell generation in the hippocampus and a quicker response to the antidepressant. Their findings appear in the Journal of Neuroscience.

“This is the first demonstration of CREM within the brain playing a role in behavior, and specifically in behavioral outcomes, following antidepressant treatment,” says Blendy.

A Flood of Neurotransmitters

Antidepressants like SSRIs, NRIs, and older tricyclic drugs work by causing an immediate flood of neurotransmitters like serotonin, norepinephrine, and in some cases dopamine, into the synaptic space. However, it can take three to four weeks for patients to feel changes in mental state. Long-term behavioral effects of the drugs may take longer to manifest themselves, because of the need to activate CREB downstream targets such as BDNF and trkB, or as of yet unidentified targets, which could also be developed as new antidepressant drug targets.

The Penn team compared the behavior of the control, wild-type mice to the CREB mutant mice using a test in which the mouse is trained to eat a treat – Reese’s Pieces, to be exact – in the comfort of their home cage. The treat-loving mice are then placed in a new cage to make them anxious. They are given the treat again, and the time it takes for the mouse to approach the treat is recorded.

Animals that receive no drug treatment take a long time to venture out into the anxious environment to retrieve the treat, however, if given an antidepressant drug for at least three weeks, the time it takes a mouse to get the treat decreases significantly, from about 400 seconds to 100 seconds. In mice in which CREB is deleted or in mice in which CREM is upregulated, this reduction happens in one to two days versus the three weeks seen in wild-type mice.

The accelerated time to approach the treat in mice on the medication was accompanied by an increase in new nerve growth in the hippocampus.

“Our results suggest that activation of CREM may provide a means to accelerate the therapeutic efficacy of current antidepressant treatment,” says Blendy. Upregulation of CREM observed after CREB deletion, appears to functionally compensate for CREB loss at a behavioral level and leads to maintained or increased expression of some CREB target genes. The researchers’ next step is to identify any unique CREM target genes in brain areas such as the hippocampus, which may lead to the development of faster-acting antidepressants.

Sep 14, 201376 notes
#antidepressants #hippocampus #nerve cells #dopamine #norepinephrine #serotonin #SSRIs #neuroscience #science
Sep 14, 201371 notes
#sensory neurons #neural circuitry #C. elegans #calcium sensors #insulin #neuropeptides #neuroscience #science
Sep 14, 2013112 notes
#memory formation #hippocampus #false memory #dentate gyrus #PTSD #neuroscience #science
Researchers Find Surprising Role of Critical Brain Protein

If the violins were taken away from the musicians performing Beethoven’s 9th symphony, the resulting composition would sound very different. If the violins were left on stage but the violinists were removed, the same mutant version of the symphony would be heard.

But what if it ended up sounding like “Hey Jude” instead?

This sort of surprise is what scientists from the Virginia Tech Carilion Research Institute had during what they assumed to be a routine experiment in neurodevelopment. Previous studies had shown that the glycoprotein Reelin is crucial to developing healthy neural networks. Logically, taking away the two receptors that Reelin is known to act on early in the brain’s development should create the same malformations as taking away Reelin itself.

It didn’t.

“We conducted the experiment thinking we’d see the same defects for both cases – Reelin deficiency and its receptors’ deletion – but we didn’t,” said Michael Fox, an associate professor at the research institute and the lead author of the study. “If you take away the receptors instead of the targeting molecule, you get an entirely separate set of abnormalities. The results raise the question of the identity of other molecules with which Reelin and the two receptors are interacting.”

The study, first published online in June in Neural Development, could prove useful for the development of therapies and diagnostics to combat brain disease.

In the early stages of neural development, neurons grow from the retina to a small portion of the brain called the thalamus. All sensory information coming into the brain gets routed through this region, before being transmitted to the cerebral cortex for further processing. Because these retinal neurons carry specific types of information, they must connect to specific places in the thalamus, which Reelin helps them find.

In the experiment, the scientists bred mice lacking both Reelin receptors known to be critical for neurons to navigate their targets during development. The scientists expected the neurons in the mutants to become lost and unable to find their targets, which is what happens in Reelin-deficient mice. Instead, the neurons were able to locate their targets, but those targets had wandered off.

While these results were surprising, they weren’t the most interesting of the experiment. Although most neurons look the same to people without advanced training in neuroscience, many different types are intermixed in distinct regions with strict borders. How these borders are formed, however, is still an open question.

“Many of us have questioned how you can have such a crisp boundary between two regions of the brain,” said Jianmin Su, a research assistant professor at the research institute and first author of the study. “I always thought it was a large number of cells creating some kind of cue or environment, but that isn’t what this experiment indicates.”

In the mice without the Reelin receptors, neurons from one part of the thalamus migrated to an area where they weren’t supposed to be. Even though only a handful of neurons were misplaced, they did not mingle with their new neighbors. They stayed separate.

“The result is a baffling curiosity that nobody in the lab expected – just how distinct these little regions can be,” Fox said. “How do just a few cells create such a barrier? How many cells does it take? Maybe these little islands can teach us something about how you create boundaries between larger regions of functionally similar cells.”

This experiment isn’t the only example Fox has had recently of neurons invading regions in which they weren’t supposed to be. In a second experiment, researchers examined how neurons from the cortex connect to the thalamus during the initial stages of development.

And neurons seem to be polite.

The results showed that neurons from the cortex grow to the edge of the part of the thalamus dedicated to visual signals, called the dorsal lateral geniculate nucleus, but then stop. In fact, they stay on standby for nearly two weeks before making their way into the region. It seems as though they’re waiting for the retinal neurons to make their connections before beginning to make their own. If researchers surgically removed the eyes or genetically removed the retinal cells connecting the eyes to the thalamus, neurons from the cortex invaded more than a week earlier than they were supposed to.

“It turns out that the cortical neurons are waiting for the retinal axons to mature and find the most appropriate spots to connect before they’re allowed to come in,” said Fox. “There’s some form of instructional role that retinal axons play in the timing of the cortical axons entering.”

Sep 13, 201374 notes
#brain development #reelin #retinal neurons #intergeniculate nucleus #cortical neurons #neuroscience #science
Alzheimer's patients show striking individual differences in molecular basis of disease

Alzheimer’s disease is thought to be caused by the buildup of abnormal, thread-like protein deposits in the brain, but little is known about the molecular structures of these so-called beta-amyloid fibrils. A study published by Cell Press September 12th in the journal Cell has revealed that distinct molecular structures of beta-amyloid fibrils may predominate in the brains of Alzheimer’s patients with different clinical histories and degrees of brain damage. The findings pave the way for new patient-specific strategies to improve diagnosis and treatment of this common and debilitating disease.

image

"This work represents the first detailed characterization of the molecular structures of beta-amyloid fibrils that develop in the brains of patients with Alzheimer’s disease," says senior study author Robert Tycko of the National Institutes of Health. "This detailed structural model may be used to guide the development of chemical compounds that bind to these fibrils with high specificity for purposes of diagnostic imaging, as well as compounds that inhibit fibril formation for purposes of prevention or therapy."

Tycko and his team had previously noticed that beta-amyloid fibrils grown in a dish have different molecular structures, depending on the specific growth conditions. Based on this observation, they suspected that fibrils found in the brains of patients with Alzheimer’s disease are also variable and that these structural variations might relate to each patient’s clinical history. But it has not been possible to directly study the structures of fibrils found in patients because of their low abundance in the brain.

To overcome this hurdle, Tycko and his collaborators developed a new experimental protocol. They extracted beta-amyloid fibril fragments from the brain tissue of two patients with different clinical histories and degrees of brain damage and then used these fragments to grow a large quantity of fibrils in a dish. They found that a single fibril structure prevailed in the brain tissue of each patient, but the molecular structures were different between the two patients.

"This may mean that fibrils in a given patient appear first at a single site in the brain, then spread to other locations while retaining the identical molecular structure," Tycko says. "Our study also shows that certain fibril structures may be more likely than others to cause Alzheimer’s disease, highlighting the importance of developing imaging agents that target specific fibril structures to improve the reliability and specificity of diagnosis."

Sep 13, 201357 notes
#alzheimer's disease #beta-amyloid fibrils #brain damage #brain tissue #neuroscience #science
Scientists Pinpoint Proteins Vital to Long-Term Memory

Scientists from the Florida campus of The Scripps Research Institute (TSRI) have found a group of proteins essential to the formation of long-term memories.

The study, published online ahead of print on September 12, 2013 by the journal Cell Reports, focuses on a family of proteins called Wnts. These proteins send signals from the outside to the inside of a cell, inducing a cellular response crucial for many aspects of embryonic development, including stem cell differentiation, as well as for normal functioning of the adult brain.

“By removing the function of three proteins in the Wnt signaling pathway, we produced a deficit in long-term but not short-term memory,” said Ron Davis, chair of the TSRI Department of Neuroscience. “The pathway is clearly part of the conversion of short-term memory to the long-term stable form, which occurs through changes in gene expression.”

The findings stem from experiments probing the role of Wnt signaling components in olfactory memory formation in Drosophila, the common fruit fly—a widely used doppelgänger for human memory studies. In the new study, the scientists inactivated the expression of several Wnt signaling proteins in the mushroom bodies of adult flies—part of the fly brain that plays a role in learning and memory.

The resulting memory disruption, Davis said, suggests that Wnt signaling participates actively in the formation of long-term memory, rather than having some general, non-specific effect on behavior.

“What is interesting is that the molecular mechanisms of adult memory use the same processes that guide the early development of the organism, except that they are repurposed for memory formation,” he said. “One difference, however, is that during early development the signals are intrinsic, while in adults they require an outside stimulus to create a memory.”

Sep 13, 201382 notes
#memory formation #long-term memory #learning #Wnts proteins #neuroscience #science
Dreaming is still possible even when the mind is blank

Isabelle Arnulf and colleagues from the Sleep Disorders Unit at the Université Pierre et Marie Curie (UPMC) have outlined case studies of patients with Auto-Activation Deficit who reported dreams when awakened from REM sleep – even when they demonstrated a mental blank during the daytime. This paper proves that even patients with Auto-Activation Disorder have the ability to dream and that it is the “bottom-up” process that causes the dream state.

image

In a new paper for the neurology journal Brain, Arnulf et al compare the dream states of patients with Auto-Activation Deficit (AAD) with those of healthy, control patients. AAD is caused by bilateral damage to the basal ganglia and it is a neuro-physical syndrome characterized by a striking apathy, a lack of spontaneous activation of thought, and a loss of self-driven behaviour. AAD patients must be stimulated by their care-givers in order to take part in everyday tasks like standing up, eating, or drinking. If you were to ask an AAD patient: “what are you thinking?” they would report that they have no thoughts.

During sleep, the brain is operating on an exclusively internal basis. In REM sleep, the higher cortex areas are internally stimulated by the brainstem. When awakened, most normal subjects will remember some dreams that were associated with their previous sleep state, especially in REM sleep. Would the self-stimulation of the cortex by the brainstem be sufficient to stimulate spontaneous dreams during sleep in AAD patients?

Discovering the answer to this question would go some way to proving either the top-down or bottom-up theories of dreaming. The top-down theory stipulates that dreaming begins in higher cortex memory structures and then proceeds backwards as imagination develops during wakefulness. The bottom-up theory posits that the brainstem structures which elicit rapid eye movements and cortex activation during REM sleep result in the emotional, visual, sensory, and auditory elements of dreaming.

Thirteen patients with AAD agreed to participate in the study and record their dreams in dream diaries during the week leading up to the evaluation. These patients were compared with thirteen non-medicated, healthy control subjects. Video and sleep monitoring were performed on all twenty six participants for two consecutive nights. The first night evaluated the patient’s sleep duration, structure, and architecture of their dreams. During the second night of sleep evaluation, the researchers woke the subjects up as they entered the second non-REM sleep cycle, and again after 10 min of established REM sleep during the following sleep cycle, and asked them what they were dreaming before being woken up. The dream reports were then independently analysed and scored according to; complexity of dream, bizarreness, and elaboration.

Four of the thirteen patients with AAD reported dreaming when awakened from REM sleep, even though they demonstrated a mental blank during the daytime. This is compared to 12 out of 13 of the control patients. However, the four AAD patients’ dreams were devoid of any complex, bizarre, or emotional elements. The presence of simple yet spontaneous dreams in REM sleep, despite the absence of thoughts during wakefulness in AAD patients, supports the notion that simple dream imagery is generated by brainstem stimulation and sent to the sensory cortex. The lack of complexity in the dreams of the four AAD patients, as opposed to the complexity of the control patients’ dreams, demonstrates that the full dreaming process require these sensations to be interpreted by a higher-order cortical area.

Therefore, this study shows for the first time that it is the bottom-up process that causes the dream state.

Yet, despite the simplicity of the dreams, Isabelle Arnulf commented that the banal tasks that the AAD patients dreamt about were fascinating. For instance, Patient 10 dreamt of shaving – an activity he never initiated during the daytime without motivation from his caregivers, and an activity he could not do by himself due to severe hand dystonia. Similarly, Patient 5 dreamt about writing even though he would never write in the daytime without being invited by his caregivers to do so.

Interestingly, there were no real differences in the sleep measures between the AAD patients and the control patients apart from 46% of the AAD patients had a complete absence of sleep spindles (a burst of oscillatory brain activity visible on an EEG that occurs during stage 2 sleep). The striking absence of sleep spindles in localized lesions in the basal ganglia of these 6 AAD patients highlights the role of the pallidum and striatum in spindling activity during non-REM sleep. This is a key distinction between the AAD patients and the control patients; all thirteen control subjects displayed signs of sleep spindles.

Sep 13, 2013102 notes
#auto-activation deficit #brain activity #REM sleep #sensory cortex #basal ganglia #neuroscience #science
Sep 13, 2013113 notes
#science #ASD #autism #neurons #neuronal growth #Christianson syndrome #neurotrophic factor #neuroscience
Sep 12, 20132,281 notes
#science #zebrafish #brainbow #fluorescence microscopy #neurons #neuroscience #Olympus BioScapes 2008
'Love hormone' may play wider role in social interaction than previously thought

Researchers at the Stanford University School of Medicine have shown that oxytocin — often referred to as “the love hormone” because of its importance in the formation and maintenance of strong mother-child and sexual attachments — is involved in a broader range of social interactions than previously understood.

The discovery may have implications for neurological disorders such as autism, as well as for scientific conceptions of our evolutionary heritage.

Scientists estimate that the advent of social living preceded the emergence of pair living by 35 million years. The new study suggests that oxytocin’s role in one-on-one bonding probably evolved from an existing, broader affinity for group living.

Oxytocin is the focus of intense scrutiny for its apparent roles in establishing trust between people, and has been administered to children with autism spectrum disorders in clinical trials. The new study, published Sept. 12 in Nature, pinpoints a unique way in which oxytocin alters activity in a part of the brain that is crucial to experiencing the pleasant sensation neuroscientists call “reward.” The findings not only provide validity for ongoing trials of oxytocin in autistic patients, but also suggest possible new treatments for neuropsychiatric conditions in which social activity is impaired.

"People with autism-spectrum disorders may not experience the normal reward the rest of us all get from being with our friends," said Robert Malenka, MD, PhD, the study’s senior author. "For them, social interactions can be downright painful. So we asked, what in the brain makes you enjoy hanging out with your buddies?"

Some genetic evidence suggests the awkward social interaction that is a hallmark of autism-spectrum disorders may be at least in part oxytocin-related. Certain variations in the gene that encodes the oxytocin receptor — a cell-surface protein that senses the substance’s presence — are associated with increased autism risk.

Malenka, the Nancy Friend Pritzker Professor in Psychiatry and Behavioral Sciences, has spent the better part of two decades studying the reward system — a network of interconnected brain regions responsible for our sensation of pleasure in response to a variety of activities such as finding or eating food when we’re hungry, sleeping when we’re tired, having sex or acquiring a mate, or, in a pathological twist, taking addictive drugs. The reward system has evolved to reinforce behaviors that promote our survival, he said.

For this study, Malenka and lead author Gül Dölen, MD, PhD, a postdoctoral scholar in his group with over 10 years of autism-research expertise, teamed up to untangle the complicated neurophysiological underpinnings of oxytocin’s role in social interactions. They focused on biochemical events taking place in a brain region called the nucleus accumbens, known for its centrality to the reward system.

In the 1970s, biologists learned that in prairie voles, which mate for life, the nucleus accumbens is replete with oxytocin receptors. Disrupting the binding of oxytocin to these receptors impaired prairie voles’ monogamous behavior. In many other species that are not monogamous by nature, such as mountain voles and common mice, the nucleus accumbens appeared to lack those receptors.

"From this observation sprang a dogma that pair bonding is a special type of social behavior tied to the presence of oxytocin receptors in the nucleus accumbens. But what’s driving the more common group behaviors that all mammals engage in — cooperation, altruism or just playing around — remained mysterious, since these oxytocin receptors were supposedly absent in the nucleus accumbens of most social animals," said Dölen.

The new discovery shows that mice do indeed have oxytocin receptors at a key location in the nucleus accumbens and, importantly, that blocking oxytocin’s activity there significantly diminishes these animals’ appetite for socializing. Dölen, Malenka and their Stanford colleagues also identified, for the first time, the nerve tract that secretes oxytocin in the region, and they pinpointed the effects of oxytocin release on other nerve tracts projecting to this area.

Mice can squeak, but they can’t talk, Malenka noted. “You can’t ask a mouse, ‘Hey, did hanging out with your buddies a while ago make you happier?’” So, to explore the social-interaction effects of oxytocin activity in the nucleus accumbens, the investigators used a standard measure called the conditioned place preference test.

"It’s very simple," Malenka said. "You like to hang out in places where you had fun, and avoid places where you didn’t. We give the mice a ‘house’ made of two rooms separated by a door they can walk through at any time. But first, we let them spend 24 hours in one room with their littermates, followed by 24 hours in the other room all by themselves. On the third day we put the two rooms together to make the house, give them complete freedom to go back and forth through the door and log the amount of time they spend in each room."

Mice normally prefer to spend time in the room that reminds them of the good times they enjoyed in the company of their buddies. But that preference vanished when oxytocin activity in their nucleus accumbens was blocked. Interestingly, only social activity appeared to be affected. There was no difference, for example, in the mice’s general propensity to move around. And when the researchers trained the mice to prefer one room over the other by giving them cocaine (which mice love) only when they went into one room, blocking oxytocin activity didn’t stop the mice from picking the cocaine den.

In an extensive series of sophisticated, highly technical experiments, Dölen, Malenka and their teammates located the oxytocin receptors in the murine nucleus accumbens. These receptors lie not on nucleus accumbens nerve cells that carry signals forward to numerous other reward-system nodes but, instead, at the tips of nerve cells forming a tract from a brain region called the dorsal Raphe, which projects to the nucleus accumbens. The dorsal Raphe secretes another important substance, serotonin, triggering changes in nucleus accumbens activity. In fact, popular antidepressants such as Prozac, Paxil and Zoloft belong to a class of drugs called serotonin-reuptake inhibitors that increase available amounts of serotonin in brain regions, including the nucleus accumbens.

As the Stanford team found, oxytocin acting at the nucleus accumbens wasn’t simply squirted into general circulation, as hormones typically are, but was secreted at this spot by another nerve tract originating in the hypothalamus, a multifunction midbrain structure. Oxytocin released by this tract binds to receptors on the dorsal Raphe projections to the nucleus accumbens, in turn liberating serotonin in this key node of the brain’s reward circuitry. The serotonin causes changes in the activity of yet other nerve tracts terminating at the nucleus accumbens, ultimately resulting in altered nucleus accumbens activity — and a happy feeling.

"There are at least 14 different subtypes of serotonin receptor," said Dölen. "We’ve identified one in particular as being important for social reward. Drugs that selectively act on this receptor aren’t clinically available yet, but our study may encourage researchers to start looking at drugs that target it for the treatment of diseases such as autism, where social interactions are impaired."

Malenka and Dölen said they think their findings in mice are highly likely to generalize to humans because the brain’s reward circuitry has been so carefully conserved over the course of hundreds of millions of years of evolution. This extensive cross-species similarity probably stems from pleasure’s absolutely essential role in reinforcing behavior likely to boost an individual’s chance of survival and procreation.

Sep 12, 2013197 notes
#oxytocin #love hormone #ASD #autism #reward system #nucleus accumbens #neuroscience #science
Sep 12, 2013124 notes
#taste receptors #bitter taste #mRNA #papillae #perception #genetics #neuroscience #science
How schizophrenia affects the brain

UI study documents the illness’s effect on brain tissue

It’s hard to fully understand a mental disease like schizophrenia without peering into the human brain. Now, a study by University of Iowa psychiatry professor Nancy Andreasen uses brain scans to document how schizophrenia impacts brain tissue as well as the effects of anti-psychotic drugs on those who have relapses.

Andreasen’s study, published in the American Journal of Psychiatry, documented brain changes seen in MRI scans from more than 200 patients beginning with their first episode and continuing with scans at regular intervals for up to 15 years. The study is considered the largest longitudinal, brain-scan data set ever compiled, Andreasen says.

Schizophrenia affects roughly 3.5 million people, or about one percent of the U.S. population, according to the National Institutes of Health. Globally, some 24 million are affected, according to the World Health Organization.

The scans showed that people at their first episode had less brain tissue than healthy individuals. The findings suggest that those who have schizophrenia are being affected by something before they show outward signs of the disease.

image

“There are several studies, mine included, that show people with schizophrenia have smaller-than-average cranial size,” explains Andreasen, whose appointment is in the Carver College of Medicine. “Since cranial development is completed within the first few years of life, there may be some aspect of earliest development—perhaps things such as pregnancy complications or exposure to viruses—that on average, affected people with schizophrenia.”

Andreasen’s team learned from the brain scans that those affected with schizophrenia suffered the most brain tissue loss in the two years after the first episode, but then the damage curiously plateaued—to the group’s surprise. The finding may help doctors identify the most effective time periods to prevent tissue loss and other negative effects of the illness, Andreasen says.

The researchers also analyzed the effect of medication on the brain tissue. Although results were not the same for every patient, the group found that in general, the higher the anti-psychotic medication doses, the greater the loss of brain tissue.

“This was a very upsetting finding,” Andreasen says. “We spent a couple of years analyzing the data more or less hoping we had made a mistake. But in the end, it was a solid finding that wasn’t going to go away, so we decided to go ahead and publish it. The impact is painful because psychiatrists, patients, and family members don’t know how to interpret this finding. ‘Should we stop using antipsychotic medication? Should we be using less?’”

The group also examined how relapses could affect brain tissue, including whether long periods of psychosis could be toxic to the brain. The results suggest that longer relapses were associated with brain tissue loss.

The insight could change how physicians use anti-psychotic drugs to treat schizophrenia, with the view that those with the disorder can lead productive lives with the right balance of care.

“We used to have hundreds of thousands of people chronically hospitalized. Now, most are living in the community, and this is thanks to the medications we have,” Andreasen notes. “But antipsychotic treatment has a negative impact on the brain, so … we must get the word out that they should be used with great care, because even though they have fewer side effects than some of the other medications we use, they are certainly not trouble free and can have lifelong consequences for the health and happiness of the people and families we serve.”

Sep 12, 2013201 notes
#schizophrenia #neuroimaging #brain mapping #psychology #neuroscience #science
Sep 12, 2013101 notes
#alzheimer's disease #RNA splicing #tangles #plaques #beta amyloid #neuroscience #science
Faulty stem cell regulation may contribute to cognitive deficits associated with Down syndrome

The learning and physical disabilities that affect people with Down syndrome may be due at least in part to defective stem cell regulation throughout the body, according to researchers at the Stanford University School of Medicine. The defects in stem cell growth and self-renewal observed by the researchers can be alleviated by reducing the expression of just one gene on chromosome 21, they found.

The finding marks the first time Down syndrome has been linked to stem cells, and addresses some long-standing mysteries about the disorder. Although the gene, called Usp16, is unlikely to be the only contributor to the disease, the finding raises the possibility of an eventual therapy based on reducing its expression.

“There appear to be defects in the stem cells in all the tissues that we tested, including the brain,” said Michael Clarke, MD, Stanford’s Karel H. and Avice N. Beekhuis Professor in Cancer Biology. The researchers conducted their studies in both mouse and human cells. “We believe Usp16 overexpression is a major contributor to the neurological deficits seen in Down syndrome.”

Clarke is the senior author of the research, published Sept. 11 in Nature. Postdoctoral scholar Maddalena Adorno, PhD, is the lead author.

“Conceptually, this study suggests that drug-based strategies to slow the rate of stem cell use could have profound effects on cognitive function, aging and risk for Alzheimer’s disease in people with Down syndrome,” said co-author Craig Garner, PhD, who is the co-director of Stanford’s Center for Research and Treatment of Down Syndrome and a professor of psychiatry and behavioral sciences.

Down syndrome, which is caused by an extra copy of chromosome 21, affects about 400,000 people in the United States and 6 million worldwide. It causes both physical and cognitive problems. While many of the physical issues, such as vulnerability to heart problems, can now be treated, no treatments exist for poor cognitive function.

The new study’s findings suggest answers to many long-standing mysteries about the condition, including why people with Down syndrome appear to age faster and exhibit early Alzheimer’s disease.

“This study is the first to provide a possible explanation for these tendencies,” said Garner. The fact that people with Down syndrome have three copies of chromosome 21 and the Usp16 gene “accelerates the rate at which stem cells are used during early development, which likely exhausts stem cell pools and impairs tissue regeneration in adults with Down syndrome. As a result, their brains age faster and are susceptible to early onset neurodegenerative disorders.”

The researchers didn’t confine their studies to laboratory mice. They also investigated the effect of Usp16 overexpression in human cells. Adorno and colleagues in the laboratory of co-author Samuel Cheshier, MD, assistant professor of neurosurgery, found that the presence of excess Usp16 caused skin cells from unaffected people to grow more slowly. Furthermore, neural progenitor cells (those self-renewing cellular factories responsible for the development and maintenance of many of the cell types in the brain) were less able to form balls of cells called neurospheres — a laboratory test that reflects the number and robustness of nerve stem cells in a culture. Conversely, reducing Usp16 expression in skin and nerve-progenitor cells from people with Down syndrome allowed the cells, which usually proliferate slowly, to assume normal growth patterns.

“This gene is clearly regulating processes that are central to aging in mice and humans,” said Clarke, “and stem cells are severely compromised. Reducing Usp16 expression gives an unambiguous rescue at the stem cell level. The fact that it’s also involved in this human disorder highlights how critical stem cells are to our well-being.”

Adorno and Clarke didn’t set out to study Down syndrome. Clarke’s past research has focused on how normal stem cells and cancer stem cells regenerate themselves, and Adorno was searching for genes that could inhibit a specific molecular pathway involved in the self-renewal of these cells. Understanding how normal stem cells regenerate themselves could help to repair tissue and organ damage from disease, and understanding how cancer stem cells maintain themselves could help explain why they are unusually resistant to chemotherapy or radiation therapy — often resulting in a patient’s relapse after seemingly successful treatment. Usp16 seemed to fit the bill; it plays a critical role in a self-renewal pathway previously identified by Clarke and his colleagues.

But Adorno and Clarke soon realized that Usp16 had another interesting property: in humans, it is found on chromosome 21.

They turned to Garner and Cheshier to help them evaluate a possible link to Down syndrome. Garner supplied two strains of mice commonly used to study the condition. One, Ts65Dn, has three copies of 132 genes found on human chromosome 21 — including Usp16. The second, Ts1Cje, has three copies of 79 genes from the chromosome, but only two copies of Usp16. Although both mice display some symptoms of the disorder, Ts65Dn more closely mimics the craniofacial structure and learning and memory disabilities seen in affected humans.

Colleagues in the Cheshier laboratory found that neural stem cells from the more-severely affected Ts65Dn mice were less able to self-renew and grow normally than were cells from the Ts1Cje mice. Reducing the expression of Usp16 in the cells from the Ts65Dn mice to more normal levels largely corrected these functional defects.

“We demonstrated that central nervous system stem cells in Down syndrome mice were defective in their ability to self-renew — the process by which stem cells regenerate themselves upon cell division. Blocking Usp16 expression in these cells restored this ability,” said Cheshier. “We hope in the future that correcting this Usp16 defect can lead to therapeutics that will ameliorate the central nervous system defects seen in patients with Down syndrome.”

Finally, the researchers created a new, Ts65Dn-derived mouse strain in which one of the three copies of Usp16 was mutated. This normalized the level of expression of that gene, without affecting the overexpression of the other 131 triplicated genes in these mice. Nerve progenitor cells from these mice were equally able as normal cells to form neurospheres. The researchers are now continuing their studies of these mice.

“We are really interested in learning how other genes in this chromosomal region may be affecting stem cell renewal,” said Clarke. “We also want to understand how much we’re able to rescue the neurological defect by normalizing the expression of Usp16 in this mouse model. How does this compare to what is happening in humans? We’re sure it plays some significant role.”

Sep 12, 201352 notes
#alzheimer's disease #down syndrome #chromosome 21 #stem cells #Usp16 gene #cognitive function #neuroscience #science
The eyes have it: Scientists reveal how organic mercury can interfere with vision

More than one billion people worldwide rely on fish as an important source of animal protein, states the United Nations Food and Agriculture Organization. And while fish provide slightly over 7 per cent of animal protein in North America, in Asia they represent about 23 per cent of consumption.

Humans consume low levels of methylmercury by eating fish and seafood. Methylmercury compounds specifically target the central nervous system, and among the many effects of their exposure are visual disturbances, which were previously thought to be solely due to methylmercury-induced damage to the brain visual cortex. However, after combining powerful synchrotron X-rays and methylmercury-poisoned zebrafish larvae, scientists have found that methylmercury may also directly affect vision by accumulating in the retinal photoreceptors, i.e. the cells that respond to light in our eyes.

image

(Image: A cross section of a zebrafish eye shows the localization of mercury in the outer segments of photoreceptor cells.)

Dr. Gosia Korbas, BioXAS staff scientist at the Canadian Light Source (CLS), says the results of this experiment show quite clearly that methylmercury localizes in the part of the photoreceptor cell called the outer segment, where the visual pigments that absorb light reside.

“There are many reports of people affected by methylmercury claiming a constricted field of vision or abnormal colour vision,” said Korbas. “Now we know that one of the reasons for their symptoms may be that methylmercury directly targets photoreceptors in the retina.”

Korbas and the team of researchers from the University of Saskatchewan including Profs. Graham George, Patrick Krone and Ingrid Pickering conducted their experiments using three X-ray fluorescence imaging beamlines (2-ID-D, 2-ID-E and 20-ID-B) at the Advanced Photon Source, Argonne National Laboratory near Chicago, US, as well as the scanning X-ray transmission beamline (STXM) at the Canadian Light Source in Saskatoon, Canada. 

After exposing zebrafish larvae to methylmercury chloride in water, the team was able to obtain high-resolution maps of elemental distributions, and pinpoint the localization of mercury in the outer segments of photoreceptor cells in both the retina and pineal gland of zebrafish specimens. The results of the research were published in ACS Chemical Biology under the title “Methylmercury Targets Photoreceptor Outer Segments”.

Korbas said zebrafish are an excellent model for investigating the mechanisms of heavy metal toxicity in developing vertebrates. One of the reasons for that is their high degree of correlation with mammals. Recent studies have demonstrated that about 70 per cent of protein-coding human genes have their counterparts in zebrafish, and 84 per cent of genes linked to human diseases can be found in zebrafish.  

“Researchers are studying the potential effects of low level chronic exposure to methylmercury, which is of global concern due to methylmercury presence in fish, but the message that I want to get across is that such exposures may negatively affect vision. Our study clearly shows that we need more research into the direct effects of methylmercury on the eye,” Korbas concluded. 

Sep 12, 201370 notes
#science #methylmercury #vision #zebrafish #photoreceptor cells #retina #neuroscience
Fat Marker Predicts Cognitive Decline in People With HIV

Similarities found between HIV-associated brain damage and impairment from genetic fat-storage disease

Johns Hopkins scientists have found that levels of certain fats found in cerebral spinal fluid can predict which patients with HIV are more likely to become intellectually impaired.

The researchers believe that these fat markers reflect disease-associated changes in how the brain metabolizes these fat molecules. These changes disrupt the brain cells’ ability to regulate the activity of cells’ “garbage disposals” meant to degrade and flush the brain of molecular debris. In this case, too much cholesterol and a fat known as sphingomyelin build up in the lysosomes — the garbage disposals — backing up waste and leading to often debilitating cognitive declines. 

As many as half of patients infected with HIV will develop some form of cognitive impairment, ranging from mild (trouble counting change or driving a car) to frank dementia (an inability to manage activities of every day life), but no tests have been available to predict which people were more likely to suffer cognitive losses.

 “Every researcher of neurodegenerative disease is chasing biomarkers for the same reason: It’s better to identify problems before they strike,” says Norman J. Haughey, Ph.D., an associate professor in the departments of neurology and psychiatry and behavioral sciences at the Johns Hopkins University School of Medicine. He led the current study described online in the journal Neurology.

“It’s very hard to reverse brain damage after it starts,” he says. “Instead, we want to figure out who is likely to lose cognitive function and stop the damage before it happens.”

Haughey and his team analyzed 321 cerebral spinal fluid samples collected from seven test sites across the continental United States, Hawaii and Puerto Rico. The samples came from 291 HIV-positive participants and 30 HIV-negative subjects. The investigators found that early accumulations of a small number of these fat molecules could predict the probability of cognitive decline. As cognitive function declined in these patients, the number of different types of fat molecules that accumulated increased. The types of accumulating fat molecules in HIV were very similar to those that accumulate in inherited forms of a class of diseases called lysosomal storage disorders. This suggests that in some HIV-infected patients, the brain is retaining more of these fats, and this may disrupt the function of lysosomes.

Haughey says he believes some of these impairments in the metabolism of these fats found in people with HIV stems from the infection itself, while others may be linked to the lifesaving antiretroviral therapy taken by most people with HIV. The medications have been associated with elevated blood cholesterol and triglycerides, along with a host of other side effects. Those with HIV are now taking these drugs for decades and the complications from long-term use have not been well studied, he says.

The similarities between the metabolic disturbances in HIV-infected individuals and those apparent in lysosomal storage disorders are enabling Haughey and his team to collaborate with researchers who study genetic lysosomal storage diseases, and who are developing experimental treatments to clear the fat buildup. They are currently exploring dietary and pharmacological interventions designed to restore balance that could potentially restore brain metabolism in HIV-infected individuals, and in doing so could promote good brain health by ensuring the lysosomes function properly.

Sep 12, 201341 notes
#cognitive decline #HIV #lysosomes #lysosomal storage disorders #sphingomyelin #neurology #neuroscience #science
New System Uses Nanodiamonds to Deliver Chemotherapy Directly to Brain Tumors

Researchers at UCLA’s Jonsson Comprehensive Cancer Center have developed a new drug delivery system using nanodiamonds (NDs) that allows for direct application of chemotherapy to brain tumors with fewer harmful side effects and better cancer-killing efficiency than existing treatments.

The study was a collaboration between Dean Ho, professor, division of oral biology and medicine, division of advanced prosthodontics, and department of bioengineering and co-director of the Weintraub Center for Reconstructive Biotechnology at UCLA School of Dentistry and colleagues from the Lurie Children’s Hospital of Chicago and Northwestern University Feinberg School of Medicine.

Glioblastoma is the most common and lethal type of brain tumor. Despite treatment with surgery, radiation and chemotherapy, median survival time of patients with glioblastoma is less than 1.5 years. This tumor is notoriously difficult to treat in part because chemotherapy drugs injected on their own often are unable to cross the blood-brain barrier, which is the system of protective blood vessels that surround the brain. Also, most drugs do not stay concentrated in the tumor tissue long enough to be effective.

The drug doxorubicin (DOX) is a common chemotherapy agent that is a promising treatment for a broad range of cancers, and served as a model drug for treatment of brain tumors when injected directly into the tumor. Ho’s team originally developed a strategy for strongly attaching DOX molecules to ND surfaces, creating a combined substance called ND-DOX.
Nanodiamonds can carry a broad range of drug compounds and prevent the ejection of drug molecules that are injected on their own by proteins found in cancer cells. Thus the ND-DOX stays in the tumor longer than DOX alone, exposing the tumor cells to the drug much longer without affecting the tissue surrounding the tumor.

Ho and colleagues hypothesized that glioblastoma might be efficiently treated with a nanodiamond-modified drug using a technique called convection enhanced delivery (CED), by which they injected ND-DOX directly into brain tumors in rodent models.

The researchers found that the ND-DOX levels in the tumor were retained for a duration far beyond that of DOX alone. The DOX was taken into the tumor and stayed in the tumor longer when attached to NDs. ND-DOX also increased programmed cell death (apoptosis) and decreased cell viability in glioma (brain cancer) cell lines.

Their results also showed for the first time that ND- DOX delivery limited the amount of DOX that was distributed outside the tumor and reduced toxic side effects while keeping the drug in the tumor longer and increasing tumor-killing efficiency for brain cancer treatment. Treatment was more effective and survival time increased significantly in rats treated with ND-DOX compared to those given unmodified DOX. Further research will expand the list of brain cancer chemotherapy drugs that can be attached to the ND surface to improve treatment and reduce side effects.

“Nanomaterials are promising vehicles for treating different types of cancer,” Ho said, “We’re looking for the drugs and situations where nanotechnology actually helps chemotherapy function better, making it easier on the patient and harder on the cancer.” Ho adds that a project of this scale has been successful due to the multi-disciplinary and proactive interactions between his team of bioengineers and outstanding clinical collaborators from Northwestern and Lurie Children’s Hospital.

Ho went on to say that the ND has many facets, almost like the surface of a soccer ball, and can bind to DOX very strongly and quickly. To have a nanoparticle that has translational significance it has to have as many benefits as possible engineered into one system as simply as possible. CED of ND-DOX offers a powerful treatment delivery system against these very difficult and deadly brain tumors.

The study appears in the advance online issue of the peer-reviewed journal Nanomedicine: Nanotechnology, Biology and Medicine.

Sep 11, 2013106 notes
#brain tumors #nanodiamonds #chemotherapy #glioblastoma #doxorubicin #medicine #neuroscience #science
Sep 11, 2013137 notes
#MS #gray matter #white matter #myelin #cerebrospinal fluid #lyme disease #neuroscience #science
Study Suggests Possibility of Selectively Erasing Unwanted Memories

The human brain is exquisitely adept at linking seemingly random details into a cohesive memory that can trigger myriad associations—some good, some not so good. For recovering addicts and individuals suffering from post-traumatic stress disorder (PTSD), unwanted memories can be devastating. Former meth addicts, for instance, report intense drug cravings triggered by associations with cigarettes, money, even gum (used to relieve dry mouth), pushing them back into the addiction they so desperately want to leave.

Now, for the first time, scientists from the Florida campus of The Scripps Research Institute (TSRI) have been able to erase dangerous drug-associated memories in mice and rats without affecting other more benign memories.

The surprising discovery, published this week online ahead of print by the journal Biological Psychiatry, points to a clear and workable method to disrupt unwanted memories while leaving the rest intact.

“Our memories make us who we are, but some of these memories can make life very difficult,” said Courtney Miller, a TSRI assistant professor who led the research. “Not unlike in the movie Eternal Sunshine of the Spotless Mind, we’re looking for strategies to selectively eliminate evidence of past experiences related to drug abuse or a traumatic event. Our study shows we can do just that in mice — wipe out deeply engrained drug-related memories without harming other memories.”

Changing the Structure of Memory

To produce a memory, a lot has to happen, including the alteration of the structure of nerve cells via changes in the dendritic spines—small bulb-like structures that receive electrochemical signals from other neurons. Normally, these structural changes occur via actin, the protein that makes up the infrastructure of all cells.

In the new study, the scientists inhibited actin polymerization—the creation of large chainlike molecules—by blocking a molecular motor called myosin II in the brains of mice and rats during the maintenance phase of methamphetamine-related memory formation.

Behavioral tests showed the animals immediately and persistently lost memories associated with methamphetamine—with no other memories affected.

In the tests, animals were trained to associate the rewarding effects of methamphetamine with a rich context of visual, tactile and scent cues. When injected with the inhibitor many days later in their home environment, they later showed a complete lack of interest when they encountered drug-associated cues. At the same time, the response to other memories, such as food rewards, was unaffected.

While the scientists are not yet sure why powerful methamphetamine-related memories are also so fragile, they think the provocative findings could be related to the role of dopamine, a neurotransmitter involved in reward and pleasure centers in the brain and known to modify dendritic spines. Previous studies had shown dopamine is released during both learning and drug withdrawal. Miller adds, “We are focused on understanding what makes these memories different. The hope is that our strategies may be applicable to other harmful memories, such as those that perpetuate smoking or PTSD.”

Sep 11, 2013217 notes
#PTSD #drug addiction #actin polymerization #methamphetamine #memory #neuroscience #science
Sep 11, 2013111 notes
#memory formation #acetylcholine #nucleus basalis #neurons #plasticity #neuroscience #science
Western scientists discover a novel opiate addiction switch in the brain

Neuroscientists at Western University (London, Canada) have made a remarkable new discovery revealing the underlying molecular process by which opiate addiction develops in the brain. Opiate addiction is largely controlled by the formation of powerful reward memories that link the pleasurable effects of opiate-class drugs to environmental triggers that induce drug craving in individuals addicted to opiates. The research is published in the September 11th issue of The Journal of Neuroscience.

image

The Addiction Research Group led by Steven Laviolette of the Schulich School of Medicine & Dentistry was able to identify how exposure to heroin induces a specific switch in a memory molecule in a region of the brain called the basolateral amygdala, which is involved importantly in controlling memories related to opiate addiction, withdrawal, and relapse. Using a rodent model of opiate addiction, Laviolette’s team found that the process of opiate addiction and withdrawal triggered a switch between two molecular pathways in the amygdala controlling how opiate addiction memories were formed.  In the non-dependent state, they found that a molecule called extracellular signal-related kinase or “ERK” was recruited for early stage addiction memories. However, once opiate addiction had developed, the scientists observed a functional switch to a separate molecular memory pathway, controlled by a molecule called calmodulin-dependent kinase II or “CaMKII”.

“These findings will shed important new light on how the brain is altered by opiate drugs and provide exciting new targets for the development of novel pharmacotherapeutic treatments for individuals suffering from chronic opiate addiction,” says Laviolette, an associate professor in the Departments of Anatomy & Cell Biology, Psychiatry, and Psychology.

Sep 11, 2013115 notes
#addiction #opiate addiction #basolateral amygdala #extracellular signal-related kinase #memory #neuroscience #science
Early-onset Parkinson’s disease linked to genetic deletion

Scientists at the Centre for Addiction and Mental Health (CAMH) and University Health Network (UHN) have found a new link between early-onset Parkinson’s disease and a piece of DNA missing from chromosome 22. The findings help shed new light on the molecular changes that lead to Parkinson’s disease.

The study appears online today in JAMA Neurology.

Among people aged 35 to 64 who were missing DNA from a specific part of chromosome 22, the research team found a marked increase in the number of cases of Parkinson’s disease, compared to expected rates of Parkinson’s disease in the general population from the same age group.

The deletion, which occurs when a person is born with about 50 genes missing on one chromosome 22, is associated with 22q11.2 deletion syndrome. People with this condition may have heart or other birth defects, learning or speech difficulties, and some develop schizophrenia. It occurs in an estimated 1 in 2,000 to 4,000 births, but is believed to be under-diagnosed.

“22q11.2 deletion syndrome has been fairly well studied in childhood and adolescence, but less is known about its effects as people age,” said Dr. Anne Bassett, Director of CAMH’s Clinical Genetics Research Program and Director of the Dalglish Family Hearts and Minds Clinic at UHN, the world’s first clinic dedicated to adults with 22q11.2 deletion syndrome. A few cases of patients with the syndrome who had Parkinson’s disease symptoms had been previously reported, which suggested that the two conditions might be linked.

Parkinson’s disease is one of the most common neurodegenerative disorders worldwide, typically affecting people over the age of 65. Earlier onset of Parkinson’s disease, before age 50, is rare and has been associated with several other genetic changes that are not on chromosome 22.

The researchers studied 159 adults with 22q11.2 deletion syndrome to discover how many had been clinically diagnosed with Parkinson’s disease. For three individuals with the deletion and Parkinson’s disease who were deceased, brain tissue was also examined.

“Through a post-mortem examination, we were able to show that all three patients had a loss of neurons that was typical of that seen in Parkinson’s disease. The examination also helped to show that the symptoms of Parkinson’s disease were not related to side effects of the medications commonly used to treat schizophrenia,” added Dr.Rasmus Kiehl, neuropathologist in UHN’s Laboratory Medicine Program, who co-authored the report with CAMH graduate student Nancy Butcher. The team also found that Parkinson’s disease in 22q11.2 deletion syndrome is associated with abnormal accumulations of protein called Lewy bodies in the brain in some, but not all cases, just as in another genetic form of Parkinson’s disease.

The findings highlight the complexity of clinical care when both Parkinson’s disease and 22q11.2 deletion syndrome are present. “Our results may inform best practices in the clinic in these cases,” said Dr. Bassett, Senior Scientist in CAMH’s Campbell Family Mental Health Research Institute.

Because patients with 22q11.2DS who have schizophrenia are often prescribed anti-psychotic medications, they may experience side-effects such as tremors and muscle stiffness, similar to symptoms of Parkinson’s disease.

As a result, the researchers found that anti-psychotic use delayed the diagnosis of Parkinson’s disease – and the opportunity for treatment – by up to 10 years.

For people with early-onset Parkinson’s disease, who also have other features that could indicate 22q11.2 deletion syndrome, clinical genetic testing for the deletion on chromosome 22 should be considered, the researchers suggest.

“Our discovery that the 22q11.2 deletion syndrome is associated with Parkinson’s disease is very exciting,” said Dr. Anthony Lang, Director of the Movement Disorders Program at the Krembil Neuroscience Centre of Toronto Western Hospital. “The varying pathology that we found is reminiscent of certain other genetic causes of Parkinson’s disease, and opens new directions to search for novel genes that could cause its more common form. Studies of patients with 22q11.2 deletion syndrome before they ever develop clinical features of Parkinson’s disease may not only provide important information on the effectiveness of screening methods for early detection of the disease, but also allow for future ‘neuroprotective treatments’ to be introduced at the ultimate time when they can have a chance to make an important impact on preventing the disease or slowing its course.” 

“Most people with 22q11.2 deletion syndrome will not develop Parkinson’s disease,” emphasizes Dr. Bassett. “But it does occur at a rate higher than in the general population. We will now be on the look-out for this so we can provide the best care for patients.”

Sep 11, 201362 notes
#parkinson's disease #chromosome 22 #22q11.2 deletion syndrome #genetics #neuroscience #science
Therapy Slows Onset and Progression of Lou Gehrig’s Disease

Studies of a therapy designed to treat amyotrophic lateral sclerosis (ALS) suggest that the treatment dramatically slows onset and progression of the deadly disease, one of the most common neuromuscular disorders in the world. The researchers, led by teams from The Research Institute at Nationwide Children’s Hospital and the Ludwig Institute at the University of California, San Diego, found a survival increase of up to 39 percent in animal models with a one-time treatment, a crucial step toward moving the therapy into human clinical trials.

The therapy reduces expression of a gene called SOD1, which in some cases of familial ALS has a mutation that weakens and kills nerve cells called motor neurons that control muscle movement. While many drug studies involve only one type of animal model, this effort included analysis in two different models treated before and after disease onset. The in-depth study could vault the drug into human clinical trials, said Brian Kaspar, PhD, a principal investigator in the Center for Gene Therapy at Nationwide Children’s and a senior author on the research, which was published online Sept. 6 in Molecular Therapy.

“We designed these rigorous studies using two different models of the disease with the experimenters blinded to the treatment and in two separate laboratories,” said Dr. Kaspar, who collaborated on the study with a team led by Don Cleveland, PhD, at the University of California, San Diego. “We were very pleased with the results, and found that the delivery approach was successful in a larger species, enabling us to initiate a clinical translational plan for this horrible disease.”

There currently is no cure for ALS, also called Lou Gehrig’s disease. The Centers for Disease Control and Prevention estimates there are about 5,000 new cases in the U.S. each year, mostly in people age 50 to 60. Although the exact cause of ALS is unknown, more than 170 mutations in the SOD1 gene have been found in many patients with familial ALS, which accounts for about 2 percent of all cases.

SOD1 provides instructions for making an enzyme called superoxide dismutase, which is found throughout the body and breaks down toxic molecules that can be damaging to cells. When mutated, the SOD1 gene yields a faulty version of the enzyme that is especially harmful to motor neurons. One of the mutations, which is found in about half of all familial ALS patients, is particularly devastating, with death usually coming within 18 months of diagnosis. SOD1 has also been implicated in other types of ALS, called sporadic ALS, which means the therapy could prove beneficial for larger numbers of patients suffering with this disease.

Earlier work by Dr. Kaspar and others found that they could halt production of the mutated enzyme by blocking SOD1 expression, which in turn, they suspected, would slow ALS progression. To test this hypothesis, the researchers would not only need to come up with an approach that would block the gene, but also figure out how to specifically target cells implicated in the disease, which include motor neurons and glial cells. What’s more, the therapy would preferably be administered noninvasively instead of direct delivery via burr holes drilled into the skull.

Dr. Kaspar’s team accomplished the second part of this challenge in 2009, when they discovered that adeno-associated virus serotype 9 (AAV9) could cross the blood-brain barrier, making it an ideal transport system for delivering genes and RNA interference strategies designed to treat disease.

In this new work, funded by the National Institutes of Health, the researchers blocked human SOD1, using a technology known as short hairpin RNA, or shRNA. These single strands of RNA are designed in the lab to seek out specific sequences found in the human SOD1 gene, latch onto them and block gene expression.

In one of the mouse models used in the study, ALS develops earlier and advances more quickly. In the other, the disease develops later and progresses more slowly. All of the mice received a single injection of AAV9-SOD1-shRNA before or after disease onset.

Results showed that in the rapid-disease-progressing model, mice treated before disease onset saw a  39 percent increase in survival compared to control treated mice. Strikingly, in mice treated at 21 days of age, disease progression was slowed by 66 percent. Perhaps more surprising was the finding that even after symptoms surfaced in these models, treatment still resulted in a 23 percent increase in survival  and a 36 percent reduction in disease progression. In the slower-disease-onset model, treatment extended survival by 22 percent and delayed disease progression by 38 percent.

“The extension of survival is fantastic, and the fact that we delayed disease progression in both models when treated at disease onset is what drives our excitement to advance this work to human clinical trials,” said Kevin Foust, PhD, co-first author on the manuscript and an assistant professor in neurosciences at The Ohio State University College of Medicine.

In addition to the potential therapeutic benefit, the study also offers some interesting insights into the biological underpinnings of ALS. The role of motor neurons in ALS has been well documented, but this study also highlighted another key player—astrocytes, the most abundant cell type in the human brain and supporters of neuronal function.

“Recent work from our collaborator Dr. Cleveland has demonstrated that astrocytes and other types of glia are as important if not more important in ALS, as they really drive disease progression,” said Dr. Kaspar. “Indeed, in looking at data from mice, more than 50 percent of astrocytes were targeted throughout the spinal cord by this gene-delivery approach.”

Ideally, a therapy would hit motor neurons and astrocytes equally hard. The best way to do that is to deliver the drug directly into the cerebrospinal fluid (CSF), which would reduce the amount of SOD1 suppression in cells outside the brain and reduce immune system exposure to AAV9—elements that would add weight to an argument for studying the drug in humans.

Injections directly into CSF cannot be done easily in mice, so the team took the study a crucial step further by injecting AAV9-SOD1-shRNA into the CSF of healthy nonhuman primates. The results were just as the team hoped—the amount of gene expression dropped by as much as 90 percent in motor neurons and nearly 70 percent in astrocytes and no side effects were reported, laying the groundwork towards moving to human clinical trials.

“We have a vast amount of work to do to move this toward a clinical trial, but we’re encouraged by the results to date and our team at Nationwide Children’s and our outstanding collaborators are fully committed to making a difference in this disease,” Dr. Kaspar said.

The findings could impact other studies underway in Dr. Kaspar’s lab, including research on Spinal Muscular Atrophy, an often fatal genetic disease in infants and children that can cause profoundly weakened muscles in the arms and legs and respiratory failure.

“This research provides further proof of targeting motor neurons and glial cells throughout the entire spinal cord for treatment of Spinal Muscular Atrophy and other degenerative diseases of the brain and spinal cord, through a less invasive manner than direct injections,” said Dr. Kaspar, who also is an associate professor of pediatrics and neurosciences at The Ohio State University College of Medicine.

Sep 10, 201351 notes
#ALS #neurodegeneration #neurodegenerative diseases #motor neurons #SOD1 gene #neuroscience #science
Brain circuitry loss may be a very early sign of cognitive decline in healthy elderly people

The degeneration of a small, wishbone-shaped structure deep inside the brain may provide the earliest clues to future cognitive decline, long before healthy older people exhibit clinical symptoms of memory loss or dementia, a study by researchers with the UC Davis Alzheimer’s Disease Center has found.

image

The longitudinal study found that the only discernible brain differences between normal people who later developed cognitive impairment and those who did not were changes in their fornix, an organ that carries messages to and from the hippocampus, and that has long been known to play a role in memory.

“This could be a very early and useful marker for future incipient decline,” said Evan Fletcher, the study’s lead author and a project scientist with the UC Davis Alzheimer’s Disease Center.

“Our results suggest that fornix variables are measurable brain factors that precede the earliest clinically relevant deterioration of cognitive function among cognitively normal elderly individuals,” Fletcher said.

The research is published online today in JAMA Neurology.

Hippocampal atrophy occurs in the later stages of cognitive decline and is one of the most studied changes associated with the Alzheimer’s disease process. However, changes to the fornix and other regions of the brain structurally connected to the hippocampus have not been as closely examined. The study found that degeneration of the fornix in relation to cognition was detectable even earlier than changes in the hippocampus.

“Although hippocampal measures have been studied much more deeply in relation to cognitive decline, our direct comparison between fornix and hippocampus measures suggests that fornix properties have a superior ability to identify incipient cognitive decline among healthy individuals,” Fletcher said.

The study was conducted over five years in a group of 102 diverse, cognitively normal people with an average age of 73 who were recruited through community outreach at the Alzheimer’s Disease Center. The researchers conducted magnetic resonance imaging (MRI) studies of the participants’ brains that described their volumes and integrity. A different type of MRI was used to determine the integrity of the myelin, the fatty coating that sheaths and protects the axons. The axons are analogous to the copper wiring of the brain’s circuitry and the myelin is like the wiring’s plastic insulation.

Either one of those things being lost will “degrade the signal transmission” in the brain, Fletcher said.

The researchers also conducted psychological tests and cognitive evaluations of the study participants to gauge their level of cognitive functioning. The participants returned for updated MRIs and cognitive testing at approximately one-year intervals. At the outset, none of the study participants exhibited symptoms of cognitive decline. Over time about 20 percent began to show symptoms that led to diagnoses with either mild cognitive impairment (MCI) and, in a minority of cases, Alzheimer’s disease.

“We found that if you looked at various brain factors there was one — and only one — that seemed to be predictive of whether a person would have cognitive decline, and that was the degradation of the fornix,” Fletcher said.

The study measured two relevant fornix characteristics predicting future cognitive impairment — low fornix white matter volume and reduced axonal integrity. Each of these was stronger than any other brain factor in models predicting cognitive loss, Fletcher said. 

He said that routine MRI examination of the fornix could conceivably be used clinically in the future as a predictor of abnormal cognitive decline.

“Our findings suggest that if your fornix volume or integrity is within a certain range you’re at an increased risk of cognitive impairment down the road. But developing the use of the fornix as a predictor in a clinical setting will take some time, in the same way that it took time for evaluation of cholesterol levels to be used to predict future heart disease,” he said.

Fletcher also said that the finding may mark a paradigm shift toward evaluation of the brain’s white matter, rather than its gray matter, as among the very earliest indicators of developing cognitive loss. There is currently a strong research focus on understanding brain processes that lead eventually to Alzheimer’s disease. He said the current finding could fill in one piece of the picture and motivate new directions in research to understand why and how fornix and other white matter change is such an important harbinger of cognitive impairment. 

“The key importance of this finding is that it suggests that white matter tract measures may prove to be promising candidate biomarkers for predicting incipient cognitive decline among cognitively normal individuals in a clinical setting, possibly more so than gray matter measures,” he said.

Sep 10, 201341 notes
#alzheimer's disease #dementia #cognitive decline #fornix #hippocampus #neuroscience #science
A New Method Will Enable the Early Detection of Parkinson’s Disease Through Handwriting

Today’s primary tool for diagnosing Parkinson’s disease is the diagnostic ability of the physician, who can generally identify the clinical symptoms only when the disease is at a relatively advanced stage. A new joint study by researchers at the University of Haifa and Rambam Hospital that compared the handwriting of 40 sick and healthy subjects suggests an innovative and noninvasive method of diagnosing Parkinson’s at a fairly early stage.

“Identifying the changes in handwriting could lead to an early diagnosis of the illness and neurological intervention at a critical moment,” explains Prof. Sara Rosenblum, of the University of Haifa’s Department of Occupational Therapy, who initiated the study.

The methods for diagnosing Parkinson’s today are a physician evaluation or a test called SPECT, which uses radioactive material to image the brain. The latter, however, is no more effective in diagnosing the illness than an expert doctor and it exposes the patient to unnecessary radiation.

Studies from recent years show that there are unique and distinctive differences between the handwriting of patients with Parkinson’s disease and that of healthy people. However, most studies that to date have focused on handwriting focused on motor skills (such as the drawing of spirals) and not on writing that involves cognitive abilities, such as signing a check, copying addresses, etc.

According to Prof. Rosenblum, Parkinson’s patients report feeling a change in their cognitive abilities before detecting a change in their motor abilities and therefore a test of cognitive impairment like the one performed in this study could attest to the presence of the disease and offer a way to diagnose it earlier.

This research was conducted in cooperation with Dr. Ilana Schlesinger, head of the Center for Movement Disorders and Parkinson’s Disease at Haifa’s Rambam Medical Center and occupational therapists working in the hospital. In the study, the researchers asked the subjects to write their names and gave them addresses to copy, two everyday tasks that require cognitive abilities. Participants were 40 adults with at least 12 years of schooling, half healthy and half known to be in the early stages of Parkinson’s disease (before obvious motor signs are visible).

The writing was done on a regular piece of paper that was placed on electronic tablet, using a special pen with pressure-sensitive sensors operated by the pen when it hit the writing surface. A computerized analysis of the results compared a number of parameters: writing form (length, width and height of the letters), time required, and the pressure exerted on the surface while performing the assignment.

Analysis of the results showed significant differences between the patients and the healthy group, and all subjects, except one, had their status correctly diagnosed (97.5% accuracy). The Parkinson’s disease patients wrote smaller letters (“micrograph”), exerted less pressure on the writing surface, and took more time to complete the task. According to Prof. Rosenblum a particularly noticeable difference was the length of time the pen was in the air between the writing of each letter and each word.

“This finding is particularly important because while the patient holds the pen in the air, his mind is planning his next action in the writing process, and the need for more time reflects the subject’s reduced cognitive ability. Changes in handwriting can occur years before a clinical diagnosis and therefore can be an early signal of the approaching disease,” Prof. Rosenblum said.

According to Dr. Schlesinger, validating these findings in a broader study would allow this method to be used for a preliminary diagnosis of the disease in a safe and non-invasive fashion. “This study is a breakthrough toward an objective diagnosis of the disease,” said Dr. Schlesinger, adding, “Publication of the study in the journal of the European Neurological Society aroused great interest at the International Congress of Parkinson’s Disease and Movement held last week in Sydney, Australia.”

The researchers note that this diagnostic method has the added benefit of reducing the load on the health system, because the test can be performed by a professional other than a doctor. After the results are in, patients can be referred to a doctor for further treatment and testing if necessary. The researchers are currently using the method in a new experiment, in which they use handwriting analysis to evaluate the degree of Parkinson’s patients’ improved functioning after they have brain pacemakers implanted.

Sep 10, 201358 notes
#parkinson's disease #handwriting #SPECT #biomarker #neuroscience #science
Cell transplants may be a novel treatment for schizophrenia

Rodent research suggests feasibility of restoring neuron function

Research from the School of Medicine at The University of Texas Health Science Center at San Antonio suggests the exciting possibility of using cell transplants to treat schizophrenia.

Cells called “interneurons” inhibit activity within brain regions, but this braking or governing function is impaired in schizophrenia. Consequently, a group of nerve cells called the dopamine system go into overdrive. Different branches of the dopamine system are involved in cognition, movement and emotions.

“Since these cells are not functioning properly, our idea is to replace them,” said study senior author Daniel Lodge, Ph.D., assistant professor of pharmacology in the School of Medicine.

Transplant restored normal function

Dr. Lodge and lead author Stephanie Perez, graduate student in his laboratory, biopsied tissue from rat fetuses, isolated cells from the tissue and injected the cells into a brain center called the hippocampus. This center regulates the dopamine system and plays a role in learning, memory and executive functions such as decision making. Rats treated with the transplanted cells have restored hippocampal and dopamine function.

Stem cells are able to become different types of cells, and in this case interneurons were selected. “We put in a lot of cells and not all survived, but a significant portion did and restored hippocampal and dopamine function back to normal,” Dr. Lodge said.

‘You can essentially fix the problem’

Unlike traditional approaches to treating schizophrenia, such as medications and deep-brain stimulation, transplantation of interneurons potentially can produce a permanent solution. “You can essentially fix the problem,” Dr. Lodge said. “Ultimately, if this is translated to humans, we want to reprogram a patient’s own cells and use them.”

After meeting with other students, Perez brought the research idea to Dr. Lodge. “The students have journal club, and somebody had done a similar experiment to restore motor deficits and had good results,” Perez said. “We thought, why can’t we use it for schizophrenia and have good results, and so far we have.”

The study is in Molecular Psychiatry.

Sep 10, 2013120 notes
#schizophrenia #stem cells #interneurons #dopamine #hippocampus #neuroscience #science
Hypertensive smoking women have an exceptionally high risk of a fatal brain bleeding

Subarachnoid haemorrhage (SAH) is one of the most devastating cerebrovascular catastrophes causing death in 40 to 50% of the cases. The most common cause of SAH is a rupture of an intracranial aneurysm. If the aneurysm is found, it can be treated before the possible rupture. However, some intracranial aneurysms will never rupture – the problem is that the doctors don’t know which aneurysms will and which will not. So, they don’t know which patients should be treated and who can safely be left untreated.

image

(Image: This picture shows: A middle cerebral artery bifurcation aneurysm. Credit: Miikka Korja)

A long-term, population-based Finnish study on SAH, which is based on the FINRISK health examination surveys, and published in PLOS ONE on 9th September, shows that the risk of SAH depends strongly on the combination of certain risk factors. The SAH incidence was shown to vary from 8 up to 171 per 100 000 person-years, depending on whether people had multiple risk factors for SAH – such as smoking, hypertension and female sex – or not.

Such an extreme risk factor -dependent variation in the incidence of any cardiovascular disease is exceptional, and may have significant clinical implications, says one of the main authors, Associate Professor Miikka Korja from the Helsinki University Central Hospital and Australian School of Advanced Medicine.

If smoking women with high systolic blood pressure values have 20 times higher rate of these brain bleeds than never-smoking men with low blood pressure values, it may very well be that these women diagnosed with unruptured intracranial aneurysms should be treated. On the other hand, never-smoking men with low blood pressure values and intracranial aneurysms may not need to be treated at all.

In this largest SAH risk factor study ever, the study group also identified three new risk factors for SAH: previous myocardial infarction, history of stroke in mother, and elevated cholesterol levels in men. The results revise the understanding of the epidemiology of SAH and indicate that the risk factors for SAH appear to be similar to those for other cardiovascular diseases.

We have previously shown that lifestyle risk factors affect significantly the life expectancy of SAH survivors, and now we have shown that the same risk factors also affect dramatically the risk of SAH itself. Thus, it appears quite clear that especially smoking cessation and hypertension treatment are important in preventing SAH and increasing life expectancy after SAH, clarifies one of the study group members, Academy Professor Jaakko Kaprio, from the University of Helsinki and National Institute for Health and Welfare, referring to their previous publication on cause-specific mortality on SAH survivors (Korja et al., Neurology, 2013).

The study group members have previously published also the largest twin study to date, confirming that heritability for SAH is very low (Korja et al., Stroke, 2010), and the first study on the incidence of SAH in type 1 diabetes, showing that the rate of non-aneurysmal SAHs in type 1 diabetes is unusually high (Korja et al., Diabetes Care, 2013).

Many of the previous studies on the epidemiology of SAH have relied on retrospective and single-center databases, which are unfortunately not very reliable data sources. Due to the unique health care system and common academic interest among doctors in Nordic countries, it has been possible to conduct high-quality and unbiased studies on SAH. We hope that our studies truly help doctors and patients, and are not only of interest in coffee tables on university campuses, says neurosurgeon Korja, and rushes to continue his working day in the operation room in Macquarie University Hospital, Sydney, which is one of his current appointments.

Sep 10, 201349 notes
#aneurysm #subarachnoid haemorrhage #cardiovascular disease #smoking #hypertension #neuroscience #science
Sep 9, 201363 notes
#robots #robotics #perception #technology #neuroscience #science
Sep 9, 2013184 notes
#self-control #neuroimaging #brain activity #decision making #neuroscience #science
Old memories recombine to give a taste of the unknown

Ever tried beetroot custard? Probably not, but your brain can imagine how it might taste by reactivating old memories in a new pattern.

image

Helen Barron and her colleagues at University College London and Oxford University wondered if our brains combine existing memories to help us decide whether to try something new.

So the team used an fMRI scanner to look at the brains of 19 volunteers who were asked to remember specific foods they had tried.

Each volunteer was then given a menu of 13 unusual food combinations – including beetroot custard, tea jelly, and coffee yoghurt – and asked to imagine how good or bad they would taste, and whether or not they would eat them.

"Tea jelly was popular," says Barron. "Beetroot custard not so much."

When each volunteer imagined a new combination, they showed brain activity associated with each of the known ingredients at the same time. It is the first evidence to suggest that we use memory combination to make decisions, says Barron.

Sep 9, 2013175 notes
#decision making #memory #medial prefrontal cortex #hippocampus #neuroscience #science
Genetic breakthrough another step to understanding schizophrenia

A consortium of scientists from 20 countries, including researchers from The University of Western Australia, has made a major breakthrough in understanding the genetic basis of the debilitating disorder, schizophrenia.

More than 175 scientists from 99 institutions across Europe, the United States of America and Australia contributed to a genome-wide association analysis which identified 13 new risk loci for schizophrenia.

In an article published in the journal, Nature Genetics, the study authors write that the results provide deeper insight into the genetic architecture of schizophrenia than ever before achieved, and provide a pathway to further research.

"For the first time, there is a clear path to increased knowledge of the etiology of schizophrenia through the application of standard, off-the-shelf genomic technologies for elucidating the effects of common variation," the authors wrote.

Schizophrenia is a complex mental disorder which affects about one per cent of people over their lifetime, leading to prolonged or recurrent episodes that impair severely social functioning and quality of life.

In terms of the ‘global burden of disease and disability’ index, developed by the World Health Organization, it ranks among the top 10 disorders, along with cancer, heart disease, diabetes and other non-communicable diseases.

Winthrop Professor Assen Jablensky, director of UWA’s Centre for Clinical Research in Neuropsychiatry (CCRN) at Graylands Hospital, and Professor Luba Kalaydjieva, of the UWA-affiliated Western Australian Institute for Medical Research (WAIMR), led the UWA research team which took part in the study.

Professor Jablensky said that while a strong genetic component in the causation of schizophrenia had been well established, the role of specific genes and the mechanisms of their regulation remained largely unknown.

"Until recently, results of genetic linkage and association studies could explain only a small fraction of the estimated heritability of the disorder and of its ‘genetic architecture’," Professor Jablensky said.

However recent technological advances, enabling efficient coverage of the entire human genome with millions of single nucleotide polymorphisms (SNPs) as genetic markers, had given rise to a new generation of genome-wide association studies (GWAS), which trace the DNA differences between people affected with the disease and healthy control individuals.

"Since the effects of individual SNPs are quite tiny, their reliable measurement requires very large samples of adequately diagnosed patients and controls," Professor Jablensky said.

"This recent study reports on a major breakthrough in the understanding of the genetic basis of schizophrenia, achieved through meta-analysis of GWAS datasets contributed by a large international Psychiatric Genomics Consortium (PGC) - which includes the UWA research team."

A WA case-control sample consisting of 893 schizophrenia patients and healthy controls was part of a collection of 21,246 schizophrenia cases and 38,072 controls from 19 research centres and consortia across Europe, Australia and the USA.

The study found that a total of 8300 SNPs contribute to the risk for schizophrenia and account for at least 32 per cent of the variance in liability.

"A particularly important result of this study is that many of these SNPs are located on a molecular pathway involved in neuronal calcium signalling, which suggests a novel pathogenetic link in the causation of schizophrenia and possibly other psychotic disorders," Professor Jablensky said.

He said ongoing and future studies by the UWA research team would aim to further refine the genetic analyses of the WA schizophrenia study (which at present includes 1259 persons), and to test neurobiological hypotheses about the treatment responses of genetically defined subsets of patients. 

Sep 9, 2013111 notes
#schizophrenia #GWAS #genetics #neuroscience #science
Sep 9, 201357 notes
#brain function #nerve cells #C. elegans #nervous system #neural activity #neuroscience #science
Sep 8, 2013144 notes
#glial cells #brain mapping #connectome #neuroscience #science
Sep 8, 201362 notes
#parkinson's disease #alpha synuclein #neurodegenerative diseases #protein #medicine #neuroscience #science
Sep 8, 2013956 notes
#decision making #trust #betrayal #frontal cortex #psychology #neuroscience #science
Sep 8, 2013121 notes
#vitamin B-12 #B-12 deficiency #cognitive decline #dementia #neuroscience #science
Finally mapped: The brain region that distinguishes bits from bounty

In comparing amounts of things — be it the grains of sand on a beach, or the size of a sea gull flock inhabiting it — humans use a part of the brain that is organized topographically, researchers have finally shown. In other words, the neurons that work to make this “numerosity” assessment are laid out in a shape that allows those most closely related to communicate and interact over the shortest possible distance.

image

This layout, referred to as a topographical map, is characteristic of all primary senses — sight, hearing, touch, smell and taste — and scientists have long assumed that numerosity, while not a primary sense (but perceived similarly to one), might be characterized by such a map, too.

But they have not been able to find it, which has caused some doubt in the field as to whether a map for numerosity exists.

Now, however, Utrecht University’s Benjamin Harvey, along with his colleagues, have sussed out signals that illustrate the hypothesized numerosity map is real.

Numerosity, it is important to note, is distinct from symbolic numbers. “We use symbolic numbers to represent numerosity and other aspects of magnitude, but the symbol itself is only a representation,” Harvey said. He went on to explain that numerosity selectivity in the brain is derived from visual processing of image features, where symbolic number selectivity is derived by recognizing the shapes of numerals, written words, and linguistic sounds that represent numbers. “This latter task relies on very different parts of the brain that specialize in written and spoken language.”

Understanding whether the brain’s processing of numerosity and symbolic numbers is related, as we might be tempted to think, is just one area that will be better informed by Harvey’s new map.

To uncover it, he and his colleagues asked eight adult study participants to look at patterns of dots that varied in number over time, all the while analysing the neural response properties in a numerosity-linked part of their brain using high-field fMRI (functional magnetic resonance imaging). Use of this advanced neuroimaging method allowed them to scan the subjects for far fewer hours per sitting than would have been required with a less powerful scanning technology.

With the fMRI data that resulted, Harvey and his team used population receptive field modelling, which aims to measure neural response as directly and quantitatively as possible. “This was the key to our success,” Harvey said. It allowed the researchers to model the human fMRI response properties they observed following results of recordings from macaque neurons, in which numerosity experiments had been conducted more extensively.

Their efforts revealed a topographical layout of numerosity in the human brain; the small quantities of dots the participants observed were encoded by neurons in one part of the brain, and the larger quantities, in another.

This finding demonstrates that topography can emerge not just for lower-level cognitive functions, like the primary senses, but for higher-level cognitive functions, too.

"We are very excited that association cortex can produce emergent topographic structures," Harvey said.

Because scientists know a great deal about topographical maps (and have the tools to probe them), the work of Harvey et al. may help scientists better analyse the neural computation underlying number processing.

"We believe this will lead to a much more complete understanding of humans’ unique numerical and mathematical skills," Harvey said.

Having heard from others in the field about the difficulty associated with the hunt for a topographical map of numerosity, Harvey and colleagues were surprised to obtain the results they did.

They also found the variations between their subjects interesting.

"Every individual brain is a complex and very different system," Harvey explained. "I was very surprised then that the map we report is in such a consistent location between our subjects, and that numerosity preferences always increased in the same direction along the cortex."

"On the other hand," he continued, "the extent of individual differences … is also striking." Harvey explained that understanding the consequences of these differences for their subjects’ perception or task performance will require further study.

Sep 7, 201383 notes
#numerosity #parietal cortex #topographical map #neuroimaging #neuroscience #science
Next page →
20132014
  • January
  • February
  • March
  • April
  • May
  • June
  • July
  • August
  • September
  • October
  • November
  • December
201220132014
  • January
  • February
  • March
  • April
  • May
  • June
  • July
  • August
  • September
  • October
  • November
  • December
20122013
  • January
  • February
  • March
  • April
  • May
  • June
  • July
  • August
  • September
  • October
  • November
  • December