Neuroscience

Month

November 2013

Brain Connectivity Can Predict Epilepsy Surgery Outcomes

A discovery from Case Western Reserve and Cleveland Clinic researchers could provide epilepsy patients invaluable advance guidance about their chances to improve symptoms through surgery.

Assistant Professor of Neurosciences Roberto Fernández Galán, PhD, and his collaborators have identified a new, far more accurate way to determine precisely what portions of the brain suffer from the disease. This information can give patients and physicians better information regarding whether temporal lobe surgery will provide the results they seek.

“Our analysis of neuronal activity in the temporal lobe allows us to determine whether it is diseased, and therefore, whether removing it with surgery will be beneficial for the patient,” Galán said, the paper’s senior author. “In terms of accuracy and efficiency, our analysis method is a significant improvement relative to current approaches.”

The findings appear in research published October 30 in the open access journal PLOS ONE.

About one-third of patients with temporal lobe epilepsy do not respond to medical treatment and opt to do lobectomies to alleviate their symptoms. Yet the surgery’s success rate is only 60 to 70 percent because of the difficulties in identifying the diseased brain tissue prior to the procedures.

Galán and investigators from Cleveland Clinic determined that using intracranial electroencephalography (iEEG) to measure patients’ functional neural connectivity – that is, the communication from one brain region to another - identified the epileptic lobe with 87 percent accuracy. An iEEG records electrical activity with electrodes implanted in the brain. Key indicators of a diseased lobe are weak and similar connections.

In the retrospective study, Galán and Arun Antony, MD, formerly a senior clinical fellow in the Epilepsy Center at Cleveland Clinic and now an assistant professor of neurology at the University of Pittsburgh, examined data from 23 patients with temporal lobe epilepsy who had all or part of their temporal lobes removed after iEEG evaluations performed at Cleveland Clinic. The researchers examined the results of patients’ preoperative iEEG to determine the degree of functional connectivity that was associated with successful surgical outcomes.

“The concept of functional connectivity has been extensively studied by basic science researchers, but has not found a way into the realm of clinical epilepsy treatment yet,” Antony said, the paper’s first author. “Our discovery is another step towards the use of measures of functional connectivity in making clinical decisions in the treatment of epilepsy.”

As a standard preoperative test for lobectomy surgery, physicians analyze iEEG traces looking for simultaneous discharges of neurons that appear as spikes in the recordings, which indicate epileptic activity. This PLOS ONE discovery evaluates the data differently by examining normal brain activity in the absence of spikes and inferring connectivity.

Nov 1, 201353 notes
#epilepsy #brain activity #lobectomy #intracranial electroencephalography #neuroscience #science
Gene Found To Foster Synapse Formation In The Brain

Researchers at Johns Hopkins say they have found that a gene already implicated in human speech disorders and epilepsy is also needed for vocalizations and synapse formation in mice. The finding, they say, adds to scientific understanding of how language develops, as well as the way synapses — the connections among brain cells that enable us to think — are formed. A description of their experiments appears in Science Express on Oct. 31.

image

A group led by Richard Huganir, Ph.D., director of the Solomon H. Snyder Department of Neuroscience and a Howard Hughes Medical Institute investigator, set out to investigate genes involved in synapse formation. Gek-Ming Sia, Ph.D., a research associate in Huganir’s laboratory, first screened hundreds of human genes for their effects on lab-grown mouse brain cells. When one gene, SRPX2, was turned up higher than normal, it caused the brain cells to erupt with new synapses, Sia found.

When Huganir’s team injected fetal mice with an SRPX2-blocking compound, the mice showed fewer synapses than normal mice even as adults, the researchers found. In addition, when SRPX2-deficient mouse pups were separated from their mothers, they did not emit high-pitched distress calls as other pups do, indicating they lacked the rodent equivalent of early language ability.

Other researchers’ analyses of the human genome have found that mutations in SRPX2 are associated with language disorders and epilepsy, and when Huganir’s team injected the human SRPX2 with the same mutations into the fetal mice, they also had deficits in their vocalization as young pups.

Another research group at Institut de Neurobiologie de la Méditerranée in France had previously shown that SRPX2 interacts with FoxP2, a gene that has gained wide attention for its apparently crucial role in language ability.

Huganir’s team confirmed this, showing that FoxP2 controls how much protein the SRPX2 gene makes and may affect language in this way. “FoxP2 is famous for its role in language, but it’s actually involved in other functions as well,” Huganir comments. “SRPX2 appears to be more specialized to language ability.” Huganir suspects that the gene may also be involved in autism, since autistic patients often have language impairments, and the condition has been linked to defects in synapse formation.

This study is only the beginning of teasing out how SRPX2 acts on the brain, Sia says. “We’d like to find out what other proteins it acts on, and how exactly it regulates synapses and enables language development.”

Nov 1, 201366 notes
#synapses #language development #autism #epilepsy #genetics #neuroscience #science
Exposure to Cortisol-Like Medications Before Birth May Contribute to Emotional Problems and Brain Changes

Neonatologists seem to perform miracles in the fight to support the survival of babies born prematurely.

To promote their survival, cortisol-like drugs called glucocorticoids are administered frequently to women in preterm labor to accelerate their babies’ lung maturation prior to birth. Cortisol is a substance naturally released by the body when stressed. But the levels of glucocorticoids administered to promote lung development are higher than that achieved with typical stress, perhaps only mirrored in the body’s reaction to extreme stresses.

The benefit of glucocorticoids is undisputed and has certainly saved the lives of countless babies, but this exposure also may have some negative consequences. Indeed, excessive glucocorticoid levels may have effects on brain development, perhaps contributing to emotional problems later in life.

In this issue of Biological Psychiatry, Dr. Elysia Davis at the University of Denver and her colleagues report new findings on the effects of synthetic glucocorticoid on human brain development. Their study focused on healthy children who were born full-term, avoiding the confounding effects of premature birth.

The investigators conducted brain imaging sessions in and carefully assessed 54 children, 6-10 years of age. The mothers of the participating children also completed reports on their child’s behavior. The researchers then divided the children into two groups: those who were exposed to glucocorticoids prenatally and those who were not.

In this study, children with fetal glucocorticoid exposure showed significant cortical thinning, and a thinner cortex also predicted more emotional problems. In one particularly affected part of the brain, the rostral anterior cingulate cortex, it was 8-9% thinner among children exposed to glucocorticoids. Interestingly, other studies have shown that this region of the brain is affected in individuals diagnosed with mood and anxiety disorders.

"Fetal exposure to a frequently administered stress hormone is associated with consequences for child brain development that persist for at least 6 to 10 years. These neurological changes are associated with increased risk for stress and emotional problems," Davis explained of their findings. "Importantly, these findings were observed among healthy children born full term."

Although such a finding does not indicate that glucocorticoids ‘caused’ these changes, the researchers did determine that the findings can’t be explained by any obvious confounding differences between the groups. The two groups did not differ on weight or gestational age at birth, apgar scores, maternal factors, or any other basic demographics. Thus, the findings do suggest that glucocorticoid administration may somehow alter the trajectory of brain development of exposed children.

"This study provides evidence that prenatal exposure to stress hormones shapes the construction of the fetal nervous system with consequences for the developing brain that persist into the preadolescent period," she added.

"This study highlights potential links between early cortisol exposure, cortical thinning and mood symptoms in children. It may provide important insights into the development of the brain and the long-term impact of maternal stress," commented Dr. John Krystal, Editor of Biological Psychiatry.

Nov 1, 201398 notes
#stress #glucocorticoids #cortisol #brain development #psychology #neuroscience #science
Nov 1, 2013243 notes
#science #infants #speech development #memory #learning #psychology #neuroscience
Critical Gene in Retinal Development and Motion Sensing Identified

Our vision depends on exquisitely organized layers of cells within the eye’s retina, each with a distinct role in perception. Johns Hopkins researchers say they have taken an important step toward understanding how those cells are organized to produce what the brain “sees.” Specifically, they report identification of a gene that guides the separation of two types of motion-sensing cells, offering insight into how cellular layering develops in the retina, with possible implications for the brain’s cerebral cortex. A report on the discovery is published in the Nov. 1 issue of the journal Science.

“The separation of different types of cells into layers is critical to their ability to form the precise sets of connections with each other — the circuitry — that lets us process visual information,” says Alex Kolodkin, Ph.D., a professor in the Johns Hopkins University School of Medicine’s Solomon H. Snyder Department of Neuroscience and an investigator at the Howard Hughes Medical Institute. “There is still much to learn about how that separation happens during development, but we’ve identified for the first time proteins that enable two very similar types of cells to segregate into their own distinct neuronal layers.”

Kolodkin’s research group specializes in studying how circuitry forms among neurons (brain and nerve cells). Past experiments revealed that two types of proteins, called semaphorins and plexins, help guide this process. In the current study, Lu Sun, a graduate student in Kolodkin’s laboratory, focused on the genes that carry the blueprint for these proteins in two of the 10 layers of cells in the mammalian retina.

Those two layers are made up of so-called starburst amacrine cells (SACs). One type of SAC, known as “Off,” detects motion by sensing decreases in the amount of light hitting the retina, while the other type, “On,” detects increases in light. Sun examined the amounts of several semaphorin and plexin proteins being made by each type of cell, and found that only the “On” SACs were making a semaphorin called Sema6A. Sema6A can only work in the retina by interacting with its receptor, a plexin called PlexA2, but Sun found both types of SAC were churning out roughly equal amounts of PlexA2.

Reasoning that Sema6A might be the key difference that enabled the “On” and “Off” SACs to segregate from one another, Kolodkin’s team analyzed mice in which the genes for either Sema6A, PlexA2 or both could be switched off, and looked at the effects of this manipulation on their retinas. “Knocking out” either gene during development led the “On” and “Off” layers to run together, the team found, and caused abnormalities in the “On” SACs’ tree-like extensions. However, the “Off” SACs, which hadn’t been using their Sema6A gene in the first place, still looked and functioned normally.

“When signaling between Sema6A and PlexA2 was lost, not only was layering compromised, but the ‘On’ SACs lost both their distinctive symmetrical appearance, and, importantly, their motion-detecting ability,” Sun says. “This is evidence that the beautiful symmetric shape that gives starburst amacrine cells their name is necessary for their function.”

Adds Kolodkin, “We hope that learning how layering occurs in these very specific cell types will help us begin sorting out how connections are made not just in the retina, but also in neurons throughout the nervous system. Layering also occurs in the cerebral cortex, for example, which is responsible for thought and consciousness, and we really want to know how this is organized during neural development.”

Nov 1, 201347 notes
#retinal development #retina #nerve cells #amacrine cells #cerebral cortex #neuroscience #science
Nov 1, 2013898 notes
#science #consciousness #vegetative state #neuroimaging #attention #brain mapping #neuroscience

October 2013

Incurable Brain Cancer Gene Is Silenced

Gene regulation technology increases survival rates in mice with glioblastoma

Glioblastoma multiforme (GBM), the brain cancer that killed Sen. Edward Kennedy and kills approximately 13,000 Americans a year, is aggressive and incurable. Now a Northwestern University research team is the first to demonstrate delivery of a drug that turns off a critical gene in this complex cancer, increasing survival rates significantly in animals with the deadly disease.

image

Image: Researchers combined gold nanoparticles (in yellow) with small interfering RNAs (in green) to knock down an oncogene that is overexpressed in glioblastoma.

The novel therapeutic, which is based on nanotechnology, is small and nimble enough to cross the blood-brain barrier and get to where it is needed — the brain tumor. Designed to target a specific cancer-causing gene in cells, the drug simply flips the switch of the troublesome oncogene to “off,” silencing the gene. This knocks out the proteins that keep cancer cells immortal.

In a study of mice, the nontoxic drug was delivered by intravenous injection. In animals with GBM, the survival rate increased nearly 20 percent, and tumor size was reduced three to four fold, as compared to the control group. The results are published today (Oct. 30) in Science Translational Medicine.

“This is a beautiful marriage of a new technology with the genes of a terrible disease,” said Chad A. Mirkin, a nanomedicine expert and a senior co-author of the study. “Using highly adaptable spherical nucleic acids, we specifically targeted a gene associated with GBM and turned it off in vivo. This proof-of-concept further establishes a broad platform for treating a wide range of diseases, from lung and colon cancers to rheumatoid arthritis and psoriasis.”

Mirkin is the George B. Rathmann Professor of Chemistry in the Weinberg College of Arts and Sciences and professor of medicine, chemical and biological engineering, biomedical engineering and materials science and engineering.

Glioblastoma expert Alexander H. Stegh came to Northwestern University in 2009, attracted by the University’s reputation for interdisciplinary research, and within weeks was paired up with Mirkin to tackle the difficult problem of developing better treatments for glioblastoma. 

Help is critical for patients with GBM: The median survival rate is 14 to 16 months, and approximately 16,000 new cases are reported in the U.S. every year.

In their research partnership, Mirkin had the perfect tool to tackle the deadly cancer: spherical nucleic acids (SNAs), new globular forms of DNA and RNA, which he had invented at Northwestern in 1996, and which are nontoxic to humans. The nucleic acid sequence is designed to match the target gene.

And Stegh had the gene: In 2007, he and colleagues identified the gene Bcl2Like12 as one that is overexpressed in glioblastoma tumors and related to glioblastoma’s resistance to conventional therapies.

“My research group is working to uncover the secrets of cancer and, more importantly, how to stop it,” said Stegh, a senior co-author of the study. “Glioblastoma is a very challenging cancer, and most chemo-therapeutic drugs fail in the clinic. The beauty of the gene we silenced in this study is that it plays many different roles in therapy resistance. Taking the gene out of the picture should allow conventional therapies to be more effective.”

Stegh is an assistant professor in the Ken and Ruth Davee Department of Neurology at the Northwestern University Feinberg School of Medicine and an investigator in the Northwestern Brain Tumor Institute.

The power of gene regulation technology is that a disease with a genetic basis can be attacked and treated if scientists have the right tools. Thanks to the Human Genome Project and genomics research over the last two decades, there is an enormous number of genetic targets; having the right therapeutic agents and delivery materials has been the challenge.

“The RNA interfering-based SNAs are a completely novel approach in thinking about cancer therapy,” Stegh said. “One of the problems is that we have large lists of genes that are somehow disregulated in glioblastoma, but we have absolutely no way of targeting all of them using standard pharmacological approaches. That’s where we think nanomaterials can play a fundamental role in allowing us to implement the concept of personalized medicine in cancer therapy.”

Stegh and Mirkin’s drug for GBM is specially designed to target the Bcl2Like12 gene in cancer cells. Key is the nanostructure’s spherical shape and nucleic acid density. Normal (linear) nucleic acids cannot get into cells, but these spherical nucleic acids can. Small interfering RNA (siRNA) surrounds a gold nanoparticle like a shell; the nucleic acids are highly oriented, densely packed and form a tiny sphere. (The gold nanoparticle core is only 13 nanometers in diameter.) The RNA’s sequence is programmed to silence the disease-causing gene.

“The problems posed by glioblastoma and many other diseases are simply too big for one research group to handle,” said Mirkin, who also is the director of Northwestern’s International Institute for Nanotechnology. “This work highlights the power of scientists and engineers from different fields coming together to address a difficult medical issue.”

Mirkin first developed the nanostructure platform used in this study in 1996 at Northwestern, and the technology now is the basis of powerful commercialized and FDA-cleared medical diagnostic tools. This new development, however, is the first realization that the nanostructures injected into an animal naturally find their target in the brain and can deliver an effective payload of therapeutics.

The next step for the therapeutic will be to test it in clinical trials.

The nanostructures used in this study were developed in Mirkin’s lab on the Evanston campus and then used in cell and animal studies in Stegh’s lab on the Chicago campus.

Oct 31, 2013205 notes
#glioblastoma #brain tumors #brain cancer #medicine #science
Oct 31, 2013129 notes
#motor cortex #learning #brain mapping #brain activity #infants #psychology #neuroscience #science
Oct 31, 2013268 notes
#infants #premature babies #anxiety #stress #pain #psychology #neuroscience #science
Oct 31, 201387 notes
#retina #retinal ganglion cells #neurons #dendrites #neuroscience #science
Play
Oct 31, 2013160 notes
#night vision #spelunker illusion #synesthesia #kinesthesis #vision #psychology #neuroscience #science
Oct 30, 201363 notes
#gambling #addiction #compulsive behavior #dopamine #psychology #neuroscience #science
Oct 30, 201383 notes
#fibromyalgia #migraines #pain #HD-tDCS #neuroscience #science
High blood sugar makes Alzheimer’s plaque more toxic to the brain

High blood-sugar levels, such as those linked with Type 2 diabetes, make beta amyloid protein associated with Alzheimer’s disease dramatically more toxic to cells lining blood vessels in the brain, according to a new Tulane University study published in latest issue of the Journal of Alzheimer’s Disease.

The study supports growing evidence pointing to glucose levels and vascular damage as contributors to dementia.

“Previously, it was believed that Alzheimer’s disease was due to the accumulation of ‘tangles’ in neurons in the brain from overproduction and reduced removal of beta amyloid protein,” said senior investigator Dr. David Busija, regents professor and chair of pharmacology at Tulane University School of Medicine. “While neuronal involvement is a major factor in Alzheimer’s development, recent evidence indicates damaged cerebral blood vessels compromised by high blood sugar play a role. Even though the links among Type 2 diabetes, brain blood vessels and Alzheimer’s progression are unclear, hyperglycemia appears to play a role.”

Drs. Cristina Carvalho and Paula Moreira from the University of Coimbra in Portugal were co-investigators in the study.  

Researchers studied cell cultures taken from the lining of cerebral blood vessels, one from normal rats and another from mice with uncontrolled chronic diabetes. They exposed the cells to beta amyloid and different levels of glucose and later measured their viability. Cells exposed to high glucose or beta amyloid alone showed no changes in viability. However, when exposed to hyperglycemic conditions and beta amyloid, viability decreased by 40 percent. Researchers suspect the damage is due to oxidative stress from the mitochondria of the cell.

The cells from diabetic mice were more susceptible to damage and death to beta amyloid protein − even at normal glucose levels. The increased toxicity of beta amyloid may damage the blood-brain barrier, disrupt normal blood flow to the brain and decrease clearance of beta amyloid protein.

The study’s findings underscore the need to aggressively control blood sugar levels in diabetic individuals, Busija said.

Oct 30, 2013120 notes
#alzheimer's disease #glucose #Type II diabetes #beta amyloid #neuroscience #science
Oct 30, 201372 notes
#science #Carl Friedrich Gauss #Conrad Heinrich Fuchs #central fissure #cerebral cortex #neuroimaging #neuroscience
Study with totally blind people shows how light helps activate the brain

Light enhances brain activity during a cognitive task even in some people who are totally blind, according to a study conducted by researchers at the University of Montreal and Boston’s Brigham and Women’s Hospital. The findings contribute to scientists’ understanding of everyone’s brains, as they also revealed how quickly light impacts on cognition. “We were stunned to discover that the brain still respond significantly to light in these rare three completely blind patients despite having absolutely no conscious vision at all,” said senior co-author Steven Lockley. “Light doesn’t just allow us to see, it tells the brain whether it’s night or day which in –turn ensures that our physiology, metabolism and behavior are synchronized with environmental time”. “For diurnal species like ours, light stimulates day-like brain activity, improving alertness and mood, and enhancing performance on many cognitive tasks,” explained senior co-author Julie Carrier. The results indicate that their brains can still “see”, or detect, light via a novel photoreceptor in the ganglion cell layer of the retina, different from the rods and cones we use to see.

image

Scientists believe, however, that these specialized photoreceptors in the retina also contribute to visual function in the brain even when cells in the retina responsible for normal image formation have lost their ability to receive or process light. A previous study in a single blind patient suggested that this was possible but the research team wanted to confirm this result in different patients. To test this hypothesis, the three participants were asked to say whether a blue light was on or off, even though they could not see the light. “We found that the participants did indeed have a non-conscious awareness of the light – they were able to determine correctly when the light was on greater than chance without being able to see it,” explained first author Gilles Vandewalle.

The next steps involved looking closely at what happened to brain activation when light was flashed at their eyes at the same time as their attentiveness to a sound was monitored. “The objective of this second test was to determine whether the light affected the brain patterns associated with attentiveness – and it did,” said first author Olivier Collignon.

Finally, the participants underwent a functional MRI brain scan as they performed a simple sound matching task while lights were flashed in their eyes. “The fMRI further showed that during an auditory working memory task, less than a minute of blue light activated brain regions important to perform the task. These regions are involved in alertness and cognition regulation as well being as key areas of the default mode network,” Vandewalle explained. Researchers believe that the default network is linked to keeping a minimal amount of resources available for monitoring the environment when we are not actively doing something. “If our understanding of the default network is correct, our results raise the intriguing possibility that light is key to maintaining sustained attention” agreed Lockley and Carrier. “This theory may explain why the brain’s performance is improved when light is present during tasks.”

Oct 29, 2013109 notes
#brain activity #blindness #photoreceptors #neuroimaging #neuroscience #science
Oct 29, 201381 notes
#MS #iron #gray matter #neuroimaging #clinically isolated syndrome #neuroscience #science
Untangling Alzheimer's Disease

TAU researchers identify specific molecules that could be targeted to treat the disorder

image

Plaques and tangles made of proteins are believed to contribute to the debilitating progression of Alzheimer’s disease. But proteins also play a positive role in important brain functions, like cell-to-cell communication and immunological response. Molecules called microRNAs regulate both good and bad protein levels in the brain, binding to messenger RNAs to prevent them from developing into proteins.

Now, Dr. Boaz Barak and a team of researchers in the lab of Prof. Uri Ashery of Tel Aviv University’s Department of Neurobiology at the George S. Wise Faculty of Life Sciences and the Sagol School of Neuroscience have identified a specific set of microRNAs that detrimentally regulate protein levels in the brains of mice with Alzheimer’s disease and beneficially regulate protein levels in the brains of other mice living in a stimulating environment.

"We were able to create two lists of microRNAs — those that contribute to brain performance and those that detract — depending on their levels in the brain," says Dr. Barak. "By targeting these molecules, we hope to move closer toward earlier detection and better treatment of Alzheimer’s disease."

Prof. Daniel Michaelson of TAU’s Department of Neurobiology in the George S. Wise Faculty of Life Sciences and the Sagol School of Neuroscience, Dr. Noam Shomron of TAU’s Department of Cell and Developmental Biology and Sagol School of Neuroscience, Dr. Eitan Okun of Bar-Ilan University, and Dr. Mark Mattson of the National Institute on Aging collaborated on the study, published in Translational Psychiatry.

A double-edged sword

Alzheimer’s disease is the most common form of dementia. Currently incurable, it increasingly impairs brain function over time, ultimately leading to death. The TAU researchers became interested in the disease while studying the brains of mice living in an “enriched environment” — an enlarged cage with running wheels, bedding and nesting material, a house, and frequently changing toys. Such environments have been shown to improve and maintain brain function in animals much as intellectual activity and physical fitness do in people.

The researchers ran a series of tests on a part of the mice’s brains called the hippocampus, which plays a major role in memory and spatial navigation and is one of the earliest targets of Alzheimer’s disease in humans. They found that, compared to mice in normal cages, the mice from the enriched environment developed higher levels of good proteins and lower levels of bad proteins. Then, for the first time, they identified the microRNAs responsible for regulating the expression of both good and bad proteins.

Armed with this new information, the researchers analyzed changes in the levels of microRNAs in the hippocampi of young, middle-aged, and old mice with an Alzheimer’s-disease-like condition. They found that some of the microRNAs were expressed in exactly inverse amounts in mice with Alzheimer’s disease as they were in mice from the enriched environment. The results were higher levels of bad proteins and lower levels of good proteins in the hippocampi of old mice with Alzheimer’s disease. The microRNAs the researchers identified had already been shown or predicted to regulate the expression of proteins in ways that contributed to Alzheimer’s disease. Their finding that the microRNAs are inversely regulated in mice from the enriched environment is important, because it suggests the molecules can be targeted by activities or drugs to preserve brain function.

Brain-busting potential

Two findings appear to have particular potential for treating people with Alzheimer’s disease. In the brains of old mice with the disease, microRNA-325 was diminished, leading to higher levels of tomosyn, a protein that is well known to inhibit cellular communication in the brain. The researchers hope that eventually microRNA-325 can be used to create a drug to help Alzheimer’s patients maintain low levels of tomosyn and preserve brain function. Additionally, the researchers found several important microRNAs at low levels starting in the brains of young mice. If the same can be found in humans, these microRNAs could be used as biomarker to detect Alzheimer’s disease at a much earlier age than is now possible — at 30 years of age, for example, instead of 60.

"Our biggest hope is to be able to one day use microRNAs to detect Alzheimer’s disease in people at a young age and begin a tailor-made treatment based on our findings, right away," says Dr. Barak.

Oct 29, 201368 notes
#alzheimer's disease #hippocampus #microRNA #tomosyn #synaptic plasticity #neuroscience #science
Oct 29, 201341 notes
#alzheimer's disease #dementia #presenilin #kinesin #GSK-3ß #dynein #neurons #neuroscience #science
Oct 29, 201372 notes
#science #circadian rhythms #suprachiasmatic nucleus #jet lag #neurons #vasoactive intestinal polypeptide #neuroscience
Snakes on the brain: Are primates hard-wired to see snakes?

Was the evolution of high-quality vision in our ancestors driven by the threat of snakes? Work by neuroscientists in Japan and Brazil is supporting the theory originally put forward by Lynne Isbell, professor of anthropology at the University of California, Davis.

image

In a paper published Oct. 28 in the journal Proceedings of the National Academy of Sciences, Isbell; Hisao Nishijo and Quan Van Le at Toyama University, Japan; and Rafael Maior and Carlos Tomaz at the University of Brasilia, Brazil; and colleagues show that there are specific nerve cells in the brains of rhesus macaque monkeys that respond to images of snakes.

The snake-sensitive neurons were more numerous, and responded more strongly and rapidly, than other nerve cells that fired in response to images of macaque faces or hands, or to geometric shapes. Isbell said she was surprised that more neurons responded to snakes than to faces, given that primates are highly social animals.

"We’re finding results consistent with the idea that snakes have exerted strong selective pressure on primates," Isbell said.

Isbell originally published her hypothesis in 2006, following up with a book, “The Fruit, the Tree and the Serpent” (Harvard University Press, 2009) in which she argued that our primate ancestors evolved good, close-range vision primarily to spot and avoid dangerous snakes.

Modern mammals and snakes big enough to eat them evolved at about the same time, 100 million years ago. Venomous snakes are thought to have appeared about 60 million years ago — “ambush predators” that have shared the trees and grasslands with primates.

Nishijo’s laboratory studies the neural mechanisms responsible for emotion and fear in rhesus macaque monkeys, especially instinctive responses that occur without learning or memory. Previous researchers have used snakes to provoke fear in monkeys, he noted. When Nishijo heard of Isbell’s theory, he thought it might explain why monkeys are so afraid of snakes.

"The results show that the brain has special neural circuits to detect snakes, and this suggests that the neural circuits to detect snakes have been genetically encoded," Nishijo said.

The monkeys tested in the experiment were reared in a walled colony and neither had previously encountered a real snake.

"I don’t see another way to explain the sensitivity of these neurons to snakes except through an evolutionary path," Isbell said.

Isbell said she’s pleased to be able to collaborate with neuroscientists.

"I don’t do neuroscience and they don’t do evolution, but we can put our brains together and I think it brings a wider perspective to neuroscience and new insights for evolution," she said.

Oct 29, 201383 notes
#evolution #emotion #fear #brain mapping #neuroscience #science
Oct 29, 201353 notes
#motor performance #academic skills #cardiovascular fitness #psychology #neuroscience #science
Oct 29, 2013409 notes
#poverty #brain development #nurture #amygdala #hippocampus #childhood #psychology #neuroscience #science
Rare Childhood Disease May Hold Clues to Treating Alzheimer's and Parkinson's

Scientists at Rutgers University studying the cause of a rare childhood disease that leaves children unable to walk by adolescence say new findings may provide clues to understanding more common neurodegenerative diseases like Alzheimer’s and Parkinson’s and developing better tools to treat them.

image

Courtesy of A-T Children’s Project: Andrew, 14, who has A-T disease with his brother, Brendan, 12, who did not inherit the rare childhood neurodegenerative disorder.

In today’s online edition of Nature Neuroscience, professors Karl Herrup, Ronald Hart and Jiali Li in the Department of Cell Biology and Neuroscience, and Alexander Kusnecov, associate professor in behavioral and systems neuroscience in the Department of Psychology, provide new information about A-T disease, a rare genetic childhood disorder that occurs in an estimated 1 in 40,000 births.

Children born with A-T disease have mutations in both of their copies of the ATM gene and cannot make normal ATM protein. This leads to problems in movement, coordination, equilibrium and muscle control as well as a number of other deficiencies outside the nervous system.

Using mouse and human brain tissue studies, Rutgers researchers found that without ATM, the levels of a regulatory protein known as EZH2 go up. Looking through the characteristics of A-T disease in cells in tissue culture and in brain samples from both humans and mice with ATM mutation, they found that the increase in EZH2 was a major contributing factor to the neuromuscular problems caused by A-T.

“We hope that this work will lead to new therapies to prevent symptoms in those with A-T disease,” says Hart. “But on a larger level, this research provides a strong clue toward understanding more common neurodegenerative disorders that may use similar pathways. “It is a theme that has not yet been examined.”

While the EZH2 protein has been shown to help determine whether genes get turned on or off, altering the body’s ability to perform biological functions, necessary for maintaining good health, the Rutgers study is the first time this protein – which can cause adverse health effects if there is too much of it – has been looked at in the mature nerve cells of the brain.

By reducing the excess EZH2 protein that accumulated in mice genetically engineered with A-T disease, and creating a better protein balance within the nerve cells, Rutgers scientists found that mice exhibited improved muscle control, movement and coordination.

In the study, mutant mice that had A-T disease and increased levels of EZH2 were “cured” when this excess EZH2 protein was reduced. The treated mice were able to stay on a rotating rod without falling off almost as long as the mice that did not have A-T disease. By contrast, untreated A-T animals lost their balance and fell off the device almost immediately. The mice were also studied in an open area setting. While the treated A-T mice and normal mice explored a wide area of the open field, the A-T mice, with their excess EZH2 protein, were not as adventurous and stayed behind.

Rutgers scientists say the implications of these findings now need to be validated in a clinical setting. They have begun working with the A-T Clinical Center at Johns Hopkins University, collecting blood samples from children with the disease as well as their parents who carry the genes in order to reprogram them into stem cells. This will allow scientists to create human neurons like those in A-T patients and study the mechanisms that lead from ATM mutations to nerve cell disease in more detail.

The hope is that this new information can be used to develop therapeutic drugs that may result in better neuromuscular control and coordination for those with A-T disease. In addition, the scientists will work to determine whether the EZH2 protein plays a role in other more common neurodegenerative diseases, like Parkinson’s and Alzheimer’s and could offer a target for developing drugs to treat those brain disorders.

“What is interesting about human health and this research in particular is that it illustrates how a disease that is thought of as 100 percent genetic, actually has a component that is sensitive to the environment,” says Herrup, lead author of the study.

Oct 28, 201356 notes
#neurodegenerative diseases #a-t disease #ATM gene #genetics #EZH2 #neuromuscular control #neuroscience #science
Oct 28, 2013196 notes
#science #alzheimer's disease #dementia #memory #neurodegeneration #genetics #neuroscience
Oct 28, 2013178 notes
#dendritic spikes #axonal spikes #brain mapping #neuroimaging #neurons #neuroscience #science
Epigenetics: A Key to Controlling Acute and Chronic Pain

Epigenetics, the study of changes in gene expression through mechanisms outside of the DNA structure, has been found to control a key pain receptor related to surgical incision pain, according to a study in the November issue of Anesthesiology. This study reveals new information about pain regulation in the spinal cord.

“Postoperative pain is an incompletely understood and only partially controllable condition that can result in suffering, medical complications, unplanned hospital admissions and disappointing surgery outcomes,” said David J. Clark, M.D., Ph.D., Professor of Anesthesia at Stanford University and Director of Pain Management at the VA Palo Alto Health Care System. “We know that histone acetylation and deacetylation modifies many cellular processes and produces distinct outcomes. In this study we found that histones can epigenetically activate or silence gene expression to either increase or decrease incision pain.”

Human DNA is wrapped around proteins called histones, much like thread is wrapped around a spool. When a histone undergoes deacetylation, the DNA wraps more tightly around the spool, effectively silencing genes. Conversely, when it undergoes acetylation, the DNA is loosened, allowing for transcription or modifications of genes to occur.

In this study, groups of mice had small surgical incisions made in their hind paws after being anesthetized. These mice were then regularly injected with suberoylanilide hydroxamic acid (SAHA), which prevents deacetylation (thus promoting gene transcription), or anacardic acid, which prevents acetylation (thus reducing gene transcription). The authors tested the animals daily for the degree of pain sensitivity in their hind paws.

The study found that regulation of histone acetylation can control pain sensitization after an incision. Specifically, maintaining histone in a relatively deacetylated state reduced hypersensitivity after incision. This is due, in part, to the epigenetic regulation of a specific gene known as CXCR2 and one of its chemokine ligands (KC). The authors also found that these epigenetic changes far outlasted the recovery of animals from their incisions, a property that might help explain why some patients suffer from chronic postoperative pain. Study authors suggest that looking into the roles of these epigenetic mechanisms may help scientists find new ways to treat or prevent acute and chronic postoperative pain in the future.

“Epigenetics is a relatively underappreciated area of science, but the discoveries yet to be made in this field will be many,” said Dr. Clark. “While fascinating information has been found by studying specific genes, we need to bridge the gap in science and focus on groups or systems of many genes simultaneously, which could be give us clues to greater breakthroughs in pain control and other areas of medicine.”

Oct 27, 2013132 notes
#epigenetics #pain #acute pain #postoperative pain #histones #CXCR2 #neuroscience #science
Ultrasound device combined with clot-buster safe for stroke

A study led by researchers at The University of Texas Health Science Center at Houston (UTHealth) showed that a hands-free ultrasound device combined with a clot-busting drug was safe for ischemic stroke patients.

image

The results of the phase II pilot study were reported today in the American Heart Association journal Stroke. Lead author is Andrew D. Barreto, M.D., assistant professor of neurology in the Stroke Program at the UTHealth Medical School. Principal investigator is James C. Grotta, M.D., professor and chair of the Department of Neurology at the UTHealth Medical School, the Roy M. & Phyllis Gough Huffington Distinguished Chair and co-director of the Mischer Neuroscience Institute at Memorial Hermann-Texas Medical Center.

The device, which uses UTHealth technology licensed to Cerevast Therapeutics, Inc., is placed on the stroke patient’s head and delivers ultrasound to enhance the effectiveness of the clot-busting drug tissue plasminogen activator (tPA). Unlike the traditional hand-held ultrasound probe that’s aimed at a blood clot, the hands-free device used 18 separate probes and showers the deep areas of the brain where large blood clots cause severe strokes. 

“Our goal is to open up more arteries in the brain and help stroke patients recover,” said Barreto, an attending physician at Mischer Neuroscience Institute. “This technology would have a significant impact on patients, families and society if we could improve outcomes by another 10 percent or more by adding ultrasound to patients who’ve already received tPA.”

In the first study of its kind, 20 moderately severe ischemic stroke patients (12 men and eight women, average age 63 years) received intravenous tPA up to 4.5 hours after symptoms occurred and two hours exposure to 2-MHz pulsed wave transcranial ultrasound.

Researchers reported that 13 (or 65 percent) patients either returned home or to rehabilitation 90 days after the combination treatment. After three months, five of the 20 patients had no disability from the stroke and one had slight disability.

Oct 27, 201373 notes
#science #stroke #ultrasound device #tissue plasminogen activator #clotbust #neuroscience
Oct 26, 2013153 notes
#addiction #cocaine addiction #cocaine #topiramate #glutamate #psychology #neuroscience #science
Oct 26, 2013175 notes
#tech #AI #robotics #neuroimaging #neuroscience #technology #science
Lou Gehrig's Disease: From Patient Stem Cells to Potential Treatment Strategy in One Study

Although the technology has existed for just a few years, scientists increasingly use “disease in a dish” models to study genetic, molecular and cellular defects. But a team of doctors and scientists led by researchers at the Cedars-Sinai Regenerative Medicine Institute went further in a study of Lou Gehrig’s disease, a fatal disorder that attacks muscle-controlling nerve cells in the brain and spinal cord.

After using an innovative stem cell technique to create neurons in a lab dish from skin scrapings of patients who have the disorder, the researchers inserted molecules made of small stretches of genetic material, blocking the damaging effects of a defective gene and, in the process, providing “proof of concept” for a new therapeutic strategy – an important step in moving research findings into clinical trials.

The study, published Oct. 23 in Science Translational Medicine, is believed to be one of the first in which a specific form of Lou Gehrig’s disease, or amyotrophic lateral sclerosis, was replicated in a dish, analyzed and “treated,” suggesting a potential future therapy all in a single study.

"In a sense, this represents the full spectrum of what we are trying to accomplish with patient-based stem cell modeling. It gives researchers the opportunity to conduct extensive studies of a disease’s genetic and molecular makeup and develop potential treatments in the laboratory before translating them into patient trials," said Robert H. Baloh, MD, PhD, director of Cedars-Sinai’s Neuromuscular Division in the Department of Neurology and director of the multidisciplinary ALS Program. He is the lead researcher and the article’s senior author.

Laboratory models of diseases have been made possible by a recently invented process using induced pluripotent stem cells – cells derived from a patient’s own skin samples and “sent back in time” through genetic manipulation to an embryonic state. From there, they can be made into any cell of the human body.

The cells used in the study were produced by the Induced Pluripotent Stem Cell Core Facility of Cedars-Sinai’s Regenerative Medicine Institute. Dhruv Sareen, PhD, director of the iPSC facility and a faculty research scientist with the Department of Biomedical Sciences, is the article’s first author and one of several institute researchers who participated in the study.

"In these studies, we turned skin cells of patients who have ALS into motor neurons that retained the genetic defects of the disease," Baloh said. "We focused on a gene, C9ORF72, that two years ago was found to be the most common cause of familial ALS and frontotemporal lobar degeneration, and even causes some cases of Alzheimer’s and Parkinson’s disease. What we needed to know, however, was how the defect triggered the disease so we could find a way to treat it."

Frontotemporal lobar degeneration is a brain disorder that typically leads to dementia and sometimes occurs in tandem with ALS.

The researchers found that the genetic defect of C9ORF72 may cause disease because it changes the structure of ribonucleic acid (RNA) coming from the gene, creating an abnormal buildup of a repeated set of nucleotides, the basic components of RNA.

"We think this buildup of thousands of copies of the repeated sequence GGGGCC in the nucleus of patients’ cells may become "toxic" by altering the normal behavior of other genes in motor neurons," Baloh said. "Because our studies supported the toxic RNA mechanism theory, we used two small segments of genetic material called antisense oligonucleotides – ASOs – to block the buildup and degrade the toxic RNA. One ASO knocked down overall C9ORF72 levels. The other knocked down the toxic RNA coming from the gene without suppressing overall gene expression levels. The absence of such potentially toxic RNA, and no evidence of detrimental effect on the motor neurons, provides a strong basis for using this strategy to treat patients suffering from these diseases."

Researchers from another institution recently led a phase one trial of a similar ASO strategy to treat ALS caused by a different genetic mutation and reportedly uncovered no safety issues.

Oct 26, 201350 notes
#neurodegenerative diseases #ALS #lou gehrig’s disease #motor neurons #genetics #neuroscience #science
Oct 26, 2013174 notes
#science #memory #learning #reconsolidation #auditory cortex #neuroimaging #dendritic spines #neuroscience
Oct 25, 201370 notes
#neurodegenerative diseases #alpha-synuclein #huntington's disease #yeast #neurodegeneration #genetics #neuroscience #science
Genetic analysis reveals insights into the genetic architecture of OCD, Tourette syndrome

An international research consortium led by investigators at Massachusetts General Hospital (MGH) and the University of Chicago has answered several questions about the genetic background of obsessive-compulsive disorder (OCD) and Tourette syndrome (TS), providing the first direct confirmation that both are highly heritable and also revealing major differences between the underlying genetic makeup of the disorders. Their report is being published in the October issue of the open-access journal PLOS Genetics.

"Both TS and OCD appear to have a genetic architecture of many different genes – perhaps hundreds in each person – acting in concert to cause disease,” says Jeremiah Scharf, MD, PhD, of the Psychiatric and Neurodevelopmental Genetics Unit in the MGH Departments of Psychiatry and Neurology, senior corresponding author of the report. “By directly comparing and contrasting both disorders, we found that OCD heritability appears to be concentrated in particular chromosomes – particularly chromosome 15 – while TS heritability is spread across many different chromosomes.”

An anxiety disorder characterized by obsessions and compulsions that disrupt the lives of patients, OCD is the fourth most common psychiatric illness. TS is a chronic disorder characterized by motor and vocal tics that usually begins in childhood and is often accompanied by conditions like OCD or attention-deficit hyperactivity disorder. Both conditions have been considered to be heritable, since they are known to often recur in close relatives of affected individuals, but identifying specific genes that confer risk has been challenging.

Two reports published last year in the journal Molecular Psychiatry (1, 2), with leadership from Scharf and several co-authors of the current study, described genome-wide association studies (GWAS) of thousands of affected individuals and controls. While those studies identified several gene variants that appeared to increase the risk of each disorder, none of the associations were strong enough to meet the strict standards of genome-wide significance. Since the GWAS approach is designed to identify relatively common gene variants and it has been proposed that OCD and TS might be influenced by a number of rare variants, the research team adopted a different method. Called genome-wide complex trait analysis (GCTA), the approach allows simultaneous comparision of genetic variation across the entire genome, rather than the GWAS method of testing sites one at a time, as well as estimating the proportion of disease heritability caused by rare and common variants.

"Trying to find a single causative gene for diseases with a complex genetic background is like looking for the proverbial needle in a haystack,” says Lea Davis, PhD, of the section of Genetic Medicine at the University of Chicago, co-corresponding author of the PLOS Genetics report. “With this approach, we aren’t looking for individual genes. By examining the properties of all genes that could contribute to TS or OCD at once, we’re actually testing the whole haystack and asking where we’re more likely to find the needles.”

Using GCTA, the researchers analyzed the same genetic datasets screened in the Molecular Psychiatry reports – almost 1,500 individuals affected with OCD compared with more than 5,500 controls, and nearly TS 1,500 patients compared with more than 5,200 controls. To minimize variations that might result from slight difference in experimental techniques, all genotyping was done by collaborators at the Broad Institute of Harvard and MIT, who generated the data at the same time using the same equipment. Davis was able to analyze the resulting data on a chromosome-by-chromosome basis, along with the frequency of the identified variants and the function of variants associated with each condition.

The results found that the degree of heritability for both disorders captured by GWAS variants is actually quite close to what previously was predicted based on studies of families impacted by the disorders. “This is a crucial point for genetic researchers, as there has been a lot of controversy in human genetics about what is called ‘missing heritability’,” explains Scharf. “For many diseases, definitive genome-wide significant variants account for only a minute fraction of overall heritability, raising questions about the validity of the approach. Our findings demonstrate that the vast majority of genetic susceptibility to TS and OCD can be discovered using GWAS methods. In fact, the degree of heritability captured by GWAS variants is higher for TS and OCD than for any other complex trait studied to date.”

Nancy Cox, PhD, section chief of Genetic Medicine at the University of Chicago and co-senior author of the PLOS Genetics report, adds, “Despite the fact that we confirm there is shared genetic liability between these two disorders, we also show there are notable differences in the types of genetic variants that contribute to risk. TS appears to derive about 20 percent of genetic susceptibility from rare variants, while OCD appears to derive all of its susceptibility from variants that are quite common, which is something that has not been seen before.”

In terms of the potential impact of the risk-associated variants, about half the risk for both disorders appears to be accounted for by variants already known to influence the expression of genes in the brain. Further investigation of those findings could lead to identification of the affected genes and how the expression changes contribute to the development of TS and OCD. Additional studies in even larger patient populations, some of which are in the planning stages, could identify the biologic pathways disrupted in the disorder, potentially leading to new therapeutic approaches.

Oct 25, 2013119 notes
#tourette syndrome #GWAS #genetics #chromosome 15 #OCD #psychology #neuroscience #science
Oct 25, 2013172 notes
#sodium channel #neurotoxins #pain #analgesics #neurons #neuroscience #science
NIH funds development of novel robots to assist people with disabilities, aid doctors

Three projects have been awarded funding by the National Institutes of Health to develop innovative robots that work cooperatively with people and adapt to changing environments to improve human capabilities and enhance medical procedures. Funding for these projects totals approximately $2.4 million over the next five years, subject to the availability of funds.

The awards mark the second year of NIH’s participation in the National Robotics Initiative (NRI), a commitment among multiple federal agencies to support the development of a new generation of robots that work cooperatively with people, known as co-robots.

“These projects have the potential to transform common medical aids into sophisticated robotic devices that enhance mobility for individuals with visual and physical impairments in ways only dreamed of before,” said NIH Director Francis S. Collins, M.D., Ph.D. “In addition, as we continue to rely on robots to carry out complex medical procedures, it will become increasingly important for these robots to be able to sense and react to changing and unpredictable environments within the body. By supporting projects that develop these capabilities, we hope to increase the accuracy and safety of current and future medical robots.”

NIH is participating in the NRI with the National Science Foundation, the National Aeronautics and Space Administration, and the U.S. Department of Agriculture. NIH has funded three projects to help develop co-robots that can assist researchers, patients, and clinicians.

image

A Co-Robotic Navigation Aid for the Visually Impaired: The goal is to develop a co-robotic cane for the visually impaired that has enhanced navigation capabilities and that can relay critical information about the environment to its user. Using computer vision, the proposed cane will be able to recognize indoor structures such as stairways and doors, as well as detect potential obstacles. Using an intuitive human-device interaction mechanism, the cane will then convey the appropriate travel direction to the user. In addition to increasing mobility for the visually impaired and thus quality of life, methods developed in the creation of this technology could lead to general improvements in the autonomy of small robots and portable robotics that have many applications in military surveillance, law enforcement, and search and rescue efforts. Cang Ye, Ph.D., University of Arkansas at Little Rock (co-funded by the National Institute of Biomedical Imaging and Bioengineering [NIBIB] and the National Eye Institute).

image

MRI-Guided Co-Robotic Active Catheter: Atrial fibrillation is an irregular heartbeat that can increase the risk of stroke and heart disease. By purposefully ablating (destroying) specific areas of the heart in a controlled fashion, the propagation of irregular heart activity can be prevented. This is generally achieved by threading a catheter with an electrode at its tip through a vein in the groin until it reaches the patient’s heart. However, the constant movement of the heart as well as unpredictable changes in blood flow can make it difficult to maintain consistent contact with the heart during the ablation procedure, occasionally resulting in too large or too small of a lesion. The aim is to develop a co-robotic catheter that uses novel robotic planning strategies to compensate for physiological movements of the heart and blood and that can be used while a patient undergoes MRI — an imaging method used to take pictures of soft tissues in the body such as the heart. By combining state-of-the art robotics with high-resolution, real-time imaging, the co-robotic catheter could significantly increase the accuracy and repeatability of atrial fibrillation ablation procedures. M. Cenk Cavusoglu, Ph.D., Case Western Reserve University, Cleveland (funded by NIBIB).

image

Novel Platform for Rapid Exploration of Robotic Ankle Exoskeleton Control: Wearable robots, such as powered braces for the lower extremities, can improve mobility for individuals with impaired strength and coordination due to aging, spinal cord injury, cerebral palsy, or stroke. However, methods for determining the optimal design of an assistive device for use within a specific patient population are lacking. This project proposes to create an experimental platform for an assistive ankle robot to be used in patients recovering from stroke. The platform will allow investigators to systematically test various robotic control methods and to compare them based on measurable physiological outcomes. Results from these tests will provide evidence for making more effective, less expensive, and more manageable assistive technologies. Stephen G. Sawicki, Ph.D., North Carolina State University, Raleigh; Steven Collins, Ph.D., Carnegie Mellon University, Pittsburgh (co-funded by the National Institute of Nursing Research and NSF).

These projects are supported by the grants EB018117-01; EB018108-01; NR014756-01; from the National Institute of Biomedical Imaging and Bioengineering (NIBIB), the National Eye Institute (NEI), and the National Institute of Nursing Research (NINR) and by award #1355716 from the National Science Foundation.

Oct 25, 201363 notes
#robotics #neuroimaging #neuroscience #technology #science
Oct 25, 2013177 notes
#science #stroke #tissue plasminogen activator #brain hemorrhage #neuroscience
Oct 25, 201380 notes
#neurodegenerative diseases #progenitor cells #brain mapping #neurons #learning #neuroscience #science
Child neurologist finds potential route to better treatments for Fragile X, autism

When you experience something, neurons in the brain send chemical signals called neurotransmitters across synapses to receptors on other neurons. How well that process unfolds determines how you comprehend the experience and what behaviors might follow. In people with Fragile X syndrome, a third of whom are eventually diagnosed with Autism Spectrum Disorder, that process is severely hindered, leading to intellectual impairments and abnormal behaviors.

In a study published in the online journal PLoS One, a team of UNC School of Medicine researchers led by pharmacologist C.J. Malanga, MD, PhD, describes a major reason why current medications only moderately alleviate Fragile X symptoms. Using mouse models, Malanga discovered that three specific drugs affect three different kinds of neurotransmitter receptors that all seem to play roles in Fragile X. As a result, current Fragile X drugs have limited benefit because most of them only affect one receptor.

Nearly one million people in the United States have Fragile X Syndrome, which is the result of a single mutated gene called FMR1. In people without Fragile X, the gene produces a protein that helps maintain the proper strength of synaptic communication between neurons. In people with Fragile X, FMR1 doesn’t produce the protein, the synaptic connection weakens, and there’s a decrease in synaptic input, leading to mild to severe learning disabilities and behavioral issues, such as hyperactivity, anxiety, and sensitivity to sensory stimulation, especially touch and noise.

More than two decades ago, researchers discovered that – in people with mental and behavior problems – a receptor called mGluR5 could not properly regulate the effect of the neurotransmitter, glutamate. Since then, pharmaceutical companies have been trying to develop drugs that target glutamate receptors. “It’s been a challenging goal,” Malanga said. “No one so far has made it work very well, and kids with Fragile X have been illustrative of this.”

But there are other receptors that regulate other neurotransmitters in similar ways to mGluR5. And there are drugs already available for human use that act on those receptors. So Malanga’s team checked how those drugs might affect mice in which the Fragile X gene has been knocked out.

By electrically stimulating specific brain circuits, Malanga’s team first learned how the mice perceived reward. The mice learned very quickly that if they press a lever, they get rewarded via a mild electrical stimulation. Then his team provided a drug molecule that acts on the same reward circuitry to see how the drugs affect the response patterns and other behaviors in the mice.

His team studied one drug that blocked dopamine receptors, another drug that blocked mGluR5 receptors, and another drug that blocked mAChR1, or M1, receptors. Three different types of neurotransmitters – dopamine, glutamate, and acetylcholine – act on those receptors. And there were big differences in how sensitive the mice were to each drug.

“Turns out, based on our study and a previous study we did with my UNC colleague Ben Philpot, that Fragile X mice and Angelman Syndrome mice are very different,” Malanga said. “And how the same pharmaceuticals act in these mouse models of Autism Spectrum Disorder is very different.”

Malanga’s finding suggests that not all people with Fragile X share the same biological hurdles. The same is likely true, he said, for people with other autism-related disorders, such as Rett syndrome and Angelman syndrome.

“Fragile X kids likely have very different sensitivities to prescribed drugs than do other kids with different biological causes of autism,” Malanga said.

Oct 24, 201379 notes
#fragile x syndrome #glutamate #neurotransmitters #autism #acetylcholine #dopamine #neuroscience #science
A step towards early Alzheimer’s diagnosis

If Alzheimer’s disease is to be treated in the future it requires an early diagnosis, which is not yet possible. Now researchers at higher education institutions such as Linköping University have identified six proteins in spinal fluid that can be used as markers for the illness.

Alzheimer’s causes great suffering and has a one hundred percent fatality rate. The breakdown of brain cells has been in progress for ten years or more by the time symptoms begin to appear. Currently there is no treatment that can stop the process.

image

(Image: Human neuroblastoma with cell nucleus in blue; beta amyloid as red aggregates within green-tinted lysosomes. Photo: Lotta Agholme.)

Most researchers now agree that one cause of the illness is toxic accumulations – plaques – of the beta amyloid protein. In a healthy brain, the cells are cleansed of such surplus products through lysosomes, the cells’ “waste disposal facilities” (green in the picture).

“In victims of Alzheimer’s, something happens to the lysosomes so that they can’t manage to take care of the surplus of beta amyloid. They fill up with junk that normally is broken down into its component parts and recycled,” says Katarina Kågedal, reader in Experimental Pathology at Linköping University. She led the study that is now being published in Neuromolecular Medicine.

The researchers’ hypothesis was that these changes in the brain’s lysosomal network could be reflected in the spinal fluid, which surrounds the brain’s various parts and drains down into the spinal column. They studied samples of spinal marrow from 20 Alzheimer’s patients and an equal number of healthy control subjects. The screening was aimed at 35 proteins that are associated with the lysosomal network.

“Six of these had clearly increased in the patients; none of them were previously known as markers for Alzheimer’s,” says Kågedal.

Her hope is that the group’s discovery will contribute to early diagnoses of the illness, which is necessary in the first stage in order to be able to begin reliable clinical tests of candidates for drugs. But perhaps the six lysosomal proteins could also be “drug targets” – targets for developing drugs.

“It may be a question of strengthening protection against plaque formation or reactivating the lysosomes so that they manage to break down the plaque,” Kågedal says.

The study was conducted on 20 anonymised, archived spinal marrow samples and the results were confirmed afterwards on an independent range of samples of equal size. All samples were provided by the Laboratory for Clinical Chemistry at Sahlgrenska University Hospital.

Oct 24, 201372 notes
#alzheimer's disease #memory #lysosomes #neuroblastoma #neuroscience #science
Lower Blood Sugars May Be Good for the Brain

Even for people who don’t have diabetes or high blood sugar, those with higher blood sugar levels are more likely to have memory problems, according to a new study published in the October 23, 2013, online issue of Neurology®, the medical journal of the American Academy of Neurology.

image

The study involved 141 people with an average age of 63 who did not have diabetes or pre-diabetes, which is also called impaired glucose tolerance. People who were overweight, drank more than three-and-a-half servings of alcohol per day, and those who had memory and thinking impairment were not included in the study.

The participants’ memory skills were tested, along with their blood glucose, or sugar, levels. Participants also had brain scans to measure the size of the hippocampus area of the brain, which plays an important role in memory.

People with lower blood sugar levels were more likely to have better scores on the memory tests. On a test where participants needed to recall a list of 15 words 30 minutes after hearing them, recalling fewer words was associated with higher blood sugar levels. For example, an increase of about 7 mmol/mol of a long-term marker of glucose control called HbA1c went along with recalling 2 fewer words. People with higher blood sugar levels also had smaller volumes in the hippocampus.

“These results suggest that even for people within the normal range of blood sugar, lowering their blood sugar levels could be a promising strategy for preventing memory problems and cognitive decline as they age,” said study author Agnes Flöel, MD, of Charité University Medicine in Berlin, Germany. “Strategies such as lowering calorie intake and increasing physical activity should be tested.”

Oct 24, 2013104 notes
#glucose #diabetes #hippocampus #neuroimaging #neuroscience #science
Oct 24, 201385 notes
#learning #neurons #cognition #brain-derived neurotrophic factor #songbirds #neuroscience #science
Oct 24, 2013177 notes
#science #auditory cortex #auditory system #brain mapping #music #vision #neuroscience
Oct 24, 2013288 notes
#science #cardiovascular diseases #inflammation #immune system #sleep #sleep deprivation #Type II diabetes #interleukins #genetics #neuroscience
Oct 23, 201387 notes
#TBI #brain injury #concussions #brain mapping #neuroimaging #neuroscience #science
Oct 23, 2013121 notes
#ASD #autism #social cognition #social interaction #theatre #cortisol #psychology #neuroscience #science
Baby's Innate Number Sense Predicts Future Math Skill

Innate ability to identify quantities previews future mathematics performance

image

Babies who are good at telling the difference between large and small groups of items even before learning how to count are more likely to do better with numbers in the future, according to new research from the Duke Institute for Brain Sciences. 

The use of Arabic numerals to represent different values is a characteristic unique to humans, not seen outside our species. But we aren’t born with this skill. Infants don’t have the words to count to 10. So, scientists have hypothesized that the rudimentary sense of numbers in infants is the foundation for higher-level math understanding. 

A new study, appearing online in the Oct. 21 Proceedings of the National Academy of Sciences, suggests that children do, in fact, tap into this innate numerical ability when learning symbolic mathematical systems. The Duke researchers found that the strength of an infant’s inborn number sense can be predictive of the child’s future mathematical abilities.  

"When children are acquiring the symbolic system for representing numbers and learning about math in school, they’re tapping into this primitive number sense," said Elizabeth Brannon, Ph.D., a professor of psychology and neuroscience, who led the study. "It’s the conceptual building block upon which mathematical ability is built."

Brannon explained that babies come into the world with a rudimentary understanding referred to as a primitive number sense. When looking at two collections of objects, primitive number sense allows them to identify which set is numerically larger even without verbal counting or using Arabic numerals. For example, a person instinctively knows a group of 15 strawberries is more than six oranges, just by glancing. 

Understanding how infants and young children conceptualize and understand number can lead to the development of new mathematics education strategies, said Brannon’s colleague, Duke psychology and neuroscience graduate student Ariel Starr. In particular, this knowledge can be used to design interventions for young children who have trouble learning mathematics symbols and basic methodologies.

To test for primitive number sense, Brannon and Starr analyzed 48 6-month-old infants to see whether they could recognize numerical changes, capitalizing on the interest most babies show in things that change. They placed each baby in front of two screens, one that always showed the same number of dots (e.g., eight), changing in size and position, and another that switched between two different numerical values (e.g., eight and 16 dots). All the arrays of dots changed frequently in size and position. In this task, babies that could tell the difference between the two numerical values (e.g., eight and 16) looked longer at the numerically changing screen.  

Brannon and Starr then tested the same children at 3.5 years of age with a non-symbolic number comparison game. The children were shown two different arrays and asked to choose which one had more dots without counting them. In addition, the children took a standardized math test scaled for pre-schoolers, as well as a standardized IQ test. Finally, the researchers gave the children a simple verbal task to identify the largest number word each child could concretely understand.

"We found that infants with higher preference scores for looking at the numerically changing screen had better primitive number sense three years later compared to those infants with lower scores," Starr said. "Likewise, children with higher scores in infancy performed better on standardized math tests."

Brannon said the findings point to a real connection between symbolic math and quantitative abilities that are present in infancy before education takes hold and shapes our mathematical abilities.

"Our study shows that infant number sense is a predictor of symbolic math," Brannon said. "We believe that when children learn the meaning of number words and symbols, they’re likely mapping those meanings onto pre-verbal representations of number that they already have in infancy," she said. 

"We can’t measure a baby’s number sense ability at 6 months and know how they’ll do on their SATs," Brannon added. "In fact our infant task only explains a small percentage of the variance in young children"s math performance. But our findings suggest that there is cognitive overlap between primitive number sense and symbolic math. These are fundamental building blocks."

Oct 23, 201386 notes
#numerical cognition #infants #child development #psychology #neuroscience #science
Oct 23, 201385 notes
#science #tuberous sclerosis complex #neurons #brain mapping #genetics #neuroscience
Major Alzheimer's Risk Factor Linked to Red Wine Target

Buck Institute study provides insight for new therapeutics that target the interaction between ApoE4 and a Sirtuin protein

The major genetic risk factor for Alzheimer’s disease (AD), present in about two-thirds of people who develop the disease, is ApoE4, the cholesterol-carrying protein that about a quarter of us are born with. But one of the unsolved mysteries of AD is how ApoE4 causes the risk for the incurable, neurodegenerative disease. In research published this week in The Proceedings of the National Academy of Sciences, researchers at the Buck Institute found a link between ApoE4 and SirT1, an “anti-aging protein” that is targeted by resveratrol, present in red wine.

The Buck researchers found that ApoE4 causes a dramatic reduction in SirT1, which is one of seven human Sirtuins. Lead scientists Rammohan Rao, PhD, and Dale Bredesen, MD, founding CEO of the Buck Institute, say the reduction was found both in cultured neural cells and in brain samples from patients with ApoE4 and AD. “The biochemical mechanisms that link ApoE4 to Alzheimer’s disease have been something of a black box. However, recent work from a number of labs, including our own, has begun to open the box,” said Bredesen.

The Buck group also found that the abnormalities associated with ApoE4 and AD, such as the creation of phospho-tau and amyloid-beta, could be prevented by increasing SirT1. They have identified drug candidates that exert the same effect. “This research offers a new type of screen for Alzheimer’s prevention and treatment,” said Rammohan V. Rao, PhD, co-author of the study, and an Associate Research Professor at the Buck. “One of our goals is to identify a safe, non-toxic treatment that could be given to anyone who carries the ApoE4 gene to prevent the development of AD.”

In particular, the researchers discovered that the reduction in SirT1 was associated with a change in the way the amyloid precursor protein (APP) is processed. Rao said that ApoE4 favored the formation of the amyloid-beta peptide that is associated with the sticky plaques that are one of the hallmarks of the disease. He said with ApoE3 (which confers no increased risk of AD), there was a higher ratio of the anti-Alzheimer’s peptide, sAPP alpha, produced, in comparison to the pro-Alzheimer’s amyloid-beta peptide. This finding fits very well with the reduction in SirT1, since overexpressing SirT1 has previously been shown to increase ADAM10, the protease that cleaves APP to produce sAPP alpha and prevent amyloid-beta.

AD affects over 5 million Americans – there are no treatments that are known to cure, or even halt the progression of symptoms that include loss of memory and language. Preventive treatments are particularly needed for the 2.5% of the population that carry two genes for ApoE4, which puts them at an approximate 10-fold higher risk of developing AD, as well as for the 25% of the population with a single copy of the gene. The group hopes that the current work will identify simple, safe therapeutics that can be given to ApoE4 carriers to prevent the development of Alzheimer’s disease.

Oct 22, 201373 notes
#alzheimer's disease #dementia #resveratrol #ApoE4 #SirT1 #amyloid beta #genetics #neuroscience #science
Shorter Sleep Duration and Poorer Sleep Quality Linked to Alzheimer’s Disease Biomarker

Poor sleep quality may impact Alzheimer’s disease onset and progression. This is according to a new study led by researchers at the Johns Hopkins Bloomberg School of Public Health who examined the association between sleep variables and a biomarker for Alzheimer’s disease in older adults. The researchers found that reports of shorter sleep duration and poorer sleep quality were associated with a greater β-Amyloid burden, a hallmark of the disease. The results are featured online in the October issue of JAMA Neurology.

“Our study found that among older adults, reports of shorter sleep duration and poorer sleep quality were associated with higher levels of β-Amyloid measured by PET scans of the brain,” said Adam Spira, PhD, lead author of the study and an assistant professor with the Bloomberg School’s Department of Mental Health. “These results could have significant public health implications as Alzheimer’s disease is the most common cause of dementia, and approximately half of older adults have insomnia symptoms.”

Alzheimer’s disease is an irreversible, progressive brain disease that slowly destroys memory and thinking skills. According to the National Institutes of Health, as many as 5.1 million Americans may have the disease, with first symptoms appearing after age 60. Previous studies have linked disturbed sleep to cognitive impairment in older people.

In a cross-sectional study of adults from the neuro-imagining sub-study of the Baltimore Longitudinal Study of Aging with an average age of 76, the researchers examined the association between self-reported sleep variables and β-Amyloid deposition. Study participants reported sleep that ranged from more than seven hours to no more than five hours. β-Amyloid deposition was measured by the Pittsburgh compound B tracer and PET (positron emission tomography) scans of the brain. Reports of shorter sleep duration and lower sleep quality were both associated with greater Αβ buildup.

“These findings are important in part because sleep disturbances can be treated in older people. To the degree that poor sleep promotes the development of Alzheimer’s disease, treatments for poor sleep or efforts to maintain healthy sleep patterns may help prevent or slow the progression of Alzheimer disease,” said Spira.  He added that the findings cannot demonstrate a causal link between poor sleep and Alzheimer’s disease, and that longitudinal studies with objective sleep measures are needed to further examine whether poor sleep contributes to or accelerates Alzheimer’s disease.

Oct 22, 201387 notes
#alzheimer's disease #dementia #sleep #neuroimaging #beta amyloid #insomnia #neuroscience #science
Next page →
20132014
  • January
  • February
  • March
  • April
  • May
  • June
  • July
  • August
  • September
  • October
  • November
  • December
201220132014
  • January
  • February
  • March
  • April
  • May
  • June
  • July
  • August
  • September
  • October
  • November
  • December
20122013
  • January
  • February
  • March
  • April
  • May
  • June
  • July
  • August
  • September
  • October
  • November
  • December