Posts tagged neuroscience

Posts tagged neuroscience
Although drugs have been developed that inhibit the imbalance of neurotransmitters in the brain – a condition which causes many brain disorders and nervous system diseases – the exact understanding of the mechanism by which these drugs work has not yet been fully explained.

Now, researchers at the Hebrew University of Jerusalem, using baker’s yeast as a model, have deciphered the mode by which the inhibitors affect the neurological transmission process and have even been able to manipulate it.
Their work, reported in a recent article in the Journal of Biological Chemistry, raises hopes that these insights could eventually guide clinical scientists to develop new and more effective drugs for brain disorders associated with neurotransmitter imbalance.
All of the basic tasks of our existence are executed by the brain – whether it is breathing, heartbeat, memory building or physical movements – which depend on the highly regulated and efficient release of neurotransmitters – chemicals that act as messengers enabling extremely rapid connections between the neurons in the brain.
When even one part of the everyday “conversation” between neighboring neurons breaks down, the results can be devastating. Many brain disorders and nervous system diseases, including Huntington’s disease, various motor dysfunctions and even Parkinson’s disease, have been linked to problems with neurotransmitter transport.
The neurotransmitters are stored in the neuron in small, bubble-like compartments, called vesicles, containing transport proteins that are responsible for the storage of the neurotransmitters into the vesicles.
The storage of certain neurotransmitters is controlled by what is called the vesicular monoamine transporter (VMAT), which is known to transport a variety of vital neurotransmitters, such as adrenaline, dopamine and serotonin.
In addition, it can also transport the detrimental MPP+, a neurotoxin involved in models of Parkinson’s disease.
A number of studies demonstrated the significance of VMAT as a target for drug therapy in a variety of pathologic states, such as high blood pressure, hyperkinetic movement disorders and Tourette syndrome.
Many of the drugs that target VMAT act as inhibitors, including the classical VMAT2 inhibitor, tetrabenazine. Tetrabenazine has long been used for the treatment of motor dysfunctions associated with Huntington’s disease and other movement disorders. However, the mechanism by which the drug affects the storage of neurotransmitters was not fully understood.
The Hebrew University study set out, therefore, to achieve an understanding of the basic biochemical mechanism underlying the VMAT reaction, with a view towards better controlling it through new drug designs.
The research was conducted by in the laboratory of Prof. Shimon Schuldiner of the Hebrew University’s Department of Biological Chemistry; Dr.Yelena Ugolev, postdoctoral fellow in the laboratory; and research students Tali Segal, Dana Yaffe and Yael Gros.
To identify protein sequences responsible for tetrabenazine binding, the Hebrew University scientists harnessed the power of yeast genetics along with the method of directed evolution.
Expressing the human protein VMAT in baker’s yeast cells confers them with the ability to grow in the presence of toxic substrates, such as neurotoxin MPP+. Directed evolution mimics natural evolution in the laboratory and is a method used in protein engineering.
By using rounds of random mutations targeted to the gene encoding the protein of interest, the proteins can be tuned to acquire new properties or to adapt to new functions or environment.
The study led to identification of important flexible domains (or regions) in the structure of the VMAT, responsible for producing optional rearrangements in tetrabenazine binding, and also enabling regulation of the velocity of the neurotransmitter transporter.
Utilizing these new, controllable adaptations could serve as a guide for clinical scientists to develop more efficient drugs for brain disorders associated with neurotransmitter imbalance, say the Hebrew University researchers.
(Source: eurekalert.org)
Although brain growth slows as individuals age, some regions of the brain continue to develop for longer than others, creating new connections and remodeling existing circuitry. How this happens is a key question in neuroscience, with implications for brain health and neurodegenerative diseases. New research published today shows that those areas of the adult brain that consume more fuel than scientists might expect also share key characteristics with the developing brain. Two Allen Brain Atlas resources – the Allen Human Brain Atlas and the BrainSpan Atlas of the Developing Human Brain – were crucial to uncovering the significance of these sugar-hungry regions. The results are published this month in the journal Cell Metabolism.

"These experiments and analysis represent the first union of its kind between functional imaging data and a biological mechanism, with the Allen Brain Atlas resources helping to bridge that gap," comments Michael Hawrylycz, Ph.D., Investigator with the Allen Institute for Brain Science and co-author of the study. Data from PET scans provides structural insight into the brain, but until now, has not been able to elucidate function. "Now we can make the comparison between the functional data and the gene expression data," says Hawrylycz, "so instead of just the ‘where,’ we now also have the ‘what’ and ‘how.’"
The brain needs to constantly metabolize fuel in order to keep running, most often in the form of glycolysis: the breaking down of stored sugar into useable energy. PET scans of the brain, which illuminate regions consuming sugar, show that some select areas of the brain seemed to exhibit fuel consumption above and beyond what was needed for basic functioning. In cancer biology, this same well-known phenomenon of consuming extra fuel—called “aerobic glycolysis”—is thought to provide support pathways for cell proliferation. In the brain, aerobic glycolysis is dramatically increased during childhood and accounts for as much as one third of total brain glucose consumption at its peak around 5 years of age, which is also the peak of synapse development.
Since aerobic glycolysis varies by region of the brain, Hawrylycz and co-author Marcus Raichle, Ph.D., at Washington University in St. Louis, wondered whether regions of the brain with higher levels of aerobic glycolysis might be associated with equivalent growth processes, like synapse formation. If so, this would point to aerobic glycolysis as a reflection of “neoteny,” or persistent brain development like the kind that takes place during early childhood.
In order to delve into the significance of aerobic glycolysis, researchers examined the genes expressed at high levels in those regions where aerobic glycolysis was taking place. The team identified 16 regions of the brain with elevated levels of aerobic glycolysis and ranked their neotenous characteristics. True to prediction, they found that gene expression data from those 16 regions suggested highly neotenous behavior.
The next phase was to identify which genes were specifically correlated with aerobic glycolysis in those regions. The Allen Brain Atlas resources proved crucial in this task, helping to pinpoint gene expression in different regions at various points in development. The Allen Human Brain Atlas was used to investigate the adult human brain, while the BrainSpan Atlas of the Developing Human Brain, developed by a consortium of partners and funded by the National Institutes of Health, provided a window into how gene expression changes as the brain ages.
Analysis of the roles of those genes pointed clearly towards their roles in growth and development; top genes included those responsible for axon guidance, potassium ion channel development, synaptic transmission and plasticity, and many more. The consistent theme was development, pointing to aerobic glycolysis as a hallmark for neotenous, continually developing regions of the brain.
"Using both the adult and developmental data, we were able to study gene expression at each point in time," describes Hawrylycz. "From there, we were able to see the roles of those genes that were highly expressed in regions with aerobic glycolysis. As it turns out, those genes are consistently involved in the remodeling and maturation process, synaptic growth and neurogenesis—all factors in neoteny." "The regions we identified as being neotenous are areas of the cortex particularly associated with development of intelligence and learning," explains Hawrylycz. "Our results suggest that aerobic glycolysis, or extra fuel consumption, is a marker for regions of the brain that continue to grow and develop in similar ways to the early human brain."
(Source: eurekalert.org)

Neuroscience Study Uncovers New Player in Obesity
A new neuroscience study sheds light on the biological underpinnings of obesity. The in vivo study, published in the January 8 issue of The Journal of Neuroscience, reveals how a protein in the brain helps regulate food intake and body weight. The findings reveal a potential new avenue for the treatment of obesity and may help explain why medications that are prescribed for epilepsy and other conditions that interfere with this protein, such as gabapentin and pregabalin, can cause weight gain.
The protein – alpha2/delta-1 – has not been linked previously to obesity. A team led by Maribel Rios, Ph.D., associate professor in the department of neuroscience at Tufts University School of Medicine, discovered that alpha2/delta-1 facilitates the function of another protein called brain-derived neurotrophic factor (BDNF). A previous study by Rios determined that BDNF plays a critical role in appetite suppression, while the current study identifies a central mechanism mediating the inhibitory effects of BDNF on overeating.
“We know that low levels of the BDNF protein in the brain lead to overeating and dramatic obesity in mice. Deficiencies in BDNF have also been linked to obesity in humans. Now, we have discovered that the alpha2/delta-1 protein is necessary for normal BDNF function, giving us a potential new target for novel obesity treatments,” said Rios, also a member of the cellular and molecular physiology and neuroscience program faculties at the Sackler School of Graduate Biomedical Sciences at Tufts.
Rios and colleagues discovered that low levels of BDNF were associated with decreased function of alpha2/delta-1 in the hypothalamus, a brain region that is critical to the regulation of food intake and weight. When the team inhibited the alpha2/delta-1 protein in normal mice, mice ate significantly more food and gained weight. Conversely, when the team corrected the alpha 2/delta-1 deficiency in mice with reduced BDNF levels, overeating and weight gain were mitigated. In addition, blood sugar levels (related to diabetes in humans) were normalized.
“We blocked activity of the alpha2/delta-1 protein in mice using gabapentin. These mice ate 39 percent more food, and as a consequence gained substantially more weight than control mice over a seven-day period,” said first author Joshua Cordeira, Ph.D., a graduate of the neuroscience program at the Sackler School and member of Rios’s lab. This study is related to his Ph.D. thesis.
“When we re-introduced alpha2/delta-1 in obese mice lacking BDNF in the brain, we saw a 15-20 percent reduction in food intake and a significant reduction in weight gain. Importantly, metabolic disturbances associated with obesity, including hyperglycemia and deficient glucose metabolism, were greatly reduced by restoring the function of alpha2/delta-1,” added Rios.
Some individuals who take gabapentin and pregabalin report weight gain. Both gabapentin and pregabalin are anticonvulsants, also used to treat nerve pain from, for example, shingles or diabetes. The findings from the Rios lab suggest that these drugs might contribute to weight gain by interfering with alpha2/delta-1 in the hypothalamus. This new understanding of alpha2/delta-1’s role in appetite may allow researchers to develop complementary treatments that can prevent weight gain for patients taking these medications.
“We now know that alpha2/delta-1 plays a critical role in healthy BDNF function. The finding improves our understanding of the intricate neuroscience involved in appetite control. The next phase of our research will be to unravel the mechanisms mediating the satiety effects of alpha2/delta-1 in the hypothalamus,” said Rios.
This latest finding builds on Rios’s previous studies of BDNF and its role in regulating body weight. Earlier work by Rios established BDNF as an essential component of the neural circuits governing body weight in adult mice. Rios also determined that BDNF expression in two regions of the brain is required to suppress appetite.
Tiny Proteins Have Outsized Influence On Nerve Health
Mutations in small proteins that help convey electrical signals throughout the body may have a surprisingly large effect on health, according to results of a new Johns Hopkins study published in Proceedings of the National Academy of Sciences in December using spider, scorpion and sea anemone venom.
The tiny conduits carrying those electrical signals are sodium channels that are vital to our well-being—they trigger action potentials, or spurts of electrical energy that course from body to brain to deliver messages that invoke feelings like pain or temperature sensitivity. When such channels go awry, they contribute to a slew of diseases, one of which is epilepsy.
In the new research, Frank Bosmans, Ph.D., an assistant professor of physiology at the Johns Hopkins University School of Medicine, has found that auxiliary “helper” proteins that interact with sodium channels also play a crucial role. And that, he says, could affect drug development for epilepsy, neurological diseases, muscular disorders and pain syndromes.
“Nobody had thought these tiny molecules that don’t even form the main sodium channel were capable of changing the response of the channel to certain compounds,” Bosmans says. “But in what we consider a new concept, these auxiliary subunits can be considered as drug targets.”
Over the past few decades, there have been hints that these auxiliary proteins were influencing sodium channels, but few analyzed the problem very closely. John Gilchrist, a graduate student in Bosmans’ lab, began evaluating each of the four proteins, one at a time.
Gilchrist engineered frog eggs that made sodium channels and exposed them to the toxins released by tarantulas, scorpions, wasps and sea anemones, an extension of Bosmans’ earlier doctoral research studying the effect of animal venoms on sodium channels. He found that one auxiliary protein in particular, beta4, altered the whole sodium channel system. When exposed to tarantula venom, for instance, tissue in the presence of beta4 showed decreased sensitivity in the sodium channels, meaning that the protein changed the way the nerve fired. This denotes that if a human got bit by a tarantula in a region where beta4 was active, the whole experience might be just a little less painful, says Bosmans.
To figure out what was going on in the altered channels, Bosmans needed to know what the protein looked like, he says. He contacted Filip Van Petegem, a crystallographer at the University of British Columbia in Vancouver, Canada. Van Petegem was able to map the 3-D structure of beta4 down to 1.7 angstroms, the highest possible resolution. Crystal structure in hand, Bosmans could now mutate beta4 and watch what happened.
Purely by chance, Van Petegem had already started that mutation process. To diagram the crystal, Van Petegem had been forced to substitute one protein for another due to quirks in the test system. Bosmans found that the tiny mutation thwarted beta4’s interaction with the sodium channel system.
That finding promptly overturned conventional wisdom into how these proteins behave, Bosmans says.
Back in 1998, Bosmans says, physicians determined that a mutation in the beta1 protein seemed to be triggering a case of epilepsy. Epilepsy has hundreds of causes. It was known at the time that a chemical bridge within the sodium channel held the beta proteins together. If that bridge, known as a disulfide bond, is broken, the proteins fall apart. The physicians theorized that the mutation they found must have destroyed the bridge along with their accompanying proteins. That broken bridge theory has remained dominant ever since.
But when Bosmans introduced that same mutation in beta4, the structure stayed intact. The changes he saw were much more subtle. The position of the protein Van Petegem had mutated changed slightly so that it was farther away from the channel. And only when that mutated crystal was exposed to a toxin did beta4 lose its ability to communicate with the sodium channel.
Bosmans says that even with evidence of the auxiliary proteins’ importance mounting, such as in the epilepsy study, drug developers have continued to ignore the proteins rather than treatment opportunities. Most efforts to develop new drugs to treat epilepsy still focus exclusively on modifying the sodium channels, which don’t need the beta proteins to operate. But Bosmans believes this is only part of the story.
His new finding suggests that such an approach is shortsighted, because mutations in these beta proteins may very well be causing the disease at hand. Drugs that target the beta proteins have the potential to deliver a much more focused treatment, he says.
"That’s one of the new concepts that we’re trying to launch—keep an eye on these little guy proteins, because they are important. If they have a mutation in them, they can cause a disease,” Bosmans says.
Babbling babies – responding to one-on-one ‘baby talk’ – master more words
Common advice to new parents is that the more words babies hear the faster their vocabulary grows. Now new findings show that what spurs early language development isn’t so much the quantity of words as the style of speech and social context in which speech occurs.
Researchers at the University of Washington and University of Connecticut examined thousands of 30-second snippets of verbal exchanges between parents and babies. They measured parents’ use of a regular speaking voice versus an exaggerated, animated baby talk style, and whether speech occurred one-on-one between parent and child or in group settings.
“What our analysis shows is that the prevalence of baby talk in one-on-one conversations with children is linked to better language development, both concurrent and future,” said Patricia Kuhl, co-author and co-director of UW’s Institute for Learning & Brain Sciences.
The more parents exaggerated vowels – for example “How are youuuuu?” – and raised the pitch of their voices, the more the 1-year olds babbled, which is a forerunner of word production. Baby talk was most effective when a parent spoke with a child individually, without other adults or children around.
(Listen to a mother use baby talk with her child)
“The fact that the infant’s babbling itself plays a role in future language development shows how important the interchange between parent and child is,” Kuhl said.
The findings will be published in an upcoming issue of the journal Developmental Science.
Twenty-six babies about 1 year of age wore vests containing audio recorders that collected sounds from the children’s auditory environment for eight hours a day for four days. The researchers used LENA (“language environment analysis”) software to examine 4,075 30-second intervals of recorded speech. Within those segments, the researchers identified who was talking in each segment, how many people were there, whether baby talk – also known as “parentese” – or regular voice was used, and other variables.
When the babies were 2 years old, parents filled out a questionnaire measuring how many words their children knew. Infants who had heard more baby talk knew more words. In the study, 2-year olds in families who spoke the most baby talk in a one-on-one social context knew 433 words, on average, compared with the 169 words recognized by 2-year olds in families who used the least babytalk in one-on-one situations.
The relationship between baby talk and language development persisted across socioeconomic status and despite there only being 26 families in the study.
“Some parents produce baby talk naturally and they don’t realize they’re benefiting their children,” said first author Nairán Ramírez-Esparza, an assistant psychology professor at the University of Connecticut. “Some families are more quiet, not talking all the time. But it helps to make an effort to talk more.”
Previous studies have focused on the amount of language babies hear, without considering the social context. The new study shows that quality, not quantity, is what matters.
“What this study is adding is that how you talk to children matters. Parentese is much better at developing language than regular speech, and even better if it occurs in a one-on-one interaction,” Ramirez-Esparza said.
Parents can use baby talk when going about everyday activities, saying things like, “Where are your shoooes?,” “Let’s change your diiiiaper,” and “Oh, this tastes goooood!,” emphasizing important words and speaking slowly using a happy tone of voice.
“It’s not just talk, talk, talk at the child,” said Kuhl. “It’s more important to work toward interaction and engagement around language. You want to engage the infant and get the baby to babble back. The more you get that serve and volley going, the more language advances.”
QBI scientists at The University of Queensland have found that honeybees use the pattern of polarised light in the sky invisible to humans to direct one another to a honey source.

The study, conducted in Professor Mandyam Srinivasan’s laboratory at the Queensland Brain Institute, a member of the Australian Research Council Centre of Excellence in Vision Science (ACEVS), demonstrated that bees navigate to and from honey sources by reading the pattern of polarised light in the sky.
“The bees tell each other where the nectar is by converting their polarised ‘light map’ into dance movements,” Professor Srinivasan said.
“The more we find out how honeybees make their way around the landscape, the more awed we feel at the elegant way they solve very complicated problems of navigation that would floor most people – and then communicate them to other bees,” he said.
The discovery shines new light on the astonishing navigational and communication skills of an insect with a brain the size of a pinhead.
The researchers allowed bees to fly down a tunnel to a sugar source, shining only polarised light from above, either aligned with the tunnel or at right angles to the tunnel.
They then filmed what the bees ‘told’ their peers, by waggling their bodies when they got back to the hive.
“It is well known that bees steer by the sun, adjusting their compass as it moves across the sky, and then convert that information into instructions for other bees by waggling their body to signal the direction of the honey,” Professor Srinivasan said.
“Other laboratories have shown from studying their eyes that bees can see a pattern of polarised light in the sky even when the sun isn’t shining: the big question was could they translate the navigational information it provides into their waggle dance.”
The researchers conclude that even when the sun is not shining, bees can tell one another where to find food by reading and dancing to their polarised sky map.
In addition to revealing how bees perform their remarkable tasks, Professor Srinivasan says it also adds to our understanding of some of the most basic machinery of the brain itself.
Professor Srinivasan’s team conjectures that flight under polarised illumination activates discrete populations of cells in the insect’s brain.
When the polarised light was aligned with the tunnel, one pair of ‘place cells’ – neurons important for spatial navigation – became activated, whereas when the light was oriented across the tunnel a different pair of place cells was activated.
The researchers suggest that depending on which set of cells is activated, the bee can work out if the food source lies in a direction toward or opposite the direction of the sun, or in a direction ninety degrees to the left or right of it.
(Source: qbi.uq.edu.au)
Epilepsy drug turns out to help adults acquire perfect pitch and learn language like kids
A team of researchers from across the globe believe they have discovered a means of re-opening “critical periods” in brain development, allowing adults to acquire abilities — such as perfect pitch or fluency in language — that could previously only be acquired early in life.
According to the study in Frontiers in Systems Neuroscience, the mood-stabilizing drug valproate allows the adult brain to absorb new information as effortlessly as it did during critical windows in childhood.
A critical period is “a fixed window of time, usually early in an organism’s lifespan, during which experience has lasting effects on the development of brain function and behavior.” They are, for example, what allows children to enter into language without any formal training in grammar or vocabulary.
The researchers postulated that because such periods close when enzymes “impose ‘brakes’ on neuroplasticity,” a drug that blocks the productions of those enzymes might be able to “reopen critical-period neuroplasticity.”
Mind-controlled prostheses offer hope for disabled
The first kick of the 2014 FIFA World Cup may be delivered in Sao Paulo next June by a Brazilian who is paralyzed from the waist down. If all goes according to plan, the teenager will walk onto the field, cock back a foot and swing at the soccer ball, using a mechanical exoskeleton controlled by the teen’s brain.
Motorized metal braces tested on monkeys will support and bend the kicker’s legs. The braces will be stabilized by gyroscopes and powered by a battery carried by the kicker in a backpack. German-made sensors will relay a feeling of pressure when each foot touches the ground. And months of training on a virtual-reality simulator will have prepared the teenager — selected from a pool of 10 candidates — to do all this using a device that translates thoughts into actions.
“We want to galvanize people’s imaginations,” says Miguel Nicolelis, the Brazilian neuroscientist at Duke University who is leading the Walk Again Project’s efforts to create the robotic suit. “With enough political will and investment, we could make wheelchairs obsolete.”
Mind-controlled leg armor may sound more like the movie “Iron Man” than modern medicine. But after decades of testing on rats and monkeys, neuroprosthetics are finally beginning to show promise for people. Devices plugged directly into the brain seem capable of restoring some self-reliance to stroke victims, car crash survivors, injured soldiers and others hampered by incapacitated or missing limbs.
In the Human Brain, Size Really Isn’t Everything
There are many things that make humans a unique species, but a couple stand out. One is our mind, the other our brain.
The human mind can carry out cognitive tasks that other animals cannot, like using language, envisioning the distant future and inferring what other people are thinking.
The human brain is exceptional, too. At three pounds, it is gigantic relative to our body size. Our closest living relatives, chimpanzees, have brains that are only a third as big.
Scientists have long suspected that our big brain and powerful mind are intimately connected. Starting about three million years ago, fossils of our ancient relatives record a huge increase in brain size. Once that cranial growth was underway, our forerunners started leaving behind signs of increasingly sophisticated minds, like stone tools and cave paintings.
But scientists have long struggled to understand how a simple increase in size could lead to the evolution of those faculties. Now, two Harvard neuroscientists, Randy L. Buckner and Fenna M. Krienen, have offered a powerful yet simple explanation.
In our smaller-brained ancestors, the researchers argue, neurons were tightly tethered in a relatively simple pattern of connections. When our ancestors’ brains expanded, those tethers ripped apart, enabling our neurons to form new circuits.
Dr. Buckner and Dr. Krienen call their idea the tether hypothesis, and present it in a paper in the December issue of the journal Trends in Cognitive Sciences.
“I think it presents some pretty exciting ideas,” said Chet C. Sherwood, an expert on human brain evolution at George Washington University who was not involved in the research.

This is how your brain tells time
Did you make it to work on time this morning? Go ahead and thank the traffic gods, but also take a moment to thank your brain. The brain’s impressively accurate internal clock allows us to detect the passage of time, a skill essential for many critical daily functions. Without the ability to track elapsed time, our morning shower could continue indefinitely. Without that nagging feeling to remind us we’ve been driving too long, we might easily miss our exit.
But how does the brain generate this finely tuned mental clock? Neuroscientists believe that we have distinct neural systems for processing different types of time, for example, to maintain a circadian rhythm, to control the timing of fine body movements, and for conscious awareness of time passage. Until recently, most neuroscientists believed that this latter type of temporal processing – the kind that alerts you when you’ve lingered over breakfast for too long – is supported by a single brain system. However, emerging research indicates that the model of a single neural clock might be too simplistic. A new study, recently published in the Journal of Neuroscience by neuroscientists at the University of California, Irvine, reveals that the brain may in fact have a second method for sensing elapsed time. What’s more, the authors propose that this second internal clock not only works in parallel with our primary neural clock, but may even compete with it.