Neuroscience

Month

May 2012

Walking and Running Again After Spinal Cord Injury

ScienceDaily (May 31, 2012) — Rats with spinal cord injuries and severe paralysis are now walking (and running) thanks to researchers at EPFL. Published in the June 1, 2012 issue of Science, the results show that a severed section of the spinal cord can make a comeback when its own innate intelligence and regenerative capacity is awakened. The study, begun five years ago at the University of Zurich, points to a profound change in our understanding of the central nervous system. According to lead author Grégoire Courtine, it is yet unclear if similar rehabilitation techniques could work for humans, but the observed nerve growth hints at new methods for treating paralysis.

image

Test subject takes first steps up stairs after neurorehabilitation with a combination of robotic harness and electrical-chemical stimulation. (Credit: EPFL/Grégoire Courtine)

"After a couple of weeks of neurorehabilitation with a combination of a robotic harness and electrical-chemical stimulation, our rats are not only voluntarily initiating a walking gait, but they are soon sprinting, climbing up stairs and avoiding obstacles when stimulated," explains Courtine, who holds the International Paraplegic Foundation (IRP) Chair in Spinal Cord Repair at EPFL.

Waking up the spinal cord

It is well known that the brain and spinal cord can adapt and recover from moderate injury, a quality known as neuroplasticity. But until now the spinal cord expressed so little plasticity after severe injury that recovery was impossible. Courtine’s research proves that, under certain conditions, plasticity and recovery can take place in these severe cases — but only if the dormant spinal column is first woken up.

To do this, Courtine and his team injected a chemical solution of monoamine agonists into the rats. These chemicals trigger cell responses by binding to specific dopamine, adrenaline, and serotonin receptors located on the spinal neurons. This cocktail replaces neurotransmitters released by brainstem pathways in healthy subjects and acts to excite neurons and ready them to coordinate lower body movement when the time is right.

Five to 10 minutes after the injection, the scientists electrically stimulated the spinal cord with electrodes implanted in the outermost layer of the spinal canal, called the epidural space. “This localized epidural stimulation sends continuous electrical signals through nerve fibers to the chemically excited neurons that control leg movement. All that is left was to initiate that movement,” explains Rubia van den Brand, contributing author to the study.

The innate intelligence of the spinal column

In 2009, Courtine already reported on restoring movement, albeit involuntary. He discovered that a stimulated rat spinal column — physically isolated from the brain from the lesion down — developed in a surprising way: It started taking over the task of modulating leg movement, allowing previously paralyzed animals to walk over treadmills. These experiments revealed that the movement of the treadmill created sensory feedback that initiated walking — the innate intelligence of the spinal column took over, and walking essentially occurred without any input from the rat’s actual brain. This surprised the researchers and led them to believe that only a very weak signal from the brain was needed for the animals to initiate movement of their own volition.

To test this theory, Courtine replaced the treadmill with a device that vertically supported the subjects, a mechanical harness did not facilitate forward movement and only came into play when they lost balance, giving them the impression of having a healthy and working spinal column. This encouraged the rats to will themselves toward a chocolate reward on the other end of the platform. “What they deemed willpower-based training translated into a fourfold increase in nerve fibers throughout the brain and spine — a regrowth that proves the tremendous potential for neuroplasticity even after severe central nervous system injury,” says Janine Heutschi, co-author in the study.

First human rehabilitation on the horizon

Courtine calls this regrowth “new ontogeny,” a sort of duplication of an infant’s growth phase. The researchers found that the newly formed fibers bypassed the original spinal lesion and allowed signals from the brain to reach the electrochemically-awakened spine. And the signal was sufficiently strong to initiate movement over ground — without the treadmill — meaning the rats began to walk voluntarily towards the reward, entirely supporting their own weight with their hind legs.

"This is the world-cup of neurorehabilitation," exclaims Courtine. "Our rats have become athletes when just weeks before they were completely paralyzed. I am talking about 100% recuperation of voluntary movement."

In principle, the radical reaction of the rat spinal cord to treatment offers reason to believe that people with spinal cord injury will soon have some options on the horizon. Courtine is optimistic that human, phase-two trials will begin in a year or two at Balgrist University Hospital Spinal Cord Injury Centre in Zurich, Switzerland. Meanwhile, researchers at EPFL are coordinating a nine million Euro project called NeuWalk that aims at designing a fully operative spinal neuroprosthetic system, much like the one used here with rats, for implanting into humans.

Source: Science Daily

May 31, 201221 notes
#science #neuroscience #CNS #psychology
Alzheimer's Protein Structure Suggests New Treatment Directions

ScienceDaily (May 31, 2012) — The molecular structure of a protein involved in Alzheimer’s disease — and the surprising discovery that it binds cholesterol — could lead to new therapeutics for the disease, Vanderbilt University investigators report in the June 1 issue of the journal Science.

image

Vanderbilt Center for Structural Biology investigators determined the structure of the C99 protein (shown in green and blue), which participates in triggering Alzheimer’s disease. Their discovery that C99 binds to cholesterol (shown in black, white and red) suggests a mechanism for cholesterol’s recognized role in promoting the memory-robbing disease and may lead to new therapeutics. (Credit: Charles Sanders and colleagues/Vanderbilt University)

Charles Sanders, Ph.D., professor of Biochemistry, and colleagues in the Center for Structural Biology determined the structure of part of the amyloid precursor protein (APP) — the source of amyloid-beta, which is believed to trigger Alzheimer’s disease. Amyloid-beta clumps together into oligomers that kill neurons, causing dementia and memory loss. The amyloid-beta oligomers eventually form plaques in the brain — one of the hallmarks of the disease.

"Anything that lowers amyloid-beta production should help prevent, or possibly treat, Alzheimer’s disease," Sanders said.

Amyloid-beta production requires two “cuts” of the APP protein. The first cut, by the enzyme beta-secretase, generates the C99 protein, which is then cut by gamma-secretase to release amyloid-beta. The Vanderbilt researchers used nuclear magnetic resonance and electron paragmagnetic resonance spectroscopy to determine the structure of C99, which has one membrane-spanning region.

They were surprised to discover what appeared to be a “binding” domain in the protein. Based on previously reported evidence that cholesterol promotes Alzheimer’s disease, they suspected that cholesterol might be the binding partner. The researchers used a model membrane system called “bicelles” (that Sanders developed as a postdoctoral fellow) to demonstrate that C99 binds cholesterol.

"It has long been thought that cholesterol somehow promotes Alzheimer’s disease, but the mechanisms haven’t been clear," Sanders said. "Cholesterol binding to APP and its C99 fragment is probably one of the ways it makes the disease more likely."

Sanders and his team propose that cholesterol binding moves APP to special regions of the cell membrane called “lipid rafts,” which contain “cliques of molecules that like to hang out together,” he said.

Beta- and gamma-secretase are part of the lipid raft clique.

"We think that when APP doesn’t have cholesterol around, it doesn’t care what part of the membrane it’s in," Sanders said. "But when it binds cholesterol, that drives it to lipid rafts, where these ‘bad’ secretases are waiting to clip it and produce amyloid-beta."

The findings suggest a new therapeutic strategy to reduce amyloid-beta production, he said.

"If you could develop a drug that blocks cholesterol from binding to APP, then you would keep the protein from going to lipid rafts. Instead it would be cleaved by alpha-secretase — a ‘good’ secretase that isn’t in rafts and doesn’t generate amyloid-beta."

Drugs that inhibit beta- or gamma-secretase — to directly limit amyloid-beta production — have been developed and tested, but they have toxic side effects. A drug that blocks cholesterol binding to APP may be more specific and effective in reducing amyloid-beta levels and in preventing, or treating, Alzheimer’s disease.

The C99 structure had some other interesting details, Sanders said.

The membrane domain of C99 is curved, which was unexpected but fits perfectly into the predicted active site of gamma-secretase. Also, a certain sequence of amino acids (GXXXG) that usually promotes membrane protein dimerization (two of the same proteins interacting with each other) turned out to be central to the cholesterol-binding domain. This is a completely new function for GXXXG motifs, Sanders said.

"This revealing new information on the structure of the amyloid precursor protein and its interaction with cholesterol is a perfect example of the power of team science," said Janna Wehrle, Ph.D., who oversees grants focused on the biophysical properties of proteins at the National Institutes of Health’s National Institute of General Medical Sciences (NIGMS), which partially funded the work. "The researchers at Vanderbilt brought together biological and medical insight, cutting-edge physical techniques and powerful instruments, each providing a valuable tool for piecing together the puzzle."

"When we were developing bicelles 20 years ago, no one was saying, ‘someday these things are going to lead to discoveries in Alzheimer’s disease,’" he said. "It was interesting basic science research that is now paying off."

Source: Science Daily

May 31, 201235 notes
#science #neuroscience #brain #psychology #alzheimer
Memory Training Unlikely to Help in Treating ADHD, Boosting IQ

ScienceDaily (May 31, 2012) — Working memory training is unlikely to be an effective treatment for children suffering from disorders such as attention-deficit/hyperactivity or dyslexia, according to a research analysis published by the American Psychological Association. In addition, memory training tasks appear to have limited effect on healthy adults and children looking to do better in school or improve their cognitive skills.

"The success of working memory training programs is often based on the idea that you can train your brain to perform better, using repetitive memory trials, much like lifting weights builds muscle mass," said the study’s lead author, Monica Melby-Lervåg, PhD, of the University of Oslo. "However, this analysis shows that simply loading up the brain with training exercises will not lead to better performance outside of the tasks presented within these tests." The article was published online in Developmental Psychology.

Working memory enables people to complete tasks at hand by allowing the brain to retain pertinent information temporarily. Working memory enhancing tasks usually involve trying to get people to remember information presented to them while they are performing distracting activities. For example, participants may be presented with a series of numbers one at a time on a computer screen. The computer presents a new digit and then prompts participants to recall the number immediately preceding. More difficult versions might ask participants to recall what number appeared two, three or four digits ago.

In this meta-analysis, researchers from the University of Oslo and University College London examined 23 peer-reviewed studies with 30 different comparisons of groups that met their criteria. The studies were randomized controlled trials or experiments, had some sort of working memory treatment and a control group. The studies comprised a wide range of participants, including young children, children with cognitive impairments, such as ADHD, and healthy adults. Most of the studies had been published within the last 10 years.

Overall, working memory training improved performance on tasks related to the training itself but did not have an impact on more general cognitive performance such as verbal skills, attention, reading or arithmetic. “In other words, the training may help you improve your short-term memory when it’s related to the task implemented in training but it won’t improve reading difficulties or help you pay more attention in school,” said Melby-Lervåg.

In recent years, several commercial, computer-based working memory training programs have been developed and purport to benefit students suffering from ADHD, dyslexia, language disorders, poor academic performance or other issues. Some even claim to boost people’s IQs. These programs are widely used around the world in schools and clinics, and most involve tasks in which participants are given many memory tests that are designed to be challenging, the study said.

"In the light of such evidence, it seems very difficult to justify the use of working memory training programs in relation to the treatment of reading and language disorders," said Melby-Lervåg. "Our findings also cast strong doubt on claims that working memory training is effective in improving cognitive ability and scholastic attainment."

Source: Science Daily

May 31, 20124 notes
#science #neuroscience #brain #psychology #memory
Fantasizing About Your Dream Vacation Could Lead to Poor Decision-Making

ScienceDaily (May 31, 2012) — Summer vacation time is upon us. If you have been saving up for your dream vacation for years, you may want to make sure your dream spot is still the best place to go. A new study has found that when we fantasize about such trips before they are possible, we tend to overlook the negatives — thus influencing our decision-making down the line.

image

Summer vacation time is upon us. If you have been saving up for your dream vacation for years, you may want to make sure your dream spot is still the best place to go. A new study has found that when we fantasize about such trips before they are possible, we tend to overlook the negatives — thus influencing our decision-making down the line. (Credit: © XtravaganT / Fotolia)

"We were interested in the effects of positive fantasies — what happens when people imagine an idealized, best-case-scenario version of the future, compared to when they imagine a less idealized version," says Heather Barry Kappes of New York University, co-author of the study published online this week in Personality and Social Psychology Bulletin. “This is one of the first papers to examine selective information acquisition at this early stage, before people are seriously considering a possibility.”

Say, for example, that you would like to take a trip to Australia this year but think you are very unlikely to do so — you have no more vacation time left, cannot afford it, or would rather save up for a new car. But you still daydream about how nice it would be to see the Australian Outback and lie on the white sand beaches, perhaps without thinking about the long plane ride there or the poisonous animals. Those daydreams, Kappes says, have powerful effects.

To test those effects, Kappes and co-author Gabriele Oettingen asked people to imagine a particular future about one of three topics: wearing glamorous high-heeled shoes, making money in the stock market, or taking a vacation. To induce positive fantasies for each topic, the study participants were prompted to think about how great it would be to do each activity. In the control condition, participants also imagined experiencing the future, but were prompted to think about the negatives as well, with questions like “Would it really be so great?” In both conditions, participants wrote down what they were thinking, for the researchers to ensure they were engaged in the imagery.

After that exercise, the researchers offered the participants a choice of different types of information. For example, participants could browse a website describing the positive and negative health consequences of wearing high heels, and researchers noted how much more time they spent reading about positive versus negative consequences. Or, they could choose which of five (fictitious) tripadvisor.com reviews they wanted to read, and researchers recorded whether they chose one that was more pro-trip (i.e., five stars) or con-trip (i.e., one star).

Kappes’ team found that for each topic, imagining the idealized version made people prefer to learn about the pros rather than the cons of the future event. “These effects are pronounced when people are not seriously considering pursuing a given future,” Kappes says.

The work has important implications for even the most deliberate of decision-makers. “When people are seriously considering implementing a decision like taking a trip, they often engage in careful deliberations about the pros versus cons,” Kappes says. “Our work suggests that before getting to this point, positive fantasies might lead people to acquire biased information — to learn more about the pros rather than the cons. Thus, even if people deliberate very carefully on the information they’ve acquired, they could still make poor decisions.”

People need to be aware of these effects to ensure that they acquire balanced information before it is time to make a decision, she says. The study also contributes to a larger body of research about the powerful consequences of mental imagery — and shows that positive thinking may not always be best. “Although there are benefits to imagining a positive future, there are also drawbacks, and it’s important to recognize them in order to most effectively pursue our goals.”

Source: Science Daily

May 31, 20125 notes
#science #neuroscience #psychology #brain
The Special Scent of Age: Body Odor Gives Away Age

ScienceDaily (May 30, 2012) — New findings from the Monell Center reveal that humans can identify the age of other humans based on differences in body odor. Much of this ability is based on the capacity to identify odors of elderly individuals, and contrary to popular supposition, the so-called ‘old-person smell’ is rated as less intense and less unpleasant than body odors of middle-aged and young individuals.

image

Baby-smell. Humans can identify the age of other humans based on differences in body odor. (Credit: © S.Kobold / Fotolia)

"Similar to other animals, humans can extract signals from body odors that allow us to identify biological age, avoid sick individuals, pick a suitable partner, and distinguish kin from non-kin," said senior author Johan Lundström, a sensory neuroscientist at Monell.

Like non-human animals, human body odors contain a rich array of chemical components that can transmit various types of social information. The perceptual characteristics of these odors are reported to change across the lifespan, as are concentrations of the underlying chemicals.

Scientists theorize that age-related odors may help animals select suitable mates: older males might be desirable because they contribute genes that enable offspring to live longer, while older females might be avoided because their reproductive systems are more fragile.

In humans, a unique ‘old person smell’ is recognized across cultures. This phenomenon is so acknowledged in Japan that there is a special word to describe this odor, kareishū.

Because studies with non-human animals at Monell and other institutions have demonstrated the ability to identify age via body odor, Lundström’s team examined whether humans are able to do the same.

In the study, published in the journal PLoS ONE, body odors were collected from three age groups, with 12-16 individuals in each group: Young (20-30 years old), Middle-age (45-55), and Old-age (75-95). Each donor slept for five nights in unscented t-shirts containing underarm pads, which were then cut into quadrants and placed in glass jars.

Odors were assessed by 41 young (20-30 years old) evaluators, who were given two body odor glass jars in nine combinations and asked to identify which came from the older donors. Evaluators also rated the intensity and pleasantness of each odor. Finally evaluators were asked to estimate the donor’s age for each odor sample.

Evaluators were able to discriminate the three donor age categories based on odor cues. Statistical analyses revealed that odors from the old-age group were driving the ability to differentiate age. Interestingly, evaluators rated body odors from the old-age group as less intense and less unpleasant than odors from the other two age groups.

"Elderly people have a discernible underarm odor that younger people consider to be fairly neutral and not very unpleasant," said Lundström. "This was surprising given the popular conception of old age odor as disagreeable. However, it is possible that other sources of body odors, such as skin or breath, may have different qualities."

Future studies will both attempt to identify the underlying biomarkers that evaluators use to identify age-related odors and also determine how the brain is able to identify and evaluate this information.

Source: Science Daily

May 31, 201221 notes
#science #neuroscience #brain #psychology
Despite Less Play, Children's Use of Imagination Increases Over Two Decades

ScienceDaily (May 30, 2012) — Children today may be busier than ever, but Case Western Reserve University psychologists have found that their imagination hasn’t suffered — in fact, it appears to have increased.

image

Children today may be busier than ever, but Case Western Reserve University psychologists have found that their imagination hasn’t suffered — in fact, it appears to have increased. (Credit: © BeTa-Artworks / Fotolia)

Psychologists Jessica Dillon and Sandra Russ expected the opposite outcome when they analyzed 14 play studies that Russ conducted between 1985 and 2008.

But as they report in “Changes in Children’s Play Over Two Decades,” an article in the Creativity Research Journal, the data told a story contrary to common assumptions. First, children’s use of imagination in play and their overall comfort and engagement with play activities actually increased over time. In addition, the results suggested that children today expressed less negative feelings in play. Finally, their capacity to express a wide range of positive emotions, to tell stories and to organize thoughts stayed consistent.

Dillon, a fifth-year doctoral student, and Russ, a professor in psychological sciences at Case Western Reserve, decided to revisit the play data after a 2007 report from the American Academy of Pediatrics showed children played less.

They set out to see if having less time for unstructured play affected the processes in play that influence cognition and emotional development, a focus of the play research.

The pretend play studies focused on children between the ages of 6 and 10. The children’s play was measured for comfort, imagination, the range and amount of positive to negative emotions used and expressed, and the quality of storytelling by using Russ’ Affect in Play Scale (APS).

The APS is a five-minute, unstructured play session. Children are asked to play freely with three wooden blocks and two human hand puppets. The play is videotaped, and later reviewed and scored for imagination, expression of emotions, actions and storytelling.

Russ explains that children who exhibit good play skills with imaginative and emotional play situations have shown better skills at coping, creativity and problem solving. She stresses there is no link between being a good player and intelligence.

The APS data provided a consistent measurement and research structure over the 23-year period. Russ said the consistency of having the same tool to measure play provided this unique opportunity to track changes in play.

"We were surprised that outside of imagination and comfort, play was consistent over time," said Dillon.

Russ did voice concern about the decrease in displayed negative emotions and actions. “Past studies have linked negative emotions in play with creativity,” she said.

But even with the lack of time to play, Russ said, children, like some other forms of higher mammals, have a drive to play and always will find ways to do it.

As new stimuli, like video games and the Internet, have crept into everyday life, Russ explains that children might gain cognitive skills from using technology where they once got it from acting out situations in play. Skills might also develop from daydreaming.

Russ said future research will need to focus on whether acting out emotions and creating stories in play is as important as it once was in helping children to be creative.

Even though children have less time these days for play, Russ still advises giving children time for it, adding that it helps children develop emotional and cognitive abilities.

Video: Studying imagination in children’s play

Source: Science Daily

May 31, 201210 notes
#science #neuroscience #brain #psychology
Could Sarcastic Computers Be in Our Future? New Math Model Can Help Computers Understand Inference

ScienceDaily (May 30, 2012) — In a new paper, the researchers describe a mathematical model they created that helps predict pragmatic reasoning and may eventually lead to the manufacture of machines that can better understand inference, context and social rules.

image

Noah Goodman, right, and Michael Frank, both assistant professors of psychology, discuss their research at the white board that covers the wall in Goodman’s office. (Credit: L.A. Cicero)

Language is so much more than a string of words. To understand what someone means, you need context.

Consider the phrase, “Man on first.” It doesn’t make much sense unless you’re at a baseball game. Or imagine a sign outside a children’s boutique that reads, “Baby sale — One week only!” You easily infer from the situation that the store isn’t selling babies but advertising bargains on gear for them.

Present these widely quoted scenarios to a computer, however, and there would likely be a communication breakdown. Computers aren’t very good at pragmatics — how language is used in social situations.

But a pair of Stanford psychologists has taken the first steps toward changing that.

In a new paper published recently in the journal Science, Assistant Professors Michael Frank and Noah Goodman describe a quantitative theory of pragmatics that promises to help open the door to more human-like computer systems, ones that use language as flexibly as we do.

The mathematical model they created helps predict pragmatic reasoning and may eventually lead to the manufacture of machines that can better understand inference, context and social rules. The work could help researchers understand language better and treat people with language disorders.

It also could make speaking to a computerized customer service attendant a little less frustrating.

"If you’ve ever called an airline, you know the computer voice recognizes words but it doesn’t necessarily understand what you mean," Frank said. "That’s the key feature of human language. In some sense it’s all about what the other person is trying to tell you, not what they’re actually saying."

Frank and Goodman’s work is part of a broader trend to try to understand language using mathematical tools. That trend has led to technologies like Siri, the iPhone’s speech recognition personal assistant.

But turning speech and language into numbers has its obstacles, mainly the difficulty of formalizing notions such as “common knowledge” or “informativeness.”

That is what Frank and Goodman sought to address.

The researchers enlisted 745 participants to take part in an online experiment. The participants saw a set of objects and were asked to bet which one was being referred to by a particular word.

For example, one group of participants saw a blue square, a blue circle and a red square. The question for that group was: Imagine you are talking to someone and you want to refer to the middle object. Which word would you use, “blue” or “circle”?

The other group was asked: Imagine someone is talking to you and uses the word “blue” to refer to one of these objects. Which object are they talking about?

"We modeled how a listener understands a speaker and how a speaker decides what to say," Goodman explained.

The results allowed Frank and Goodman to create a mathematical equation to predict human behavior and determine the likelihood of referring to a particular object.

"Before, you couldn’t take these informal theories of linguistics and put them into a computer. Now we’re starting to be able to do that," Goodman said.

The researchers are already applying the model to studies on hyperbole, sarcasm and other aspects of language.

"It will take years of work but the dream is of a computer that really is thinking about what you want and what you mean rather than just what you said," Frank said.

Source: Science Daily

May 31, 20122 notes
#science #neuroscience #brain #psychology
Genes Predict If Medication Can Help You Quit Smoking

ScienceDaily (May 30, 2012) — The same gene variations that make it difficult to stop smoking also increase the likelihood that heavy smokers will respond to nicotine-replacement therapy and drugs that thwart cravings, a new study shows.

image

High-risk genetic variations can increase the risk for nicotine dependence, but the same gene variants predict a more robust response to anti-smoking medications. (Credit: Li-Shiun Chen)

The research, led by investigators at Washington University School of Medicine in St. Louis, will appear online May 30 in the American Journal of Psychiatry.

The study suggests it may one day be possible to predict which patients are most likely to benefit from drug treatments for nicotine addiction.

"Smokers whose genetic makeup puts them at the greatest risk for heavy smoking, nicotine addiction and problems kicking the habit also appear to be the same people who respond most robustly to pharmacologic therapy for smoking cessation," says senior investigator Laura Jean Bierut, MD, professor of psychiatry. "Our research suggests that a person’s genetic makeup can help us better predict who is most likely to respond to drug therapy so we can make sure those individuals are treated with medication in addition to counseling or other interventions."

For the new study, the researchers analyzed data from more than 5,000 smokers who participated in community-based studies and more than 1,000 smokers in a clinical treatment study. The scientists focused on the relationship between their ability to quit smoking successfully and genetic variations that have been associated with risk for heavy smoking and nicotine dependence.

"People with the high-risk genetic markers smoked an average of two years longer than those without these high-risk genes, and they were less likely to quit smoking without medication," says first author Li-Shiun Chen, MD, assistant professor of psychiatry at Washington University. "The same gene variants can predict a person’s response to smoking-cessation medication, and those with the high-risk genes are more likely to respond to the medication."

In the clinical treatment trial, individuals with the high-risk variants were three times more likely to respond to drug therapy, such as nicotine gum, nicotine patches, the antidepressant buproprion and other drugs used to help people quit.

Tobacco use is the leading cause of preventable illness and death in the United States and a major public health problem worldwide. Cigarette smoking contributes to the deaths of an estimated 443,000 Americans each year. Although lung cancer is the leading cause of smoking-related cancer death among both men and women, tobacco also contributes to other lung problems, many other cancers and heart attacks.

Bierut and Chen say that the gene variations they studied are not the only ones involved in whether a person smokes, becomes addicted to nicotine or has difficulty quitting. But they contend that because the same genes can predict both heavy smoking and enhanced response to drug treatment, the genetic variants are important to the addiction puzzle.

"It’s almost like we have a ‘corner piece’ here," Bierut says. "It’s a key piece of the puzzle, and now we can build on it. Clearly these genes aren’t the entire story — other genes play a role, and environmental factors also are important. But we’ve identified a group that’s responding to pharmacologic treatment and a group that’s not responding, and that’s a key step in improving, and eventually tailoring, treatments to help people quit smoking."

Since people without the risky genetic variants aren’t as likely to respond to drugs, Bierut says they should get counseling or other non-drug therapies.

"This is an actionable genetic finding," Chen says. "Scientific journals publish genetic findings every day, but this one is actionable because treatment could be based on a person’s genetic makeup. I think this study is moving us closer to personalized medicine, which is where we want to go."

And Bierut says that although earlier studies suggested the genes had only a modest influence on smoking and addiction, the new clinical findings indicate the genetic variations are having a big effect on treatment response.

"These variants make a very modest contribution to the development of nicotine addiction, but they have a much greater effect on the response to treatment. That’s a huge finding," she says.

Source: Science Daily

May 31, 20129 notes
#science #neuroscience #brain #psychology #genes
Tiny Genetic Variations Led to Big Changes in the Evolving Human Brain

ScienceDaily (May 30, 2012) — Changes to just three genetic letters among billions contributed to the evolution and development of the mammalian motor sensory circuits and laid the groundwork for the defining characteristics of the human brain, Yale University researchers report.

image

Illustration of neurons. Changes to just three genetic letters among billions contributed to the evolution and development of the mammalian motor sensory circuits and laid the groundwork for the defining characteristics of the human brain. (Credit: © nobeastsofierce / Fotolia)

In a study published in the May 31 issue of the journal Nature, Yale researchers found that a small, simple change in the mammalian genome was critical to the evolution of the corticospinal neural circuits. This circuitry directly connects the cerebral cortex, the conscious part of the human brain, with the brainstem and the spinal cord to make possible the fine, skilled movements necessary for functions such as tool use and speech. The evolutionary mechanisms that drive the formation of the corticospinal circuit, which is a mammalian-specific advance, had remained largely mysterious.

"What we found is a small genetic element that is part of the gene regulatory network directing neurons in the cerebral cortex to form the motor sensory circuits," said Nenad Sestan, professor of neurobiology, researcher for the Kavli Institute for Neuroscience, and senior author of the paper.

Most mammalian genomes contain approximately 22,000 protein-encoding genes. The critical drivers of evolution and development, however, are thought to reside in the non-coding portions of the genome that regulate when and where genes are active. These so-called cis-regulatory elements control the activation of genes that carry out the formation of basic body plans in all organisms.

Sungbo Shim, the first author, and other members of Sestan’s lab identified one such regulatory DNA region they named E4, which specifically drives the development of the corticospinal system by controlling the dynamic activity of a gene called Fezf2 — which, in turn, directs the formation of the corticospinal circuits. E4 is conserved in all mammals but divergent in other craniates, suggesting that it is important to both the emergence and survival of mammalian species. The species differences within E4 are tiny, but crucially drive the regulation of E4 activity by a group of regulatory proteins, or transcription factors, that include SOX4, SOX11, and SOX5. In cooperation, they control the dynamic activation and repression of E4 to shape the development of the corticospinal circuits in the developing embryo.

Source: Science Daily

May 31, 201217 notes
#science #neuroscience #brain #psychology
Speeding Up Drug Discovery With Rapid 3-D Mapping of Proteins

ScienceDaily (May 30, 2012) — A new method for rapidly solving the three-dimensional structures of a special group of proteins, known as integral membrane proteins, may speed drug discovery by providing scientists with precise targets for new therapies, according to a paper published May 20 in Nature Methods.

image

Using their new rapid technique, Choe’s team generated the structure of a hIMP known as TMEM14A, shown here in multiple three-dimensional conformations. (Credit: Courtesy of the Salk Institute for Biological Studies)

The technique, developed by scientists at the Salk Institute for Biological Studies, provides a shortcut for determining the structure of human integral membrane proteins (hIMPs), molecules found on the surface of cells that serve as the targets for about half of all current drugs.

Knowing the exact three-dimensional shape of hIMPs allows drug developers to understand the precise biochemical mechanisms by which current drugs work and to develop new drugs that target the proteins.

"Our cells contain around 8,000 of these proteins, but structural biologists have known the three-dimensional structure of only 30 hIMPs reported by the entire field over many years," says Senyon Choe, a professor in Salk’s Structural Biology Laboratory and lead author on the paper. "We solved six more in a matter of months using this new technique. The very limited information on the shape of human membrane proteins hampers structure-driven drug design, but our method should help address this by dramatically increasing the library of known hIMP structures."

Integral membrane proteins are attached to the membrane surrounding each cell, serving as gateways for absorbing nutrients, hormones and drugs, removing waste products, and allowing cells to communicate with their environment. Many diseases, including Alzheimer’s, heart disease and cancer have been linked to malfunctioning hIMPs, and many drugs, ranging from aspirin to schizophrenia medications, target these proteins.

Most of the existing drugs were discovered through brute force methods that required screening thousands of potential molecules in laboratory studies to determine if they had a therapeutic effect. Given a blueprint of the 3D structure of a hIMP involved in a specific disease, however, drug developers could focus only on molecules that are most likely to interact with the target hIMP, saving time and expense.

In the past, it was extremely difficult to solve the structure of hIMPs, due to the difficulty of harvesting them from cells and the difficulty of labeling the amino acids that compose the proteins, a key step in determining their three-dimensional configuration.

"One problem was that hIMPs serve many functions in a cell, so if you tried to engineer cells with many copies of the proteins on their membrane, they would die before you could harvest the hIMPs," says Christian Klammt, a postdoctoral researcher in Choe’s lab and a first author on the paper.

To get around this, the scientists created an outside-the-cell environment, called cell-free expression system, to synthesize the proteins. They used a plexiglass chamber that contained all the biochemical elements necessary to manufacture hIMPs as if they were inside the cell. This system provided the researchers with enough of the proteins to conduct structural analysis.

The cell-free method also allowed them to easily add labeled amino acids into the biochemical stew, which were then incorporated into the proteins. These amino acids gave off telltale structural clues when analyzed with nuclear magnetic resonance spectroscopy, a method for using the magnetic properties of atoms to determine a molecule’s physical and chemical properties.

"It was very difficult and inefficient to introduce labeled amino acids selectively into the protein produced in live cells," says Innokentiy Maslennikov, a Salk staff scientist and co-first author on the paper. "With a cell-free system, we can precisely control what amino acids are available for protein production, giving us isotope-labeled hIMPs in large quantities. Using a proprietary labeling strategy we devised a means to minimize the number of samples to prepare."

Prior methods might take up to a year to determine a single protein structure, but using their new method, the Salk scientists determined the structure of six hIMPs within just 18 months. They have already identified 38 more hIMPs that are suitable for analysis with their technique, and expect it will be used to solve the structure for many more.

Source: Science Daily

May 31, 20121 note
#science #neuroscience #brain #psychology #proteins
Neural protective protein has two faces

May 30, 2012

(Medical Xpress) — A protein produced by the central nervous system’s support cells seems to play two opposing roles in protecting nerve cells from damage, an animal study by Johns Hopkins researchers suggests: Decreasing its activity seems to trigger support cells to gear up their protective powers, but increasing its activity appears to be key to actually use those powers to defend cells from harm.

Seth Blackshaw, Ph.D., an associate professor in the Solomon H. Snyder Department of Neuroscience at the Johns Hopkins University School of Medicine, explains that researchers have long suspected that central nervous system cells called glia play an important role in saving nerve cells from almost certain death after either an acute injury, such as a blow to the head, or chronic damage, such as that caused by Alzheimer’s or Parkinson’s disease. Glia — named after the Greek word for glue, since decades ago they were thought to play a very passive role in holding the central nervous system together — respond to an assault on nearby neurons in a dramatic way, puffing up to a larger size and turning off several genes involved in routine maintenance functions.

Previous research in cell cultures containing both neurons and glia showed that when the entire group was exposed to an assault, the reaction of the glia seemed to drive a response that protects cells from subsequent damage. However, Blackshaw says, it’s been unclear exactly what glia are doing when they change in size and gene expression. Even whether this response is actually important for protection was uncertain, he adds, since it’s been impossible to study this so-called glial reactivity without treating whole tissues that include neurons and other types of cells that may exert their own protective effects.

Hoping to find a way to trigger glial reactivity without assaulting entire tissues, Blackshaw and his colleagues searched for proteins that could play an important role in this response. The team used Mueller glia as their model system. These glia are the most abundant type in the retina, and are highly likely to behave like other glia throughout the central nervous system, Blackshaw says.

The researchers’ investigation eventually zeroed in on a protein called Lhx2. When they bred mutant mice that selectively lacked Lhx2 in the glia of the eye, these cells displayed the physical and genetic characteristics of being reactive all the time, even without any damaging stimulus. However, to the researchers’ surprise, hitting the mutant animals’ eyes with extraordinarily bright light caused considerably more damage to their retinas compared to the same stimulus in normal mice.

To understand why these reactive glia didn’t produce the expected protective response, the researchers looked for other pro-survival proteins that glia produce under assault. In the mutant animals, these other proteins were conspicuously missing, Blackshaw says, suggesting that Lhx2 is necessary for glia to produce other protective proteins.

“Lhx2 seems to be a master regulator of glial reactivity, and we’ve shown here that it has two faces,” Blackshaw says of these results, reported in the March 20 issue of the Proceedings of the National Academy of Sciences. While the protein’s absence seems to be critical for triggering the physical and genetic changes glia use to bring their protective proteins to bear to help neurons survive, its presence is vital to produce these proteins in the first place. Levels of Lhx2 activity likely dip and then increase in glia exposed to an attack, he says, explaining both the initial glial reactivity researchers see under a microscope as well as the resulting neural protection.

Once researchers understand this mechanism better, Blackshaw adds, they may be able to craft drugs that stimulate glia to pump out more pro-survival proteins, making novel therapies for neurodegenerative diseases.

Provided by Johns Hopkins University

Source: medicalxpress.com

May 30, 20126 notes
#science #neuroscience #psychology
Hear to see: New method for the treatment of visual field defects

May 30, 2012

Patients who are blind in one side of their visual field benefit from presentation of sounds on the affected side. After passively hearing sounds for an hour, their visual detection of light stimuli in the blind half of their visual field improved significantly. Neural pathways that simultaneously process information from different senses are responsible for this effect.

"We have embarked on a whole new therapy approach" says PD Dr. Jörg Lewald from the RUB’s Cognitive Psychology Unit. Together with colleagues from the Neurological University Clinic at Bergmannsheil (Prof. Dr. Martin Tegenthoff) and Durham University (PD Dr. Markus Hausmann), he describes the results in PLoS ONE.

To investigate the effectiveness of the auditory stimulation, the research team carried out a visual test before and after the acoustic stimulation. Patients were asked to determine the position of light flashes in the healthy and in the blind field of vision. While performance was stable in the intact half of their field of vision, the number of correct answers in the blind half increased after the auditory stimulation. This effect lasted for 1.5 hours. “In other treatments, the patients undergo arduous and time-consuming visual training” explains Lewald. “The therapeutic results are moderate and vary greatly from patient to patient. Our result suggests that passive hearing alone can improve vision temporarily.”

If strokes or injuries cause damage to the area of the brain that processes the information of the visual sense, this results in a visual field defect. The area most commonly affected is the primary visual cortex, the first processing point for visual input to the cerebral cortex. The more neurons die in this brain area, the bigger the visual deficit. Usually the entire half of the visual field is affected, a condition known as hemianopia. “Hemianopia restricts patients immensely in their everyday life” says Lewald. “When objects or people are missed on the blind side, this can quickly lead to accidents.”

"There is increasing evidence that processing of incoming sensory information is not strictly separated in the brain", says Lewald. "At various stages there are connections between the sensory systems." In particular the nerve cells in the so-termed superior colliculus, part of the midbrain, process auditory and visual information simultaneously. This area is not usually affected by visual field defects, and thus continues to analyse visual stimuli. Therefore, remaining visual functions are retained in the blind half, which the patients, however, are not aware of. “Since the same nerve cells also receive auditory information, we had the idea to use acoustic stimuli to increase their sensitivity to light stimuli” says Lewald.

The team of researchers now aims to further refine their therapy approach in order to reveal sustained improvement in visual functioning. They will also investigate whether the stimulation of the sense of hearing also has an effect on more complex visual functions. Finally, they aim to explore the mechanisms that underlie the effect observed.

Provided by Ruhr-Universitaet-Bochum

Source: medicalxpress.com

May 30, 201225 notes
#science #neuroscience #brain #vision #psychology
Fish study raises hope for spinal injury repair

May 30, 2012

(Medical Xpress) — Scientists have unlocked the secrets of the zebra fish’s ability to heal its spinal cord after injury, in research that could deliver therapy for paraplegics and quadriplegics in the future.

image

Scientists discovered the role of a protein in the remarkable self-healing ability of the fish

A team from Monash University’s Australian Regenerative Medicine Institute (ARMI), led by Dr Yona Goldshmit and Professor Peter Currie, discovered the role of a protein in the remarkable self-healing ability of the fish.

The findings, detailed in The Journal of Neuroscience, could eventually lead to ways to stimulate spinal cord regeneration in humans.

When the spinal cord is severed in humans and other mammals, the immune system kicks in, activating specialised cells called glia to prevent bleeding into it, Professor Currie said.

“Glia are the workmen of nervous system. The glia proliferate, forming bigger cells that span the wound site in order to prevent bleeding into it. They come in and try to sort out problems. A glial scar forms,” Professor Currie said.

However, the scar prevents axons, threadlike structures of nerve cells that carry impulses to the brain, of neighbouring nerve cells from penetrating the wound. The result is paralysis.

“The axons upstream and downstream of the lesion sites are never able to penetrate the glial scar to reform. This is a major barrier in mammalian spinal cord regeneration,” Professor Currie said.

In contrast, the zebra fish glia form a bridge that spans the injury site but allow the penetration of axons into it.

The fish can fully regenerate its spinal cord within two months of injury. “You can’t tell there’s been any wound at all,” Professor Currie said.

Scientists discovered the protein, called fibroblast growth factor (fgf), controlled the shape of the glia, and accounted for the difference in the response to spinal cord injury between humans and zebra fish.

The scientists showed the protein could be manipulated in the zebra fish to speed up tissue repair even more.

“The hope is that fgf could eventually be used to promote better results in spinal cord repair in people,” Professor Currie said.

Provided by Monash University

Source: medicalxpress.com

May 30, 20128 notes
#science #neuroscience
Ketamine Improved Bipolar Depression Within Minutes, Study Suggests

ScienceDaily (May 30, 2012) — Bipolar disorder is a serious and debilitating condition where individuals experience severe swings in mood between mania and depression. The episodes of low or elevated mood can last days or months, and the risk of suicide is high.

Antidepressants are commonly prescribed to treat or prevent the depressive episodes, but they are not universally effective. Many patients still continue to experience periods of depression even while being treated, and many patients must try several different types of antidepressants before finding one that works for them. In addition, it may take several weeks of treatment before a patient begins to feel relief from the drug’s effects.

For these reasons, better treatments for depression are desperately needed. A new study in Biological Psychiatry this week confirms that scientists may have found one in a drug called ketamine.

A group of researchers at the National Institute of Mental Health, led by Dr. Carlos Zarate, previously found that a single dose of ketamine produced rapid antidepressant effects in depressed patients with bipolar disorder. They have now replicated that finding in an independent group of depressed patients, also with bipolar disorder. Replication is an important component of the scientific method, as it helps ensure that the initial finding wasn’t accidental and can be repeated.

In this new study, they administered a single dose of ketamine and a single dose of placebo to a group of patients on two different days, two weeks apart. The patients were then carefully monitored and repeatedly completed ratings to ‘score’ their depressive symptoms and suicidal thoughts.

When the patients received ketamine, their depression symptoms significantly improved within 40 minutes, and remained improved over 3 days. Overall, 79% of the patients improved with ketamine, but 0% reported improvement when they received placebo.

Importantly, and for the first time in a group of patients with bipolar depression, they also found that ketamine significantly reduced suicidal thoughts. These antisuicidal effects also occurred within one hour. Considering that bipolar disorder is one of the most lethal of all psychiatric disorders, these study findings could have a major impact on public health.

"Our finding that a single infusion of ketamine produces rapid antidepressant and antisuicidal effects within one hour and that is fairly sustained is truly exciting," Dr. Zarate commented. "We think that these findings are of true importance given that we only have a few treatments approved for acute bipolar depression, and none of them have this rapid onset of action; they usually take weeks or longer to have comparable antidepressant effects as ketamine does."

Ketamine is an N-methyl-D-aspartate (NMDA) receptor antagonist, which means that it works by blocking the actions of NMDA. Dr. Zarate added, “Importantly, confirmation that blocking the NMDA receptor complex is involved in generating rapid antidepressant and antisuicidal effects offers an avenue for developing the next generation of treatments for depression that are radically different than existing ones.”

Source: Science Daily

May 30, 201240 notes
#science #neuroscience #psychology #brain #depression
Study looks at effects of cannabis on MS progression

May 30, 2012

(Medical Xpress) — The first large non-commercial study to investigate whether the main active constituent of cannabis (tetrahydrocannabinol or THC) is effective in slowing the course of progressive multiple sclerosis (MS) shows that there is no evidence to suggest this; although benefits were noted for those at the lower end of the disability scale.

The CUPID (Cannabinoid Use in Progressive Inflammatory brain Disease) study was carried out by researchers from the Peninsula College of Medicine and Dentistry (PCMD), Plymouth University. The study was funded by the Medical Research Council (MRC) and managed by the National Institute for Health Research (NIHR) on behalf of the MRC-NIHR partnership, the Multiple Sclerosis Society and the Multiple Sclerosis Trust.

The preliminary results of CUPID were presented by lead researcher Professor John Zajicek at the Association of British Neurologists’ Annual Meeting in Brighton on Tuesday 29th May.

CUPID enrolled nearly 500 people with MS from 27 centres around the UK, and has taken eight years to complete. People with progressive MS were randomised to receive either THC capsules or identical placebo capsules for three years, and were carefully followed to see how their MS changed over this period. The two main outcomes of the trial were a disability scale administered by neurologists (the Expanded Disability Status Scale), and a patient report scale of the impact of MS on people with the condition (the Multiple Sclerosis Impact Scale 29).

Overall the study found no evidence to support an effect of THC on MS progression in either of the main outcomes. However, there was some evidence to suggest a beneficial effect in participants who were at the lower end of the disability scale at the time of enrolment but, as the benefit was only found in a small group of people rather than the whole population, further studies will be needed to assess the robustness of this finding. One of the other findings of the trial was that MS in the study population as a whole progressed slowly, more slowly than expected. This makes it more challenging to find a treatment effect when the aim of the treatment is that of slow progression.

As well as evaluating the potential neuroprotective effects and safety of THC over the long-term, one of the aims of the CUPID study was to improve the way that clinical trial research is done by exploring newer methods of measuring MS and using the latest statistical methods to make the most of every piece of information collected. This analysis will continue for several months. The CUPID study will therefore provide important information about conducting further large scale clinical trials in MS.

Professor John Zajicek, Professor of Clinical Neuroscience at PCMD, Plymouth University, said: “To put this study into context: current treatments for MS are limited, either being targeted at the immune system in the early stages of the disease or aimed at easing specific symptoms such as muscle spasms, fatigue or bladder problems. At present there is no treatment available to slow MS when it becomes progressive. Progression of MS is thought to be due to death of nerve cells, and researchers around the world are desperately searching for treatments that may be ‘neuroprotective’. Laboratory experiments have suggested that certain cannabis derivatives may be neuroprotective.”

He added: “Overall our research has not supported laboratory based findings and shown that, although there is a suggestion of benefit to those at the lower end of the disability scale when they joined CUPID, there is little evidence to suggest that THC has a long term impact on the slowing of progressive MS.”

Dr. Doug Brown, Head of Biomedical Research at the MS Society, said: “There are currently no treatments for people with progressive MS to slow or stop the worsening of disability. The MS Society is committed to supporting research in this area and this was an important study for us to fund. While this study sadly suggests THC is ineffective at slowing the course of progressive MS, we will not stop our search for effective treatments. We are encouraged by the possibility shown by this study that THC may have potential benefits for some people with MS and we welcome further investigation in this area.”

Provided by University of Plymouth

Source: medicalxpress.com

May 30, 20125 notes
#science #neuroscience #psychology #cannabis
Antioxidant Shows Promise as Treatment for Certain Features of Autism

ScienceDaily (May 29, 2012) — A specific antioxidant supplement may be an effective therapy for some features of autism, according to a pilot trial from the Stanford University School of Medicine and Lucile Packard Children’s Hospital that involved 31 children with the disorder.

The antioxidant, called N-Acetylcysteine, or NAC, lowered irritability in children with autism as well as reducing the children’s repetitive behaviors. The researchers emphasized that the findings must be confirmed in a larger trial before NAC can be recommended for children with autism.

Irritability affects 60 to 70 percent of children with autism. “We’re not talking about mild things: This is throwing, kicking, hitting, the child needing to be restrained,” said Antonio Hardan, MD, the primary author of the new study. “It can affect learning, vocational activities and the child’s ability to participate in autism therapies.”

The study appears in the June 1 issue of Biological Psychiatry. Hardan is an associate professor of psychiatry and behavioral sciences at Stanford and director of the Autism and Developmental Disabilities Clinic at Packard Children’s. Stanfordis filing a patent for the use of NAC in autism, and one of the study authors has a financial stake in a company that makes and sells the NAC used in the trial.

Finding new medications to treat autism and its symptoms is a high priority for researchers. Currently, irritability, mood swings and aggression, all of which are considered associated features of autism, are treated with second-generation antipsychotics. But these drugs cause significant side effects, including weight gain, involuntary motor movements and metabolic syndrome, which increases diabetes risk. By contrast, side effects of NAC are generally mild, with gastrointestinal problems such as constipation, nausea, diarrhea and decreased appetite being the most common.

The state of drug treatments for autism’s core features, such as social deficits, language impairment and repetitive behaviors, is also a major problem. “Today, in 2012, we have no effective medication to treat repetitive behavior such as hand flapping or any other core features of autism,” Hardan said. NAC could be the first medication available to treat repetitive behavior in autism — if the findings hold up when scrutinized further.

The study tested children with autism ages 3 to 12. They were physically healthy and were not planning any changes in their established autism treatments during the trial. In a double-blind study design, children received NAC or a placebo for 12 weeks. The NAC used was a pharmaceutical-grade preparation donated by the drug manufacturer Bioadvantex Inc. Subjects were evaluated before the trial began and every four weeks during the study using several standardized surveys that measure problem behaviors, social behaviors, autistic preoccupations and drug side effects.

During the 12-week trial, NAC treatment decreased irritability scores from 13.1 to 7.2 on the Aberrant Behavior Checklist, a widely used clinical scale for assessing irritability. The change is not as large as that seen in children taking antipsychotics. “But this is still a potentially valuable tool to have before jumping on these big guns,” Hardan said.

In addition, according to two standardized measures of autism mannerisms and stereotypic behavior, children taking NAC showed a decrease in repetitive and stereotyped behaviors.

"One of the reasons I wanted to do this trial was that NAC is being used by community practitioners who focus on alternative, non-traditional therapies," Hardan said. "But there is no strong scientific evidence to support these interventions. Somebody needs to look at them."

Hardan cautioned that the NAC for sale as a dietary supplement at drugstores and grocery stores differs in some important respects from the individually packaged doses of pharmaceutical-grade NAC used in the study, and that the over-the-counter version may not produce the same results. “When you open the bottle from the drugstore and expose the pills to air and sunlight, it gets oxidized and becomes less effective,” he said.

Although the study did not test how NAC works, the researchers speculated on two possible mechanisms of action. NAC increases the capacity of the body’s main antioxidant network, which some previous studies have suggested is deficient in autism. In addition, other research has suggested that autism is related to an imbalance in excitatory and inhibitory neurotransmitters in the brain. NAC can modulate the glutamatergic family of excitatory neurotransmitters, which might be useful in autism.

The scientists are now applying for funding to conduct a large, multicenter trial in which they hope to replicate their findings.

"This was a pilot study," Hardan said. "Final conclusions cannot be made before we do a larger trial."

Source: Science Daily

May 30, 20129 notes
#science #neuroscience #psychology #autism
Diabetes Drug Could Be a Promising Therapy for Traumatic Brain Injury

ScienceDaily (May 29, 2012) — TAU research finds that existing diabetes medication may ease damage caused by brain-addling explosions.

Although the death toll is relatively low for people who suffer from traumatic brain injury (TBI), it can have severe, life-long consequences for brain function. TBI can impair a patient’s mental abilities, impact memory and behavior, and lead to dramatic personality changes. And long-term medical treatment carries a high economic cost.

Now, in research commissioned by the United States Air Force, Prof. Chaim Pick of Tel Aviv University’s Sackler Faculty of Medicine and Dr. Nigel Greig of the National Institute of Aging in the US have discovered that Exendin-4, an FDA-approved diabetes drug, significantly minimizes damage in TBI animal models when administered shortly after the initial incident. Originally designed to control sugar levels in the body, the drug has recently been found effective in protecting neurons in disorders such as Alzheimer’s disease.

Prof. Pick’s collaborators include his TAU colleagues Dr. Vardit Rubovitch, Lital Rachmany-Raber, and Prof. Shaul Schreiber, and Dr. David Tweedie of the National Institute of Aging in the US. Detailed in the journal Experimental Neurology, this breakthrough is the first step towards developing a cocktail of medications to prevent as much brain damage as possible following injury.

Diabetes medication to halt trauma

Prof. Pick has been researching TBI for many years, beginning with the effects of everyday injuries such as hitting the windshield in a car accident. As a result of his work for the Air Force, he has expanded his research to include trauma sustained when a person is exposed to an explosion, such as during a terrorist attack.

TBI causes long-term damage by changing the chemistry of the brain. During an explosion, increased pressure followed by an intense vacuum shakes the fluid inside the brain and damages the brain’s structure. This damage cannot be reversed, but mapping the injury through behavioral and physical tests is crucial to understanding and quantifying the damage and forming a treatment plan through therapy or medication.

Prof. Pick and his colleagues designed a pre-clinical experiment that exposed mice to controlled explosions from 23 and 33 feet away, and then analyzed the resulting injuries. They also studied the effect of Exendin-4 as an additional parameter in minimizing brain damage.

The researchers divided their mice into four groups: a control group; a second group that was exposed to the blast without medication; a third group that received the medication but was not exposed to the blast; and a fourth group, exposed to the explosion but given the medication within an hour after the blast and continuing for seven days afterwards. The mice were placed under anesthesia before the explosion.

Behavioral and physical tests showed that the mice that had been exposed to the blast had severely impaired brain function compared to the control group. However, the mice that had also received the Exendin-4 treatment were almost on a par with the control group in terms of brain function, proving that Exendin-4 significantly reduced the long-term damage done by an explosion. In separate experiments, the drug was also associated with an improved outcome in mice who sustained TBI by blunt force.

Finding the ideal drug cocktail

Prof. Pick says this promising discovery can help researchers find the ideal combination of medications to minimize the lasting impact of TBI. “We are moving in the right direction. Now we need to find the right dosage and delivery system, then build a cocktail of drugs that will increase the therapeutic value of this concept,” he explains. He adds that in treating such traumatic injuries, one drug is unlikely to be sufficient.

Source: Science Daily

May 29, 20126 notes
#science #brain #psychology #neuroscience
Evil Eyebrows and Pointy Chin of a Cartoon Villain Make Our ‘threat’ Instinct Kick in

ScienceDaily (May 29, 2012) — New research from the University of Warwick could explain why the evil eyebrows and pointy chin of a cartoon villain make our ‘threat’ instinct kick in.

image

Triangular-shaped face. Psychologists have found that a downward pointing triangle can be perceived to carry a threat. (Credit: © Viktor Kuryan / Fotolia)

Psychologists have found that a downward pointing triangle can be perceived to carry threat just like a negative face in a crowd.

In a paper published in Emotion, a journal of the American Psychological Association, Dr Derrick Watson and Dr Elisabeth Blagrove have carried out a series of experiments with volunteers to find out if simple geometric shapes can convey positive or negative emotions.

Previous research by these scientists showed that people could pick out a negative face in a crowd more quickly than a positive or neutral face and also that it was difficult to ignore faces in general. The researchers carried out a series of experiments asking volunteers to respond to computer-generated images. They were shown positive, negative and neutral faces, and triangles facing upwards, downwards, inward and outward. This latest study shows that downward triangles are detected just as quickly as a negative face.

Dr Watson said: “We know from previous studies that simple geometric shapes are effective at capturing or guiding attention, particularly if these shapes carry the features present within negative or positive faces.”

"Our study shows that downward pointing triangles in particular convey negative emotions and we can pick up on them quickly and perceive them as a threat."

Dr Blagrove added: “If we look at cartoon characters, the classic baddie will often be drawn with the evil eyebrows that come to a downward point in the middle. This could go some way to explain why we associate the downward pointing triangle with negative faces. These shapes correspond with our own facial features and we are unconsciously making that link.”

Source: Science Daily

May 29, 201222 notes
#science #neuroscience #brain #psychology
Researchers restore neuron function to brains damaged by Huntington's disease

May 29, 2012

Researchers from South Korea, Sweden, and the United States have collaborated on a project to restore neuron function to parts of the brain damaged by Huntington’s disease (HD) by successfully transplanting HD-induced pluripotent stem cells into animal models.

Induced pluripotent stem cells (iPSCs) can be genetically engineered from human somatic cells such as skin, and can be used to model numerous human diseases. They may also serve as sources of transplantable cells that can be used in novel cell therapies. In the latter case, the patient provides a sample of his or her own skin to the laboratory.

In the current study, experimental animals with damage to a deep brain structure called the striatum (an experimental model of HD) exhibited significant behavioral recovery after receiving transplanted iPS cells. The researchers hope that this approach eventually could be tested in patients for the treatment of HD.

"The unique features of the iPSC approach means that the transplanted cells will be genetically identical to the patient and therefore no medications that dampen the immune system to prevent graft rejection will be needed,” said Jihwan Song, D.Phil. Associate Professor and Director of Laboratory of Developmental & Stem Cell Biology at CHA Stem Cell Institute, CHA University, Seoul, South Korea and co-author of the study.

The study, published online this week in Stem Cells, found that transplanted iPSCs initially formed neurons producing GABA, the chief inhibitory neurotransmitter in the mammalian central nervous system, which plays a critical role in regulating neuronal excitability and acts at inhibitory synapses in the brain. GABAergic neurons, located in the striatum, are the cell type most susceptible to degeneration in HD.

Another key point in the study involves the new disease models for HD presented by this method, allowing researchers to study the underlying disease process in detail. Being able to control disease development from such an early stage, using iPS cells, may provide important clues about the very start of disease development in HD. An animal model that closely imitates the real conditions of HD also opens up new and improved opportunities for drug screening.

"Having created a model that mimics HD progression from the initial stages of the disease provides us with a unique experimental platform to study Huntington’s disease pathology" said Patrik Brundin, M.D., Ph.D., Director of the Center for Neurodegenerative Science at Van Andel Research Institute (VARI), Head of the Neuronal Survival Unit at Lund University, Sweden, and co-author of the study.

Huntington’s disease (HD) is a neurodegenerative genetic disorder that affects muscle coordination and leads to cognitive decline and psychiatric problems. It typically becomes noticeable in mid-adult life, with symptoms beginning between 35 and 44 years of age. Life expectancy following onset of visual symptoms is about 20 years. The worldwide prevalence of HD is 5-10 cases per 100,000 persons. Key to the disease process is the formation of specific protein aggregates (essentially abnormal clumps) inside some neurons.

Provided by Van Andel Research Institute

Source: medicalxpress.com

May 29, 201213 notes
#science #neuroscience #brain #psychology
Physical sciences illuminate neurodegenerative diseases

May 29, 2012

What do physicists, chemists, mathematicians and biologists have in common? One of the answers at Cambridge is a shared interest in unravelling the processes behind neurodegenerative diseases such as Alzheimer’s, Parkinson’s and Motor Neurone Disease.

image

Dementia. Credit: ©freshidea Fotolia

As more people live to a ripe old age, an increasing number of us will develop neurodegenerative diseases such as Alzheimer’s. Despite the escalating economic costs and human misery associated with these diseases, we still know relatively little about how they develop or how best to tackle them.

Alzheimer’s is the most common neurodegenerative disease. “It’s an enormous problem and we’re not doing very well at the moment in slowing the disease or treating its symptoms effectively,” says Professor Peter St George-Hyslop.

Neurodegenerative diseases such as Alzheimer’s are difficult to study for several reasons. “One is that it’s not easy to get pieces of living brain,” he explains. “It’s also a disease where patients become unable to speak for themselves, so unlike people with AIDS or breast cancer they aren’t demonstrating outside the houses of Parliament demanding funding.”

Although charities and campaigners are doing sterling work raising the profile of Alzheimer’s, until recently attitudes to neurodegenerative disease had much in common with the way we viewed cancer 50 years ago.

“We are, for Alzheimer’s, like where we were for cancer in the 1950s, when people didn’t like to talk about it, were frightened or ashamed of it. And therapeutically we are in the same place; although we are beginning to learn about these diseases we don’t yet have much in the way of effective therapies,” Professor St George-Hyslop says.

One crucial discovery is that proteins misfolding in the brain form clumps or aggregates and these play a major role in causing neurodegenerative diseases. When these proteins misfold they take on certain characteristics that become noxious to cells, but what we need to know now is why these proteins misfold, which aggregates do the damage, and how that damage occurs. Which is where physics, chemistry and mathematics enter the biological picture.

Professor St George-Hyslop leads a group of experts from disparate disciplines, each bringing different tools and different ways of working to the study of neurodegenerative diseases.

What began in late 2008 as a series of meetings has now developed into a 12-strong group funded by a £5.3 million Strategic Award from the Wellcome Trust and Medical Research Council. “It’s a very interesting group of people who came together because they wanted to come together. They each knew they had something to contribute but also that they needed something else – some skills, some knowledge, some point of view – from another member of the group,” he says.

“The biologists among us knew there were techniques that the physicists and chemists had that could help us. They in turn knew we had some biological knowledge that would help them apply, in a sensible way, their very good and insightful physical and chemical tools.”

Among the group is Professor David Klenerman from the Department of Chemistry. One of the inventors of rapid, high-throughput DNA sequencing, he is now applying this knowledge to protein misfolding. From the same department comes Professor Michele Vendruscolo, a theoretical physicist working on the mechanics and thermodynamics of protein misfolding. Professor Chris Dobson, who is also from the Department of Chemistry works on protein misfolding in neurodegenerative diseases, while from the Department of Chemical Engineering and Biotechnology Dr. Clemens Kaminski brings modern laser spectroscopy tools that allow you to watch these proteins misfold inside living cells in real time.

The group has applied these physical tools to study nematode worms in which a mutation produces the same protein misfolding that causes disease in humans. “That ability to see these things as they happen in a living model give us a much greater understanding compared with previous techniques, which essentially involved grinding up biological samples and examining them after these processes had occurred,” Professor St George-Hyslop explains.

“What’s important is the marriage of the physical tool with the biological question,” he says. And he hopes that by revealing where these misfolded proteins act, these new tools could help researchers develop ways of blocking the damage they cause in both Alzheimer’s and other neurodegenerative diseases.

“The primary goal is to understand what the beginning and the middle parts of the process are. We know what the end is – the cell dies and you get a disease – but if you know why the cells get sick and what the mechanisms are then you have a better chance of preventing or halting it,” says Professor St George- Hyslop. “Our goal is to provide that fundamental knowledge of cause and mechanism. Hopefully from that will come some idea of which parts of those pathways you can monitor as a diagnostic and which parts you can block or change as a treatment.”

More recently, the group has been enlarged by a £4.5 million grant from the National Institute of Health Research to support an extension of the Cambridge Biomedical Research Centre via the creation of a Biomedical Research Unit in Dementia for translational research. This has allowed the inclusion of researchers in immunology and in brain imaging from the Department of Medicine and the Wolfson Brain Imaging Centre.

Provided by University of Cambridge

Source: medicalxpress.com

May 29, 201217 notes
#science #neuroscience #brain #psychology
New Effective Treatment for Tinnitus?

ScienceDaily (May 28, 2012) — A team of researchers from Maastricht, Leuven, Bristol and Cambridge demonstrated the effectiveness of a new tinnitus treatment approach in the journal The Lancet. Tinnitus is the perception of a noxious disabling internal sound without an external source. Roughly fifteen percent of the population suffers from this disorder in varying degrees along with the associated concentration problems, sleep disturbances, anxiety, depression and extreme fatigue.

image

Tinnitus is the perception of a noxious disabling internal sound without an external source. (Credit: © BildPix.de / Fotolia)

Sometimes this disorder is so disruptive it seriously impairs their daily functioning and, unfortunately, there is no cure.

The research conducted by Rilana Cima and her colleagues, however, indicates that cognitive behavioural therapy can help improve the daily functioning of tinnitus patients.

The study, conducted at Adelante Audiology & Communication, followed 492 adult tinnitus patients for a period of twelve months. The effectiveness of an innovative tinnitus treatment protocol was compared to the standard treatment methods offered throughout the Netherlands. The ground-breaking, stepped treatment plan consists of cognitive behavioural therapy and combines elements from psychology and audiology. The therapy aims at reducing the negative thoughts and feelings surrounding tinnitus, symptoms through exposure techniques, movement and relaxation exercises, and mindfulness-based elements.

This is supplemented with elements from the so-called tinnitus retraining therapy (TRT), which examines the problems on a sound perception level. The treatment is offered by a multidisciplinary team of audiologists, psychologists, speech and movement therapists, physical therapists and social workers. The project was funded by the Netherlands Organisation for Health Research and Development (ZonMW), and directed by Johan Vlaeyen, professor behavioural medicine at KU Leuven and Maastricht University.

The results offer compelling evidence to support the effectiveness of this innovative and specialised tinnitus therapy over more traditional forms of treatment. The overall health of the tinnitus patient improves and the severity of their symptoms and perceived impairment decreases after therapy. Moreover, the new treatment is far more effective in reducing negative mood, dysfunctional beliefs and tinnitus-related fear). The specialised tinnitus treatment is effective for both milder and more severe forms of the disorder. The researchers are therefore advocating a widespread implementation of this new treatment protocol.

Source: Science Daily

May 29, 201211 notes
#science #psychology #neuroscience #brain
Brain activity revealed when watching a feature film

May 29, 2012

Human brain functions have been studied in the past using relatively simple stimuli, such as pictures of faces and isolated sounds or words. Researchers from Aalto University Department of Biomedical Engineering and Computational Science have now taken a highly different approach: they have studied brain functions in lifelike circumstances.

In their new study, published in PLoS ONE, the group examined how the brain processes the film The Match Factory Girl by Aki Kaurismäki.

Films have been previously used to study brain activity, but the brain activity patterns have been integrated over the whole duration of the film, and thus time information is lost. This is like compressing a whole film into just one frame. In some studies, scientists have looked at dynamic brain activity, but focusing on a single brain region at a time.

The Aalto University scientists on the other hand study the full brain activity patterns with the time resolution allowed by functional magnetic resonance imaging. This way it possible to find out which events in the film cause changes in the brain activity, and which brain areas are activated at each moment.

This analysis revealed, for example, that parts of a brain network that usually respond to speech also become activated during other types of communication, such as writing. Some other areas of the network were very selective to speech.

The researchers combined two complementary approaches to disclose the brain activity. One based on dependencies of activation in different parts of the brain, and the other begins from detailed analysis of the visual and acoustic features of which the film is composed.

The results revealed brain networks in which activity follows remarkably well the complex model of the auditory and visual features of the film. For example, brain activity in the auditory cortex followed the soundtrack extremely well over the whole length of the film, and viewing the motions of characters’ hands reliably activated widespread areas of the brain.

"Our study opens new ways for studying human brain functions. Many brain areas that process sensory information reveal their principles only if sufficiently complex and naturalistic stimuli are used,” explain researcher Juha Lahnakoski and Professor Mikko Sams from Aalto University Department of Biomedical Engineering and Computational Science.

The new methods also make it possible to study brain mechanisms’ underlying behaviour in normal everyday conditions – by simulating them in films.

Provided by Aalto University

Source: medicalxpress.com

May 29, 201211 notes
#science #neuroscience #brain #psychology
People Smile When They Are Frustrated, and the Computer Knows the Difference

ScienceDaily (May 28, 2012) — Do you smile when you’re frustrated? Most people think they don’t — but they actually do, a new study from MIT has found. What’s more, it turns out that computers programmed with the latest information from this research do a better job of differentiating smiles of delight and frustration than human observers do.

image

Can you tell which of these smiles is showing happiness? Or which one is the result of frustration? A computer system developed at MIT can. The answer: The smile on the right is the sign of frustration. (Credit: Images courtesy of Hoque et al.)

The research could pave the way for computers that better assess the emotional states of their users and respond accordingly. It could also help train those who have difficulty interpreting expressions, such as people with autism, to more accurately gauge the expressions they see.

"The goal is to help people with face-to-face communication," says Ehsan Hoque, a graduate student in the Affective Computing Group of MIT’s Media Lab who is lead author of a paper just published in the IEEE Transactions on Affective Computing. Hoque’s co-authors are Rosalind Picard, a professor of media arts and sciences, and Media Lab graduate student Daniel McDuff.

In experiments conducted at the Media Lab, people were first asked to act out expressions of delight or frustration, as webcams recorded their expressions. Then, they were either asked to fill out an online form designed to cause frustration or invited to watch a video designed to elicit a delighted response — also while being recorded.

When asked to feign frustration, Hoque says, 90 percent of subjects did not smile. But when presented with a task that caused genuine frustration — filling out a detailed online form, only to then find the information deleted after pressing the “submit” button — 90 percent of them did smile, he says. Still images showed little difference between these frustrated smiles and the delighted smiles elicited by a video of a cute baby, but video analysis showed that the progression of the two kinds of smiles was quite different: Often, the happy smiles built up gradually, while frustrated smiles appeared quickly but faded fast.

In such experiments, researchers usually rely on acted expressions of emotion, Hoque says, which may provide misleading results. “The acted data was much easier to classify accurately” than the real responses, he says. But when trying to interpret images of real responses, people performed no better than chance, assessing these correctly only about 50 percent of the time.

Understanding the subtleties that reveal underlying emotions is a major goal of this research, Hoque says. “People with autism are taught that a smile means someone is happy,” he says, but research shows that it’s not that simple.

While people may not know exactly what cues they are responding to, timing does have a lot to do with how people interpret expressions, he says, For example, former British prime minister Gordon Brown was widely seen as having a phony smile, largely because of the unnatural timing of his grin, Hoque says. Similarly, a campaign commercial for former presidential candidate Herman Cain featured a smile that developed so slowly — it took nine seconds to appear — that it was widely parodied, including a spoof by comedian Stephen Colbert. “Getting the timing right is very crucial if you want to be perceived as sincere and genuine with your smiles,” Hoque says.

Jeffrey Cohn, a professor of psychology at the University of Pittsburgh who was not involved in this research, says this work “breaks new ground with its focus on frustration, a fundamental human experience. While pain researchers have identified smiling in the context of expressions of pain, the MIT group may be the first to implicate smiles in expressions of negative emotion.”

Cohn adds, “This is very exciting work in computational behavioral science that integrates psychology, computer vision, speech processing and machine learning to generate new knowledge … with clinical implications.” He says this “is an important reminder that not all smiles are positive. There has been a tendency to ‘read’ enjoyment whenever smiles are found. For human-computer interaction, among other fields and applications, a more nuanced view is needed.”

In addition to providing training for people who have difficulty with expressions, the findings may be of interest to marketers, Hoque says. “Just because a customer is smiling, that doesn’t necessarily mean they’re satisfied,” he says. And knowing the difference could be important in gauging how best to respond to the customer, he says: “The underlying meaning behind the smile is crucial.”

The analysis could also be useful in creating computers that respond in ways appropriate to the moods of their users. One goal of the research of Affective Computing Group is to “make a computer that’s more intelligent and respectful,” Hoque says.

Source: Science Daily

May 28, 201232 notes
#science #neuroscience #psychology #emotion
Working with solvents tied to cognitive problems for less-educated people

May 28, 2012

Exposure to solvents at work may be associated with reduced thinking skills later in life for those who have less than a high school education, according to a study published in the May 29, 2012, print issue of Neurology, the medical journal of the American Academy of Neurology.

The thinking skills of people with more education were not affected, even if they had the same amount of exposure to solvents.

"People with more education may have a greater cognitive reserve that acts like a buffer allowing the brain to maintain its ability to function in spite of damage," said study author Lisa F. Berkman, PhD, of Harvard University in Cambridge, Mass. "This may be because education helps build up a dense network of connections among brain cells.”

The study involved 4,134 people who worked at the French national gas and electric company. The majority of the people worked at the company for their entire career. Their lifetime exposure to four types of solvents—chlorinated solvents, petroleum solvents, benzene and non-benzene aromatic solvents—was assessed. The participants took a test of thinking skills when they were an average of 59 years old and 91 percent were retired.

A total of 58 percent of the participants had less than a high school education. Of those, 32 percent had cognitive impairment, or problems with thinking skills, compared to 16 percent of those with more education. Among the less-educated, those who were highly exposed to chlorinated and petroleum solvents were 14 percent more likely to have cognitive problems than those with no exposure. People highly exposed to benzene were 24 percent more likely to have cognitive problems, and those highly exposed to non-benzene aromatic solvents were 36 percent more likely to have cognitive problems.

"These findings suggest that efforts to improve quality and quantity of education early in life could help protect people’s cognitive abilities later in life," Berkman said, who worked alongside study author Erika Sabbath, ScD. "Investment in education could serve as a broad shield against both known and unknown exposures across the lifetime. This is especially important given that some evidence shows that federal levels of permissible exposure for some solvents may be insufficient to protect workers against the health consequences of exposure.”

Provided by American Academy of Neurology

Source: medicalxpress.com

May 28, 201211 notes
#science #neuroscience #psychology #brain
Scientists uncover deja vu mystery

May 28, 2012

In a groundbreaking study, researchers from the Czech Republic and the United Kingdom have discovered a link between the déjà vu phenomenon and structures in the human brain, effectively confirming the neurological origin of this phenomenon. Despite past studies investigating this phenomenon in healthy individuals, no concrete evidence had ever emerged … until now. The study is presented in the journal Cortex.

image

Led by the Central European Institute of Technology, Masaryk University (CEITEC MU) and Masaryk University’s Faculty of Medicine in the Czech Republic, researchers discovered that specific brain structures have a direct impact on the déjà vu experience. The findings of their study showed that the size of these structures are considerably smaller in the brains of the people experiencing déjà vu, compared with individuals who had no personal experience with déjà vu.

The team from CEITEC MU, along with colleagues from other Brno research institutions as well as the University of Exeter in the United Kingdom succeeded in providing huge insight into this phenomenon that has perplexed many over the years.

The team observed how small structures in the brain’s medial temporal lobes, in which memory and recollections originate, were considerably smaller in individuals with the occurrence of déjà vu than in individuals who have not experienced déjà vu. Their findings also showed that the more often the examined individuals experience déjà vu, the smaller the brain structures are.

"One hundred and thirteen healthy subjects underwent a structural examination of their brain by means of magnetic resonance and subsequently by using a new sensitive method for an automatic analysis of brain morphology (source-based morphometry) [and] the size of individual brain regions was compared among the individuals who have never experienced déjà vu and those who have experienced it," said lead author Milan Brázdil from CEITEC.

"Except for the presence of the examined phenomenon, both groups of individuals were fully comparable. When we stimulate the hippocampus, we are able to induce déjà vu in neurological patients. By finding the structural differences in hippocampus in healthy individuals who do and do not experience déjà vu, we have unambiguously proved that déjà vu is directly linked to the function of these brain structures. We think that it is probably a certain small “error in the system” caused by higher excitability of hippocampuses. It is the consequence of changes in the most sensitive brain regions which probably occurred in the course of the development of the neural system.”

Experts say déjà vu, while fascinating, is not an uncommon experience. Between 60% and 80% of healthy individuals have reported occasional occurrences of déjà vu.

Provided by CORDIS

Source: medicalxpress.com

May 28, 2012111 notes
#science #neuroscience #brain #psychology
Understanding how our brain perceives space

May 28, 2012

European scientists looked into the cellular properties of neurons responsible for space coordination. Insight into the neuronal network of the entorhinal cortex will help understand what determines space and movement perception, and also how it is linked to brain-related disorders.

image

The ability to find one’s way is performed in a special site of the mammalian cortex known as the entorhinal cortex. Information regarding place, direction and destination is processed in specialised neurons called grid cells. These cells present with specific spatially firing fields that repeat at regular intervals and have been found to scale up progressively along the dorsal-ventral axis.

Further dissection of this neural map was the subject of the EU-funded project ‘Spatial representation in the entorhinal neural circuit’ (Entorhinal Circuits). More specifically, scientists hypothesised that the topographic expansion of grid cells paralleled changes in cellular properties and particularly in the current (Ih) which went through hyperpolarisation-activated cyclic nucleotide-gated (HCN) channels.

Using transgenic animals with forebrain-specific knockout of the transmembrane protein HCN1, researchers found that HCN1 modulated grid cell properties, especially the size and spacing of the grid fields. This clearly indicated that HCN1 was crucial for the spatial representation in the entorhinal circuit. It also implies that during self-motion–based navigation, the current that goes through HCN1 is responsible for transforming movement signals to spatial firing fields.

Entorhinal Circuits results offered unique insights into some of the fundamental principles of neuronal assembly and microcircuit operation in the mammalian cortex. The generated knowledge will hopefully shed light into the role of the entorhinal cortex in various neuronal diseases like Alzheimer’s and schizophrenia.

Provided by CORDIS

Source: medicalxpress.com

May 28, 201225 notes
#science #neuroscience #brain #psychology
CCR2 Involved in Removing Beta-Amyloid, Could Slow Alzheimer’s Progression

May 25th, 2012

First study to suggest that the immune system may protect against Alzheimer’s changes in humans

Recent work in mice suggested that the immune system is involved in removing beta-amyloid, the main Alzheimer’s-causing substance in the brain. Researchers have now shown for the first time that this may apply in humans.

Researchers at the Peninsula College of Medicine and Dentistry, University of Exeter with colleagues in the National Institute on Aging in the USA and in Italy screened the expression levels of thousands of genes in blood samples from nearly 700 people. The telltale marker of immune system activity against beta-amyloid, a gene called CCR2, emerged as the top marker associated with memory in people.

The team used a common clinical measure called the Mini Mental State Examination to measure memory and other cognitive functions.

image

CCR2 might protect against Alzheimer’s changes, a new study claims. Image adapted from Wikimedia Commons user Pleiotrope.

The previous work in mice showed that augmenting the CCR2-activated part of the immune system in the blood stream resulted in improved memory and functioning in mice susceptible to Alzheimer’s disease.

Professor David Melzer, who led the work, commented: “This is a very exciting result. It may be that CCR2-associated immunity could be strengthened in humans to slow Alzheimer’s disease, but much more work will be needed to ensure that this approach is safe and effective”.

Dr Lorna Harries, co-author, commented: “Identification of a key player in the interface between immune function and cognitive ability may help us to gain a better understanding of the disease processes involved in Alzheimer’s disease and related disorders.”

Alzheimer’s disease is the most common form of dementia and affects around 496,000 people in the UK.

Source: Neuroscience News

May 26, 201212 notes
#science #neuroscience #brain #psychology #alzheimer
Math Predicts Size of Clot-Forming Cells

ScienceDaily (May 25, 2012) — UC Davis mathematicians have helped biologists figure out why platelets, the cells that form blood clots, are the size and shape that they are. Because platelets are important both for healing wounds and in strokes and other conditions, a better understanding of how they form and behave could have wide implications.

"Platelet size has to be very specific for blood clotting," said Alex Mogilner, professor of mathematics, and neurobiology, physiology and behavior at UC Davis and a co-author of the paper, published this week in the journal Nature Communications. “It’s a longstanding puzzle in platelet formation, and this is the first quantitative solution.”

Mogilner and UC Davis postdoctoral scholars Jie Zhu and Kun-Chun Lee developed a mathematical model of the forces inside the cells that turn into platelets, accurately predicting their final size and shape.

They were collaborating with a team led by Joseph Italiano and Jonathon Thon at Harvard Medical School and Brigham and Women’s Hospital, Boston.

Platelets are made by bone marrow cells called megakaryocytes. They bud off first as large, circular pre-platelets, form into a dumbbell-shaped pro-platelet, then finally divide into a standard-sized, disc-shaped platelet. A typical person has about a trillion platelets in circulation at a time, and makes about 100 billion new platelets a day, each living for 8 to 10 days.

Inside the pre- and pro-platelets is a ring of protein microtubules, which exerts pressure to straighten and broaden the nascent cells. But overlying the ring is a rigid cortex of proteins that prevents the platelets from expanding.

By tweaking the number of microtubules in the bundles, Mogilner, Zhu and Lee found that they could correctly predict how pro-platelets would flip into a dumbbell shape, as well as the size and shape of mature platelets.

Source: Science Daily

May 26, 20127 notes
#science #neuroscience
Of mice and mental models: Neuroscientific implications of risk-optimized behavior in the mouse

May 25, 2012 by Stuart Mason Dambrot

(Medical Xpress) — Regardless of an organism’s biological complexity, every encephalized animal continuously makes under-informed behavioral choices that can have serious consequences. Despite its ubiquity, however, there’s a long-standing question about its neurological basis – namely, whether these choices are made through probabilistic world models constructed by the brain, or by reinforcement of learned associations. Recently, however, scientists in the Department of Psychology at Rutgers University found that reinforcement cannot account for the rapidity with which mice modify their behavior when the chance of a given phenomenon changes. The researchers say this indicates that mice may have primordially-evolved neural capabilities to represent likelihood and perform calculations that optimize their resulting behavior – and therefore that such genetic mechanisms can be investigated and manipulated by genetic and other procedures.

image

The experimental environment. In the switch task, a trial proceeds as follows: 1: Light in the Trial-Initiation Hopper signals that the mouse may initiate a trial. 2: The mouse approaches and pokes into the trial-initiation hopper, extinguishing the light there and turning on the lights in the two feeding hoppers (trial onset). 3: The mouse goes to the short-latency hopper and pokes into it. 4: If, after 3 s have elapsed since the trial onset, poking in the short-latency hopper does not deliver a pellet, the mouse switches to the long-latency hopper, where it gets a pellet there in response to the first poke at or after 9 s since the trial onset. Lights in both feeding hoppers extinguish either at pellet delivery or when an erroneously timed poke occurs. Short trials last about 3 s and long trials about 9 s, whether reinforced or not: if the mouse is poking in the short hopper at the end of a 3-s trial, it gets a pellet and the trial ends; if it is poking in the 9-s hopper, it does not get a pellet and the trial ends at 3 s. Similarly, long trials end at 9 s: if the mouse is poking in the 9-s hopper, it gets a pellet; if in the 3-s hopper, it does not. A switch latency is the latency of the last poke in the short hopper before the mouse switches to the long hopper. Only the switch latencies from long trials are analyzed. Copyright © PNAS, doi: 10.1073/pnas.1205131109

In conducting their research, Prof. Randy Gallistel and doctoral student Aaron Kheifets had to first address a key challenge in identifying estimates of stochastic parameters versus reinforcement-driven processes as the behavior-optimizing mechanism in the laboratory mice studied (the c57bl/6j strain of Mus musculus, the common house mouse, from Jackson Labs). “Because both processes can lead to approximately optimal behavior in the long run,” Gallistel tells Medical Xpress, “one has to focus on the short run – that is, on the course of the transition in behavior. The problem in this case is that the transition is a change in the distribution of switch latencies.” A distribution of switch latencies is composed of a great many temporal discriminations on the part of the subject observed over a long sequence of trials, so this distribution can be used to prove that the process generating the distribution changed abruptly.

“Fortunately,” Gallistel continues, “it was obvious from simple inspection of the raw data that there was an abrupt change. The challenge was to develop a mathematical analysis that confirmed this. Meeting this challenge required the use of Bayesian methods, which are just now beginning to be applied to behavioral data. In addition, we had to develop analyses showing that differential reinforcement could not explain the transition.” The team therefore applied Bayesian methods of analysis to the determination of the parameters of a transition function for a 4-parameter mixture distribution.

“Also,” Gallistel adds, “a graphical means of displaying the raw data in such a way as to make the basic phenomenon visually apparent was required. To this end, we devised a figure with a huge number of bits per square centimeter – that is, it shows an enormous amount of readily graspable information in a small space.”

Read More →

May 26, 20126 notes
#science #neuroscience #brain #psychology
Synchronized Brains: Feeling Strong Emotions Makes People's Brains 'Tick Together'

ScienceDaily (May 24, 2012) — Experiencing strong emotions synchronizes brain activity across individuals, a research team at Aalto University and Turku PET Centre in Finland has revealed.

image

Experiencing strong emotions synchronizes brain activity across individuals. (Credit: Image courtesy of Aalto University)

Human emotions are highly contagious. Seeing others’ emotional expressions such as smiles triggers often the corresponding emotional response in the observer. Such synchronization of emotional states across individuals may support social interaction: When all group members share a common emotional state, their brains and bodies process the environment in a similar fashion.

Researchers at Aalto University and Turku PET Centre have now found that feeling strong emotions makes different individuals’ brain activity literally synchronous.

The results revealed that especially feeling strong unpleasant emotions synchronized brain’s emotion processing networks in the frontal and midline regions. On the contrary, experiencing highly arousing events synchronized activity in the networks supporting vision, attention and sense of touch.

"Sharing others’ emotional states provides the observers a somatosensory and neural framework that facilitates understanding others’ intentions and actions and allows to ‘tune in’ or ‘sync’ with them. Such automatic tuning facilitates social interaction and group processes," says Adjunct Professor Lauri Nummenmaa from the Aalto University, Finland.

"The results have major implications for current neural models of human emotions and group behavior. It also deepens our understanding of mental disorders involving abnormal socioemotional processing," Nummenmaa says.

Participants’ brain activity was measured with functional magnetic resonance imaging while they were viewing short pleasant, neutral and unpleasant movies.

Source: Science Daily

May 25, 2012668 notes
#science #neuroscience #brain #psychology
Protein Necessary for Behavioral Flexibility Discovered

ScienceDaily (May 24, 2012) — Researchers have identified a protein necessary to maintain behavioral flexibility, which allows us to modify our behaviors to adjust to circumstances that are similar, but not identical, to previous experiences. Their findings, which appear in the journal Cell Reports, may offer new insights into addressing autism and schizophrenia — afflictions marked by impaired behavioral flexibility.

Our stored memories from previous experiences allow us to repeat certain tasks. For instance, after driving to a particular location, we recall the route the next time we make that trip. However, sometimes circumstances change — one road on the route is temporarily closed — and we need to make adjustments to reach our destination. Our behavioral flexibility allows us to make such changes and, then, successfully complete our task. It is driven, in part, by protein synthesis, which produces experience-dependent changes in neural function and behavior.

However, this process is impaired for many, preventing an adjustment in behavior when faced with different circumstances. In the Cell Reports study, the researchers sought to understand how protein synthesis is regulated during behavioral flexibility.

To do so, they focused on the kinase PERK, an enzyme that regulates protein synthesis. PERK is known to modify eIF2α, a factor that is required for proper protein synthesis. Their experiments involved comparing normal lab mice, which possessed the enzyme, with those that lacked it.

In their study, the mice were asked to navigate a water maze, which included elevating themselves onto a platform to get out of the water. Normal mice and those lacking PERK learned to complete this task.

However, in a second step, the researchers tested the mice’s behavioral flexibility by moving the maze’s platform to another location, thereby requiring them to respond to a change in the terrain. Here, the normal mice located the platform, but those lacking PERK were unable to do so or took significantly more time to complete the task.

A second experiment offered a different test of the role of PERK in aiding behavioral flexibility. In this measure, both normal and mutant mice heard an audible tone that was followed by a mild foot shock. At this stage, all of the mice developed a normal fear response — freezing at the tone in anticipation of the foot shock. However, the researchers subsequently removed the foot shock from the procedure and the mice heard only the tone. Eventually, the normal mice adjusted their responses so they did not freeze after hearing the tone. However, the mutant mice continued to respond as if they expected a foot shock to follow.

The researchers sought additional support for their conclusion that the absence of PERK may contribute to impaired behavioral flexibility in human neurological disorders. To do so, they conducted postmortem analyses of human frontal cortex samples from patients afflicted with schizophrenia, who often exhibit behavioral inflexibility, and unaffected individuals. The samples from the control group showed normal levels of PERK while those from the schizophrenic patients had significantly reduced levels of the protein.

"A rapidly expanding list of neurological disorders and neurodegenerative diseases, including Alzheimer’s disease, Parkinson’s disease, and Fragile X syndrome, have already been linked to aberrant protein synthesis," explained Eric Klann, a professor in NYU’s Center for Neural Science and one of the study’s co-authors. "Our results show the significance of PERK in maintaining behavioral flexibility and how its absence might be associated with schizophrenia. Further studies clarifying the specific role of PERK-regulated protein synthesis in the brain may provide new avenues to tackle such widespread and often debilitating neurological disorders."

Source: Science Daily

May 25, 201219 notes
#science #neuroscience #brain #psychology
Boundary stops molecule right where it needs to be

May 24, 2012

A molecule responsible for the proper formation of a key portion of the nervous system finds its way to the proper place not because it is actively recruited, but instead because it can’t go anywhere else.

Researchers at Baylor College of Medicine have identified a distal axonal cytoskeleton as the boundary that makes sure AnkyrinG clusters where it needs to so it can perform properly.

The findings appear in the current edition of Cell.

"It has been known that AnkyrinG is needed for the axon initial segment to form. Without the axon initial segment there would be no output of information within the nervous system,” said Dr. Matthew Rasband, associate professor of neuroscience at BCM. “Every known protein found at the axon initial segment depends on AnkyrinG, so if it is eliminated then the axon initial segment doesn’t form and the neuron doesn’t fire.”

To answer the question of how AnkyrinG gets to where it needs to be for proper function, Rasband, along with first author Dr. Mauricio Galiano, postdoctoral associate in neuroscience at BCM, and colleagues, began by analyzing how the axon initial segment forms. They found that AnkyrinG always appeared in exactly the same spot during development.

"It would start to enter into the axon and then it was almost as if it hit a wall and couldn’t go any further," Rasband said. "We would see it stop very close to the cell body and then it would backfill. This showed us that there was some type of boundary or barrier marking that area."

To further study the properties of the boundary they began to look at ways they could disrupt or move it to test the effects of AnkyrinG clustering in different areas.

In cell cultures mouse models they were able to move the boundary to different distances along the axon. Doing this allowed researchers to change the length of the axon initial segment. If the boundary was farther away from the cell body than the length of the segment was longer. If it was closer to the cell body, then the length was shorter.

When researchers removed the boundary all together, AnkyrinG would not cluster in the appropriate area and the axon initial segment would not form.

"We had anticipated there was a kind of molecule that recruited AnkyrinG but instead we found a barrier that excludes it," Rasband said. "These results have important implications because they imply a similar exclusion mechanism might be in play or functioning not only at the axon initial segment, but all of the places where AnkyrinG is found."

Rasband said within many disorders like autism or epilepsy proteins that AnkyrinG is responsible for forming are disrupted. So understanding how this molecule functions properly could one day play a role in finding treatment targets for diseases.

Provided by Baylor College of Medicine

Source: medicalxpress.com

May 24, 20127 notes
#science #neuroscience #brain #psychology
Locating ground zero: How the brain's emergency workers find the disaster area

May 24, 2012

Like emergency workers rushing to a disaster scene, cells called microglia speed to places where the brain has been injured, to contain the damage by ‘eating up’ any cellular debris and dead or dying neurons. Scientists at the European Molecular Biology Laboratory (EMBL) in Heidelberg, Germany, have now discovered exactly how microglia detect the site of injury, thanks to a relay of molecular signals. Their work, published today in Developmental Cell, paves the way for new medical approaches to conditions where microglia’s ability to locate hazardous cells and material within the brain is compromised.

image

Microglia (green) move to the site of injury (arrow) to clear up debris. Credit: Copyright EMBL/Peri

"Considering that they help keep our brain healthy, we know surprisingly little about microglia," says Francesca Peri, who led the work. "Now, for the first time, we’ve identified the mechanism that allows microglia to detect brain injury, and how that emergency call is transmitted from neuron to neuron.”

image

When microglia (green) cannot detect ATP (bottom), they don’t move to the injury site as they usually would (top). Credit: Copyright EMBL/Peri

When an emergency occurs, cries can alert bystanders, who will dial the emergency number. A call will go out over the radio, and ambulances, police or fire engines in the area will respond as needed. In the brain, Peri and colleagues found, injured neurons send out their own distress cry: they release a molecule called glutamate. Neighbouring neurons sense that glutamate and respond by taking up calcium. As glutamate spreads out from the injury site, this creates a wave of calcium swallowing. Along that wave, as neurons take up calcium they release a third molecule, called ATP. When the wave comes within reach, a microglial cell detects that ATP and takes it as a call to action, moving in that direction – essentially tracing the wave backwards until it reaches the injury.

Scientists knew already that microglia can detect ATP, but this molecule doesn’t last long outside of cells, so there were doubts about how ATP alone could be a signal that carried far enough to reach microglia located far from the site of injury. The trick, as Peri and colleagues discovered, is the long-lasting glutamate-driven calcium wave that can travel the length of the brain. Thanks to this wave, the ATP signal is not just emitted by the injured cells, but is repeatedly sent out by the neurons along the way, until it reaches microglia.

Dirk Sieger and Christian Moritz in Peri’s lab took advantage of the fact that zebrafish have transparent heads, which allow scientists to peer down a microscope straight into the fish’s brain. They used a laser to injure a few of the fish’s brain cells, and watched fluorescently-labelled microglia move in on the injury. When they genetically engineered zebrafish to make neurons’ calcium levels traceable under the microscope, too, the scientists were able to confirm that when the calcium wave reached microglia, these cells immediately started moving toward the injury.

Knowing all the steps in this process, and how they feed into each other, could help to design treatments to improve microglia’s detection ability, which go awry in conditions such as Alzheimer’s and Parkinson’s diseases.

Provided by European Molecular Biology Laboratory

Source: medicalxpress.com

May 24, 201212 notes
#science #neuroscience #brain #psychology
Persistent sensory experience is good for aging brain

May 24, 2012

Despite a long-held scientific belief that much of the wiring of the brain is fixed by the time of adolescence, a new study shows that changes in sensory experience can cause massive rewiring of the brain, even as one ages. In addition, the study found that this rewiring involves fibers that supply the primary input to the cerebral cortex, the part of the brain that is responsible for sensory perception, motor control and cognition. These findings promise to open new avenues of research on brain remodeling and aging.

Published in the May 24, 2012 issue of Neuron, the study was conducted by researchers at the Max Planck Florida Institute (MPFI) and at Columbia University in New York.

"This study overturns decades-old beliefs that most of the brain is hard-wired before a critical period that ends when one is a young adult," said MPFI neuroscientist Marcel Oberlaender, PhD, first author on the paper. "By changing the nature of sensory experience, we were able to demonstrate that the brain can rewire, even at an advanced age. This may suggest that if one stops learning and experiencing new things as one ages, a substantial amount of connections within the brain may be lost."

The researchers conducted their study by examining the brains of older rats, focusing on an area of the brain known as the thalamus, which processes and delivers information obtained from sensory organs to the cerebral cortex. Connections between the thalamus and the cortex have been thought to stop changing by early adulthood, but this was not found to be the case in the rodents studied.

Being nocturnal animals, rats mainly rely on their whiskers as active sensory organs to explore and navigate their environment. For this reason, the whisker system is an ideal model for studying whether the brain can be remodeled by changing sensory experience. By simply trimming the whiskers, and preventing the rats from receiving this important and frequent form of sensory input, the scientists sought to determine whether extensive rewiring of the connections between the thalamus and cortex would occur.

On examination, they found that the animals with trimmed whiskers had altered axons, nerve fibers along which information is conveyed from one neuron (nerve cell) to many others; those whose whiskers were not trimmed had no axonal changes. Their findings were particularly striking as the rats were considered relatively old – meaning that this rewiring can still take place at an age not previously thought possible. Also notable was that the rewiring happened rapidly – in as little as a few days.

"We’ve shown that the structure of the rodent brain is in constant flux, and that this rewiring is shaped by sensory experience and interaction with the environment," said Dr. Oberlaender. "These changes seem to be life-long and may pertain to other sensory systems and species, including people. Our findings open the possibility of new avenues of research on development of the aging brain using quantitative anatomical studies combined with noninvasive imaging technologies suitable for humans, such as functional MRI (fMRI)."

The study was possible due to recent advances in high-resolution imaging and reconstruction techniques, developed in part by Dr. Oberlaender at MPFI. These novel methods enable researchers to automatically and reliably trace the fine and complex branching patterns of individual axons, with typical diameters less than a thousandth of a millimeter, throughout the entire brain.

Provided by Tartaglia Communications

Source: medicalxpress.com

May 24, 201216 notes
#science #neuroscience #brain #psychology
The auditory cortex adapts agilely with concentration

May 24, 2012

The birth of sensory perception on the human cerebral cortex is yet to be fully explained. The different areas on the cortex function in cooperation, and no perception is the outcome of only one area working alone. In his doctoral dissertation for the Department of Biomedical Engineering and Computational Science in Aalto University Jaakko Kauramäki shows that the auditory cortex is not left to its own devices.

Kauramäki’s dissertation in the field of cognitive neuroscience studied neural top-down processes, that is, the ways the brain as a system handles sounds arriving onto the auditory cortex in the frontal lobes.

Moving from parts towards a whole, bottom-up processes analyse a sound by dissecting it in hierarchical chain reactions from small and sophisticated bits towards a concise auditory sensation.

"The operation of the system as a whole can be affected by focusing on a specific task or sound. In my research I focused precisely on how the top-down effects manifest themselves on the auditory cortex," explains Kauramäki his study.

Right kind of noise promotes concentration and reinforces perception?

Kauramäki studied the auditory cortex in two separate tasks: reactions caused by selective attention during sound recognition and by lipreading. Kauramäki recorded the electrical and magnetic activity on the cortex using electroencephalography (EEG) and magnetoencephalography (MEG) respectively.

"40 years ago a so-called ‘gain effect’ was formulated: focusing attention enhances responses on the auditory cortex, which means that attention helps to better perceive audio stimuli," tells Kauramäki.

In the attention tests Kauramäki masked the sounds played for the test subjects with different frequencies of noise – and made a discovery. During periods of selective attention, the enhanced responses on the auditory cortex depended on the type of noise used. The frequency content of the noise affected the prominence of the responses. The responses are not only enhanced, but they are feature and task-specific.

"Similar results have not been obtained earlier because the stimuli used in the experiments have been too simple. The noise mask added a combinatory effect that brought the specificity and selectivity of the responses to the fore."

"Focusing attention may then be easier in a rich sound environment. Complete silence is of course an extreme case, but in total silence the auditory cortex begins to create connections out of thin air, to make up sensory perceptions."

"Then again, the more stimuli there are in the environment, the harder it becomes to focus. In attention disorders such as ADHD, precisely the top-down ability to filter sounds may be lacking," suspects Kauramäki.

In the lipreading tasks Kauramäki did not encounter such a dependency on frequency. Instead, lipreading suppressed the auditory cortex’s ability to react. The reason for this is the neural response of the speech production system.

"The suppressing effect is caused by the adaptation of the areas on the auditory cortex that specialise in speech. Suppressing occurs even when the speech is inaudible – the articulatory gestures of the mouth alone activate parts of the auditory cortex."

For Kauramäki the result suggests that the neural responses of the speech production system can reach the auditory cortex and thus reinforce perception.

"In noisy meetings, for example, it pays off to concentrate on the face of whoever is speaking: lipreading helps in the processing. It may suppress the reaction of the auditory cortex, but the big picture becomes clearer."

Provided by Aalto University

Source: medicalxpress.com

May 24, 201210 notes
#science #neuroscience #brain #psychology
World's biggest stroke clot-buster trial reveals patient benefits

May 24, 2012

(Medical Xpress) — Patients given a clot-busting drug within six hours of a stroke are more likely to make a better recovery than those who do not receive the treatment, new research has found.

The trial was set up in 2000 by the University of Sydney’s Professor Richard Lindley, while he was employed at the University of Edinburgh.

The study of more than 3000 patients is the world’s largest trial of the drug rt-PA and was coordinated at the University of Edinburgh. Since coming to Sydney Medical School in 2003, Professor Lindley has continued as the co-principal investigator of the research.

The findings of the study are published today in The Lancet, alongside an analysis of all other trials of the drug carried out in the past 20 years.

The trial found that following treatment with the drug rt-PA, which is given intravenously to patients who have suffered an acute ischaemic stroke, more patients were able to look after themselves.

"The trial results, together with the updated review, mean that rt-PA can now be offered to a much wider group of patients presenting with stroke", Professor Lindley said.

A patient’s chances of making a complete recovery within six months of a stroke were also increased.

An ischaemic stroke happens when the brain’s blood supply is interrupted by a blood clot. The damage caused can be permanent or fatal.

Researchers now know that for every 1000 patients given rt-PA within three hours of stroke, 80 more will survive and live without help from others than if they had not been given the drug.

The benefits of using rt-PA do come at a price, say researchers. Patients are at risk of death within seven days of treatment because the drug can cause a secondary bleed in the brain. The research team concluded that the benefits were seen in a wide variety of patients, despite the risks.

Stroke experts stress that these mortality figures need to be viewed in the context of deaths from stroke. Without treatment, one third of people who suffer a stroke die, with another third left permanently dependent and disabled.

Researchers say the threat of death and disability means many stroke patients are prepared to take the early risks of being treated with rt-PA to avoid being disabled.

The authors conclude that for those who do not experience bleeding, the drug improves patients’ longer term recovery.

About half of those who took part in the trial were over 80.

"The trial underlines the benefits of treating patients with the drug as soon as possible and provides the first reliable evidence that treatment is effective for those aged 80 and over," Professor Lindley said.

The study also found no reason to restrict use of rt-PA - also known as alteplase - on the basis of how severe a patient’s stroke has been.

Chief investigator Professor Peter Sandercock of the University of Edinburgh’s Centre for Clinical Brain Sciences said: “Our trial shows that it is crucial that treatment is given as fast as possible to all suitable patients.”

Provided by University of Sydney

Source: medicalxpress.com

May 24, 20127 notes
#science #neuroscience #brain #psychology #stroke
Genetic 'reset switch' enables signaling pathway to induce multiple developmental outcomes for olfactory neurons

May 24, 2012

Within the nervous system, a handful of signaling pathways modulate development of a cornucopia of different neuronal subtypes. “Even small alterations in neuron differentiation pathways can disrupt subsequent circuit organization and catalyze the genesis of neurological disorders,” explains Adrian Moore of the RIKEN Brain Science Institute in Wako.

image

Figure 1: Interplay between Notch signaling and Hamlet activity gives rise to diverse olfactory receptor neurons (ORNs), each with distinct structures and subsets of olfactory receptors (left). The precursor cell (right) divides to yield two daughter cells, one of which undergoes Notch (N)-mediated gene activation. Hamlet (Ham) subsequently resets Notch’s genetic effects, and the absence or subsequent restoration of Notch signaling determines which type of ORN (Naa or Nab) will result from differentiation. Credit: 2012 Adrian Moore, RIKEN Brain Science Institute

Recent work from Moore’s team, which includes Keita Endo of the University of Tokyo, has revealed mechanisms governing this complexity in the fruit fly olfactory system. Within the antennae—the fly equivalent of the nose—it was known that cells called neuronal precursors undergo multiple rounds of ‘asymmetric division’, wherein each resulting daughter cell follows a distinct developmental path, yielding different combinations of olfactory receptor neurons (ORNs). Moore’s team showed specifically that ORN precursors undergo two rounds of division, yielding four different cellular subtypes, three of which will typically mature into ORNs.

Earlier work from Endo showed that the activation or suppression of signaling by the Notch protein helps differentiate these cellular fates, but other factors were clearly involved. Their joint research demonstrated that a second protein, Hamlet, modulates the effects of Notch. 

“This [process] provides an important foundation for all future studies of odorant receptor expression and axon targeting control on the olfactory system,” says Moore. The researchers found that presence or absence of Notch and Hamlet activity plays a central role in establishing the identity of these subtypes, and this in turn determines both the connections formed by the resulting ORNs as well as the subset of olfactory receptor proteins that will be expressed (Fig. 1). 

Moore and Endo’s study also revealed a surprising mode of action for Hamlet. Chromosomal DNA is wrapped around clusters of protein, and chemical changes to those proteins profoundly alter local gene activity—a mechanism called ‘epigenetic regulation’. They found that Hamlet selectively deactivates genes activated by Notch by triggering such changes. This means that immature ORNs produced by division of a Notch-activated cell can essentially be ‘reset’ by Hamlet. The ultimate developmental fate of those cells is then determined, in part, by whether or not they subsequently undergo a new round of Notch activation. 

Moore and colleagues also observed that, beyond simply switching off active Notch genes, Hamlet may define subsets of target genes that can subsequently be reactivated by Notch signaling. “The modifications induced by Hamlet may help establish cell fate by marking gene promoters for use later during differentiation,” says Moore. “This could prove fundamental to understanding the process of neuronal diversification.”

Provided by RIKEN

Source: medicalxpress.com

May 24, 20124 notes
#science #neuroscience #brain #psychology #neuron
No new neurons in the human olfactory bulb

May 24, 2012

(Medical Xpress) — Research from Karolinska Institutet shows that the human olfactory bulb - a structure in the brain that processes sensory input from the nose - differs from that of other mammals in that no new neurons are formed in this area after birth. The discovery, which is published in the scientific journal Neuron, is based on the age-determination of the cells using the carbon-14 method, and might explain why the human sense of smell is normally much worse than that of other animals.

"I’ve never been so astonished by a scientific discovery," says lead investigator Jonas Frisén, Tobias Foundation Professor of stem cell research at Karolinska Institutet. "What you would normally expect is for humans to be like other animals, particularly apes, in this respect."

It was long thought that all brain neurons were formed up to the time of birth, after which production stopped. A paradigm shift occurred when scientists found that nerve cells were being continually formed from stem cells in the mammalian brain, which changed scientific views on the plasticity of the brain and raised hopes of being able to replace neurons lost during some types of neurological disease.

In the adult mammal, new nerve cells are formed in two regions of the brain: the hippocampus and the olfactory bulb. While the former has an important part to play in memory, the latter is essential to the interpretation of smells. However, owing to the difficulty of studying the formation of new neurons in humans, the extent to which this phenomenon also occurs in the human brain has remained unclear. In this present study, researchers at Karolinska Institutet and their Austrian and French colleagues made use of the sharp rise in atmospheric carbon-14 caused by Cold War nuclear tests to find an answer to this question.

Carbon-14 is incorporated in DNA, making it possible to gauge the age of the cells by measuring how much of the isotope they contain. Doing this, the team found that the olfactory bulb neurons in their adult human subjects had carbon-14 levels that matched those at the atmosphere at the time of their birth. This is a strong indication that there is no significant generation of new neurons in this part of the brain, something that sets humans apart from all other mammals.

"Humans are less dependent on their sense of smell for their survival than many other animals, which may be related to the loss of new cell generation in the olfactory bulb, but this is just speculation,” says Professor Frisén.

Professor Frisén and his team now plan to study the extent of neuron generation in the hippocampus, a part of the brain that is important for higher cerebral functions in humans.

Provided by Karolinska Institutet

Source: medicalxpress.com

May 24, 201212 notes
#science #neuroscience #brain #psychology #neuron
'Obesity Genes' May Influence Food Choices, Eating Patterns

ScienceDaily (May 23, 2012) — Blame it on your genes? Researchers from The Miriam Hospital’s Weight Control and Diabetes Research Center say individuals with variations in certain “obesity genes” tend to eat more meals and snacks, consume more calories per day and often choose the same types of high fat, sugary foods.

image

Blame it on your genes? Researchers say individuals with variations in certain “obesity genes” tend to eat more meals and snacks and consume more calories per day. (Credit: © Gennadiy Poznyakov / Fotolia)

Their study, published online by the American Journal of Clinical Nutrition and appearing in the June issue, reveals certain variations within the FTO and BDNF genes — which have been previously linked to obesity — may play a role in eating habits that can cause obesity.

The findings suggest it may be possible to minimize genetic risk by changing one’s eating patterns and being vigilant about food choices, in addition to adopting other healthy lifestyle habits, like regular physical activity.

"Understanding how our genes influence obesity is critical in trying to understand the current obesity epidemic, yet it’s important to remember that genetic traits alone do not mean obesity is inevitable," said lead author Jeanne M. McCaffery, Ph.D., of The Miriam Hospital’s Weight Control and Diabetes Research Center.

"Our lifestyle choices are critical when it comes to determining how thin or heavy we are, regardless of your genetic traits," she added. "However, uncovering genetic markers can possibly pinpoint future interventions to control obesity in those who are genetically predisposed."

Previous research has shown individuals who carry a variant of the fast mass and obesity-associated gene FTO and BDNF (or brain-derived neurotrophic factor gene) are at increased risk for obesity. The genes have also been linked with overeating in children and this is one of the first studies to extend this finding to adults. Both FTO and BDNF are expressed in the part of the brain that controls eating and appetite, although the mechanisms by which these gene variations influence obesity is still unknown.

As part of the Look AHEAD (Action in Health and Diabetes) trial, more than 2,000 participants completed a questionnaire about their eating habits over the past six months and also underwent geneotyping. Researchers focused on nearly a dozen genes that have been previously associated with obesity. They then examined whether these genetic markers influenced the pattern or content of the participants’ diet.

Variations in the FTO gene specifically were significantly associated with a greater number of meals and snacks per day, greater percentage of energy from fat and more servings of fats, oils and sweets. The findings are largely consistent with previous research in children.

Researchers also discovered that individuals with BDNF variations consumed more servings from the dairy and the meat, eggs, nuts and beans food groups. They also consumed approximately 100 more calories per day, which McCaffery notes could have a substantial influence on one’s weight.

"We show that at least some of the genetic influence on obesity may occur through patterns of dietary intake," she said. "The good news is that eating habits can be modified, so we may be able to reduce one’s genetic risk for obesity by changing these eating patterns."

McCaffery says that while this research greatly expands their knowledge on how genetics may influence obesity, the data must be replicated before the findings can be translated into possible clinical measures.

Source: Science Daily

May 24, 20127 notes
#science #neuroscience #brain #psychology #obesity
Antioxidant Urate Could Protect Against Parkinson’s Disease

May 23rd, 2012

Study supports urate protection against Parkinson’s disease, hints at novel mechanism

In vitro study indicates urate protection extends beyond antioxidant effect

Use of the antioxidant urate to protect against the neurodegeneration caused by Parkinson’s disease appears to rely on more than urate’s ability to protect against oxidative damage. In the May issue of the open-access journal PLoS One, researchers from the MassGeneral Institute for Neurodegenerative Diseases (MGH-MIND) describe experiments suggesting the involvement of a novel mechanism in urate’s protection of cultured brain cells against Parkinson’s-like damage.

“Our experiments showed, unexpectedly, that urate’s ability to protect neurons requires the presence of neighboring cells called astrocytes,” says Michael Schwarzschild, MD, PhD, of MGH-MIND, the study’s senior author. “The results suggest there may be multiple ways that raising urate could help protect against neurodegeneration in diseases like Parkinson’s and further support the development of treatments designed to elevate urate in the brain.” Schwarzschild and colleagues in the Parkinson’s Study Group currently are conducting a clinical trial investigating one approach to that strategy.

Characterized by tremors, rigidity, difficulty walking and other symptoms, Parkinson’s disease is caused by destruction of brain cells that produce the neurotransmitter dopamine. Several epidemiological studies suggested that healthy people with elevated levels of urate, a normal component of the blood, may have a reduced risk of developing Parkinson’s disease, and investigations by Schwarzschild’s team found that Parkinson’s patients with higher naturally occuring urate levels had slower progression of their symptoms.

The current study was designed to investigate whether both added urate and urate already present within the cells protect cultured dopamine-producing neurons against Parkinson-like degeneration. In addition, since previous studies suggested that urate’s protective effects depended on the presence of astrocytes,  star-shaped cells of the central nervous system that provide both structural and metabolic support to neurons,  the MGH-MIND team explored how the presence of astrocytes affects the ability of urate to protect against damage induced by MPP+, a toxic molecule that produces the same kind of neurodegeneration seen in Parkinson’s and is widely used in research studies.

image

Raising urate levels could help to protect against neurodegenerative diseases like Parkinsons. Image adapted from Flickr user Niels_Olson.

The experiments showed that, while added urate reduced MPP+-induced cell death by about 50 percent in cultured dopamine-producing mouse neurons, urate treatment virtually eliminated neuronal death in cultures containing both neurons and astrocytes. They also showed that reducing intracellular urate levels by induced expression of the enzyme that breaks it down increased neuronal vulnerability to MPP+ toxicity significantly in cultures that included astrocytes but only slightly in neuron-rich cultures. The fact that the presence of astrocytes greatly increases the protection of both externally applied urate and urate produced within cells indicates that the effect depends on more than urate’s ability to directly protect neurons against oxidative stress.

“A valuable next step will be determining whether endogenous urate is protective in live animal models of Parkinson’s disease,” says Schwarzschild. “It also will be important to determine whether we can selectively increase urate levels in brain cells by targeting urate transporter molecules. The approach now in early clinical trials examines whether treatment with the urate precursor inosine, which increases urate levels throughout the body, can slow the progression of the disease. If we could raise urate levels in brain cells without changing them in the rest of the body, we could avoid the risks of of excessive urate, which when accumulated in joints can cause gout.”

Source: Neuroscience News

May 24, 201210 notes
#science #neuroscience #brain #psychology #parkinson
Study shows how immune cells change wiring of the developing mouse brain

May 23, 2012

Researchers have shown in mice how immune cells in the brain target and remove unused connections between brain cells during normal development. This research, supported by the National Institutes of Health, sheds light on how brain activity influences brain development, and highlights the newly found importance of the immune system in how the brain is wired, as well as how the brain forms new connections throughout life in response to change.

Disease-fighting cells in the brain, known as microglia, can prune the billions of tiny connections (or synapses) between neurons, the brain cells that transmit information through electric and chemical signals. This new research demonstrates that microglia respond to neuronal activity to select synapses to prune, and shows how this pruning relies on an immune response pathway – the complement system – to eliminate synapses in the way that bacterial cells or other pathogenic debris are eliminated. The study was led by Beth Stevens, Ph.D., assistant professor of neurology at Boston Children’s Hospital and Harvard Medical School.

The brain is created with many more synapses than it retains into adulthood. As the brain develops, it goes through dynamic changes to refine its circuitry, trimming away the synaptic connections that do not have a lot of activity, and preserving the stronger, more active synapses. This period, known as synaptic pruning, is a key part of normal brain development.

Scientists do not have a clear understanding of how these synapses are selected, targeted and then pruned. However, precise elimination of unused synapses and strengthening those that are most needed is essential for normal brain function. Many childhood disorders, such as amblyopia (a loss of vision in one eye that can occur when the eyes are misaligned), various forms of mental retardation, epilepsy and autism are thought to be due to abnormal brain development.

Microglia originate in the bone marrow and transform into an activated state to defend the body against infections. Activated microglia are also found in other disease states, ranging from stroke to Alzheimer’s disease. It is not always clear, however, if these cells cause degeneration of brain cells, or if they are part of the brain’s recovery process. In more recent years, several research groups reported that activated microglia are also present in the normal brain. Additionally, during the most robust synaptic pruning periods there is an increased number of activated microglia present and clustered around synapses.

As reported in the May 24 issue of Neuron, scientists in Dr. Stevens’s lab used the visual system in mice to study synaptic pruning, a model that undergoes robust change and remodeling during development and which has circuitry that is well-defined and easy to manipulate. Researchers labeled neurons that project from the eye into an area of the brain called the lateral geniculate nucleus, or LGN, and found that reactive microglia contained portions of the synapses from the labeled neurons. They also saw that these labeled pieces of synaptic material were specifically found inside the microglia’s lysosomes – compartments responsible for digesting foreign particles.

The researchers then investigated if the amount of neuronal activity at a synapse determines whether microglia target it for removal. They used a drug to increase activity in the neurons projecting from one eye and saw less pruning of synapses in the corresponding brain region, as compared to the untreated eye. When they used a drug to reduce activity, this resulted in more pruning compared to the untreated eye. The researchers think microglia select a synapse for removal based on the synapse’s level of activity. This may be directly relevant to amblyopia, a loss of vision in one eye that can occur when the eyes are misaligned. Children with amblyopia will preferentially use one eye and vision in the less used eye deteriorates due loss of synapses and cells in the LGN.

Earlier research revealed that proteins involved in the complement system are found near synapses during development and are necessary for pruning. To see if these same proteins are used by microglia to shape neuronal connections, the researchers disrupted complement pathway proteins that are found only in the brain’s immune cells. Their results indicate that these complement proteins signal the microglia to trim away synapses, and suggest that immune system pathways are key to proper synaptic pruning.

"The concept that microglia prune synapses using immune system pathways has been difficult to prove,” said Edmund Talley, Ph.D., program director at the National Institute of Neurological Disorders and Stroke, “This exquisitely careful and meticulous research confirms the role of microglia in brain development, plasticity and learning.”

Dr. Stevens said the study sheds light on the role of microglia in the normal brain, and supports further investigations into the role of microglia in brain disease. “Almost every neurodegenerative brain disease involves several interesting common denominators,” she said. “It’s becoming increasingly recognized that early synapse loss is a hallmark of many neurodegenerative diseases.”

Provided by NIH/National Institute of Neurological Disorders and Stroke

Source: medicalxpress.com

May 24, 20127 notes
#science #neuroscience #brain #psychology
Brain research shows visual perception system unconsciously affects our preferences

May 23, 2012

When grabbing a coffee mug out of a cluttered cabinet or choosing a pen to quickly sign a document, what brain processes guide your choices?

New research from Carnegie Mellon University’s Center for the Neural Basis of Cognition (CNBC) shows that the brain’s visual perception system automatically and unconsciously guides decision-making through valence perception. Published in the journal Frontiers in Psychology, the review hypothesizes that valence, which can be defined as the positive or negative information automatically perceived in the majority of visual information, integrates visual features and associations from experience with similar objects or features. In other words, it is the process that allows our brains to rapidly make choices between similar objects.

The findings offer important insights into consumer behavior in ways that traditional consumer marketing focus groups cannot address. For example, asking individuals to react to package designs, ads or logos is simply ineffective. Instead, companies can use this type of brain science to more effectively assess how unconscious visual valence perception contributes to consumer behavior.

To transfer the research’s scientific application to the online video market, the CMU research team is in the process of founding the start-up company neonlabs through the support of the National Science Foundation (NSF) Innovation Corps (I-Corps).

"This basic research into how visual object recognition interacts with and is influenced by affect paints a much richer picture of how we see objects," said Michael J. Tarr, the George A. and Helen Dunham Cowan Professor of Cognitive Neuroscience and co-director of the CNBC. “What we now know is that common, household objects carry subtle positive or negative valences and that these valences have an impact on our day-to-day behavior.”

Tarr added that the NSF I-Corps program has been instrumental in helping the neonlabs’ team take this basic idea and teaching them how to turn it into a viable company. “The I-Corps program gave us unprecedented access to highly successful, experienced entrepreneurs and venture capitalists who provided incredibly valuable feedback throughout the development process,” he said.

NSF established I-Corps for the sole purpose of assessing the readiness of transitioning new scientific opportunities into valuable products through a public-private partnership. The CMU team of Tarr, Sophie Lebrecht, a CNBC and Tepper School of Business postdoctoral fellow, Babs Carryer, an embedded entrepreneur at CMU’s Project Olympus, and Thomas Kubilius, president of Pittsburgh-based Bright Innovation and adjunct professor of design at CMU, were awarded a $50,000, six-month grant to investigate how understanding valence perception could be used to make better consumer marketing decisions. They are launching neonlabs to apply their model of visual preference to increase click rates on online videos, by identifying the most visually appealing thumbnail from a stream of video. The web-based software product selects a thumbnail based on neuroimaging data on object perception and valence, crowd sourced behavioral data and proprietary computational analyses of large amounts of video streams.

"Everything you see, you automatically dislike or like, prefer or don’t prefer, in part, because of valence perception," said Lebrecht, lead author of the study and the entrepreneurial lead for the I-Corps grant. "Valence links what we see in the world to how we make decisions."

Lebrecht continued, “Talking with companies such as YouTube and Hulu, we realized that they are looking for ways to keep users on their sites longer by clicking to watch more videos. Thumbnails are a huge problem for any online video publisher, and our research fits perfectly with this problem. Our approach streamlines the process and chooses the screenshot that is the most visually appealing based on science, which will in the end result in more user clicks.”

Today (May 23), Lebrecht will join the other 23 I-Corps project teams in Palo Alto, Calif., for the final presentation of each team’s I-Corps journey from basic science idea to real-world business application. She will present neonlabs’ solution, outlining the customer landscape, competition and business model.

Carnegie Mellon is well known for its entrepreneurial culture. The university’s Greenlighting Startups initiative, a portfolio of five business incubators, is designed to speed company creation at CMU. In the past 15 years, Carnegie Mellon faculty and students have helped to create more than 300 companies and 9,000 jobs; the university averages 15 to 20 new startups each year.

"CMU has been an amazing place to build neonlabs," Lebrecht said. "There’s a great intellectual community and facilities here as well as people unbelievably experienced in tech transfer and startups who have been so incredibly generous with their time."

Provided by Carnegie Mellon University

Source: medicalxpress.com

May 24, 201219 notes
#science #neuroscience #brain #psychology #vision
Robust White Matter Helps Keep Us Smart As We Age

May 23rd, 2012

Well-connected brains make you smarter in older age
Brains that maintain healthy nerve connections as we age help keep us sharp in later life, new research funded by the charity Age UK has found

Brains that maintain healthy nerve connections as we age help keep us sharp in later life, new research funded by the charity Age UK has found.

Older people with robust brain ‘wiring’, that is, the nerve fibres that connect different, distant brain areas, can process information quickly and that this makes them generally smarter, the study suggests.

According to the findings, joining distant parts of the brain together with better wiring improves mental performance, suggesting that intelligence is not found in a single part of the brain.

However a loss of condition of this wiring or ‘white matter’, the billions of nerve fibres that transmit signals around the brain, can negatively affect our intelligence by altering these networks and slowing down our processing speed.

The research by the University of Edinburgh shows for the first time that the deterioration of white matter with age is likely to be a significant cause of age-related cognitive decline.

The research team used three different brain imaging techniques in compiling the results, including two that have never been used before in the study of intelligence.

image

Healthy nerve connections in the brain help to reduce mental decline and dementia in older people. Image by Flickr user Brian Auer. See below for attribution.

These techniques measure the amount of water in brain tissue, indicate structural loss in the brain, and show how well the nerve fibres are insulated.

The researchers examined scans and results of thinking and reaction time tests from 420 people in the Lothian Birth Cohort of 1936, a group of nearly 1100 people whose intelligence & general health have been tracked since they were 11

The research was part of the Disconnected Mind Project, a large study of the causes of people’s differences in cognitive ageing, led by Professor Ian Deary.

Study author Doctor Lars Penke said “Our results suggest a first plausible way how brain structure differences lead to higher intelligence. The results are exciting for our understanding of human intelligence differences at all ages.”

“They also suggest a clear target for seeking treatment for mental difficulties, be they pathological or age-related. That the brain’s nerve connections tend to stay the same throughout the brain means we can now look at factors that affect the overall condition of the brain, like its bloody supply.”

Professor Deary said that uncovering the secrets of good thinking skills in old age is a high priority. “The research team is now looking at what keeps the brain’s connections healthy. We value our thinking skills, and research should address how we might retain them or slow their decline with age.”

Doctor Mark Bastin, who co-authored the study, said “These findings are exciting as they show how quantitative brain imaging can provide novel insights into the links between brain structure and cognitive ability. This is a key research area given the importance of identifying strategies for retaining good mental ability into older age.”

Professor James Goodwin, Head of Research at Age UK, said: “This research is very exciting as it could have a real impact on tackling mental decline in later life, including dementia. With new understanding on how the brain functions we can work out why mental faculties decline with age in some people and not others and look at what can be done to improve our minds’ chances of ageing better.”

Source: Neuroscience News

May 24, 20129 notes
#science #neuroscience #brain #psychology
Researchers uncover new ways sleep-wake patterns are like clockwork

May 23, 2012

Researchers at New York University and Albert Einstein College of Medicine of Yeshiva University have discovered new ways neurons work together to ease the transition between sleep and wakefulness. Their findings, which appear in the journal Neuron, provide additional insights into sleep-wake patterns and offer methods to explore what may disrupt them.

Their study explored the biological, or circadian, clocks of Drosophila fruit flies, which are commonly used for research in this area. This is because it is relatively easy to find mutants with malfunctioning biological clocks and then to identify the genes underlying the altered behavior. Such studies in fruit flies have allowed the identification of similar “clock genes" in mammals, which function in largely the same manner as they do in a fly’s clock.

In the Neuron study, the researchers moved up a level to study how pacemaker clock neurons—which express clock genes—interact with each other. Specifically, they looked at the relationship between master pacemaker neurons, which control the overall pace of the circadian system, and non-master pacemaker neurons, whose role in circadian rhythms has been less clear.

To do so, they examined flies with normally functioning master and non-master clock neurons and compared them with mutant flies in which the signaling of these neurons was either increased or decreased. These comparisons allowed the researchers to isolate the individual roles of these neurons and, in particular, to understand how master and non-master pacemaker neurons work together to control circadian rhythms.

Their results revealed a previously unknown role for non-master pacemaker neurons. Specifically, these neurons employ a neurotransmitter, glutamate, which suppresses signaling of the master pacemaker neurons during the evening. Artificially increasing this suppression by the non-master clock neurons in the morning made it much harder for flies to wake up. So in normal flies, these non-master pacemaker neurons have to stand aside at dawn, allowing the master pacemaker neurons to fire to wake up the fly. The authors concluded that the balance between signaling of these two groups of clock neurons helps to set the precise time of the transition between sleep and wakefulness.

"Our work shifts the emphasis away from clock genes and starts to address how clock neurons function in a neural network to regulate behavior," explained Justin Blau, an associate professor in NYU’s Department of Biology and one of the study’s co-authors. "And it shows the importance of studying individual groups of clock neurons, since different subsets can have opposite effects on animal behavior.”

"This work helps to elucidate the neurotransmitters and receptors that facilitate communication between specific groups of nerve cells that regulate circadian rhythm," said co-author Myles Akabas, professor of Physiology & Biophysics and of Neuroscience at Albert Einstein College of Medicine. "It demonstrates the power of collaborative interdisciplinary research to address the molecular and cellular basis for behavior."

Provided by New York University

Source: medicalxpress.com

May 23, 201216 notes
#science #neuroscience #brain #psychology
Reverse engineering epilepsy's 'miracle' diet

May 23, 2012 by R. Alan Leo

For decades, neurologists have known that a diet high in fat and extremely low in carbohydrates can reduce epileptic seizures that resist drug therapy. But how the diet worked, and why, was a mystery—so much so that in 2010, The New York Times Magazine called it “Epilepsy’s Big, Fat Miracle.”

Now, researchers at Dana-Farber Cancer Institute and Harvard Medical School have proposed an answer, linking resistance to seizures to a protein that modifies cellular metabolism in the brain. The research, to be published in the May 24th issue of the journal Neuron, may lead to the development of new treatments for epilepsy.

The research was led jointly by Nika Danial, HMS assistant professor of cell biology at Dana-Farber Cancer Institute, and Gary Yellen, professor of neurobiology at Harvard Medical School. The first author was Alfredo Giménez-Cassina, a research fellow in Danial’s lab.

Epilepsy is a neurological disorder characterized by repeated seizures, an electrical storm in the brain that can manifest as convulsions, loss of motor control, or loss of consciousness. Some cases of epilepsy can be improved by a diet that drastically reduces sugar intake, triggering neurons to switch from their customary fuel of glucose to fat byproducts called ketone bodies. The so-called ketogenic diet, which mimics effects of starvation, was described more than 80 years ago and received renewed interest in the 1990s. Recent studies corroborate that it works, but shed little light on how.

"The connection between metabolism and epilepsy has been such a puzzle," said Yellen, who was introduced to the ketogenic diet through his wife, Elizabeth Thiele, HMS professor of neurology, who directs the Pediatric Epilepsy Program at MassGeneral Hospital for Children, but was not directly involved in the study. "I’ve met a lot of kids whose lives are completely changed by this diet," Yellen said. "It’s amazingly effective, and it works for many kids for whom drugs don’t work."

"We knew we needed to come at this link between metabolism and epilepsy from a new angle," said Danial, who had previously discovered a surprising double duty for a protein known for its role in apoptosis: The protein, BCL-2-associated Agonist of Cell Death, or BAD, also regulated glucose metabolism.

Giménez-Cassina further discovered that certain modifications in BAD switched metabolism in brain cells from glucose to ketone bodies. “It was then that we realized we had come upon a metabolic switch to do what the ketogenic diet does to the brain without any actual dietary therapy,” said Gimenez-Cassina, who went on to show that these same BAD modifications protect against seizures in experimental models of epilepsy. Still, it wasn’t clear exactly how.

Yellen suspected the solution involved potassium ion channels. While sodium and calcium ion channels tend to excite cells, including neurons, potassium channels tend to suppress cell electrical activity. His lab had previously linked ketone bodies to the activation of ATP-sensitive potassium (KATP) channels in neurons. Yellen had hypothesized that the ketogenic diet worked because ketone bodies provide neurons enough fuel for normal function, but when the electrical and energy storm of an epileptic seizure threatens, the activated KATP channels can shut the storm down. But the effects of diets are broad and complex, so it was impossible to say for sure.

The effects that Danial’s lab had discovered—BAD’s ability to alter metabolism and seizures—offered a new avenue for studying the therapeutic effects of altered metabolism. Together, the researchers decided to investigate whether Danial’s switch governed Yellen’s pathway, and whether they could reverse engineer the seizure protection of a ketogenic diet.

They could. Working in genetically altered mice, the researchers modified the BAD protein to reduce glucose metabolism and increase ketone body metabolism in the brain. Seizures decreased, but the benefit was erased when they knocked out the KATP channel—strong evidence that a BAD-KATP pathway conferred resistance to epileptic seizures. Further experiments suggested that it was indeed BAD’s role in metabolism, not cell death that mattered. The findings make the BAD protein a promising target for new epilepsy drugs.

"Diet sounds like this wholesome way to treat seizures, but it’s very hard. I mean, diets in general are hard, and this diet is really hard," said Yellen, whose wife’s Center for Dietary Therapy in Epilepsy hosts a candy-free Halloween party for its many patients on the ketogenic diet. “So finding a pharmacological substitute for this would make lots of people really happy.”

Provided by Harvard Medical School

Source: medicalxpress.com

May 23, 201210 notes
#science #neuroscience #brain #psychology #epilepsy
Treating pain with transplants

May 23, 2012

A new study finds that transplanting embryonic cells into adult mouse spinal cord can alleviate persistent pain. The research, published by Cell Press in the May 24th issue of the journal Neuron, suggests that reduced pain results from successful integration of the embryonic cells into the host spinal cord. The findings open avenues for clinical strategies aimed not just at treating the symptoms of chronic debilitating pain, but correcting the underlying disease pathology.

There are two major classes of chronic pain: inflammatory pain that results from injury to tissue, such as muscle and bone, and neuropathic pain from injury to nerves, for example, in the limbs or face. Damage to nerves can occur after physical trauma and from chemotherapy drugs. With neuropathic pain, the pain occurs in the absence of stimulation, and there is hypersensitivity and exacerbated pain to stimuli that would not normally cause pain. Neuropathic pain is thought to involve the loss of inhibitory neurons that release the chemical GABA, which is an inhibitory neurotransmitter that controls the excitability of neurons, including neurons that transmit pain information.

"Pharmacological approaches to managing neuropathic pain enhance GABA-mediated inhibition. However, some patients do not respond to these therapies and there are significant adverse side effects," explains senior study author, Dr. Allan Basbaum from the University of California, San Francisco. "Therefore, new therapeutic approaches for neuropathic pain are essential." Dr. Basbaum and colleagues explored whether replacement of the damaged inhibitory neurons might be useful for reducing neuropathic pain.

The researchers transplanted immature GABA neurons from mouse fetal brain into the spinal cord of mice with nerve injury-induced pain, a model for human neuropathic pain. The transplanted cells not only survived, but made connections with appropriate targets and integrated into the host spinal cord circuitry. This resulted in an almost complete reversal of the mechanical hypersensitivity generated in a nerve injury model of neuropathic pain. In contrast, the transplant procedure was not effective at reducing pain in a mouse model of inflammatory pain, which is induced by tissue injury.

Taken together, the findings have exciting implications for a cell-based treatment of neuropathic pain in humans. “Our strategy not only ameliorates the symptoms of neuropathic pain but, importantly, is also potentially disease modifying,” concludes Dr. Basbaum. “It is worth considering whether transplants such as these might have clinical utility in humans, a great advantage being that the adverse side effects associated with drug administration can be avoided.”

Provided by Cell Press

Source: medicalxpress.com

May 23, 20124 notes
#science #neuroscience #psychology #pain #brain
Dementia patients reveal how we construct a picture of the future

May 23, 2012

(Medical Xpress) — Our ability to imagine and plan our future depends on brain regions that store general knowledge, new research shows.

Dr. Muireann Irish from Neuroscience Research Australia (NeuRA) found that dementia patients who can no longer recall general knowledge – for example, the names of famous people or popular songs – are also unable to imagine themselves in the future.

"We already know that if memory of past events is compromised, as is the case in Alzheimer’s disease, then the ability to imagine future scenarios is also impaired,” says Dr. Irish.

"We have now discovered that damage to parts of the brain that store knowledge of facts and meanings can also produce the same effect," she says.

Thinking about the future is an important ability because it helps us to plan and anticipate the consequences of our actions.

"For example, a person with dementia who may leave the oven on, partly because they forget the appropriate action, but also because they cannot project forward in time to anticipate the dangerous consequences this might have," says Dr. Irish.

Dr. Irish and colleagues used MRI to study people with Alzheimer’s disease (memories of past experiences are lost) as well as patients with semantic dementia who have lost the ability to remember facts (semantic memory) but have little problem remembering past experiences.

Surprisingly, she found that the semantic dementia group was as impaired as the Alzheimer’s group when imagining future events, even though their memory of past experiences was relatively intact.

"This is an important finding, as it points to multiple regions in the brain that are responsible for our ability to imagine and plan for the future,” she says.

Provided by Neuroscience Research Australia

Source: medicalxpress.com

May 23, 201210 notes
#science #neuroscience #brain #psychology
Discoveries Into Perception Via Popular Magic Tricks

ScienceDaily (May 22, 2012) — Researchers at Barrow Neurological Institute at St. Joseph’s Hospital and Medical Center have unveiled how and why the public perceives some magic tricks in recent studies that could have real-world implications in military tactics, marketing and sports.

image

A professional magician believed that if he moved his hand in a straight line while performing a trick the audience would focus on the beginning and end points of the motion, but not in between. In contrast, he believed if he moved his hand in a curved motion the audience would follow his hand’s trajectory from beginning to end. (Credit: © luzitanija / Fotolia)

Susana Martinez-Conde, PhD, of Barrow’s Laboratory of Visual Neuroscience, and Stephen Macknik, PhD, of Barrow’s Laboratory of Behavioral Neurophysiology are well known for their research into magic and illusions. Their most recent original research projects, published in Frontiers in Human Neuroscience, offer additional insight into perception and cognition.

One of the studies was initiated by professional magician Apollo Robbins, who believed that audience members directed their attention differently depending on the type of hand motion used. Robbins believed that if he moved his hand in a straight line while performing a trick the audience would focus on the beginning and end points of the motion, but not in between. In contrast, he believed if he moved his hand in a curved motion the audience would follow his hand’s trajectory from beginning to end.

By studying the eye movements of individuals as they watched Robbins perform, Barrow researchers confirmed Robbins’ theory. Perhaps more importantly, they also found that the different types of hand motion triggered two different types of eye movement. The researchers discovered that curved motion engaged smooth pursuit eye movements (in which the eye follows a moving object smoothly), whereas straight motion led to saccadic eye movements (in which the eye jumps from one point of interest to another).

"Not only is this discovery important for magicians, but the knowledge that curved motion attracts attention differently from straight motion could have wide-reaching implications — for example, in predator-prey evasion techniques in the natural world, military tactics, sports strategies and marketing," says Martinez-Conde. This finding is believed to be the first discovery in the neuroscientific literature initiated by a magician, rather than a scientist.

In another study, the researchers worked with professional magician Mac King to investigate magicians’ use of social cues — like the position of their gaze — to misdirect observers.

They studied a popular coin-vanishing trick, in which King tosses a coin up and down in his right hand before “tossing” it to his left hand, where it subsequently disappears. In reality, the magician only simulates tossing the coin to the left hand, an implied motion that essentially tricks the neurons into responding as they would have if the coin had actually been thrown.

The Barrow researchers discovered that social misdirection does not always help magic. By presenting two different videos of King — one in which the audience could see his face and another in which his face was hidden — they found that social misdirection did not play a role in this particular trick.

"We wondered if the observer’s perception of magic was going to be different if they could see the magician’s head and eye position. To our surprise, it didn’t matter," says Martinez-Conde. "This indicates that social misdirection in magic is more complicated than previously believed, and not necessary for the perception of all magic tricks."

Source: Science Daily

May 23, 20124 notes
#science #neuroscience #brain #psychology #perception
Neuron-Nourishing Cells Appear to Retaliate in Alzheimer's

ScienceDaily (May 22, 2012) — When brain cells start oozing too much of the amyloid protein that is the hallmark of Alzheimer’s disease, the astrocytes that normally nourish and protect them deliver a suicide package instead, researchers report.

image

Drs. Michael Dinkins (from left), Guanghu Wang and Erhard Bieberich. (Credit: Image courtesy of Georgia Health Sciences University)

Amyloid is excreted by all neurons, but rates increase with aging and dramatically accelerate in Alzheimer’s. Astrocytes, which deliver blood, oxygen and nutrients to neurons in addition to hauling off some of their garbage, get activated and inflamed by excessive amyloid.

Now researchers have shown another way astrocytes respond is by packaging the lipid ceramide with the protein PAR-4, which independently can do damage but together are a more “deadly duo,” said Dr. Erhard Bieberich, biochemist at the Medical College of Georgia at Georgia Health Sciences University.

"If the neuron makes something toxic and dumps it at your door, what would you do?" said Bieberich, corresponding author of the study published in the Journal of Biological Chemistry. “You would probably do something to defend yourself.”

The researchers hypothesize that this lipid-coated package ultimately kills them both, which could help explain the brain-cell death and shrinkage that occurs in Alzheimer’s. “If the astrocytes die, the neurons die,” Bieberich said, noting studies suggest that excess amyloid alone does not kill brain cells. “There must be a secondary process toxifying the amyloid; otherwise the neuron would self-intoxicate before it made a big plaque,” he said. “The neuron would die first.”

One of many avenues for future pursuit include whether a ceramide antibody could be a viable Alzheimer’s treatment. In the researchers’ studies of brain cells of humans with Alzheimer’s as well as an animal model of the disease, antibodies to ceramide and Par-4 prevented astrocytes’ amyloid-induced death.

Ceramide and Par-4 get packaged in lipid-coated vesicles called exosomes; all cells secrete thousands of these vesicles but scientists are only beginning to understand their normal function. When exosomes become deadly, they are called apoxosomes.

Ceramide and Par-4 are typically not in a vesicle, rather in two distinct parts of a cell. Ceramide appears to take the lead in bringing the two together when confronted with amyloid. Bieberich and colleagues at the University of Georgia reported in 2003 that the deadly duo helps eliminate duplicate brain cells that occur early in brain development when their survival could result in a malformed brain. They suspected then that the duo might also have a role in Alzheimer’s.

Risk factors for Alzheimer’s include aging, family history and genetics, according to the Alzheimer’s Association. Increasing evidence suggests that Alzheimer’s also shares many of the same risk factors for cardiovascular disease, such as high cholesterol, high blood pressure and inactivity.

Source: Science Daily

May 22, 20125 notes
#science #neuroscience #brain #psychology #alzheimer
Learning and memory: The role of neo-neurons revealed

May 22, 2012

(Medical Xpress) — Researchers at the Institut Pasteur and the CNRS have recently identified in mice the role played by neo-neurons formed in the adult brain. By using selective stimulation the researchers were able to show that these neo-neurons increase the ability to learn and memorize difficult cognitive tasks. This newly discovered characteristic of neo-neurons to assimilate complex information could open up new avenues in the treatment of some neurodegenerative diseases. This publication is available online on the Nature Neuroscience journal’s website.

image

Section of a mouse brain observed using a fluorescence microscope. The green filaments represent neo-neurons in an organized network. Credit: Institut Pasteur

The discovery that new neurons could be formed in the adult brain created quite a stir in 2003 by debunking the age-old belief that a person is born with a set number of neurons and that any loss of neurons is irreversible. This discovery was all the more incredible considering that the function of these new neurons remained undetermined. That is, until today.

Using mice models the team working under Pierre-Marie Lledo, head of the Laboratory for Perception and Memory (Institut Pasteur/CNRS) recently revealed the role of these neo-neurons formed in the adult brain with respect to learning and memory. With the help of an experimental approach using optogenetics, developed by this very same team and published in December 2010, the researchers were able to show that when stimulated by a brief flash of light these neo-neurons facilitate both learning and the memorization of complex tasks. This resulted in mice models that were able to memorize information given during the learning activity more quickly and remember exercises even 50 days after experimentation had ended. The study also shows that neo-neurons generated just after birth hold no added advantages as relates to either learning or memory. In this respect it is only the neurons produced by the adult brain that have any considerable significance.

“This study shows that the activity of just a few neurons produced in the adult brain can still have considerable effects on cognitive processes and behavior. Moreover, this work helps to illustrate how the brain assimilates new stimulations seeing as normally electrical activity (which we mimic using flashes of light) is produced within the brain’s attention centers”, explains the study’s director Pierre-Marie Lledo.

Beyond simply discovering the functional contribution of these neo-neurons, the study has also reaffirmed the clear link between “mood” (defined here by a specific pattern of stimulation) and cerebral activity. It has been shown that curiosity, attentiveness and pleasure all promote the formation of neo-neurons and consequently the acquisition of new cognitive abilities. Conversely, a state of depression is detrimental to the production of new neurons and triggers a vicious cycle which prolongs this state of despondency. These results, and the optogenetics technologies that enabled this study, may prove very useful for devising therapeutic protocols which aim to counter the development of neurologic or psychiatric diseases.

Provided by CNRS

Source: medicalxpress.com

May 22, 201219 notes
#science #neuroscience #brain #psychology #memory
Next page →
20132014
  • January
  • February
  • March
  • April
  • May
  • June
  • July
  • August
  • September
  • October
  • November
  • December
201220132014
  • January
  • February
  • March
  • April
  • May
  • June
  • July
  • August
  • September
  • October
  • November
  • December
20122013
  • January
  • February
  • March
  • April
  • May
  • June
  • July
  • August
  • September
  • October
  • November
  • December