Neuroscience

Articles and news from the latest research reports.

Posts tagged neuroscience

6 notes

Noninvasive brain stimulation shown to impact walking patterns

June 1, 2012

In a step towards improving rehabilitation for patients with walking impairments, researchers from the Kennedy Krieger Institute found that non-invasive stimulation of the cerebellum, an area of the brain known to be essential in adaptive learning, helped healthy individuals learn a new walking pattern more rapidly. The findings suggest that cerebellar transcranial direct current stimulation (tDCS) may be a valuable therapy tool to aid people relearning how to walk following a stroke or other brain injury.

Previous studies in the lab of Amy Bastian, PhD, PT, director of the Motion Analysis Laboratory at Kennedy Krieger Institute, have shown that the cerebellum, a part of the brain involved in movement coordination, is essential for walking adaptation. In this new study, Dr. Bastian and her colleagues explored the impact of stimulation over the cerebellum on adaptive learning of a new walking pattern. Specifically, her team tested how anode (positive), cathode (negative) or sham (none) stimulation affected this learning process.

"We’ve known that the cerebellum is essential to adaptive learning mechanisms like reaching, walking, balance and eye movements,” says Dr. Bastian. “In this study, we wanted to examine the effects of direct stimulation of the cerebellum on locomotor learning utilizing a split-belt treadmill that separately controls the legs.”

The study, published today in the Journal of Neurophysiology, found that by placing electrodes on the scalp over the cerebellum and applying very low levels of current, the rate of walking adaptation could be increased or decreased. Dr. Bastian’s team studied 53 healthy adults in a series of split-belt treadmill walking tests. Rather than a single belt, a split-belt treadmill consists of two belts that can move at different speeds. During split-belt walking, one leg is set to move faster than the other. This initially disrupts coordination between the legs so the user is not walking symmetrically, however over time the user learns to adapt to the disturbance.

The main experiment consisted of a two-minute baseline period of walking with both belts at the same slow speed, followed by a 15-minute period with the belts at two separate speeds. While people were on the treadmill, researchers stimulated one side of the cerebellum to assess the impact on the rate of re-adjustment to a symmetric walking pattern.

Dr. Bastian’s team found not only that cerebellar tDCS can change the rate of cerebellum-dependent locomotor learning, but specifically that the anode speeds up learning and the cathode slows it down. It was also surprising that the side of the cerebellum that was stimulated mattered; only stimulation of the side that controls the leg walking on the faster treadmill belt changed adaptation rate.

"It is important to demonstrate that we can make learning faster or slower, as it suggests that we are not merely interfering with brain function," says Dr. Bastian. "Our findings also suggest that tDCS can be selectively used to assess and understand motor learning."

The results from this study present an exciting opportunity to test cerebellar tDCS as a rehabilitation tool. Dr. Bastian says, “If anodal tDCS prompts faster learning, this may help reduce the amount of time needed for stroke patients to relearn to walk evenly. It may also be possible to use tDCS to help sustain gains made in therapy, so patients can retain and practice improved walking patterns for a longer period of time. We are currently testing these ideas in individuals who have had a stroke.”

Provided by Kennedy Krieger Institute

Source: medicalxpress.com

Filed under science neuroscience brain psychology

4 notes

Flies With Restless Legs Syndrome Point to a Genetic Cause

ScienceDaily (May 31, 2012) — When flies are made to lose a gene with links to Restless Legs Syndrome (RLS), they suffer the same sleep disturbances and restlessness that human patients do. The findings reported online on May 31 in Current Biology, a Cell Press publication, strongly suggest a genetic basis for RLS, a condition in which patients complain of an irresistible urge to move that gets worse as they try to rest.

"Although widely prevalent, RLS is a disorder whose pathophysiological basis remains very poorly understood," said Subhabrata Sanyal of Emory University School of Medicine. "The major significance of our study is to highlight the fact that there might be a genetic basis for RLS. Understanding the function of these genes also helps to understand and diagnose the disease and may offer more focused therapeutic options that are currently limited to very general approaches."

Sanyal’s team recognized that a number of genome-wide association studies in humans had suggested connections between RLS and variation in a single gene (BTBD9).

"BTBD9 function or its relationship to RLS and sleep were a complete mystery," Sanyal said.

His team realized that there might be a way to shed some light on that mystery in fruit flies. Flies have a single, highly conserved version of the human BTBD9. They decided to test whether the gene that had turned up in those human studies would have any effect on sleep in the insects. In fact, flies need sleep just like humans do, and their sleep patterns are influenced by the same kinds of brain chemistry.

The researchers now report that flies lacking their version of the RLS-associated gene do lose sleep as they move more. When those flies were treated with a drug used for RLS, they showed improvements in their sleep.

The studies also yielded evidence about how the RLS gene works by controlling dopamine levels in the brain as well as iron balance in cells. Sanyal said his team will continue to explore other RLS-related genes that have been identified in human studies in search of more details of their interaction and function.

"Our results support the idea that genetic regulation of dopamine and iron metabolism constitute the core pathophysiology of at least some forms of RLS," the researchers write.

More broadly, they say, the study emphasizes the utility of simple animals such as fruit flies in unraveling the genetics of sleep and sleep disorders.

Source: Science Daily

Filed under neuroscience psychology science RLS genes

21 notes

Walking and Running Again After Spinal Cord Injury

ScienceDaily (May 31, 2012) — Rats with spinal cord injuries and severe paralysis are now walking (and running) thanks to researchers at EPFL. Published in the June 1, 2012 issue of Science, the results show that a severed section of the spinal cord can make a comeback when its own innate intelligence and regenerative capacity is awakened. The study, begun five years ago at the University of Zurich, points to a profound change in our understanding of the central nervous system. According to lead author Grégoire Courtine, it is yet unclear if similar rehabilitation techniques could work for humans, but the observed nerve growth hints at new methods for treating paralysis.

Test subject takes first steps up stairs after neurorehabilitation with a combination of robotic harness and electrical-chemical stimulation. (Credit: EPFL/Grégoire Courtine)

"After a couple of weeks of neurorehabilitation with a combination of a robotic harness and electrical-chemical stimulation, our rats are not only voluntarily initiating a walking gait, but they are soon sprinting, climbing up stairs and avoiding obstacles when stimulated," explains Courtine, who holds the International Paraplegic Foundation (IRP) Chair in Spinal Cord Repair at EPFL.

Waking up the spinal cord

It is well known that the brain and spinal cord can adapt and recover from moderate injury, a quality known as neuroplasticity. But until now the spinal cord expressed so little plasticity after severe injury that recovery was impossible. Courtine’s research proves that, under certain conditions, plasticity and recovery can take place in these severe cases — but only if the dormant spinal column is first woken up.

To do this, Courtine and his team injected a chemical solution of monoamine agonists into the rats. These chemicals trigger cell responses by binding to specific dopamine, adrenaline, and serotonin receptors located on the spinal neurons. This cocktail replaces neurotransmitters released by brainstem pathways in healthy subjects and acts to excite neurons and ready them to coordinate lower body movement when the time is right.

Five to 10 minutes after the injection, the scientists electrically stimulated the spinal cord with electrodes implanted in the outermost layer of the spinal canal, called the epidural space. “This localized epidural stimulation sends continuous electrical signals through nerve fibers to the chemically excited neurons that control leg movement. All that is left was to initiate that movement,” explains Rubia van den Brand, contributing author to the study.

The innate intelligence of the spinal column

In 2009, Courtine already reported on restoring movement, albeit involuntary. He discovered that a stimulated rat spinal column — physically isolated from the brain from the lesion down — developed in a surprising way: It started taking over the task of modulating leg movement, allowing previously paralyzed animals to walk over treadmills. These experiments revealed that the movement of the treadmill created sensory feedback that initiated walking — the innate intelligence of the spinal column took over, and walking essentially occurred without any input from the rat’s actual brain. This surprised the researchers and led them to believe that only a very weak signal from the brain was needed for the animals to initiate movement of their own volition.

To test this theory, Courtine replaced the treadmill with a device that vertically supported the subjects, a mechanical harness did not facilitate forward movement and only came into play when they lost balance, giving them the impression of having a healthy and working spinal column. This encouraged the rats to will themselves toward a chocolate reward on the other end of the platform. “What they deemed willpower-based training translated into a fourfold increase in nerve fibers throughout the brain and spine — a regrowth that proves the tremendous potential for neuroplasticity even after severe central nervous system injury,” says Janine Heutschi, co-author in the study.

First human rehabilitation on the horizon

Courtine calls this regrowth “new ontogeny,” a sort of duplication of an infant’s growth phase. The researchers found that the newly formed fibers bypassed the original spinal lesion and allowed signals from the brain to reach the electrochemically-awakened spine. And the signal was sufficiently strong to initiate movement over ground — without the treadmill — meaning the rats began to walk voluntarily towards the reward, entirely supporting their own weight with their hind legs.

"This is the world-cup of neurorehabilitation," exclaims Courtine. "Our rats have become athletes when just weeks before they were completely paralyzed. I am talking about 100% recuperation of voluntary movement."

In principle, the radical reaction of the rat spinal cord to treatment offers reason to believe that people with spinal cord injury will soon have some options on the horizon. Courtine is optimistic that human, phase-two trials will begin in a year or two at Balgrist University Hospital Spinal Cord Injury Centre in Zurich, Switzerland. Meanwhile, researchers at EPFL are coordinating a nine million Euro project called NeuWalk that aims at designing a fully operative spinal neuroprosthetic system, much like the one used here with rats, for implanting into humans.

Source: Science Daily

Filed under science neuroscience CNS psychology

35 notes

Alzheimer’s Protein Structure Suggests New Treatment Directions

ScienceDaily (May 31, 2012) — The molecular structure of a protein involved in Alzheimer’s disease — and the surprising discovery that it binds cholesterol — could lead to new therapeutics for the disease, Vanderbilt University investigators report in the June 1 issue of the journal Science.

Vanderbilt Center for Structural Biology investigators determined the structure of the C99 protein (shown in green and blue), which participates in triggering Alzheimer’s disease. Their discovery that C99 binds to cholesterol (shown in black, white and red) suggests a mechanism for cholesterol’s recognized role in promoting the memory-robbing disease and may lead to new therapeutics. (Credit: Charles Sanders and colleagues/Vanderbilt University)

Charles Sanders, Ph.D., professor of Biochemistry, and colleagues in the Center for Structural Biology determined the structure of part of the amyloid precursor protein (APP) — the source of amyloid-beta, which is believed to trigger Alzheimer’s disease. Amyloid-beta clumps together into oligomers that kill neurons, causing dementia and memory loss. The amyloid-beta oligomers eventually form plaques in the brain — one of the hallmarks of the disease.

"Anything that lowers amyloid-beta production should help prevent, or possibly treat, Alzheimer’s disease," Sanders said.

Amyloid-beta production requires two “cuts” of the APP protein. The first cut, by the enzyme beta-secretase, generates the C99 protein, which is then cut by gamma-secretase to release amyloid-beta. The Vanderbilt researchers used nuclear magnetic resonance and electron paragmagnetic resonance spectroscopy to determine the structure of C99, which has one membrane-spanning region.

They were surprised to discover what appeared to be a “binding” domain in the protein. Based on previously reported evidence that cholesterol promotes Alzheimer’s disease, they suspected that cholesterol might be the binding partner. The researchers used a model membrane system called “bicelles” (that Sanders developed as a postdoctoral fellow) to demonstrate that C99 binds cholesterol.

"It has long been thought that cholesterol somehow promotes Alzheimer’s disease, but the mechanisms haven’t been clear," Sanders said. "Cholesterol binding to APP and its C99 fragment is probably one of the ways it makes the disease more likely."

Sanders and his team propose that cholesterol binding moves APP to special regions of the cell membrane called “lipid rafts,” which contain “cliques of molecules that like to hang out together,” he said.

Beta- and gamma-secretase are part of the lipid raft clique.

"We think that when APP doesn’t have cholesterol around, it doesn’t care what part of the membrane it’s in," Sanders said. "But when it binds cholesterol, that drives it to lipid rafts, where these ‘bad’ secretases are waiting to clip it and produce amyloid-beta."

The findings suggest a new therapeutic strategy to reduce amyloid-beta production, he said.

"If you could develop a drug that blocks cholesterol from binding to APP, then you would keep the protein from going to lipid rafts. Instead it would be cleaved by alpha-secretase — a ‘good’ secretase that isn’t in rafts and doesn’t generate amyloid-beta."

Drugs that inhibit beta- or gamma-secretase — to directly limit amyloid-beta production — have been developed and tested, but they have toxic side effects. A drug that blocks cholesterol binding to APP may be more specific and effective in reducing amyloid-beta levels and in preventing, or treating, Alzheimer’s disease.

The C99 structure had some other interesting details, Sanders said.

The membrane domain of C99 is curved, which was unexpected but fits perfectly into the predicted active site of gamma-secretase. Also, a certain sequence of amino acids (GXXXG) that usually promotes membrane protein dimerization (two of the same proteins interacting with each other) turned out to be central to the cholesterol-binding domain. This is a completely new function for GXXXG motifs, Sanders said.

"This revealing new information on the structure of the amyloid precursor protein and its interaction with cholesterol is a perfect example of the power of team science," said Janna Wehrle, Ph.D., who oversees grants focused on the biophysical properties of proteins at the National Institutes of Health’s National Institute of General Medical Sciences (NIGMS), which partially funded the work. "The researchers at Vanderbilt brought together biological and medical insight, cutting-edge physical techniques and powerful instruments, each providing a valuable tool for piecing together the puzzle."

"When we were developing bicelles 20 years ago, no one was saying, ‘someday these things are going to lead to discoveries in Alzheimer’s disease,’" he said. "It was interesting basic science research that is now paying off."

Source: Science Daily

Filed under science neuroscience brain psychology alzheimer

4 notes

Memory Training Unlikely to Help in Treating ADHD, Boosting IQ

ScienceDaily (May 31, 2012) — Working memory training is unlikely to be an effective treatment for children suffering from disorders such as attention-deficit/hyperactivity or dyslexia, according to a research analysis published by the American Psychological Association. In addition, memory training tasks appear to have limited effect on healthy adults and children looking to do better in school or improve their cognitive skills.

"The success of working memory training programs is often based on the idea that you can train your brain to perform better, using repetitive memory trials, much like lifting weights builds muscle mass," said the study’s lead author, Monica Melby-Lervåg, PhD, of the University of Oslo. "However, this analysis shows that simply loading up the brain with training exercises will not lead to better performance outside of the tasks presented within these tests." The article was published online in Developmental Psychology.

Working memory enables people to complete tasks at hand by allowing the brain to retain pertinent information temporarily. Working memory enhancing tasks usually involve trying to get people to remember information presented to them while they are performing distracting activities. For example, participants may be presented with a series of numbers one at a time on a computer screen. The computer presents a new digit and then prompts participants to recall the number immediately preceding. More difficult versions might ask participants to recall what number appeared two, three or four digits ago.

In this meta-analysis, researchers from the University of Oslo and University College London examined 23 peer-reviewed studies with 30 different comparisons of groups that met their criteria. The studies were randomized controlled trials or experiments, had some sort of working memory treatment and a control group. The studies comprised a wide range of participants, including young children, children with cognitive impairments, such as ADHD, and healthy adults. Most of the studies had been published within the last 10 years.

Overall, working memory training improved performance on tasks related to the training itself but did not have an impact on more general cognitive performance such as verbal skills, attention, reading or arithmetic. “In other words, the training may help you improve your short-term memory when it’s related to the task implemented in training but it won’t improve reading difficulties or help you pay more attention in school,” said Melby-Lervåg.

In recent years, several commercial, computer-based working memory training programs have been developed and purport to benefit students suffering from ADHD, dyslexia, language disorders, poor academic performance or other issues. Some even claim to boost people’s IQs. These programs are widely used around the world in schools and clinics, and most involve tasks in which participants are given many memory tests that are designed to be challenging, the study said.

"In the light of such evidence, it seems very difficult to justify the use of working memory training programs in relation to the treatment of reading and language disorders," said Melby-Lervåg. "Our findings also cast strong doubt on claims that working memory training is effective in improving cognitive ability and scholastic attainment."

Source: Science Daily

Filed under science neuroscience brain psychology memory

5 notes

Fantasizing About Your Dream Vacation Could Lead to Poor Decision-Making

ScienceDaily (May 31, 2012) — Summer vacation time is upon us. If you have been saving up for your dream vacation for years, you may want to make sure your dream spot is still the best place to go. A new study has found that when we fantasize about such trips before they are possible, we tend to overlook the negatives — thus influencing our decision-making down the line.

Summer vacation time is upon us. If you have been saving up for your dream vacation for years, you may want to make sure your dream spot is still the best place to go. A new study has found that when we fantasize about such trips before they are possible, we tend to overlook the negatives — thus influencing our decision-making down the line. (Credit: © XtravaganT / Fotolia)

"We were interested in the effects of positive fantasies — what happens when people imagine an idealized, best-case-scenario version of the future, compared to when they imagine a less idealized version," says Heather Barry Kappes of New York University, co-author of the study published online this week in Personality and Social Psychology Bulletin. “This is one of the first papers to examine selective information acquisition at this early stage, before people are seriously considering a possibility.”

Say, for example, that you would like to take a trip to Australia this year but think you are very unlikely to do so — you have no more vacation time left, cannot afford it, or would rather save up for a new car. But you still daydream about how nice it would be to see the Australian Outback and lie on the white sand beaches, perhaps without thinking about the long plane ride there or the poisonous animals. Those daydreams, Kappes says, have powerful effects.

To test those effects, Kappes and co-author Gabriele Oettingen asked people to imagine a particular future about one of three topics: wearing glamorous high-heeled shoes, making money in the stock market, or taking a vacation. To induce positive fantasies for each topic, the study participants were prompted to think about how great it would be to do each activity. In the control condition, participants also imagined experiencing the future, but were prompted to think about the negatives as well, with questions like “Would it really be so great?” In both conditions, participants wrote down what they were thinking, for the researchers to ensure they were engaged in the imagery.

After that exercise, the researchers offered the participants a choice of different types of information. For example, participants could browse a website describing the positive and negative health consequences of wearing high heels, and researchers noted how much more time they spent reading about positive versus negative consequences. Or, they could choose which of five (fictitious) tripadvisor.com reviews they wanted to read, and researchers recorded whether they chose one that was more pro-trip (i.e., five stars) or con-trip (i.e., one star).

Kappes’ team found that for each topic, imagining the idealized version made people prefer to learn about the pros rather than the cons of the future event. “These effects are pronounced when people are not seriously considering pursuing a given future,” Kappes says.

The work has important implications for even the most deliberate of decision-makers. “When people are seriously considering implementing a decision like taking a trip, they often engage in careful deliberations about the pros versus cons,” Kappes says. “Our work suggests that before getting to this point, positive fantasies might lead people to acquire biased information — to learn more about the pros rather than the cons. Thus, even if people deliberate very carefully on the information they’ve acquired, they could still make poor decisions.”

People need to be aware of these effects to ensure that they acquire balanced information before it is time to make a decision, she says. The study also contributes to a larger body of research about the powerful consequences of mental imagery — and shows that positive thinking may not always be best. “Although there are benefits to imagining a positive future, there are also drawbacks, and it’s important to recognize them in order to most effectively pursue our goals.”

Source: Science Daily

Filed under science neuroscience psychology brain

21 notes

The Special Scent of Age: Body Odor Gives Away Age

ScienceDaily (May 30, 2012) — New findings from the Monell Center reveal that humans can identify the age of other humans based on differences in body odor. Much of this ability is based on the capacity to identify odors of elderly individuals, and contrary to popular supposition, the so-called ‘old-person smell’ is rated as less intense and less unpleasant than body odors of middle-aged and young individuals.

Baby-smell. Humans can identify the age of other humans based on differences in body odor. (Credit: © S.Kobold / Fotolia)

"Similar to other animals, humans can extract signals from body odors that allow us to identify biological age, avoid sick individuals, pick a suitable partner, and distinguish kin from non-kin," said senior author Johan Lundström, a sensory neuroscientist at Monell.

Like non-human animals, human body odors contain a rich array of chemical components that can transmit various types of social information. The perceptual characteristics of these odors are reported to change across the lifespan, as are concentrations of the underlying chemicals.

Scientists theorize that age-related odors may help animals select suitable mates: older males might be desirable because they contribute genes that enable offspring to live longer, while older females might be avoided because their reproductive systems are more fragile.

In humans, a unique ‘old person smell’ is recognized across cultures. This phenomenon is so acknowledged in Japan that there is a special word to describe this odor, kareishū.

Because studies with non-human animals at Monell and other institutions have demonstrated the ability to identify age via body odor, Lundström’s team examined whether humans are able to do the same.

In the study, published in the journal PLoS ONE, body odors were collected from three age groups, with 12-16 individuals in each group: Young (20-30 years old), Middle-age (45-55), and Old-age (75-95). Each donor slept for five nights in unscented t-shirts containing underarm pads, which were then cut into quadrants and placed in glass jars.

Odors were assessed by 41 young (20-30 years old) evaluators, who were given two body odor glass jars in nine combinations and asked to identify which came from the older donors. Evaluators also rated the intensity and pleasantness of each odor. Finally evaluators were asked to estimate the donor’s age for each odor sample.

Evaluators were able to discriminate the three donor age categories based on odor cues. Statistical analyses revealed that odors from the old-age group were driving the ability to differentiate age. Interestingly, evaluators rated body odors from the old-age group as less intense and less unpleasant than odors from the other two age groups.

"Elderly people have a discernible underarm odor that younger people consider to be fairly neutral and not very unpleasant," said Lundström. "This was surprising given the popular conception of old age odor as disagreeable. However, it is possible that other sources of body odors, such as skin or breath, may have different qualities."

Future studies will both attempt to identify the underlying biomarkers that evaluators use to identify age-related odors and also determine how the brain is able to identify and evaluate this information.

Source: Science Daily

Filed under science neuroscience brain psychology

10 notes

Despite Less Play, Children’s Use of Imagination Increases Over Two Decades

ScienceDaily (May 30, 2012) — Children today may be busier than ever, but Case Western Reserve University psychologists have found that their imagination hasn’t suffered — in fact, it appears to have increased.

Children today may be busier than ever, but Case Western Reserve University psychologists have found that their imagination hasn’t suffered — in fact, it appears to have increased. (Credit: © BeTa-Artworks / Fotolia)

Psychologists Jessica Dillon and Sandra Russ expected the opposite outcome when they analyzed 14 play studies that Russ conducted between 1985 and 2008.

But as they report in “Changes in Children’s Play Over Two Decades,” an article in the Creativity Research Journal, the data told a story contrary to common assumptions. First, children’s use of imagination in play and their overall comfort and engagement with play activities actually increased over time. In addition, the results suggested that children today expressed less negative feelings in play. Finally, their capacity to express a wide range of positive emotions, to tell stories and to organize thoughts stayed consistent.

Dillon, a fifth-year doctoral student, and Russ, a professor in psychological sciences at Case Western Reserve, decided to revisit the play data after a 2007 report from the American Academy of Pediatrics showed children played less.

They set out to see if having less time for unstructured play affected the processes in play that influence cognition and emotional development, a focus of the play research.

The pretend play studies focused on children between the ages of 6 and 10. The children’s play was measured for comfort, imagination, the range and amount of positive to negative emotions used and expressed, and the quality of storytelling by using Russ’ Affect in Play Scale (APS).

The APS is a five-minute, unstructured play session. Children are asked to play freely with three wooden blocks and two human hand puppets. The play is videotaped, and later reviewed and scored for imagination, expression of emotions, actions and storytelling.

Russ explains that children who exhibit good play skills with imaginative and emotional play situations have shown better skills at coping, creativity and problem solving. She stresses there is no link between being a good player and intelligence.

The APS data provided a consistent measurement and research structure over the 23-year period. Russ said the consistency of having the same tool to measure play provided this unique opportunity to track changes in play.

"We were surprised that outside of imagination and comfort, play was consistent over time," said Dillon.

Russ did voice concern about the decrease in displayed negative emotions and actions. “Past studies have linked negative emotions in play with creativity,” she said.

But even with the lack of time to play, Russ said, children, like some other forms of higher mammals, have a drive to play and always will find ways to do it.

As new stimuli, like video games and the Internet, have crept into everyday life, Russ explains that children might gain cognitive skills from using technology where they once got it from acting out situations in play. Skills might also develop from daydreaming.

Russ said future research will need to focus on whether acting out emotions and creating stories in play is as important as it once was in helping children to be creative.

Even though children have less time these days for play, Russ still advises giving children time for it, adding that it helps children develop emotional and cognitive abilities.

Video: Studying imagination in children’s play

Source: Science Daily

Filed under science neuroscience brain psychology

2 notes

Could Sarcastic Computers Be in Our Future? New Math Model Can Help Computers Understand Inference

ScienceDaily (May 30, 2012) — In a new paper, the researchers describe a mathematical model they created that helps predict pragmatic reasoning and may eventually lead to the manufacture of machines that can better understand inference, context and social rules.

Noah Goodman, right, and Michael Frank, both assistant professors of psychology, discuss their research at the white board that covers the wall in Goodman’s office. (Credit: L.A. Cicero)

Language is so much more than a string of words. To understand what someone means, you need context.

Consider the phrase, “Man on first.” It doesn’t make much sense unless you’re at a baseball game. Or imagine a sign outside a children’s boutique that reads, “Baby sale — One week only!” You easily infer from the situation that the store isn’t selling babies but advertising bargains on gear for them.

Present these widely quoted scenarios to a computer, however, and there would likely be a communication breakdown. Computers aren’t very good at pragmatics — how language is used in social situations.

But a pair of Stanford psychologists has taken the first steps toward changing that.

In a new paper published recently in the journal Science, Assistant Professors Michael Frank and Noah Goodman describe a quantitative theory of pragmatics that promises to help open the door to more human-like computer systems, ones that use language as flexibly as we do.

The mathematical model they created helps predict pragmatic reasoning and may eventually lead to the manufacture of machines that can better understand inference, context and social rules. The work could help researchers understand language better and treat people with language disorders.

It also could make speaking to a computerized customer service attendant a little less frustrating.

"If you’ve ever called an airline, you know the computer voice recognizes words but it doesn’t necessarily understand what you mean," Frank said. "That’s the key feature of human language. In some sense it’s all about what the other person is trying to tell you, not what they’re actually saying."

Frank and Goodman’s work is part of a broader trend to try to understand language using mathematical tools. That trend has led to technologies like Siri, the iPhone’s speech recognition personal assistant.

But turning speech and language into numbers has its obstacles, mainly the difficulty of formalizing notions such as “common knowledge” or “informativeness.”

That is what Frank and Goodman sought to address.

The researchers enlisted 745 participants to take part in an online experiment. The participants saw a set of objects and were asked to bet which one was being referred to by a particular word.

For example, one group of participants saw a blue square, a blue circle and a red square. The question for that group was: Imagine you are talking to someone and you want to refer to the middle object. Which word would you use, “blue” or “circle”?

The other group was asked: Imagine someone is talking to you and uses the word “blue” to refer to one of these objects. Which object are they talking about?

"We modeled how a listener understands a speaker and how a speaker decides what to say," Goodman explained.

The results allowed Frank and Goodman to create a mathematical equation to predict human behavior and determine the likelihood of referring to a particular object.

"Before, you couldn’t take these informal theories of linguistics and put them into a computer. Now we’re starting to be able to do that," Goodman said.

The researchers are already applying the model to studies on hyperbole, sarcasm and other aspects of language.

"It will take years of work but the dream is of a computer that really is thinking about what you want and what you mean rather than just what you said," Frank said.

Source: Science Daily

Filed under science neuroscience brain psychology

9 notes

Genes Predict If Medication Can Help You Quit Smoking

ScienceDaily (May 30, 2012) — The same gene variations that make it difficult to stop smoking also increase the likelihood that heavy smokers will respond to nicotine-replacement therapy and drugs that thwart cravings, a new study shows.

High-risk genetic variations can increase the risk for nicotine dependence, but the same gene variants predict a more robust response to anti-smoking medications. (Credit: Li-Shiun Chen)

The research, led by investigators at Washington University School of Medicine in St. Louis, will appear online May 30 in the American Journal of Psychiatry.

The study suggests it may one day be possible to predict which patients are most likely to benefit from drug treatments for nicotine addiction.

"Smokers whose genetic makeup puts them at the greatest risk for heavy smoking, nicotine addiction and problems kicking the habit also appear to be the same people who respond most robustly to pharmacologic therapy for smoking cessation," says senior investigator Laura Jean Bierut, MD, professor of psychiatry. "Our research suggests that a person’s genetic makeup can help us better predict who is most likely to respond to drug therapy so we can make sure those individuals are treated with medication in addition to counseling or other interventions."

For the new study, the researchers analyzed data from more than 5,000 smokers who participated in community-based studies and more than 1,000 smokers in a clinical treatment study. The scientists focused on the relationship between their ability to quit smoking successfully and genetic variations that have been associated with risk for heavy smoking and nicotine dependence.

"People with the high-risk genetic markers smoked an average of two years longer than those without these high-risk genes, and they were less likely to quit smoking without medication," says first author Li-Shiun Chen, MD, assistant professor of psychiatry at Washington University. "The same gene variants can predict a person’s response to smoking-cessation medication, and those with the high-risk genes are more likely to respond to the medication."

In the clinical treatment trial, individuals with the high-risk variants were three times more likely to respond to drug therapy, such as nicotine gum, nicotine patches, the antidepressant buproprion and other drugs used to help people quit.

Tobacco use is the leading cause of preventable illness and death in the United States and a major public health problem worldwide. Cigarette smoking contributes to the deaths of an estimated 443,000 Americans each year. Although lung cancer is the leading cause of smoking-related cancer death among both men and women, tobacco also contributes to other lung problems, many other cancers and heart attacks.

Bierut and Chen say that the gene variations they studied are not the only ones involved in whether a person smokes, becomes addicted to nicotine or has difficulty quitting. But they contend that because the same genes can predict both heavy smoking and enhanced response to drug treatment, the genetic variants are important to the addiction puzzle.

"It’s almost like we have a ‘corner piece’ here," Bierut says. "It’s a key piece of the puzzle, and now we can build on it. Clearly these genes aren’t the entire story — other genes play a role, and environmental factors also are important. But we’ve identified a group that’s responding to pharmacologic treatment and a group that’s not responding, and that’s a key step in improving, and eventually tailoring, treatments to help people quit smoking."

Since people without the risky genetic variants aren’t as likely to respond to drugs, Bierut says they should get counseling or other non-drug therapies.

"This is an actionable genetic finding," Chen says. "Scientific journals publish genetic findings every day, but this one is actionable because treatment could be based on a person’s genetic makeup. I think this study is moving us closer to personalized medicine, which is where we want to go."

And Bierut says that although earlier studies suggested the genes had only a modest influence on smoking and addiction, the new clinical findings indicate the genetic variations are having a big effect on treatment response.

"These variants make a very modest contribution to the development of nicotine addiction, but they have a much greater effect on the response to treatment. That’s a huge finding," she says.

Source: Science Daily

Filed under science neuroscience brain psychology genes

free counters