Neuroscience

Articles and news from the latest research reports.

Posts tagged neuroscience

17 notes

Risk-Sensitivity in Bayesian Sensorimotor Integration
Information processing in the nervous system during sensorimotor tasks with inherent uncertainty has been shown to be consistent with Bayesian integration. Bayes optimal decision-makers are, however, risk-neutral in the sense that they weigh all possibilities based on prior expectation and sensory evidence when they choose the action with highest expected value. In contrast, risk-sensitive decision-makers are sensitive to model uncertainty and bias their decision-making processes when they do inference over unobserved variables. In particular, they allow deviations from their probabilistic model in cases where this model makes imprecise predictions. Here we test for risk-sensitivity in a sensorimotor integration task where subjects exhibit Bayesian information integration when they infer the position of a target from noisy sensory feedback. When introducing a cost associated with subjects’ response, we found that subjects exhibited a characteristic bias towards low cost responses when their uncertainty was high. This result is in accordance with risk-sensitive decision-making processes that allow for deviations from Bayes optimal decision-making in the face of uncertainty. Our results suggest that both Bayesian integration and risk-sensitivity are important factors to understand sensorimotor integration in a quantitative fashion.

Risk-Sensitivity in Bayesian Sensorimotor Integration

Information processing in the nervous system during sensorimotor tasks with inherent uncertainty has been shown to be consistent with Bayesian integration. Bayes optimal decision-makers are, however, risk-neutral in the sense that they weigh all possibilities based on prior expectation and sensory evidence when they choose the action with highest expected value. In contrast, risk-sensitive decision-makers are sensitive to model uncertainty and bias their decision-making processes when they do inference over unobserved variables. In particular, they allow deviations from their probabilistic model in cases where this model makes imprecise predictions. Here we test for risk-sensitivity in a sensorimotor integration task where subjects exhibit Bayesian information integration when they infer the position of a target from noisy sensory feedback. When introducing a cost associated with subjects’ response, we found that subjects exhibited a characteristic bias towards low cost responses when their uncertainty was high. This result is in accordance with risk-sensitive decision-making processes that allow for deviations from Bayes optimal decision-making in the face of uncertainty. Our results suggest that both Bayesian integration and risk-sensitivity are important factors to understand sensorimotor integration in a quantitative fashion.

Filed under brain bayesian integration nervous system sensorimotor system decision-making neuroscience psychology science

425 notes


When Your Eyes Tell Your Hands What to Think: You’re Far Less in Control of Your Brain Than You Think
You’ve probably never given much thought to the fact that picking up your cup of morning coffee presents your brain with a set of complex decisions. You need to decide how to aim your hand, grasp the handle and raise the cup to your mouth, all without spilling the contents on your lap.
A new Northwestern University study shows that, not only does your brain handle such complex decisions for you, it also hides information from you about how those decisions are made.
"Our study gives a salient example," said Yangqing ‘Lucie’ Xu, lead author of the study and a doctoral candidate in psychology at Northwestern. "When you pick up an object, your brain automatically decides how to control your muscles based on what your eyes provide about the object’s shape. When you pick up a mug by the handle with your right hand, you need to add a clockwise twist to your grip to compensate for the extra weight that you see on the left side of the mug.
"We showed that the use of this visual information is so powerful and automatic that we cannot turn it off. When people see an object weighted in one direction, they actually can’t help but ‘feel’ the weight in that direction, even when they know that we’re tricking them," Xu said.

When Your Eyes Tell Your Hands What to Think: You’re Far Less in Control of Your Brain Than You Think

You’ve probably never given much thought to the fact that picking up your cup of morning coffee presents your brain with a set of complex decisions. You need to decide how to aim your hand, grasp the handle and raise the cup to your mouth, all without spilling the contents on your lap.

A new Northwestern University study shows that, not only does your brain handle such complex decisions for you, it also hides information from you about how those decisions are made.

"Our study gives a salient example," said Yangqing ‘Lucie’ Xu, lead author of the study and a doctoral candidate in psychology at Northwestern. "When you pick up an object, your brain automatically decides how to control your muscles based on what your eyes provide about the object’s shape. When you pick up a mug by the handle with your right hand, you need to add a clockwise twist to your grip to compensate for the extra weight that you see on the left side of the mug.

"We showed that the use of this visual information is so powerful and automatic that we cannot turn it off. When people see an object weighted in one direction, they actually can’t help but ‘feel’ the weight in that direction, even when they know that we’re tricking them," Xu said.

Filed under brain decision-making neuroscience psychology vision perception science

33 notes


The Marketplace in Your Brain: Neuroscientists have found brain cells that compute value. Why are economists ignoring them?
In 2003, amid the coastal greenery of the Winnetu Oceanside Resort, on Martha’s Vineyard, a group of about 20 scholars gathered to kick-start a new discipline. They fell, broadly, into two groups: neuroscientists and economists. What they came to talk about was a collaboration between the two fields, which a few researchers had started to call “neuroeconomics.” Insights about brain anatomy, combined with economic models of neurons in action, could produce new insights into how people make decisions about money and life.
A photo taken during one of those sun-dappled days captures the group posed and smiling around a giant chess set on the resort lawn. Pawns were about two feet tall, kings and queens about four feet. Informally, the neuroscientists began to play the black pieces. The economists began to play white.
Today, nearly a decade later, a few black pawns have moved down the board. But the white pieces have stayed put. “I would say that neuroeconomics is about 90 percent neuroscience and 10 percent economists,” says Colin F. Camerer, a professor of behavioral finance and economics at the California Institute of Technology and one of the prime movers in the new field. “We’ve taken a lot of mathematical models from economics to help describe what we see happening in the brain. But economists have been a lot slower to use any of our ideas.”

Read more

The Marketplace in Your Brain: Neuroscientists have found brain cells that compute value. Why are economists ignoring them?

In 2003, amid the coastal greenery of the Winnetu Oceanside Resort, on Martha’s Vineyard, a group of about 20 scholars gathered to kick-start a new discipline. They fell, broadly, into two groups: neuroscientists and economists. What they came to talk about was a collaboration between the two fields, which a few researchers had started to call “neuroeconomics.” Insights about brain anatomy, combined with economic models of neurons in action, could produce new insights into how people make decisions about money and life.

A photo taken during one of those sun-dappled days captures the group posed and smiling around a giant chess set on the resort lawn. Pawns were about two feet tall, kings and queens about four feet. Informally, the neuroscientists began to play the black pieces. The economists began to play white.

Today, nearly a decade later, a few black pawns have moved down the board. But the white pieces have stayed put. “I would say that neuroeconomics is about 90 percent neuroscience and 10 percent economists,” says Colin F. Camerer, a professor of behavioral finance and economics at the California Institute of Technology and one of the prime movers in the new field. “We’ve taken a lot of mathematical models from economics to help describe what we see happening in the brain. But economists have been a lot slower to use any of our ideas.”

Read more

Filed under brain brain cells neuroscience neuroeconomics decision-making psychology science

38 notes


Scientists announce new treatment for type II diabetes
According to the World Health Organization, there are currently 347 million diabetics worldwide, with 90 percent of those people having type II diabetes specifically. It occurs when fat accumulates in places such as muscles, blood vessels and the heart, causing the cells in those areas to no longer be sufficiently responsive to insulin. This insulin resistance, in turn, causes blood glucose levels to rise to dangerous levels. Ultimately, it can result in things such as heart disease, strokes, blindness, kidney failure, and amputations. Fortunately, however, an international team of scientists has just announced a new way of treating the disease.
Currently, one of the main ways of treating type II diabetes involves switching the patient to a healthier diet and increasing the amount of exercise they get – the disease is most often caused by obesity. Additionally, oral medication can be used to increase insulin production and the body’s sensitivity to it, or to decrease glucose production. For approximately 30 percent of patients, however, such medication ceases to be effective after a few years, and they end up having to receive regular insulin injections.
The new treatment focuses on VEGF-B, a protein within the body that affects how fat is transported and stored. Using an antibody/drug known as 2H10, the scientists were able to block the signaling of VEGF-B in mice and rats, which subsequently kept fat from accumulating in the “wrong” areas of the animals – namely their muscles, blood vessels and hearts.

Scientists announce new treatment for type II diabetes

According to the World Health Organization, there are currently 347 million diabetics worldwide, with 90 percent of those people having type II diabetes specifically. It occurs when fat accumulates in places such as muscles, blood vessels and the heart, causing the cells in those areas to no longer be sufficiently responsive to insulin. This insulin resistance, in turn, causes blood glucose levels to rise to dangerous levels. Ultimately, it can result in things such as heart disease, strokes, blindness, kidney failure, and amputations. Fortunately, however, an international team of scientists has just announced a new way of treating the disease.

Currently, one of the main ways of treating type II diabetes involves switching the patient to a healthier diet and increasing the amount of exercise they get – the disease is most often caused by obesity. Additionally, oral medication can be used to increase insulin production and the body’s sensitivity to it, or to decrease glucose production. For approximately 30 percent of patients, however, such medication ceases to be effective after a few years, and they end up having to receive regular insulin injections.

The new treatment focuses on VEGF-B, a protein within the body that affects how fat is transported and stored. Using an antibody/drug known as 2H10, the scientists were able to block the signaling of VEGF-B in mice and rats, which subsequently kept fat from accumulating in the “wrong” areas of the animals – namely their muscles, blood vessels and hearts.

Filed under diabetes type II diabetes insulin glucose VEGF-B protein neuroscience science

38 notes

Deaf girl fitted with bionic ear speaks her first word
Evie was born profoundly deaf but it was not until she was 16 months old that tests revealed she had no hearing nerves, meaning an auditory brainstem implant - or bionic ear - was her only chance of ever hearing.
The 23-month-old has Oculo-Auriculo-Vertebral Syndrome (OAV), a very rare condition with no known cause, which affects the eyes, ears and spine.

Deaf girl fitted with bionic ear speaks her first word

Evie was born profoundly deaf but it was not until she was 16 months old that tests revealed she had no hearing nerves, meaning an auditory brainstem implant - or bionic ear - was her only chance of ever hearing.

The 23-month-old has Oculo-Auriculo-Vertebral Syndrome (OAV), a very rare condition with no known cause, which affects the eyes, ears and spine.

Filed under OAV bionic ear deafness hearing implants neuroscience auditory brainstem implantation science

32 notes


Robots paint hotel guests’ sleep patterns
Global hotel chain Ibis is transforming the nightly tosses and turns of its guests into works of modern art, painted by robots.
"Our masterpiece is to make your sleep a true work of art," the promotional video gushes, after putting a far more interesting point to viewers: "What does sleep look like?" To find out, the budget chain is installing thin grids covered in 80 heat, pressure and sound sensors on mattresses in select guestrooms, kicking off on 13 October in Paris. Data gathered by the sensors will be fed wirelessly throughout the night to the studio, where it is then fed through an algorithm that converts information on a guest’s movement, sound and temperature into colour and movement.
This video shows the robot, much like an assembly line arm, reacting in sequence, tracing acrylic paints onto a black canvas in a visual and physical interpretation of sleep cycles and patterns.
Only 40 participants can take part — anyone who wants to try it out can enter a competition on the Ibis Facebook page. When the project is wrapped up in Novemeber there will be an online gallery of the artworks and guests will get an original to take home.

Robots paint hotel guests’ sleep patterns

Global hotel chain Ibis is transforming the nightly tosses and turns of its guests into works of modern art, painted by robots.

"Our masterpiece is to make your sleep a true work of art," the promotional video gushes, after putting a far more interesting point to viewers: "What does sleep look like?" To find out, the budget chain is installing thin grids covered in 80 heat, pressure and sound sensors on mattresses in select guestrooms, kicking off on 13 October in Paris. Data gathered by the sensors will be fed wirelessly throughout the night to the studio, where it is then fed through an algorithm that converts information on a guest’s movement, sound and temperature into colour and movement.

This video shows the robot, much like an assembly line arm, reacting in sequence, tracing acrylic paints onto a black canvas in a visual and physical interpretation of sleep cycles and patterns.

Only 40 participants can take part — anyone who wants to try it out can enter a competition on the Ibis Facebook page. When the project is wrapped up in Novemeber there will be an online gallery of the artworks and guests will get an original to take home.

Filed under brain sleep sleep patterns robots art neuroscience robotics technology science

22 notes

Making headway on beta-blockers and sleep

Researchers at Brigham and Women’s Hospital have found that melatonin supplementation significantly improved sleep in hypertensive patients taking beta-blockers

Over 20 million people in the United States take beta-blockers, a medication commonly prescribed for cardiovascular issues, anxiety, hypertension and more. Many of these same people also have trouble sleeping, a side effect possibly related to the fact that these medications suppress night-time melatonin production. Researchers at Brigham and Women’s Hospital (BWH) have found that melatonin supplementation significantly improved sleep in hypertensive patients taking beta-blockers.

The study will be electronically published on September 28, 2012 and will be published in the October print issue of SLEEP (Title: A mechanism for upper airway stability during slow wave sleep).

"Beta-blockers have long been associated with sleep disturbances, yet until now, there have been no clinical studies that tested whether melatonin supplementation can improve sleep in these patients," explained Frank Scheer, PhD, MSc, an associate neuroscientist at BWH, and principal investigator on this study. "We found that melatonin supplements significantly improved sleep."

The research team analyzed 16 hypertensive patients who regularly took beta-blockers as treatment for their hypertension. The study participants were given either a melatonin supplement or placebo to take each night before bed. To avoid bias, neither the participants nor the researchers knew which pill they were taking. During the three week study, the participants spent two separate four-day visits in lab. While in the lab, the researchers assessed the participants’ sleep patterns and found a 37-minute increase in the amount of sleep in the participants who received the melatonin supplement compared to those who received placebo. They also found an eight percent improvement of sleep efficiency and a 41 minute increase in the time spent in Stage 2 sleep, without a decrease in slow wave sleep or REM sleep.

"Over the course of three weeks, none of the study participants taking the melatonin showed any of the adverse effects that are often observed with other, classic sleep aids. There were also no signs of ‘rebound insomnia’ after the participants stopped taking the drug," explained Scheer, who is also an assistant professor of Medicine at Harvard Medical School. "In fact, melatonin had a positive carry-over effect on sleep even after the participants had stopped taking the drug."

The researchers caution that while this data is promising for hypertensive patients taking beta-blockers, more research is needed to determine whether patients taking beta-blockers for causes other than hypertension could also benefit from melatonin supplementation.

(Source: eurekalert.org)

Filed under brain sleep melatonin beta-blockers neuroscience science

41 notes

New research published in Psychological Science, a journal of the Association for Psychological Science, examines the nuanced relationship between language and different types of perception.
Bilingual Infants Can Tell Unfamiliar Languages Apart
Speaking more than one language can improve our ability to control our behavior and focus our attention, recent research has shown. But are there any advantages for bilingual children before they can speak in full sentences? We know that bilingual children can tell if a person is speaking one of their native languages or the other, even when there is no sound, by watching the speaker’s mouth for visual cues. But Núria Sebastián-Gallés of Universitat Pompeu Fabra and colleagues wanted to know whether bilingual infants could also do this with two unfamiliar languages. They studied 8-month-old infants, half of whom lived in either Spanish- or Catalan-speaking households and half of whom lived in Spanish-Catalan bilingual households. The researchers looked at whether the infants could discriminate between English and French, two unfamiliar languages, using only visual cues. They found that the bilingual infants could tell the difference between the two languages, while the infants who lived in single-language households could not. These findings suggest that infants who are immersed in bilingual environments are more sensitive to the differences in visual cues associated with the sounds of various languages.
Lead author: Núria Sebastián-Gallés
Skilled Deaf Readers Have an Enhanced Perceptual Span in Reading
Though people born deaf are better able to use information from peripheral vision than those who can hear, they have a harder time learning to read. Researchers have proposed that the extra information coming in could distract from, rather than enhance, the process of reading. But no research has actually compared visual attention in reading between hearing and deaf readers.  In a new study, Nathalie Bélanger of the University of California, San Diego and colleagues investigated this issue by measuring the perceptual span, or the number of letter spaces used when reading, of skilled deaf readers, less-skilled deaf readers, and hearing readers. The experimenters manipulated the number of letter spaces that the participants saw while reading text on a screen. They found that, compared to the other two groups, skilled deaf readers read fastest when they were given the largest number of letter spaces, showing that they had the largest perceptual span. Regardless, they were able to read just as fast as skilled hearing readers. Contrary to previous hypotheses, these findings suggest that enhanced visual attention and perceptual span are not the cause of reading difficulties common among deaf individuals.
Lead author: Nathalie N. Bélanger

New research published in Psychological Science, a journal of the Association for Psychological Science, examines the nuanced relationship between language and different types of perception.

Bilingual Infants Can Tell Unfamiliar Languages Apart

Speaking more than one language can improve our ability to control our behavior and focus our attention, recent research has shown. But are there any advantages for bilingual children before they can speak in full sentences? We know that bilingual children can tell if a person is speaking one of their native languages or the other, even when there is no sound, by watching the speaker’s mouth for visual cues. But Núria Sebastián-Gallés of Universitat Pompeu Fabra and colleagues wanted to know whether bilingual infants could also do this with two unfamiliar languages. They studied 8-month-old infants, half of whom lived in either Spanish- or Catalan-speaking households and half of whom lived in Spanish-Catalan bilingual households. The researchers looked at whether the infants could discriminate between English and French, two unfamiliar languages, using only visual cues. They found that the bilingual infants could tell the difference between the two languages, while the infants who lived in single-language households could not. These findings suggest that infants who are immersed in bilingual environments are more sensitive to the differences in visual cues associated with the sounds of various languages.

Lead author: Núria Sebastián-Gallés

Skilled Deaf Readers Have an Enhanced Perceptual Span in Reading

Though people born deaf are better able to use information from peripheral vision than those who can hear, they have a harder time learning to read. Researchers have proposed that the extra information coming in could distract from, rather than enhance, the process of reading. But no research has actually compared visual attention in reading between hearing and deaf readers.  In a new study, Nathalie Bélanger of the University of California, San Diego and colleagues investigated this issue by measuring the perceptual span, or the number of letter spaces used when reading, of skilled deaf readers, less-skilled deaf readers, and hearing readers. The experimenters manipulated the number of letter spaces that the participants saw while reading text on a screen. They found that, compared to the other two groups, skilled deaf readers read fastest when they were given the largest number of letter spaces, showing that they had the largest perceptual span. Regardless, they were able to read just as fast as skilled hearing readers. Contrary to previous hypotheses, these findings suggest that enhanced visual attention and perceptual span are not the cause of reading difficulties common among deaf individuals.

Lead author: Nathalie N. Bélanger

Filed under brain language auditory perception deafness psychology neuroscience science

20 notes

Detection of Appearing and Disappearing Objects in Complex Acoustic Scenes
The ability to detect sudden changes in the environment is critical for survival. Hearing is hypothesized to play a major role in this process by serving as an “early warning device,” rapidly directing attention to new events. Here, we investigate listeners’ sensitivity to changes in complex acoustic scenes—what makes certain events “pop-out” and grab attention while others remain unnoticed? We use artificial “scenes” populated by multiple pure-tone components, each with a unique frequency and amplitude modulation rate. Importantly, these scenes lack semantic attributes, which may have confounded previous studies, thus allowing us to probe low-level processes involved in auditory change perception. Our results reveal a striking difference between “appear” and “disappear” events. Listeners are remarkably tuned to object appearance: change detection and identification performance are at ceiling; response times are short, with little effect of scene-size, suggesting a pop-out process. In contrast, listeners have difficulty detecting disappearing objects, even in small scenes: performance rapidly deteriorates with growing scene-size; response times are slow, and even when change is detected, the changed component is rarely successfully identified. We also measured change detection performance when a noise or silent gap was inserted at the time of change or when the scene was interrupted by a distractor that occurred at the time of change but did not mask any scene elements. Gaps adversely affected the processing of item appearance but not disappearance. However, distractors reduced both appearance and disappearance detection. Together, our results suggest a role for neural adaptation and sensitivity to transients in the process of auditory change detection, similar to what has been demonstrated for visual change detection. Importantly, listeners consistently performed better for item addition (relative to deletion) across all scene interruptions used, suggesting a robust perceptual representation of item appearance.

Detection of Appearing and Disappearing Objects in Complex Acoustic Scenes

The ability to detect sudden changes in the environment is critical for survival. Hearing is hypothesized to play a major role in this process by serving as an “early warning device,” rapidly directing attention to new events. Here, we investigate listeners’ sensitivity to changes in complex acoustic scenes—what makes certain events “pop-out” and grab attention while others remain unnoticed? We use artificial “scenes” populated by multiple pure-tone components, each with a unique frequency and amplitude modulation rate. Importantly, these scenes lack semantic attributes, which may have confounded previous studies, thus allowing us to probe low-level processes involved in auditory change perception. Our results reveal a striking difference between “appear” and “disappear” events. Listeners are remarkably tuned to object appearance: change detection and identification performance are at ceiling; response times are short, with little effect of scene-size, suggesting a pop-out process. In contrast, listeners have difficulty detecting disappearing objects, even in small scenes: performance rapidly deteriorates with growing scene-size; response times are slow, and even when change is detected, the changed component is rarely successfully identified. We also measured change detection performance when a noise or silent gap was inserted at the time of change or when the scene was interrupted by a distractor that occurred at the time of change but did not mask any scene elements. Gaps adversely affected the processing of item appearance but not disappearance. However, distractors reduced both appearance and disappearance detection. Together, our results suggest a role for neural adaptation and sensitivity to transients in the process of auditory change detection, similar to what has been demonstrated for visual change detection. Importantly, listeners consistently performed better for item addition (relative to deletion) across all scene interruptions used, suggesting a robust perceptual representation of item appearance.

Filed under brain hearing auditory perception perception attention psychology neuroscience science

105 notes

New research methods reveal that babies and young children learn by rationally testing hypotheses, analyzing statistics and doing experiments much as scientists do
Very young children’s learning and thinking is strikingly similar to much learning and thinking in science, according to Alison Gopnik, professor of psychology and affiliate professor of philosophy at the University of California, Berkeley. Gopnik’s findings are described in the Sept 28 issue of the journal Science. She spoke about her work in a video briefing with NSF.

New research methods reveal that babies and young children learn by rationally testing hypotheses, analyzing statistics and doing experiments much as scientists do

Very young children’s learning and thinking is strikingly similar to much learning and thinking in science, according to Alison Gopnik, professor of psychology and affiliate professor of philosophy at the University of California, Berkeley. Gopnik’s findings are described in the Sept 28 issue of the journal Science. She spoke about her work in a video briefing with NSF.

Filed under brain children's play cognition learning neuroscience psychology scientific thinking science

free counters