Neuroscience

Articles and news from the latest research reports.

Posts tagged perception

47 notes

What You Hear Could Depend on What Your Hands are Doing

A new finding could lead to strategies for treating speech loss after a stroke and helping children with dyslexia.

New research links motor skills and perception, specifically as it relates to a second finding—a new understanding of what the left and right brain hemispheres “hear.” Georgetown University Medical Center researchers say these findings may eventually point to strategies to help stroke patients recover their language abilities, and to improve speech recognition in children with dyslexia.

The study, presented at Neuroscience 2012, the annual meeting of the Society for Neuroscience, is the first to match human behavior with left brain/right brain auditory processing tasks. Before this research, neuroimaging tests had hinted at differences in such processing.

“Language is processed mainly in the left hemisphere, and some have suggested that this is because the left hemisphere specializes in analyzing very rapidly changing sounds,” says the study’s senior investigator, Peter E. Turkeltaub, M.D., Ph.D., a neurologist in the Center for Brain Plasticity and Recovery. This newly created center is a joint program of Georgetown University and MedStar National Rehabilitation Network.

Turkeltaub and his team hid rapidly and slowly changing sounds in background noise and asked 24 volunteers to simply indicate whether they heard the sounds by pressing a button.

“We asked the subjects to respond to sounds hidden in background noise,” Turkeltaub explained. “Each subject was told to use his or her right hand to respond during the first 20 sounds, then the left hand for the next 20 second, then right, then left, and so on.”

He says when a subject was using their right hand, they heard the rapidly changing sounds more often than when they used their left hand, and vice versa for the slowly changing sounds.

“Since the left hemisphere controls the right hand and vice versa, these results demonstrate that the two hemispheres specialize in different kinds of sounds—the left hemisphere likes rapidly changing sounds, such as consonants, and the right hemisphere likes slowly changing sounds, such as syllables or intonation,” Turkeltaub explains.

“These results also demonstrate the interaction between motor systems and perception. It’s really pretty amazing. Imagine you’re waving an American flag while listening to one of the presidential candidates. The speech will actually sound slightly different to you depending on whether the flag is in your left hand or your right hand.”

Ultimately, Turkeltaub hopes that understanding the basic organization of auditory systems and how they interact with motor systems will help explain why language resides in the left hemisphere of the brain, and will lead to new treatments for language disorders, like aphasia (language difficulties after stroke or brain injury) or dyslexia.

“If we can understand the basic brain organization for audition, this might ultimately lead to new treatments for people who have speech recognition problems due to stroke or other brain injury. Understanding better the specific roles of the two hemispheres in auditory processing will be a big step in that direction. If we find that people with aphasia, who typically have injuries to the left hemisphere, have difficulty recognizing speech because of problems with low-level auditory perception of rapidly changing sounds, maybe training the specific auditory processing deficits will improve their ability to recognize speech,” Turkeltaub concludes.

(Source: explore.georgetown.edu)

Filed under brain language motor skills stroke neuroscience psychology perception science

26 notes

Reducing visual clutter may help Alzheimer’s patients
It’s a finding that could help Alzheimer’s patients better cope with their condition.
Psychologists at the University of Toronto and the Georgia Institute of Technology (Georgia Tech) have shown that the inability to recognize once-familiar faces and objects may have as much to do with difficulty perceiving their distinct features as it does with the capacity to recall from memory.
A study published in the October issue of Hippocampus suggests that memory impairments for people diagnosed with early stage Alzheimer’s disease may in part be due to problems with determining the differences between similar objects.
The research contributes to growing evidence that a part of the brain once believed to support memory exclusively – the medial temporal lobe – also plays a role in object perception.

Reducing visual clutter may help Alzheimer’s patients

It’s a finding that could help Alzheimer’s patients better cope with their condition.

Psychologists at the University of Toronto and the Georgia Institute of Technology (Georgia Tech) have shown that the inability to recognize once-familiar faces and objects may have as much to do with difficulty perceiving their distinct features as it does with the capacity to recall from memory.

A study published in the October issue of Hippocampus suggests that memory impairments for people diagnosed with early stage Alzheimer’s disease may in part be due to problems with determining the differences between similar objects.

The research contributes to growing evidence that a part of the brain once believed to support memory exclusively – the medial temporal lobe – also plays a role in object perception.

Filed under alzheimer alzheimer's disease memory perception object perception neuroscience psychology science

143 notes

The Marshmallow Study Revisited
For the past four decades, the “marshmallow test” has served as a classic experimental measure of children’s self-control: will a preschooler eat one of the fluffy white confections now or hold out for two later?
Now a new study demonstrates that being able to delay gratification is influenced as much by the environment as by innate ability. Children who experienced reliable interactions immediately before the marshmallow task waited on average four times longer—12 versus three minutes—than youngsters in similar but unreliable situations [Video]
"Our results definitely temper the popular perception that marshmallow-like tasks are very powerful diagnostics for self-control capacity," says Celeste Kidd, a doctoral candidate in brain and cognitive sciences at the University of Rochester and lead author on the study to be published online October 11 in the journal Cognition.

The Marshmallow Study Revisited

For the past four decades, the “marshmallow test” has served as a classic experimental measure of children’s self-control: will a preschooler eat one of the fluffy white confections now or hold out for two later?

Now a new study demonstrates that being able to delay gratification is influenced as much by the environment as by innate ability. Children who experienced reliable interactions immediately before the marshmallow task waited on average four times longer—12 versus three minutes—than youngsters in similar but unreliable situations [Video]

"Our results definitely temper the popular perception that marshmallow-like tasks are very powerful diagnostics for self-control capacity," says Celeste Kidd, a doctoral candidate in brain and cognitive sciences at the University of Rochester and lead author on the study to be published online October 11 in the journal Cognition.

Filed under brain self-control children marshmallow study marshmallow test perception psychology neuroscience science

88 notes

Is the afterlife full of fluffy clouds and angels?

What does the neuroscientist Colin Blakemore make of an American neurosurgeon’s account of the afterlife?

Have you ever noticed that more people come back from Heaven than from Hell? We have all read those astonishing reports of near-death experiences (NDEs, as the aficionados call them) – the things that people say have happened to them when they almost, but don’t quite, shuffle off the coil.

They are nearly always pleasant and deeply reassuring in a saccharin-soaked way. Lots of spinning down warm, dark tunnels to the sound of celestial music; lots of trips along country lanes lined with hedges, towards the light of a welcoming cottage at the end of the road; lots of tumbling down Alice-in-Wonderland rabbit holes, but without the damaging effects of gravity.

True, Dr Maurice S Rawlings Jr, MD, heart surgeon in Chattanooga, Tennessee, and author of To Hell and Back, did have patients who reported very nasty NDEs after they came back on his operating table. Booming noises; licking flames and all that Mephistophelian stuff. But perhaps that tells us more about the challenges of living in Chattanooga, Tennessee, than about the metaphysics of life after death.

Predictably, the amazingly consistent, remarkably heaven-like experiences recounted by the majority of NDE-ers (yes, that really is what the experts call them) have been summarily dismissed by materialist sceptics – like me. Of course the brain does funny things when it’s running out of oxygen. The odd perceptions are just the consequences of confused activity in the temporal lobes.

But NDEs have taken on a new cloak of respectability with a book by a Harvard doctor. Proof of Heaven, by Eben Alexander, will make your toes wiggle or curl, depending on your prejudices. What’s special about his account of being dead is that he’s a neurosurgeon. At least that’s what the publicity is telling us. It’s a cover story in Newsweek magazine, with a screaming headline: “Heaven is Real: a doctor’s account of the afterlife”.

Read more …

Filed under near-death experiences metaphysics life death neuroscience brain perception afterlife science

285 notes


How Do Blind People Picture Reality?
Paul Gabias has never seen a table. He was born prematurely and went blind shortly thereafter, most likely because of overexposure to oxygen in his incubator. And yet, Gabias, 60, has no trouble perceiving the table next to him. “My image of the table is exactly the same as a table,” he said. “It has height, depth, width, texture; I can picture the whole thing all at once. It just has no color.”
If you have trouble constructing a mental picture of a table that has no color — not even black or white — that’s probably because you’re blinded by your ability to see. Sighted people visualize the surrounding world by detecting borders between areas rich in different wavelengths of light, which we see as different colors. Gabias, like many blind people, builds pictures using his sense of touch, and by listening to the echoes of clicks of his tongue and taps of his cane as these sounds bounce off objects in his surroundings, a technique called echolocation.

Read more

How Do Blind People Picture Reality?

Paul Gabias has never seen a table. He was born prematurely and went blind shortly thereafter, most likely because of overexposure to oxygen in his incubator. And yet, Gabias, 60, has no trouble perceiving the table next to him. “My image of the table is exactly the same as a table,” he said. “It has height, depth, width, texture; I can picture the whole thing all at once. It just has no color.”

If you have trouble constructing a mental picture of a table that has no color — not even black or white — that’s probably because you’re blinded by your ability to see. Sighted people visualize the surrounding world by detecting borders between areas rich in different wavelengths of light, which we see as different colors. Gabias, like many blind people, builds pictures using his sense of touch, and by listening to the echoes of clicks of his tongue and taps of his cane as these sounds bounce off objects in his surroundings, a technique called echolocation.

Read more

Filed under brain vision blindness reality mental representation perception neuroscience psychology science

85 notes

Whether you like someone can affect how your brain processes their actions, according to new research from the Brain and Creativity Institute at the USC Dornsife College of Letters, Arts and Sciences.
Most of the time, watching someone else move causes a “mirroring” effect — that is, the parts of our brains responsible for motor skills are activated by watching someone else in action.
But a study by USC researchers appearing in PLOS ONE shows that whether you like the person you’re watching can actually have an effect on brain activity related to motor actions and lead to “differential processing” — for example, thinking the person you dislike is moving more slowly than they actually are.
“We address the basic question of whether social factors influence our perception of simple actions,” said Lisa Aziz-Zadeh, assistant professor with the Brain and Creativity Institute and the Division of Occupational Science. “These results indicate that an abstract sense of group membership, and not only differences in physical appearance, can affect basic sensory-motor processing.”

Whether you like someone can affect how your brain processes their actions, according to new research from the Brain and Creativity Institute at the USC Dornsife College of Letters, Arts and Sciences.

Most of the time, watching someone else move causes a “mirroring” effect — that is, the parts of our brains responsible for motor skills are activated by watching someone else in action.

But a study by USC researchers appearing in PLOS ONE shows that whether you like the person you’re watching can actually have an effect on brain activity related to motor actions and lead to “differential processing” — for example, thinking the person you dislike is moving more slowly than they actually are.

“We address the basic question of whether social factors influence our perception of simple actions,” said Lisa Aziz-Zadeh, assistant professor with the Brain and Creativity Institute and the Division of Occupational Science. “These results indicate that an abstract sense of group membership, and not only differences in physical appearance, can affect basic sensory-motor processing.”

Filed under brain brain activity motor actions mirroring effect perception neuroscience psychology science

203 notes

What is reality?
WHEN you woke up this morning, you found the world largely as you left it. You were still you; the room in which you awoke was the same one you went to sleep in. The outside world had not been rearranged. History was unchanged and the future remained unknowable. In other words, you woke up to reality. But what is reality? The more we probe it, the harder it becomes to comprehend. In the eight articles on this page we take a tour of our fundamental understanding of the world around us, starting with an attempt to define reality and ending with the idea that whatever reality is, it isn’t what it seems. Hold on to your hats.

What is reality?

WHEN you woke up this morning, you found the world largely as you left it. You were still you; the room in which you awoke was the same one you went to sleep in. The outside world had not been rearranged. History was unchanged and the future remained unknowable. In other words, you woke up to reality. But what is reality? The more we probe it, the harder it becomes to comprehend. In the eight articles on this page we take a tour of our fundamental understanding of the world around us, starting with an attempt to define reality and ending with the idea that whatever reality is, it isn’t what it seems. Hold on to your hats.

Filed under brain perception reality consciousness neuroscience psychology science

425 notes


When Your Eyes Tell Your Hands What to Think: You’re Far Less in Control of Your Brain Than You Think
You’ve probably never given much thought to the fact that picking up your cup of morning coffee presents your brain with a set of complex decisions. You need to decide how to aim your hand, grasp the handle and raise the cup to your mouth, all without spilling the contents on your lap.
A new Northwestern University study shows that, not only does your brain handle such complex decisions for you, it also hides information from you about how those decisions are made.
"Our study gives a salient example," said Yangqing ‘Lucie’ Xu, lead author of the study and a doctoral candidate in psychology at Northwestern. "When you pick up an object, your brain automatically decides how to control your muscles based on what your eyes provide about the object’s shape. When you pick up a mug by the handle with your right hand, you need to add a clockwise twist to your grip to compensate for the extra weight that you see on the left side of the mug.
"We showed that the use of this visual information is so powerful and automatic that we cannot turn it off. When people see an object weighted in one direction, they actually can’t help but ‘feel’ the weight in that direction, even when they know that we’re tricking them," Xu said.

When Your Eyes Tell Your Hands What to Think: You’re Far Less in Control of Your Brain Than You Think

You’ve probably never given much thought to the fact that picking up your cup of morning coffee presents your brain with a set of complex decisions. You need to decide how to aim your hand, grasp the handle and raise the cup to your mouth, all without spilling the contents on your lap.

A new Northwestern University study shows that, not only does your brain handle such complex decisions for you, it also hides information from you about how those decisions are made.

"Our study gives a salient example," said Yangqing ‘Lucie’ Xu, lead author of the study and a doctoral candidate in psychology at Northwestern. "When you pick up an object, your brain automatically decides how to control your muscles based on what your eyes provide about the object’s shape. When you pick up a mug by the handle with your right hand, you need to add a clockwise twist to your grip to compensate for the extra weight that you see on the left side of the mug.

"We showed that the use of this visual information is so powerful and automatic that we cannot turn it off. When people see an object weighted in one direction, they actually can’t help but ‘feel’ the weight in that direction, even when they know that we’re tricking them," Xu said.

Filed under brain decision-making neuroscience psychology vision perception science

20 notes

Detection of Appearing and Disappearing Objects in Complex Acoustic Scenes
The ability to detect sudden changes in the environment is critical for survival. Hearing is hypothesized to play a major role in this process by serving as an “early warning device,” rapidly directing attention to new events. Here, we investigate listeners’ sensitivity to changes in complex acoustic scenes—what makes certain events “pop-out” and grab attention while others remain unnoticed? We use artificial “scenes” populated by multiple pure-tone components, each with a unique frequency and amplitude modulation rate. Importantly, these scenes lack semantic attributes, which may have confounded previous studies, thus allowing us to probe low-level processes involved in auditory change perception. Our results reveal a striking difference between “appear” and “disappear” events. Listeners are remarkably tuned to object appearance: change detection and identification performance are at ceiling; response times are short, with little effect of scene-size, suggesting a pop-out process. In contrast, listeners have difficulty detecting disappearing objects, even in small scenes: performance rapidly deteriorates with growing scene-size; response times are slow, and even when change is detected, the changed component is rarely successfully identified. We also measured change detection performance when a noise or silent gap was inserted at the time of change or when the scene was interrupted by a distractor that occurred at the time of change but did not mask any scene elements. Gaps adversely affected the processing of item appearance but not disappearance. However, distractors reduced both appearance and disappearance detection. Together, our results suggest a role for neural adaptation and sensitivity to transients in the process of auditory change detection, similar to what has been demonstrated for visual change detection. Importantly, listeners consistently performed better for item addition (relative to deletion) across all scene interruptions used, suggesting a robust perceptual representation of item appearance.

Detection of Appearing and Disappearing Objects in Complex Acoustic Scenes

The ability to detect sudden changes in the environment is critical for survival. Hearing is hypothesized to play a major role in this process by serving as an “early warning device,” rapidly directing attention to new events. Here, we investigate listeners’ sensitivity to changes in complex acoustic scenes—what makes certain events “pop-out” and grab attention while others remain unnoticed? We use artificial “scenes” populated by multiple pure-tone components, each with a unique frequency and amplitude modulation rate. Importantly, these scenes lack semantic attributes, which may have confounded previous studies, thus allowing us to probe low-level processes involved in auditory change perception. Our results reveal a striking difference between “appear” and “disappear” events. Listeners are remarkably tuned to object appearance: change detection and identification performance are at ceiling; response times are short, with little effect of scene-size, suggesting a pop-out process. In contrast, listeners have difficulty detecting disappearing objects, even in small scenes: performance rapidly deteriorates with growing scene-size; response times are slow, and even when change is detected, the changed component is rarely successfully identified. We also measured change detection performance when a noise or silent gap was inserted at the time of change or when the scene was interrupted by a distractor that occurred at the time of change but did not mask any scene elements. Gaps adversely affected the processing of item appearance but not disappearance. However, distractors reduced both appearance and disappearance detection. Together, our results suggest a role for neural adaptation and sensitivity to transients in the process of auditory change detection, similar to what has been demonstrated for visual change detection. Importantly, listeners consistently performed better for item addition (relative to deletion) across all scene interruptions used, suggesting a robust perceptual representation of item appearance.

Filed under brain hearing auditory perception perception attention psychology neuroscience science

free counters