Neuroscience

Articles and news from the latest research reports.

Posts tagged perception

47 notes

Try this exercise: Put this book down and go look in a mirror. Now move your eyes back and forth, so that you’re looking at your left eye, then at your right eye, then at your left eye again. When your eyes shift from one position to the other, they take time to move and land on the other location. But here’s the kicker: you never see your eyes move. What is happening to the time gaps during which your eyes are moving? Why do you feel as though there is no break in time while you’re changing your eye position? (Remember that it’s easy to detect someone else’s eyes moving, so the answer cannot be that eye movements are too fast to see.)
All these illusions and distortions are consequences of the way your brain builds a representation of time. When we examine the problem closely, we find that “time” is not the unitary phenomenon we may have supposed it to be. This can be illustrated with some simple experiments: for example, when a stream of images is shown over and over in succession, an oddball image thrown into the series appears to last for a longer period, although presented for the same physical duration. In the neuroscientific literature, this effect was originally termed a subjective “expansion of time,” but that description begs an important question of time representation: when durations dilate or contract, does time in general slow down or speed up during that moment? If a friend, say, spoke to you during the oddball presentation, would her voice seem lower in pitch, like a slowed- down record?
If our perception works like a movie camera, then when one aspect of a scene slows down, everything should slow down. In the movies, if a police car launching off a ramp is filmed in slow motion, not only will it stay in the air longer but its siren will blare at a lower pitch and its lights will flash at a lower frequency. An alternative hypothesis suggests that different temporal judgments are generated by different neural mechanisms—and while they often agree, they are not required to. The police car may seem suspended longer, while the frequencies of its siren and its flashing lights remain unchanged.
Read more: Brain Time

Try this exercise: Put this book down and go look in a mirror. Now move your eyes back and forth, so that you’re looking at your left eye, then at your right eye, then at your left eye again. When your eyes shift from one position to the other, they take time to move and land on the other location. But here’s the kicker: you never see your eyes move. What is happening to the time gaps during which your eyes are moving? Why do you feel as though there is no break in time while you’re changing your eye position? (Remember that it’s easy to detect someone else’s eyes moving, so the answer cannot be that eye movements are too fast to see.)

All these illusions and distortions are consequences of the way your brain builds a representation of time. When we examine the problem closely, we find that “time” is not the unitary phenomenon we may have supposed it to be. This can be illustrated with some simple experiments: for example, when a stream of images is shown over and over in succession, an oddball image thrown into the series appears to last for a longer period, although presented for the same physical duration. In the neuroscientific literature, this effect was originally termed a subjective “expansion of time,” but that description begs an important question of time representation: when durations dilate or contract, does time in general slow down or speed up during that moment? If a friend, say, spoke to you during the oddball presentation, would her voice seem lower in pitch, like a slowed- down record?

If our perception works like a movie camera, then when one aspect of a scene slows down, everything should slow down. In the movies, if a police car launching off a ramp is filmed in slow motion, not only will it stay in the air longer but its siren will blare at a lower pitch and its lights will flash at a lower frequency. An alternative hypothesis suggests that different temporal judgments are generated by different neural mechanisms—and while they often agree, they are not required to. The police car may seem suspended longer, while the frequencies of its siren and its flashing lights remain unchanged.

Read more: Brain Time

Filed under science neuroscience brain psychology time perception perception

122 notes


Disney researchers add sense of touch to augmented reality applications 
Technology developed by Disney Research, Pittsburgh, makes it possible to change the feel of real-world surfaces and objects, including touch-screens, walls, furniture, wooden or plastic objects, without requiring users to wear special gloves or use force-feedback devices. Surfaces are not altered with actuators and require little if any instrumentation. 
Instead, Disney researchers employ a newly discovered physical phenomenon called reverse electrovibration to create the illusion of changing textures as the user’s fingers sweep across a surface. A weak electrical signal, which can be applied imperceptibly anywhere on the user’s body, creates an oscillating electrical field around the user’s fingers that is responsible for the tactile feedback.
The technology, called REVEL, could be used to create “please touch” museum displays, add haptic feedback to games, apply texture to projected images on surfaces of any size and shape, provide customized directions on walls for people with visual disabilities and enhance other applications of augmented reality.

Disney researchers add sense of touch to augmented reality applications 

Technology developed by Disney Research, Pittsburgh, makes it possible to change the feel of real-world surfaces and objects, including touch-screens, walls, furniture, wooden or plastic objects, without requiring users to wear special gloves or use force-feedback devices. Surfaces are not altered with actuators and require little if any instrumentation. 

Instead, Disney researchers employ a newly discovered physical phenomenon called reverse electrovibration to create the illusion of changing textures as the user’s fingers sweep across a surface. A weak electrical signal, which can be applied imperceptibly anywhere on the user’s body, creates an oscillating electrical field around the user’s fingers that is responsible for the tactile feedback.

The technology, called REVEL, could be used to create “please touch” museum displays, add haptic feedback to games, apply texture to projected images on surfaces of any size and shape, provide customized directions on walls for people with visual disabilities and enhance other applications of augmented reality.

Filed under brain illusions neuroscience perception psychology science touch vision tactile technology tech

28 notes


Opinion: Scientists’ Intuitive Failures
Much of what researchers believe about the public and effective communication is wrong.
Scientists in the United States and Europe have long been concerned with how well the public understands science, whether or not the media adequately covers science, and how the public reaches decisions on complex science-related policy issues. Given the norms of our profession, however, it is ironic that many of these debates about how to best communicate science with lay populations are driven by intuitive assumptions on the part of scientists rather than the growing body of social science research on the topic that has developed over the past 2 decades.
In May, more than 500 researchers, journalists, and policy professionals gathered at the National Academies in Washington, DC, for a 2-day forum on the “Science of Science Communication” to dispel some of these intuitive but persistent myths about science, the media, and the public.
1. Americans no longer trust scientists.  Prominent scientists warn that we have entered a new “dark age,” where the public no longer trusts scientific expertise. 
2. Science journalism is dead.  Though scientists are often critical of the news media, calling attention to perceived bias on the part of journalists, they also fear that budget cuts at news organizations have meant the death of science journalism.  
3. Entertainment media promote a culture of anti-science.  Since the 1970s, scientists have feared that entertainment TV and film undermine public trust in science.  
4. The problem is the public, not scientists or policymakers. Scientists have long believed that when the public disagreed with them on matters of policy, public ignorance was to blame.  
5. Political views don’t influence the judgments of scientists. In debating science-related policy matters, we tend to assume that scientists are not influenced by their own political views. Yet in a recent study co-authored by one of us (Scheufele), we find that even after controlling for their scientific judgments, scientists’ political ideologies significantly influence their preferences for potential regulatory policies.

Opinion: Scientists’ Intuitive Failures

Much of what researchers believe about the public and effective communication is wrong.

Scientists in the United States and Europe have long been concerned with how well the public understands science, whether or not the media adequately covers science, and how the public reaches decisions on complex science-related policy issues. Given the norms of our profession, however, it is ironic that many of these debates about how to best communicate science with lay populations are driven by intuitive assumptions on the part of scientists rather than the growing body of social science research on the topic that has developed over the past 2 decades.

In May, more than 500 researchers, journalists, and policy professionals gathered at the National Academies in Washington, DC, for a 2-day forum on the “Science of Science Communication” to dispel some of these intuitive but persistent myths about science, the media, and the public.

1. Americans no longer trust scientists.  Prominent scientists warn that we have entered a new “dark age,” where the public no longer trusts scientific expertise. 

2. Science journalism is dead.  Though scientists are often critical of the news media, calling attention to perceived bias on the part of journalists, they also fear that budget cuts at news organizations have meant the death of science journalism.  

3. Entertainment media promote a culture of anti-science.  Since the 1970s, scientists have feared that entertainment TV and film undermine public trust in science.  

4. The problem is the public, not scientists or policymakers. Scientists have long believed that when the public disagreed with them on matters of policy, public ignorance was to blame.  

5. Political views don’t influence the judgments of scientists. In debating science-related policy matters, we tend to assume that scientists are not influenced by their own political views. Yet in a recent study co-authored by one of us (Scheufele), we find that even after controlling for their scientific judgments, scientists’ political ideologies significantly influence their preferences for potential regulatory policies.

Filed under communication media neuroscience perception psychology public science scientists society social sciences professionals laymen myths politics

41 notes

The longer you’re awake, the slower you get

July 27, 2012

Anyone that has ever had trouble sleeping can attest to the difficulties at work the following day. Experts recommend eight hours of sleep per night for ideal health and productivity, but what if five to six hours of sleep is your norm? Is your work still negatively affected? A team of researchers at Brigham and Women’s Hospital (BWH) have discovered that regardless of how tired you perceive yourself to be, that lack of sleep can influence the way you perform certain tasks.

This finding is published in the July 26, 2012 online edition of The Journal of Vision.

"Our team decided to look at how sleep might affect complex visual search tasks, because they are common in safety-sensitive activities, such as air-traffic control, baggage screening, and monitoring power plant operations," explained Jeanne F. Duffy, PhD, MBA, senior author on this study and associate neuroscientist at BWH. "These types of jobs involve processes that require repeated, quick memory encoding and retrieval of visual information, in combination with decision making about the information."

Researchers collected and analyzed data from visual search tasks from 12 participants over a one month study. In the first week, all participants were scheduled to sleep 10-12 hours per night to make sure they were well-rested. For the following three weeks, the participants were scheduled to sleep the equivalent of 5.6 hours per night, and also had their sleep times scheduled on a 28-hour cycle, mirroring chronic jet lag. The research team gave the participants computer tests that involved visual search tasks and recorded how quickly the participants could find important information, and also how accurate they were in identifying it. The researchers report that the longer the participants were awake, the more slowly they identified the important information in the test. Additionally, during the biological night time, 12 a.m. -6 a.m., participants (who were unaware of the time throughout the study) also performed the tasks more slowly than they did during the daytime.

"This research provides valuable information for workers, and their employers, who perform these types of visual search tasks during the night shift, because they will do it much more slowly than when they are working during the day," said Duffy. "The longer someone is awake, the more the ability to perform a task, in this case a visual search, is hindered, and this impact of being awake is even stronger at night."

While the accuracy of the participants stayed the fairly constant, they were slower to identify the relevant information as the weeks went on. The self-ratings of sleepiness only got slightly worse during the second and third weeks on the study schedule, yet the data show that they were performing the visual search tasks significantly slower than in the first week. This finding suggests that someone’s perceptions of how tired they are do not always match their performance ability, explains Duffy.

Provided by Brigham and Women’s Hospital

Source: medicalxpress.com

Filed under science neuroscience brain psychology sleep vision perception memory decision making

43 notes


Measuring the Evolution of Contemporary Western Popular Music
Popular music is a key cultural expression that has captured listeners’ attention for ages. Many of the structural regularities underlying musical discourse are yet to be discovered and, accordingly, their historical evolution remains formally unknown. Here we unveil a number of patterns and metrics characterizing the generic usage of primary musical facets such as pitch, timbre, and loudness in contemporary western popular music. Many of these patterns and metrics have been consistently stable for a period of more than fifty years. However, we prove important changes or trends related to the restriction of pitch transitions, the homogenization of the timbral palette, and the growing loudness levels. This suggests that our perception of the new would be rooted on these changing characteristics. Hence, an old tune could perfectly sound novel and fashionable, provided that it consisted of common harmonic progressions, changed the instrumentation, and increased the average loudness.

Measuring the Evolution of Contemporary Western Popular Music

Popular music is a key cultural expression that has captured listeners’ attention for ages. Many of the structural regularities underlying musical discourse are yet to be discovered and, accordingly, their historical evolution remains formally unknown. Here we unveil a number of patterns and metrics characterizing the generic usage of primary musical facets such as pitch, timbre, and loudness in contemporary western popular music. Many of these patterns and metrics have been consistently stable for a period of more than fifty years. However, we prove important changes or trends related to the restriction of pitch transitions, the homogenization of the timbral palette, and the growing loudness levels. This suggests that our perception of the new would be rooted on these changing characteristics. Hence, an old tune could perfectly sound novel and fashionable, provided that it consisted of common harmonic progressions, changed the instrumentation, and increased the average loudness.

Filed under science neuroscience psychology music evolution perception pitch brain timbre loudness western music

35 notes


Using Virtual Reality an Arm Up to Three or Even Four Times the Length of a Real Arm Can Be Felt as If It Was the Person’s Own Arm
The authors of the article have added another dimension to this illusion of body ownership. Using virtual reality they have shown that a virtual body with one very long arm can be incorporated into body representation. An arm up to three or possibly even four times the length of a person’s real arm can be felt as if it was the person’s own arm. This is notwithstanding the fact that having one such long arm introduces a gross asymmetry in the body. An extended body space (a body with longer limbs occupies more volume than a normal body) affects also the special space surrounding our body that is called peripersonal space — a space that when violated by objects or other people can be experienced as a threat or intimacy, depending on the context.
In the experiment 50 people experienced virtual reality where they had a virtual body. They put on a head-mounted display so that all around themselves they saw a virtual world. When they looked down towards where their body should be, they saw a virtual body instead of their real one. They had their dominant hand resting on a table with a special textured material that they could feel with their real hand, but also see their virtual hand touching it. So as they moved their real hand over the surface of this table they would see the virtual hand doing the same.

The results of the study were analysed by using a questionnaire to assess the subjective illusion that the virtual arm was part of the person’s body; a pointing task, where the arm that did not grow in length was required to point towards where the other hand was felt to be (with eyes shut), and a response to a threat task, in which a saw fell down towards the virtual hand (figure E, F) and it was measured whether people would move their real hand in an attempt to avoid it.
Based on these data, researchers found that people did have the illusion that the extended hand was their own. Even when the virtual arm was 4 times the length of the corresponding real arm, still 40-50% of participants showed signs of incorporation of the virtual arm as part of their body representation. It was also found that vision alone is a very powerful inducer of the illusion of virtual arm ownership — those who experienced the inconsistent condition where the virtual hand did not touch the table, even though the real hand felt the table top, had a strong illusion of ownership over the virtual arm.
These results show how malleable is our body representation, even incorporating strong asymmetries in the body shape, which do not correspond at all to the average human shape. This type of research will help neuroscientists to understand how the brain represents the body, and ultimately may help people overcome illnesses that are based on body image distortions.

Using Virtual Reality an Arm Up to Three or Even Four Times the Length of a Real Arm Can Be Felt as If It Was the Person’s Own Arm

The authors of the article have added another dimension to this illusion of body ownership. Using virtual reality they have shown that a virtual body with one very long arm can be incorporated into body representation. An arm up to three or possibly even four times the length of a person’s real arm can be felt as if it was the person’s own arm. This is notwithstanding the fact that having one such long arm introduces a gross asymmetry in the body. An extended body space (a body with longer limbs occupies more volume than a normal body) affects also the special space surrounding our body that is called peripersonal space — a space that when violated by objects or other people can be experienced as a threat or intimacy, depending on the context.

In the experiment 50 people experienced virtual reality where they had a virtual body. They put on a head-mounted display so that all around themselves they saw a virtual world. When they looked down towards where their body should be, they saw a virtual body instead of their real one. They had their dominant hand resting on a table with a special textured material that they could feel with their real hand, but also see their virtual hand touching it. So as they moved their real hand over the surface of this table they would see the virtual hand doing the same.

The results of the study were analysed by using a questionnaire to assess the subjective illusion that the virtual arm was part of the person’s body; a pointing task, where the arm that did not grow in length was required to point towards where the other hand was felt to be (with eyes shut), and a response to a threat task, in which a saw fell down towards the virtual hand (figure E, F) and it was measured whether people would move their real hand in an attempt to avoid it.

Based on these data, researchers found that people did have the illusion that the extended hand was their own. Even when the virtual arm was 4 times the length of the corresponding real arm, still 40-50% of participants showed signs of incorporation of the virtual arm as part of their body representation. It was also found that vision alone is a very powerful inducer of the illusion of virtual arm ownership — those who experienced the inconsistent condition where the virtual hand did not touch the table, even though the real hand felt the table top, had a strong illusion of ownership over the virtual arm.

These results show how malleable is our body representation, even incorporating strong asymmetries in the body shape, which do not correspond at all to the average human shape. This type of research will help neuroscientists to understand how the brain represents the body, and ultimately may help people overcome illnesses that are based on body image distortions.

Filed under brain illusion neuroscience perception psychology science virtual reality peripersonal space body image vision

41 notes

Why does vivid memory ‘feel so real?’

Scientists find evidence that real perceptual experience and mental replay share similar brain activation patterns

Toronto, Canada – Neuroscientists have found strong evidence that vivid memory and directly experiencing the real moment can trigger similar brain activation patterns.

The study, led by Baycrest’s Rotman Research Institute (RRI), in collaboration with the University of Texas at Dallas, is one of the most ambitious and complex yet for elucidating the brain’s ability to evoke a memory by reactivating the parts of the brain that were engaged during the original perceptual experience. Researchers found that vivid memory and real perceptual experience share “striking” similarities at the neural level, although they are not “pixel-perfect” brain pattern replications.

The study appears online this month in the Journal of Cognitive Neuroscience, ahead of print publication.

"When we mentally replay an episode we’ve experienced, it can feel like we are transported back in time and re-living that moment again," said Dr. Brad Buchsbaum, lead investigator and scientist with Baycrest’s RRI. "Our study has confirmed that complex, multi-featured memory involves a partial reinstatement of the whole pattern of brain activity that is evoked during initial perception of the experience. This helps to explain why vivid memory can feel so real."

But vivid memory rarely fools us into believing we are in the real, external world – and that in itself offers a very powerful clue that the two cognitive operations don’t work exactly the same way in the brain, he explained.

In the study, Dr. Buchsbaum’s team used functional magnetic resonance imaging (fMRI), a powerful brain scanning technology that constructs computerized images of brain areas that are active when a person is performing a specific cognitive task. A group of 20 healthy adults (aged 18 to 36) were scanned while they watched 12 video clips, each nine seconds long, sourced from YouTube.com and Vimeo.com. The clips contained a diversity of content – such as music, faces, human emotion, animals, and outdoor scenery. Participants were instructed to pay close attention to each of the videos (which were repeated 27 times) and informed they would be tested on the content of the videos after the scan.

A subset of nine participants from the original group were then selected to complete intensive and structured memory training over several weeks that required practicing over and over again the mental replaying of videos they had watched from the first session. After the training, this group was scanned again as they mentally replayed each video clip. To trigger their memory for a particular clip, they were trained to associate a particular symbolic cue with each one. Following each mental replay, participants would push a button indicating on a scale of 1 to 4 (1 = poor memory, 4 = excellent memory) how well they thought they had recalled a particular clip.

Dr. Buchsbaum’s team found “clear evidence” that patterns of distributed brain activation during vivid memory mimicked the patterns evoked during sensory perception when the videos were viewed – by a correspondence of 91% after a principal components analysis of all the fMRI imaging data.

The so-called “hot spots”, or largest pattern similarity, occurred in sensory and motor association areas of the cerebral cortex – a region that plays a key role in memory, attention, perceptual awareness, thought, language and consciousness.

Dr. Buchsbaum suggested the imaging analysis used in his study could potentially add to the current battery of memory assessment tools available to clinicians. Brain activation patterns from fMRI data could offer an objective way of quantifying whether a patient’s self-report of their memory as “being good or vivid” is accurate or not.

Source: EurekAlert!

Filed under science neuroscience brain brain activation psychology memory perceptual experience perception

109 notes


Why Facial Disfigurements Creep Us Out
Whether we realize it or not, most of us have a knee-jerk reaction when we see someone with a facial disfigurement, such as psoriasis, a cleft lip, or a birthmark. We may sit away from them on the bus, hesitate to shake their hand, or even give a barely masked look of revulsion. A new study suggests these disgust reactions stem from an ancient disease-avoidance system that normally prevents us from catching illnesses. Essentially, we treat facial disfigurements like infectious diseases.

Why Facial Disfigurements Creep Us Out

Whether we realize it or not, most of us have a knee-jerk reaction when we see someone with a facial disfigurement, such as psoriasis, a cleft lip, or a birthmark. We may sit away from them on the bus, hesitate to shake their hand, or even give a barely masked look of revulsion. A new study suggests these disgust reactions stem from an ancient disease-avoidance system that normally prevents us from catching illnesses. Essentially, we treat facial disfigurements like infectious diseases.

Filed under science neuroscience psychology disease aversion perception behavior facial disfigurements face

17 notes

How Does Fat Influence Flavor Perception?

ScienceDaily (July 19, 2012) — A joint study carried out by The University of Nottingham and the multinational food company Unilever has found for the first time that fat in food can reduce activity in several areas of the brain which are responsible for processing taste, aroma and reward.

The research, now available in the Springer journal Chemosensory Perception, provides the food industry with better understanding of how in the future it might be able to make healthier, less fatty food products without negatively affecting their overall taste and enjoyment. Unveiled in 2010, Unilever’s Sustainable Living Plan sets out its ambition to help hundreds of millions of people improve their diet around the world within a decade.

This fascinating three-year study investigated how the brains of a group of participants in their 20s would respond to changes in the fat content of four different fruit emulsions they tasted while under an MRI scanner. All four samples were of the same thickness and sweetness, but one contained flavour with no fat, while the other three contained fat with different flavour release properties.

The research found that the areas of the participants’ brains which are responsible for the perception of flavour — such as the somatosensory cortices and the anterior, mid & posterior insula — were significantly more activated when the non-fatty sample was tested compared to the fatty emulsions despite having the same flavour perception. It is important to note that increased activation in these brain areas does not necessarily result in increased perception of flavour or reward.

Dr Joanne Hort, Associate Professor in Sensory Science at The University of Nottingham said: “This is the first brain study to assess the effect of fat on the processing of flavour perception and it raises questions as to why fat emulsions suppress the cortical response in brain areas linked to the processing of flavour and reward. It also remains to be determined what the implications of this suppressive effect are on feelings of hunger, satiety and reward.”

Unilever food scientist Johanneke Busch, based at the company’s Research & Development laboratories in Vlaardingen, Netherlands added: “There is more to people’s enjoyment of food than the product’s flavour — like its mouthfeel, its texture and whether it satisfies hunger, so this is a very important building block for us to better understand how to innovate and manufacture healthier food products which people want to buy.”

Source: Science Daily

Filed under science neuroscience brain psychology food perception taste MRI

28 notes

Infants’ Recognition of Speech More Sophisticated Than Previously Known

ScienceDaily (July 17, 2012) — The ability of infants to recognize speech is more sophisticated than previously known, researchers in New York University’s Department of Psychology have found. Their study, which appears in the journal Developmental Psychology, showed that infants, as early as nine months old, could make distinctions between speech and non-speech sounds in both humans and animals.

A new study shows that infants, as early as nine months old, could make distinctions between speech and non-speech sounds in both humans and animals. (Credit: © ChantalS / Fotolia)

"Our results show that infant speech perception is resilient and flexible," explained Athena Vouloumanos, an assistant professor at NYU and the study’s lead author. "This means that our recognition of speech is more refined at an earlier age than we’d thought."

It is well-known that adults’ speech perception is fine-tuned — they can detect speech among a range of ambiguous sounds. But much less is known about the capability of infants to make similar assessments. Understanding when these abilities become instilled would shed new light on how early in life we develop the ability to recognize speech.

In order to gauge the aptitude to perceive speech at any early age, the researchers examined the responses of infants, approximately nine months in age, to recorded human and parrot speech and non-speech sounds. Human (an adult female voice) and parrot speech sounds included the words “truck,” “treat,” “dinner,” and “two.” The adult non-speech sounds were whistles and a clearing of the throat while the parrot non-speech sounds were squawks and chirps. The recorded parrot speech sounds were those of Alex, an African Gray parrot that had the ability to talk and reason and whose behaviors were studied by psychology researcher Irene Pepperberg.

Since infants cannot verbally communicate their recognition of speech, the researchers employed a commonly used method to measure this process: looking longer at what they find either interesting or unusual. Under this method, looking longer at a visual paired with a sound may be interpreted as a reflection of recognition. In this study, sounds were paired with a series of visuals: a checkerboard-like image, adult female faces, and a cup.

The results showed that infants listened longer to human speech compared to human non-speech sounds regardless of the visual stimulus, revealing the ability recognize human speech independent of the context.

Their findings on non-human speech were more nuanced. When paired with human-face visuals or human artifacts like cups, the infants listened to parrot speech longer than they did non-speech, such that their preference for parrot speech was similar to their preference for human speech sounds. However, this did not occur in the presence of other visual stimuli. In other words, infants were able to distinguish animal speech from non-speech, but only in some contexts.

"Parrot speech is unlike human speech, so the results show infants have the ability to detect different types of speech, even if they need visual cues to assist in this process," explained Vouloumanos.

Source: Science Daily

Filed under science neuroscience brain psychology speech recognition speech perception

free counters