Posts tagged object perception

Posts tagged object perception
Don’t Underestimate Your Mind’s Eye
Take a look around, and what do you see? Much more than you think you do, thanks to your finely tuned mind’s eye, which processes images without your even knowing.
A University of Arizona study has found that objects in our visual field of which we are not consciously aware still may influence our decisions. The findings refute traditional ideas about visual perception and cognition, and they could shed light on why we sometimes make decisions — stepping into a street, choosing not to merge into a traffic lane — without really knowing why.
Laura Cacciamani, who recently earned her doctorate in psychology with a minor in neuroscience, has found supporting evidence. Cacciamani’s is the lead author on a co-authored study, published online in the journal Attention, Perception and Psychophysics, shows that the brain’s subconscious processing has an impact on behavior and decision-making.
This seems to make evolutionary sense, Cacciamani said. Early humans would have required keen awareness of their surroundings on a subliminal level in order to survive.
"Your brain is always monitoring for meaning in the world, to be aware of your general surroundings and potential predators," Cacciamani said. "You can be focused on a task, but your brain is assessing the meaning of everything around you – even objects that you’re not consciously perceiving."
The study builds on the findings of earlier research by Jay Sanguinetti, who also was a doctoral candidate in the UA Department of Psychology. Both studies go against conventional wisdom among vision scientists.
"According to the traditional view, the brain accesses the meaning – or the memory – of an object after you perceive it," Cacciamani said. "Against this view, we have now shown that the meaning of an object can be accessed before conscious perception.
"We’re showing that there’s more interplay between memory and perception than previously has been assumed," she said.
Cacciamani asked participants in her study to classify nouns that appeared on a computer screen as naming a natural object or artificial object by pressing one of two buttons labeled “natural” or “artificial.” For example, the word “leaf” indicates an object found in nature, while “anchor” indicates a man-made or artificial object.
But before each word appeared on the screen, the computer flashed a black silhouette that – unknown to participants – had portions of natural or artificial objects suggested along the white outside regions (called the “ground” regions) of the image. Participants were not told to look for anything in the silhouettes, and they were flashed so quickly – 50 milliseconds – that it would have been difficult to notice the objects in the ground regions even if someone knew what to look for. Participants never were aware that the silhouette’s grounds suggested recognizable objects.
Cacciamani measured how well study participants performed at categorizing the words as natural or artificial by recording speed and accuracy.
"We found that participants performed better on the natural/artificial word task when that word followed a silhouette whose ground contained an object of the same rather than a different category," Cacciamani said.
This indicates that the brain accessed the meaning of the objects in the silhouette’s grounds even though study participants didn’t know the objects were there, she said.
"Every day our visual systems are bombarded with more information than we can consciously be aware of," Cacciamani said. "We’re showing that your brain might still be accessing information without your conscious awareness, and that could influence your behavior."

Infants process faces long before they recognize other objects
New research from psychology Research Professor Anthony Norcia and postdoctoral fellow Faraz Farzin, both of the Stanford Vision and NeuroDevelopment Lab, suggests a physical basis for infants’ ogling. At as early as four months, babies’ brains already process faces at nearly adult levels, even while other images are still being analyzed in lower levels of the visual system.
The results fit, Farzin pointed out, with the prominent role human faces play in a baby’s world.
"If anything’s going to develop earlier it’s going to be face recognition," she said.
The paper appeared in the online Journal of Vision.
The researchers noninvasively measured electrical activity generated in the infants’ brains with a net of sensors placed over the scalp – a sort of electroencephalographic skullcap.
The sensors were monitoring what are called steady state visual potentials – spikes in brain activity elicited by visual stimulation. By flashing photographs at infants and adults and measuring their brain activity at the same steady rhythm – a technique Norcia has pioneered for over three decades – the researchers were able to “ask” the participants’ brains what they perceived.
When the experiment is conducted on adults, faces and objects (like a telephone or an apple) light up similar areas of the temporal lobe – a region of the brain devoted to higher-level visual processing.
Infants’ neural responses to faces were similar to those of adults, showing activity over a part of the temporal lobe researchers think is devoted to face processing.
Reducing visual clutter may help Alzheimer’s patients
It’s a finding that could help Alzheimer’s patients better cope with their condition.
Psychologists at the University of Toronto and the Georgia Institute of Technology (Georgia Tech) have shown that the inability to recognize once-familiar faces and objects may have as much to do with difficulty perceiving their distinct features as it does with the capacity to recall from memory.
A study published in the October issue of Hippocampus suggests that memory impairments for people diagnosed with early stage Alzheimer’s disease may in part be due to problems with determining the differences between similar objects.
The research contributes to growing evidence that a part of the brain once believed to support memory exclusively – the medial temporal lobe – also plays a role in object perception.
New study suggests memory impairment tied to object perception
A new study from Georgia Tech and the University of Toronto suggests that memory impairments for people diagnosed with early stage Alzheimer’s disease may be due, in part, to problems in determining the differences between similar objects. The findings also support growing research indicating that a part of the brain once believed to support memory exclusively – the medial temporal lobe - also plays a role in object perception. The results are published in the October edition of Hippocampus.
Mild cognitive impairment (MCI) is a disorder commonly thought to be a precursor to Alzheimer’s disease. The study’s investigators, partnering with the Emory Alzheimer’s Disease Research Center, tested MCI patients on their ability to determine whether two rotated, side-by-side pictures were different or identical.