
Infants process faces long before they recognize other objects
New research from psychology Research Professor Anthony Norcia and postdoctoral fellow Faraz Farzin, both of the Stanford Vision and NeuroDevelopment Lab, suggests a physical basis for infants’ ogling. At as early as four months, babies’ brains already process faces at nearly adult levels, even while other images are still being analyzed in lower levels of the visual system.
The results fit, Farzin pointed out, with the prominent role human faces play in a baby’s world.
"If anything’s going to develop earlier it’s going to be face recognition," she said.
The paper appeared in the online Journal of Vision.
The researchers noninvasively measured electrical activity generated in the infants’ brains with a net of sensors placed over the scalp – a sort of electroencephalographic skullcap.
The sensors were monitoring what are called steady state visual potentials – spikes in brain activity elicited by visual stimulation. By flashing photographs at infants and adults and measuring their brain activity at the same steady rhythm – a technique Norcia has pioneered for over three decades – the researchers were able to “ask” the participants’ brains what they perceived.
When the experiment is conducted on adults, faces and objects (like a telephone or an apple) light up similar areas of the temporal lobe – a region of the brain devoted to higher-level visual processing.
Infants’ neural responses to faces were similar to those of adults, showing activity over a part of the temporal lobe researchers think is devoted to face processing.
