Neuroscience

Articles and news from the latest research reports.

Posts tagged fusiform face area

173 notes

Researchers find ‘Seeing Jesus in toast’ phenomenon perfectly normal

People who claim to see “Jesus in toast” may no longer be mocked in the future thanks to a new study by researchers at the University of Toronto and partner institutions in China.

image

Researchers have found that the phenomenon of “face pareidolia”– where onlookers report seeing images of Jesus, Virgin Mary, or Elvis in objects such as toasts, shrouds, and clouds — is normal and based on physical causes.

“Most people think you have to be mentally abnormal to see these types of images, so individuals reporting this phenomenon are often ridiculed”, says lead researcher Prof. Kang Lee of the University of Toronto’s Eric Jackman Institute of Child Study. “But our findings suggest that it’s common for people to see non-existent features because human brains are uniquely wired to recognize faces, so that even when there’s only a slight suggestion of facial features the brain automatically interprets it as a face,” said Lee.

Although this phenomenon has been known for centuries, little is understood about the underlying neural mechanisms that cause it. In the first study of its kind, researchers studied brain scans and behavioural responses to individuals seeing faces and letters in different patterns. They discovered face paredilia isn’t due to a brain anomaly or imagination but is caused by the combined work of the frontal cortex which helps generate expectations and sends signals to the posterior visual cortex to enhance the interpretation stimuli from the outside world.

Researchers also found that people can be led to see different images — such as faces or words or letters — depending on what they expect to see, which in turn activates specific parts of the brain that process such images. Seeing “Jesus in toast” reflects our brain’s normal functioning and the active role that the frontal cortex plays in visual perception. Instead of the phrase “seeing is believing” the results suggest that “believing is seeing.”

(Source: media.utoronto.ca)

Filed under face pareidolia face processing fusiform face area visual perception prefrontal cortex psychology neuroscience science

204 notes

How the brain pays attention
Neuroscientists identify a brain circuit that’s key to shifting our focus from one object to another.
Picking out a face in the crowd is a complicated task: Your brain has to retrieve the memory of the face you’re seeking, then hold it in place while scanning the crowd, paying special attention to finding a match.
A new study by MIT neuroscientists reveals how the brain achieves this type of focused attention on faces or other objects: A part of the prefrontal cortex known as the inferior frontal junction (IFJ) controls visual processing areas that are tuned to recognize a specific category of objects, the researchers report in the April 10 online edition of Science.

Scientists know much less about this type of attention, known as object-based attention, than spatial attention, which involves focusing on what’s happening in a particular location. However, the new findings suggest that these two types of attention have similar mechanisms involving related brain regions, says Robert Desimone, the Doris and Don Berkey Professor of Neuroscience, director of MIT’s McGovern Institute for Brain Research, and senior author of the paper.
“The interactions are surprisingly similar to those seen in spatial attention,” Desimone says. “It seems like it’s a parallel process involving different areas.”
In both cases, the prefrontal cortex — the control center for most cognitive functions — appears to take charge of the brain’s attention and control relevant parts of the visual cortex, which receives sensory input. For spatial attention, that involves regions of the visual cortex that map to a particular area within the visual field.
In the new study, the researchers found that IFJ coordinates with a brain region that processes faces, known as the fusiform face area (FFA), and a region that interprets information about places, known as the parahippocampal place area (PPA). The FFA and PPA were first identified in the human cortex by Nancy Kanwisher, the Walter A. Rosenblith Professor of Cognitive Neuroscience at MIT.  
The IFJ has previously been implicated in a cognitive ability known as working memory, which is what allows us to gather and coordinate information while performing a task — such as remembering and dialing a phone number, or doing a math problem.
For this study, the researchers used magnetoencephalography (MEG) to scan human subjects as they viewed a series of overlapping images of faces and houses. Unlike functional magnetic resonance imaging (fMRI), which is commonly used to measure brain activity, MEG can reveal the precise timing of neural activity, down to the millisecond. The researchers presented the overlapping streams at two different rhythms — two images per second and 1.5 images per second — allowing them to identify brain regions responding to those stimuli.
“We wanted to frequency-tag each stimulus with different rhythms. When you look at all of the brain activity, you can tell apart signals that are engaged in processing each stimulus,” says Daniel Baldauf, a postdoc at the McGovern Institute and the lead author of the paper.
Each subject was told to pay attention to either faces or houses; because the houses and faces were in the same spot, the brain could not use spatial information to distinguish them. When the subjects were told to look for faces, activity in the FFA and the IFJ became synchronized, suggesting that they were communicating with each other. When the subjects paid attention to houses, the IFJ synchronized instead with the PPA.
The researchers also found that the communication was initiated by the IFJ and the activity was staggered by 20 milliseconds — about the amount of time it would take for neurons to electrically convey information from the IFJ to either the FFA or PPA. The researchers believe that the IFJ holds onto the idea of the object that the brain is looking for and directs the correct part of the brain to look for it.
The MEG scanner, as well as the study’s “elegant design,” were critical to discovering this relationship, says Robert Knight, a professor of psychology and neuroscience at the University of California at Berkeley who was not part of the research team.
“Functional MRI gives hints of connectivity,” Knight says, “but the time course is way too slow to show these millisecond-scale frequencies and to establish what they show, which is that the inferior frontal lobe is the prime driver.”
Further bolstering this idea, the researchers used an MRI-based method to measure the white matter that connects different brain regions and found that the IFJ is highly connected with both the FFA and PPA.
Members of Desimone’s lab are now studying how the brain shifts its focus between different types of sensory input, such as vision and hearing. They are also investigating whether it might be possible to train people to better focus their attention by controlling the brain interactions  involved in this process.
“You have to identify the basic neural mechanisms and do basic research studies, which sometimes generate ideas for things that could be of practical benefit,” Desimone says. “It’s too early to say whether this training is even going to work at all, but it’s something that we’re actively pursuing.”

How the brain pays attention

Neuroscientists identify a brain circuit that’s key to shifting our focus from one object to another.

Picking out a face in the crowd is a complicated task: Your brain has to retrieve the memory of the face you’re seeking, then hold it in place while scanning the crowd, paying special attention to finding a match.

A new study by MIT neuroscientists reveals how the brain achieves this type of focused attention on faces or other objects: A part of the prefrontal cortex known as the inferior frontal junction (IFJ) controls visual processing areas that are tuned to recognize a specific category of objects, the researchers report in the April 10 online edition of Science.

Scientists know much less about this type of attention, known as object-based attention, than spatial attention, which involves focusing on what’s happening in a particular location. However, the new findings suggest that these two types of attention have similar mechanisms involving related brain regions, says Robert Desimone, the Doris and Don Berkey Professor of Neuroscience, director of MIT’s McGovern Institute for Brain Research, and senior author of the paper.

“The interactions are surprisingly similar to those seen in spatial attention,” Desimone says. “It seems like it’s a parallel process involving different areas.”

In both cases, the prefrontal cortex — the control center for most cognitive functions — appears to take charge of the brain’s attention and control relevant parts of the visual cortex, which receives sensory input. For spatial attention, that involves regions of the visual cortex that map to a particular area within the visual field.

In the new study, the researchers found that IFJ coordinates with a brain region that processes faces, known as the fusiform face area (FFA), and a region that interprets information about places, known as the parahippocampal place area (PPA). The FFA and PPA were first identified in the human cortex by Nancy Kanwisher, the Walter A. Rosenblith Professor of Cognitive Neuroscience at MIT.  

The IFJ has previously been implicated in a cognitive ability known as working memory, which is what allows us to gather and coordinate information while performing a task — such as remembering and dialing a phone number, or doing a math problem.

For this study, the researchers used magnetoencephalography (MEG) to scan human subjects as they viewed a series of overlapping images of faces and houses. Unlike functional magnetic resonance imaging (fMRI), which is commonly used to measure brain activity, MEG can reveal the precise timing of neural activity, down to the millisecond. The researchers presented the overlapping streams at two different rhythms — two images per second and 1.5 images per second — allowing them to identify brain regions responding to those stimuli.

“We wanted to frequency-tag each stimulus with different rhythms. When you look at all of the brain activity, you can tell apart signals that are engaged in processing each stimulus,” says Daniel Baldauf, a postdoc at the McGovern Institute and the lead author of the paper.

Each subject was told to pay attention to either faces or houses; because the houses and faces were in the same spot, the brain could not use spatial information to distinguish them. When the subjects were told to look for faces, activity in the FFA and the IFJ became synchronized, suggesting that they were communicating with each other. When the subjects paid attention to houses, the IFJ synchronized instead with the PPA.

The researchers also found that the communication was initiated by the IFJ and the activity was staggered by 20 milliseconds — about the amount of time it would take for neurons to electrically convey information from the IFJ to either the FFA or PPA. The researchers believe that the IFJ holds onto the idea of the object that the brain is looking for and directs the correct part of the brain to look for it.

The MEG scanner, as well as the study’s “elegant design,” were critical to discovering this relationship, says Robert Knight, a professor of psychology and neuroscience at the University of California at Berkeley who was not part of the research team.

“Functional MRI gives hints of connectivity,” Knight says, “but the time course is way too slow to show these millisecond-scale frequencies and to establish what they show, which is that the inferior frontal lobe is the prime driver.”

Further bolstering this idea, the researchers used an MRI-based method to measure the white matter that connects different brain regions and found that the IFJ is highly connected with both the FFA and PPA.

Members of Desimone’s lab are now studying how the brain shifts its focus between different types of sensory input, such as vision and hearing. They are also investigating whether it might be possible to train people to better focus their attention by controlling the brain interactions  involved in this process.

“You have to identify the basic neural mechanisms and do basic research studies, which sometimes generate ideas for things that could be of practical benefit,” Desimone says. “It’s too early to say whether this training is even going to work at all, but it’s something that we’re actively pursuing.”

Filed under inferior frontal junction attention object-based attention prefrontal cortex fusiform face area neuroscience science

134 notes

Study finds context is key in helping us to recognise a face

Why does it take longer to recognise a familiar face when seen in an unfamiliar setting, like seeing a work colleague when on holiday? A new study published today in Nature Communications has found that part of the reason comes down to the processes that our brain performs when learning and recognising faces.

During the experiment, participants were shown faces of people that they had never seen before, while lying inside an MRI scanner in the Department of Psychology at Royal Holloway. They were shown some of these faces numerous times from different angles and were asked to indicate whether they had seen that person before or not.

While participants were relatively good at recognising faces once they had seen them a few times, using a new mathematical approach, the scientists found that people’s decisions of whether they recognised someone were also dependent on the context in which they encountered the face. If participants had recently seen lots of unfamiliar faces, they were more likely to say that the face they were looking at was unfamiliar, even if they had seen the face several times before and had previously reported that they did recognise the face.

Activity in two areas of the brain matched the way in which the mathematical model predicted people’s performance.

“Our study has characterised some of the mathematical processes that are happening in our brain as we do this,” said lead author Dr Matthew Apps. “One brain area, called the fusiform face area, seems to be involved in learning new information about faces and increasing their familiarity.

“Another area, called the superior temporal sulcus, we found to have an important role in influencing  our report of whether we recognise someone’s face, regardless of whether we are actually familiar with them or not. While this seems rather counter-intuitive, it may be an important mechanism for simplifying all the information that we need to process about faces.”

“Face recognition is a fundamental social skill, but we show how error prone this process can be. To recognise someone, we become familiar with their face, by learning a little more about what it looks like,” said co-author Professor Manos Tsakiris.

“At the same time, we often see people in different contexts. The recognition biases that we measured might give us an advantage in integrating information about identity and social context, two key elements of our social world.”

(Source: rhul.ac.uk)

Filed under face recognition fusiform face area superior temporal sulcus fMRI neuroscience science

free counters