Neuroscience

Articles and news from the latest research reports.

Posts tagged cognitive systems

254 notes

Researchers Find Causality in the Eye of the Beholder
We rely on our visual system more heavily than previously thought in determining the causality of events. A team of researchers has shown that, in making judgments about causality, we don’t always need to use cognitive reasoning. In some cases, our visual brain—the brain areas that process what the eyes sense—can make these judgments rapidly and automatically.
The study appears in the latest issue of the journal Current Biology.
“Our study reveals that causality can be computed at an early level in the visual system,” said Martin Rolfs, who conducted much of the research as a post-doctoral fellow in NYU’s Department of Psychology. “This finding ends a long-standing debate over how some visual events are processed: we show that our eyes can quickly make assessments about cause-and-effect—without the help of our cognitive systems.”
Rolfs is currently a research group leader at the Bernstein Center for Computational Neuroscience and the Department of Psychology of Berlin’s Humboldt University. The study’s other co-authors were Michael Dambacher, post-doctoral researcher at the universities of Potsdam and Konstanz, and Patrick Cavanagh, professor at Université Paris Descartes.
We frequently make rapid judgments of causality (“The ball knocked the glass off the table”), animacy (“Look out, that thing is alive!”), or intention (“He meant to help her”). These judgments are complex enough that many believe that substantial cognitive reasoning is required—we need our brains to tell us what our eyes have seen. However, some judgments are so rapid and effortless that they “feel” perceptual – we can make them using only our visual systems, with no thinking required.
It is not yet clear which judgments require significant cognitive processing and which may be mediated solely by our visual system. In the Current Biology study, the researchers investigated one of these—causality judgments—in an effort to better understand the division of labor between visual and cognitive processes.

Researchers Find Causality in the Eye of the Beholder

We rely on our visual system more heavily than previously thought in determining the causality of events. A team of researchers has shown that, in making judgments about causality, we don’t always need to use cognitive reasoning. In some cases, our visual brain—the brain areas that process what the eyes sense—can make these judgments rapidly and automatically.

The study appears in the latest issue of the journal Current Biology.

“Our study reveals that causality can be computed at an early level in the visual system,” said Martin Rolfs, who conducted much of the research as a post-doctoral fellow in NYU’s Department of Psychology. “This finding ends a long-standing debate over how some visual events are processed: we show that our eyes can quickly make assessments about cause-and-effect—without the help of our cognitive systems.”

Rolfs is currently a research group leader at the Bernstein Center for Computational Neuroscience and the Department of Psychology of Berlin’s Humboldt University. The study’s other co-authors were Michael Dambacher, post-doctoral researcher at the universities of Potsdam and Konstanz, and Patrick Cavanagh, professor at Université Paris Descartes.

We frequently make rapid judgments of causality (“The ball knocked the glass off the table”), animacy (“Look out, that thing is alive!”), or intention (“He meant to help her”). These judgments are complex enough that many believe that substantial cognitive reasoning is required—we need our brains to tell us what our eyes have seen. However, some judgments are so rapid and effortless that they “feel” perceptual – we can make them using only our visual systems, with no thinking required.

It is not yet clear which judgments require significant cognitive processing and which may be mediated solely by our visual system. In the Current Biology study, the researchers investigated one of these—causality judgments—in an effort to better understand the division of labor between visual and cognitive processes.

Filed under visual system cognitive reasoning causality cognitive systems neuroscience science

735 notes

IBM: Computers Will See, Hear, Taste, Smell and Touch in 5 Years
Today’s PCs and smartphones can do a lot — from telling you the weather in Zimbabwe in milliseconds, to buying your morning coffee. But ask them to show you what a piece of fabric feels like, or to detect the odor of a great-smelling soup, and they’re lost.
That will change in the next five years, says IBM. Computers at that time will be much more aware of the world around them, and be able to understand it. The company’s annual “5 in 5” list, in which IBM predicts the five trends in computing that will arrive in five years’ time, reads exactly like a list of the five human senses — predicting computers with sight, hearing, taste, smell and touch.
The five senses are really all part of one grand concept: cognitive computing, which involves machines experiencing the world more like a human would. For example, a cognizant computer wouldn’t see a painting as merely a set of data points describing color, pigment and brush stroke; rather, it would truly see the object holistically as a painting, and be able to know what that means.
Read more

IBM: Computers Will See, Hear, Taste, Smell and Touch in 5 Years

Today’s PCs and smartphones can do a lot — from telling you the weather in Zimbabwe in milliseconds, to buying your morning coffee. But ask them to show you what a piece of fabric feels like, or to detect the odor of a great-smelling soup, and they’re lost.

That will change in the next five years, says IBM. Computers at that time will be much more aware of the world around them, and be able to understand it. The company’s annual “5 in 5” list, in which IBM predicts the five trends in computing that will arrive in five years’ time, reads exactly like a list of the five human senses — predicting computers with sight, hearing, taste, smell and touch.

The five senses are really all part of one grand concept: cognitive computing, which involves machines experiencing the world more like a human would. For example, a cognizant computer wouldn’t see a painting as merely a set of data points describing color, pigment and brush stroke; rather, it would truly see the object holistically as a painting, and be able to know what that means.

Read more

Filed under IBM cognitive systems cognitive computing cognizant computer technology science

free counters