Neuroscience

Articles and news from the latest research reports.

Posts tagged eye movements

75 notes


Eye-Writing Technology: Writing in Cursive With Your Eyes Only
A new technology might allow people who have almost completely lost the ability to move their arms or legs to communicate freely, by using their eyes to write in cursive. The eye-writing technology tricks the neuromuscular machinery into doing something that is usually impossible: to voluntarily produce smooth eye movements in arbitrary directions.
The technology relies on changes in contrast to trick the eyes into the perception of motion. When viewing that changing visual display, people can learn to control their eye movements smoothly and at will, the new study shows. It doesn’t take very much practice either.

Eye-Writing Technology: Writing in Cursive With Your Eyes Only

A new technology might allow people who have almost completely lost the ability to move their arms or legs to communicate freely, by using their eyes to write in cursive. The eye-writing technology tricks the neuromuscular machinery into doing something that is usually impossible: to voluntarily produce smooth eye movements in arbitrary directions.

The technology relies on changes in contrast to trick the eyes into the perception of motion. When viewing that changing visual display, people can learn to control their eye movements smoothly and at will, the new study shows. It doesn’t take very much practice either.

Filed under science neuroscience brain psychology eye movements vision eye-writing technology technology

33 notes

Where you look predicts what you’re going to say

 23 July 2012 by Will Heaven

Watch where you look – it can be used to predict what you’ll say. A new study shows that it is possible to guess what sentences people will use to describe a scene by tracking their eye movements.

Moreno Coco and Frank Keller at the University of Edinburgh, UK, presented 24 volunteers with a series of photo-realistic images depicting indoor scenes such as a hotel reception. They then tracked the sequence of objects that each volunteer looked at after being asked to describe what they saw.

Other than being prompted with a keyword, such as “man” or “suitcase”, participants were free to describe the scene however they liked. Some typical sentences included “the man is standing in the reception of a hotel” or “the suitcase is on the floor”.

The order in which a participant’s gaze settled on objects in each scene tended to mirror the order of nouns in the sentence used to describe it. “We were surprised there was such a close correlation,” says Keller. Given that multiple cognitive processes are involved in sentence formation, Coco says “it is remarkable to find evidence of similarity between speech and visual attention”.

Word prediction

The team used the discovery to see if they could predict what sentences would be used to describe a scene based on eye movement alone. They developed an algorithm that was able to use the eye gazes recorded from the previous experiment to predict the correct sentence from a choice of 576 descriptions.

Changsong Liu of Michigan State University’s Language and Interaction Research lab, in East Lansing, who was not involved in the study, suggests these results could motivate novel designs for human-machine interfaces that take advantage of visual cues to improve speech recognition software.

Gaze information is already used to help with disambiguation. For example, if a speech recognition system can tell that you are looking at a tree, it is less likely to guess that you just said “three”. Sentence prediction, perhaps in combination with augmented reality headsets that track eye movement, for example, is one possible application.

Coco and Keller are now looking into the role of coordinated visual and linguistic processes in conversations between two people. “People engaged in a dialogue use similar syntactic forms, expressions and eye-movements,” says Coco. One hypothesis is that such “coordinative mimicry” might be important for joint decision-making.

Source: NewScientist

Filed under science neuroscience brain psychology eye movements language production speech scene understanding

41 notes

The Eyes Don’t Have It: New Research Into Lying and Eye Movements

ScienceDaily (July 11, 2012) — Widely held beliefs about Neuro-Linguistic Programming and lying are unfounded.

Twenty portrait of a woman with different expressions. (Credit: © gemenacom / Fotolia)

Proponents of Neuro-Linguistic Programming (NLP) have long claimed that it is possible to tell whether a person is lying from their eye movements.  Research published July 11 in the journal PLoS ONE reveals that this claim is unfounded, with the authors calling on the public and organisations to abandon this approach to lie detection.

For decades many NLP practitioners have claimed that when a person looks up to their right they are likely to be lying, whilst a glance up to their left is indicative of telling the truth.

Professor Richard Wiseman (University of Hertfordshire, UK) and Dr Caroline Watt (University of Edinburgh, UK) tested this idea by filming volunteers as they either lied or told the truth, and then carefully coded their eye movements.  In a second study another group of participants was asked to watch the films and attempt to detect the lies on the basis of the volunteers’ eye movements.

"The results of the first study revealed no relationship between lying and eye movements, and the second showed that telling people about the claims made by NLP practitioners did not improve their lie detection skills,” noted Wiseman. 

A final study involved moving out of the laboratory and was conducted in collaboration with Dr Leanne ten Brinke and Professor Stephen Porter from the University of British Columbia, Canada.  The team analysed films of liars and truth tellers from high profile press conferences in which people were appealing for missing relatives or claimed to have been the victim of a crime. 

"Our previous research with these films suggests that there are significant differences in the behaviour of liars and truth tellers," noted Dr Leanne ten Brinke. "However, the alleged tell-tale pattern of eye movements failed to emerge."

"A large percentage of the public believes that certain eye movements are a sign of lying, and this idea is even taught in organisational training courses.  Our research provides no support for the idea and so suggests that it is time to abandon this approach to detecting deceit" remarked Watt.

Source: Science Daily

Filed under science neuroscience brain psychology eye movements

free counters