Neuroscience

Articles and news from the latest research reports.

Posts tagged eye movements

134 notes

Rats have a double view of the world
Scientists from the Max Planck Institute for Biological Cybernetics in Tübingen, using miniaturised high-speed cameras and high-speed behavioural tracking, discovered that rats move their eyes in opposite directions in both the horizontal and the vertical plane when running around. Each eye moves in a different direction, depending on the change in the animal’s head position. An analysis of both eyes’ field of view found that the eye movements exclude the possibility that rats fuse the visual information into a single image like humans do. Instead, the eyes move in such a way that enables the space above them to be permanently in view – presumably an adaptation to help them deal with the major threat from predatory birds that rodents face in their natural environment.
Like many mammals, rats have their eyes on the sides of their heads. This gives them a very wide visual field, useful for detection of predators. However, three-dimensional vision requires overlap of the visual fields of the two eyes. Thus, the visual system of these animals needs to meet two conflicting demands at the same time; on the one hand maximum surveillance and on the other hand detailed binocular vision.
The research team from the Max Planck Institute for Biological Cybernetics have now, for the first time, observed and characterised the eye movements of freely moving rats. They fitted minuscule cameras weighing only about one gram to the animals’ heads, which could record the lightning-fast eye movements with great precision. The scientists also used another new method to measure the position and direction of the head, enabling them to reconstruct the rats’ exact line of view at any given time.
The Max Planck scientists’ findings came as a complete surprise. Although rats process visual information from their eyes through very similar brain pathways to other mammals, their eyes evidently move in a totally different way. “Humans move their eyes in a very stereotypical way for both counteracting head movements and searching around. Both our eyes move together and always follow the same object. In rats, on the other hand, the eyes generally move in opposite directions,” explains Jason Kerr from the Max Planck Institute for Biological Cybernetics.
In a series of behavioural experiments, the neurobiologists also discovered that the eye movements largely depend on the position of the animal’s head. “When the head points downward, the eyes move back, away from the tip of the nose. When the rat lifts its head, the eyes look forward: cross-eyed, so to speak. If the animal puts its head on one side, the eye on the lower side moves up and the other eye moves down.” says Jason Kerr.
In humans, the direction in which the eyes look must be precisely aligned, otherwise an object cannot be fixated. A deviation measuring less than a single degree of the field of view is enough to cause double vision. In rats, the opposing eye movements between left and right eye mean that the line of vision varies by as much as 40 degrees in the horizontal plane and up to 60 degrees in the vertical plane. The consequence of these unusual eye movements is that irrespective of vigorous head movements in all planes, the eyes movements always move in such a way to ensure that the area above the animal is always in view simultaneously by both eyes –something that does not occur in any other region of the rat’s visual field.
These unusual eye movements that rats possess appear to be the visual system’s way of adapting to the animals’ living conditions, given that they are preyed upon by numerous species of birds. Although the observed eye movements prevent the fusion of the two visual fields, the scientists postulate that permanent visibility in the direction of potential airborne attackers dramatically increases the animals’ chances of survival.

Rats have a double view of the world

Scientists from the Max Planck Institute for Biological Cybernetics in Tübingen, using miniaturised high-speed cameras and high-speed behavioural tracking, discovered that rats move their eyes in opposite directions in both the horizontal and the vertical plane when running around. Each eye moves in a different direction, depending on the change in the animal’s head position. An analysis of both eyes’ field of view found that the eye movements exclude the possibility that rats fuse the visual information into a single image like humans do. Instead, the eyes move in such a way that enables the space above them to be permanently in view – presumably an adaptation to help them deal with the major threat from predatory birds that rodents face in their natural environment.

Like many mammals, rats have their eyes on the sides of their heads. This gives them a very wide visual field, useful for detection of predators. However, three-dimensional vision requires overlap of the visual fields of the two eyes. Thus, the visual system of these animals needs to meet two conflicting demands at the same time; on the one hand maximum surveillance and on the other hand detailed binocular vision.

The research team from the Max Planck Institute for Biological Cybernetics have now, for the first time, observed and characterised the eye movements of freely moving rats. They fitted minuscule cameras weighing only about one gram to the animals’ heads, which could record the lightning-fast eye movements with great precision. The scientists also used another new method to measure the position and direction of the head, enabling them to reconstruct the rats’ exact line of view at any given time.

The Max Planck scientists’ findings came as a complete surprise. Although rats process visual information from their eyes through very similar brain pathways to other mammals, their eyes evidently move in a totally different way. “Humans move their eyes in a very stereotypical way for both counteracting head movements and searching around. Both our eyes move together and always follow the same object. In rats, on the other hand, the eyes generally move in opposite directions,” explains Jason Kerr from the Max Planck Institute for Biological Cybernetics.

In a series of behavioural experiments, the neurobiologists also discovered that the eye movements largely depend on the position of the animal’s head. “When the head points downward, the eyes move back, away from the tip of the nose. When the rat lifts its head, the eyes look forward: cross-eyed, so to speak. If the animal puts its head on one side, the eye on the lower side moves up and the other eye moves down.” says Jason Kerr.

In humans, the direction in which the eyes look must be precisely aligned, otherwise an object cannot be fixated. A deviation measuring less than a single degree of the field of view is enough to cause double vision. In rats, the opposing eye movements between left and right eye mean that the line of vision varies by as much as 40 degrees in the horizontal plane and up to 60 degrees in the vertical plane. The consequence of these unusual eye movements is that irrespective of vigorous head movements in all planes, the eyes movements always move in such a way to ensure that the area above the animal is always in view simultaneously by both eyes –something that does not occur in any other region of the rat’s visual field.

These unusual eye movements that rats possess appear to be the visual system’s way of adapting to the animals’ living conditions, given that they are preyed upon by numerous species of birds. Although the observed eye movements prevent the fusion of the two visual fields, the scientists postulate that permanent visibility in the direction of potential airborne attackers dramatically increases the animals’ chances of survival.

Filed under rats eye movements binocular vision double vision visual system neuroscience science

43 notes

New subtype of ataxia identified

The finding opens the door for presymptomatic diagnostics and genetic counselling for patients and it is the first step in identifying the cause and developing therapies

image

(Image: Antony Gormley)

Researchers from the Germans Trias i Pujol Health Sciences Research Institute Foundation (IGTP), the Bellvitge Biomedical Research Institute (IDIBELL), and the Sant Joan de Déu de Martorell Hospital, has identified a new subtype of ataxia, a rare disease without treatment that causes atrophy in the cerebellum and affects around 1.5 million people in the world. The results have been published online on April 29 in the journal JAMA Neurology.

The cause of ataxia is a diverse genetic alteration. For this reason it is classified in subtypes. The new subtype identified described by the researchers has been called SCA37. The study has found this subtype in members of the same family living in Barcelona, Huelva and Madrid and Salamanca (Spain). The finding will allow in the medium term that these families and all who suffer the genetic alteration identified will have personalized therapies and diagnostics prior to the development of the disease. The study was funded by La Marató de TV3 (the Catalan public TV) in 2009, dedicated to rare diseases.

The cerebellum is a part of the brain located behind the brain that, among other functions, coordinates the movements of the human body. When it is atrophied, movement disorders appear, and when the ataxia evolves, the patients suffer frequent falls and swallowing problems. Progressively, they end up needing a wheelchair. Until now, there have been identified more than 30 different subtypes of ataxia, the first of which was described in 1993 by Dr. Antoni Matill, head of the Neurogenetics Unit, IGTP, and Dr. Victor Volpini, head of the Center for Molecular Genetic Diagnosis at IDIBELL.

The publication of this paper has been possible thanks to the collaboration of the Hospital de Sant Pau, Universitat Pompeu Fabra and the Pitie-Salpêtrière Hospital in Paris.

Particular eye movements

The first symptoms of ataxia may develop during the childhood or adult stage, depending on the subtype. The SCA37 subtype, the first cases of which were identified by Carme Serrano, neurologist at the Sant Joan de Deu Hospital, Martorell (Barcelona), is expressed at 48 years on average. One of the features of SCA37 subtype is the difficulty for vertical eye movements. Besides the patients identified in Spain by Dr. Serrano and Germans Trias and IDIBELL researchers, there are evidence of the existence of more people affected with this subtype of ataxia in France, Holland and Britain, and for this reason it seems to be a quite prevalent subtype of ataxia in Europe.

All SCA37 patients have a common genetic alteration in the portion 32 of the short arm of chromosome 1, wherein there are a hundred genes. Currently, researchers are sequencing it with new generation technologies to find the specific mutation that causes ataxia. When it is found it will be possible to make an accurate diagnosis in family members who do not yet have developed symptoms. Also, it will be possible to investigate the biological mechanisms that cause ataxia in order to develop and implement personalized therapies, with drugs or stem cells therapy.

(Source: eurekalert.org)

Filed under ataxia cerebellum genetic alteration SCA37 subtype eye movements neuroscience science

114 notes

A Chimp’s Point Of View: Goggles simultaneously monitor a chimpanzee’s eyes and field of view
Chimps with camera goggles on their heads are helping scientists learn how the apes literally see the world.
From a scientific perspective, the eyes are windows to the mind. What people watch is one key sign of what they might be thinking, so monitoring their gazes can help researchers learn about what is going on inside people’s heads.
Scientists have conducted eye-tracking studies on people for more than 100 years. However, comparably little work has been conducted with other primates. Such work promises to shed light on humanity’s closest living relatives, and how they might perceive the world differently.
"If we know the differences between chimpanzees and humans, we will have an insight into how human perception has evolved," said comparative psychologist Fumihiro Kano at the Max Planck Institute for Evolutionary Anthropology in Leipzig, Germany.
Until recently, eye-tracking research involved desk-sized machines confined to labs. Investigators now have access to portable, wearable eye-trackers, enabling scientists to learn how people look at and interact with the world in a more natural way. This enables them to research topics such as how experts look at the world differently from novices. Now Kano and his colleagues are using these devices to study chimps.
"Everybody wants to see the world through chimpanzee eyes, right?" Kano said. "That’s one of my childhood dreams. How do chimpanzees, the closest relatives of humans, see the world?"
The researchers placed lightweight goggles on a 27-year-old female chimpanzee named Pan that had one camera monitor her right eye and another aimed at her field of view, both of which sent data to a portable recorder. The mobile setup allowed the chimp to move and behave freely.
"We modified the eye-tracker goggle shape so that the chimpanzee could wear it and like it," Kano said. "If the chimpanzee felt uncomfortable wearing the goggles, she wouldn’t care about throwing it away!"
When Pan wore the eye-trackers, the scientists practiced a two-minute gestural task with her that she had already learned for several years. The researchers performed one of three gestures — touching their noses, touching their palms, or clapping their hands — and gave Pan pieces of apple from a transparent box as a reward whenever she copied that task. The goggles also captured the greetings Pan often gave people before tasks, such as pant-grunting or swaying.
"No researcher has been successful in recording the natural gaze of chimpanzees before," Kano said.
The researchers found out how Pan looked at the world differently depending on what she was doing. For instance, when greeting experimenters, the chimpanzee focused on their faces and feet — the latter presumably to see where they were going — but during the gestural task, she gazed at the experimenters’ faces and hands. In addition, while Pan mostly ignored the fruit reward before the gestural task, she looked at it 30 times more during the task. Kano indicated that this focus on the fruit reveals that Pan was thinking ahead to anticipate the future.
"This work builds toward an understanding not just of how chimpanzees learn about the world, but how they want to influence it," said neuroethologist Stephen Shepherd at Rockefeller University in New York, who did not take part in this research. "We can use gaze as a readout of what chimpanzees think is important to attend and affect."
Moreover, past research with desk-mounted eye-trackers hinted chimps did not look at familiar faces any longer than unfamiliar ones, but these new findings suggest otherwise — Pan looked at unfamiliar experimenters longer than familiar ones.
The researchers think one reason for the difference may have been because the previous studies used pictures of faces, shown for a shorter amount of time. In the new experiment, Pan also looked at familiar people longer if they were not in rooms where she was accustomed to seeing them.
The researchers plan on testing more chimpanzees with these wearable eye-trackers. They also want to compare the apes with people and other primates.
"It will be very interesting to see how humans, chimpanzees and other primates use gaze while performing the same real-world tasks," Shepherd said. "I would love to know if chimpanzees are intermediate between humans and monkeys, or if they’re just like humans."
In addition, future research will analyze how chimpanzees predict the actions of people and other chimpanzees. How the apes predict the actions of others in real-time, “that is, within a fraction of a second, is largely unknown,” Kano said.
Kano and his colleague Masaki Tomonaga detailed their findings online March 27 in the journal PLOS ONE.

A Chimp’s Point Of View: Goggles simultaneously monitor a chimpanzee’s eyes and field of view

Chimps with camera goggles on their heads are helping scientists learn how the apes literally see the world.

From a scientific perspective, the eyes are windows to the mind. What people watch is one key sign of what they might be thinking, so monitoring their gazes can help researchers learn about what is going on inside people’s heads.

Scientists have conducted eye-tracking studies on people for more than 100 years. However, comparably little work has been conducted with other primates. Such work promises to shed light on humanity’s closest living relatives, and how they might perceive the world differently.

"If we know the differences between chimpanzees and humans, we will have an insight into how human perception has evolved," said comparative psychologist Fumihiro Kano at the Max Planck Institute for Evolutionary Anthropology in Leipzig, Germany.

Until recently, eye-tracking research involved desk-sized machines confined to labs. Investigators now have access to portable, wearable eye-trackers, enabling scientists to learn how people look at and interact with the world in a more natural way. This enables them to research topics such as how experts look at the world differently from novices. Now Kano and his colleagues are using these devices to study chimps.

"Everybody wants to see the world through chimpanzee eyes, right?" Kano said. "That’s one of my childhood dreams. How do chimpanzees, the closest relatives of humans, see the world?"

The researchers placed lightweight goggles on a 27-year-old female chimpanzee named Pan that had one camera monitor her right eye and another aimed at her field of view, both of which sent data to a portable recorder. The mobile setup allowed the chimp to move and behave freely.

"We modified the eye-tracker goggle shape so that the chimpanzee could wear it and like it," Kano said. "If the chimpanzee felt uncomfortable wearing the goggles, she wouldn’t care about throwing it away!"

When Pan wore the eye-trackers, the scientists practiced a two-minute gestural task with her that she had already learned for several years. The researchers performed one of three gestures — touching their noses, touching their palms, or clapping their hands — and gave Pan pieces of apple from a transparent box as a reward whenever she copied that task. The goggles also captured the greetings Pan often gave people before tasks, such as pant-grunting or swaying.

"No researcher has been successful in recording the natural gaze of chimpanzees before," Kano said.

The researchers found out how Pan looked at the world differently depending on what she was doing. For instance, when greeting experimenters, the chimpanzee focused on their faces and feet — the latter presumably to see where they were going — but during the gestural task, she gazed at the experimenters’ faces and hands. In addition, while Pan mostly ignored the fruit reward before the gestural task, she looked at it 30 times more during the task. Kano indicated that this focus on the fruit reveals that Pan was thinking ahead to anticipate the future.

"This work builds toward an understanding not just of how chimpanzees learn about the world, but how they want to influence it," said neuroethologist Stephen Shepherd at Rockefeller University in New York, who did not take part in this research. "We can use gaze as a readout of what chimpanzees think is important to attend and affect."

Moreover, past research with desk-mounted eye-trackers hinted chimps did not look at familiar faces any longer than unfamiliar ones, but these new findings suggest otherwise — Pan looked at unfamiliar experimenters longer than familiar ones.

The researchers think one reason for the difference may have been because the previous studies used pictures of faces, shown for a shorter amount of time. In the new experiment, Pan also looked at familiar people longer if they were not in rooms where she was accustomed to seeing them.

The researchers plan on testing more chimpanzees with these wearable eye-trackers. They also want to compare the apes with people and other primates.

"It will be very interesting to see how humans, chimpanzees and other primates use gaze while performing the same real-world tasks," Shepherd said. "I would love to know if chimpanzees are intermediate between humans and monkeys, or if they’re just like humans."

In addition, future research will analyze how chimpanzees predict the actions of people and other chimpanzees. How the apes predict the actions of others in real-time, “that is, within a fraction of a second, is largely unknown,” Kano said.

Kano and his colleague Masaki Tomonaga detailed their findings online March 27 in the journal PLOS ONE.

Filed under primates eye-tracking eye movements visual patterns neuroscience science

167 notes

Researchers identify new vision of how we explore our world
Brain researchers at Barrow Neurological Institute have discovered that we explore the world with our eyes in a different way than previously thought. Their results advance our understanding of how healthy observers and neurological patients interact and glean critical information from the world around them.
The research team was led by Dr. Susana Martinez-Conde, Director of the Laboratory of Visual Neuroscience at Barrow, in collaboration with fellow Barrow Neurological Institute researchers Jorge Otero-Millan, Rachel Langston, and Dr. Stephen Macknik, Director of the Laboratory of Behavioral Neurophysiology. The study, titled “An oculomotor continuum from exploration to fixation”, was published in the Proceedings of the National Academy of Sciences.
Previously, scientists thought that we sample visual information from the world in two main different modes: exploration and fixation. “We used to think that we make large eye movements to search for objects of interest, and then fix our gaze to see them with high detail,” says Martinez-Conde. “But now we know that’s not quite right.”
The discovery shows that even during visual fixation, we are actually scanning visual details with small eye movements — just like we explore visual scenes with big eye movements, but on a smaller scale. This means that exploration and fixation are two ends of the same continuum of oculomotor scanning.
Subjects viewed natural images while the team measured their eye movements with high-speed eye tracking. The images could range in size from the massive, presented on a room-sized video monitor in the Barrow Neurological Institute’s Eller Telepresence Room, normally used for Barrow’s surgeons to collaborate in brain surgeries with colleagues around the world, to images that are just half the width of your thumb nail.
In all cases, the researchers found that subjects’ eyes scanned the scenes with the same general strategy, along a smooth continuum of dynamical changes. “There was no abrupt change in the characteristics of the eye movements, whether the visual scenes were huge or tiny, or even when the subjects were fixing their gaze. That means that the brain controls eye movements in the same way when we explore and when we fixate,” said Dr. Martinez-Conde.
Scientists have studied how the brain controls eye movements for over 100 years, and the idea —challenged here—that fixation and exploration are fundamentally different behaviors has been central to the field. This new perspective will affect future research and bring focus to the study of neurological diseases that impact oculomotor behavior.
(Image: Getty Images)

Researchers identify new vision of how we explore our world

Brain researchers at Barrow Neurological Institute have discovered that we explore the world with our eyes in a different way than previously thought. Their results advance our understanding of how healthy observers and neurological patients interact and glean critical information from the world around them.

The research team was led by Dr. Susana Martinez-Conde, Director of the Laboratory of Visual Neuroscience at Barrow, in collaboration with fellow Barrow Neurological Institute researchers Jorge Otero-Millan, Rachel Langston, and Dr. Stephen Macknik, Director of the Laboratory of Behavioral Neurophysiology. The study, titled “An oculomotor continuum from exploration to fixation”, was published in the Proceedings of the National Academy of Sciences.

Previously, scientists thought that we sample visual information from the world in two main different modes: exploration and fixation. “We used to think that we make large eye movements to search for objects of interest, and then fix our gaze to see them with high detail,” says Martinez-Conde. “But now we know that’s not quite right.”

The discovery shows that even during visual fixation, we are actually scanning visual details with small eye movements — just like we explore visual scenes with big eye movements, but on a smaller scale. This means that exploration and fixation are two ends of the same continuum of oculomotor scanning.

Subjects viewed natural images while the team measured their eye movements with high-speed eye tracking. The images could range in size from the massive, presented on a room-sized video monitor in the Barrow Neurological Institute’s Eller Telepresence Room, normally used for Barrow’s surgeons to collaborate in brain surgeries with colleagues around the world, to images that are just half the width of your thumb nail.

In all cases, the researchers found that subjects’ eyes scanned the scenes with the same general strategy, along a smooth continuum of dynamical changes. “There was no abrupt change in the characteristics of the eye movements, whether the visual scenes were huge or tiny, or even when the subjects were fixing their gaze. That means that the brain controls eye movements in the same way when we explore and when we fixate,” said Dr. Martinez-Conde.

Scientists have studied how the brain controls eye movements for over 100 years, and the idea —challenged here—that fixation and exploration are fundamentally different behaviors has been central to the field. This new perspective will affect future research and bring focus to the study of neurological diseases that impact oculomotor behavior.

(Image: Getty Images)

Filed under vision visual system visual fixation visual exploration eye movements neuroscience science

97 notes

Is it a Stroke or Benign Dizziness? A Simple Bedside Test Can Tell
A bedside electronic device that measures eye movements can successfully determine whether the cause of severe, continuous, disabling dizziness is a stroke or something benign, according to results of a small study led by Johns Hopkins Medicine researchers.
"Using this device can directly predict who has had a stroke and who has not," says David Newman-Toker, M.D., Ph.D., an associate professor of neurology and otolaryngology at the Johns Hopkins University School of Medicine and leader of the study described in the journal Stroke. “We’re spending hundreds of millions of dollars a year on expensive stroke work-ups that are unnecessary, and probably missing the chance to save tens of thousands of lives because we aren’t properly diagnosing their dizziness or vertigo as stroke symptoms.”
Newman-Toker says if additional larger studies confirm these results, the device could one day be the equivalent of an electrocardiogram (EKG), a simple noninvasive test routinely used to rule out heart attack in patients with chest pain. And, he adds, universal use of the device could “virtually eliminate deaths from misdiagnosis and save a lot of time and money.”
To distinguish stroke from a more benign condition, such as vertigo linked to an inner ear disturbance, specialists typically use three eye movement tests that are essentially a stress test for the balance system. In the hands of specialists, these bedside clinical tests (without the device) have been shown in several large research studies to be extremely accurate — “nearly perfect, and even better than immediate MRI,” says Newman-Toker. One of those tests, known as the horizontal head impulse test, is the best predictor of stroke. To perform it, doctors or technicians ask patients to look at a target on the wall and keep their eyes on the target as doctors move the patients’ heads from side to side. But, says Newman-Toker, it requires expertise to determine whether a patient is making the fast corrective eye adjustments that would indicate a benign form of dizziness as opposed to a stroke.
For the new study, researchers instead performed the same test using a small, portable device — a video-oculography machine that detects minute eye movements that are difficult for most physicians to notice. The machine includes a set of goggles, akin to swimming goggles, with a USB-connected webcam and an accelerometer in the frame. The webcam is hooked up to a laptop where a continuous picture of the eye is taken. Software interprets eye position based on movements and views of the pupil, while the accelerometer measures the speed of the movement of the head.
Newman-Toker says the test could be easily employed to prevent misdiagnosis of  as many as 100,000 strokes a year, leading to earlier stroke diagnosis and more efficient triage and treatment decisions for patients with disabling dizziness. Overlooked strokes mean delayed or missed treatments that lead to roughly 20,000 to 30,000 preventable deaths or disabilities a year, he says. The technology, he adds, could someday be used in a smartphone application to enable wider access to a quick and accurate diagnosis of strokes whose main symptom is dizziness, as opposed to one-sided weakness or garbled speech.
The diagnosis of stroke in patients with severe dizziness, vomiting, difficulty walking and intolerance to head motion is difficult, Newman-Toker says. He estimates there are 4 million emergency department visits annually in the United States for dizziness or vertigo, at least half a million of which involve patients at high risk for stroke. The most common causes are benign inner ear conditions, but many emergency room doctors, Newman-Toker says, find it nearly impossible to tell the difference between the benign conditions and something more serious, such as a stroke. So they often rely on brain imaging - usually a CT scan, an expensive and inaccurate technology for this particular diagnosis.
The Hopkins-led study enrolled 12 patients at The Johns Hopkins Hospital and the University of Illinois College of Medicine at Peoria, who later underwent confirmatory MRI. Six were diagnosed with stroke and six with a benign condition using video-oculography. MRI later confirmed all 12 diagnoses.

Is it a Stroke or Benign Dizziness? A Simple Bedside Test Can Tell

A bedside electronic device that measures eye movements can successfully determine whether the cause of severe, continuous, disabling dizziness is a stroke or something benign, according to results of a small study led by Johns Hopkins Medicine researchers.

"Using this device can directly predict who has had a stroke and who has not," says David Newman-Toker, M.D., Ph.D., an associate professor of neurology and otolaryngology at the Johns Hopkins University School of Medicine and leader of the study described in the journal Stroke. “We’re spending hundreds of millions of dollars a year on expensive stroke work-ups that are unnecessary, and probably missing the chance to save tens of thousands of lives because we aren’t properly diagnosing their dizziness or vertigo as stroke symptoms.”

Newman-Toker says if additional larger studies confirm these results, the device could one day be the equivalent of an electrocardiogram (EKG), a simple noninvasive test routinely used to rule out heart attack in patients with chest pain. And, he adds, universal use of the device could “virtually eliminate deaths from misdiagnosis and save a lot of time and money.”

To distinguish stroke from a more benign condition, such as vertigo linked to an inner ear disturbance, specialists typically use three eye movement tests that are essentially a stress test for the balance system. In the hands of specialists, these bedside clinical tests (without the device) have been shown in several large research studies to be extremely accurate — “nearly perfect, and even better than immediate MRI,” says Newman-Toker. One of those tests, known as the horizontal head impulse test, is the best predictor of stroke. To perform it, doctors or technicians ask patients to look at a target on the wall and keep their eyes on the target as doctors move the patients’ heads from side to side. But, says Newman-Toker, it requires expertise to determine whether a patient is making the fast corrective eye adjustments that would indicate a benign form of dizziness as opposed to a stroke.

For the new study, researchers instead performed the same test using a small, portable device — a video-oculography machine that detects minute eye movements that are difficult for most physicians to notice. The machine includes a set of goggles, akin to swimming goggles, with a USB-connected webcam and an accelerometer in the frame. The webcam is hooked up to a laptop where a continuous picture of the eye is taken. Software interprets eye position based on movements and views of the pupil, while the accelerometer measures the speed of the movement of the head.

Newman-Toker says the test could be easily employed to prevent misdiagnosis of  as many as 100,000 strokes a year, leading to earlier stroke diagnosis and more efficient triage and treatment decisions for patients with disabling dizziness. Overlooked strokes mean delayed or missed treatments that lead to roughly 20,000 to 30,000 preventable deaths or disabilities a year, he says. The technology, he adds, could someday be used in a smartphone application to enable wider access to a quick and accurate diagnosis of strokes whose main symptom is dizziness, as opposed to one-sided weakness or garbled speech.

The diagnosis of stroke in patients with severe dizziness, vomiting, difficulty walking and intolerance to head motion is difficult, Newman-Toker says. He estimates there are 4 million emergency department visits annually in the United States for dizziness or vertigo, at least half a million of which involve patients at high risk for stroke. The most common causes are benign inner ear conditions, but many emergency room doctors, Newman-Toker says, find it nearly impossible to tell the difference between the benign conditions and something more serious, such as a stroke. So they often rely on brain imaging - usually a CT scan, an expensive and inaccurate technology for this particular diagnosis.

The Hopkins-led study enrolled 12 patients at The Johns Hopkins Hospital and the University of Illinois College of Medicine at Peoria, who later underwent confirmatory MRI. Six were diagnosed with stroke and six with a benign condition using video-oculography. MRI later confirmed all 12 diagnoses.

Filed under brain stroke benign dizziness eye movements electronic device medicine science

186 notes

Study reveals potential target to better treat, cure anxiety disorders

Researchers at Boston University School of Medicine (BUSM) have, for the first time, identified a specific group of cells in the brainstem whose activation during rapid eye movement (REM) sleep is critical for the regulation of emotional memory processing. The findings, published in the Journal of Neuroscience, could help lead to the development of effective behavioral and pharmacological therapies to treat anxiety disorders, such as post-traumatic stress disorder, phobias and panic attacks.

There are two main stages of sleep – REM and non-REM – and both are necessary to maintain health and to regulate multiple memory systems, including emotional memory. During non-REM sleep, the body repairs tissue, regenerates cells and improves the function of the body’s immune system. During REM sleep, the brain becomes more active and the muscles of the body become paralyzed. Additionally, dreaming generally occurs during REM sleep, as well as physiological events including saccadic eye movements and rapid fluctuations of respiration, heart rate and body temperature. One particular physiological event, which is a hallmark sign of REM sleep, is the appearance of phasic pontine waves (P-waves). The P-wave is a unique brain wave generated by the activation of a group of glutamatergic cells in a specific region within the brainstem called the pons.

Memories of fearful experiences can lead to enduring alterations in emotion and behavior and sleep plays a natural emotional regulatory role after stressful and traumatic events. Persistence of sleep disturbances, particularly of REM sleep, is predictive of developing symptoms of anxiety disorders. A core symptom of these disorders frequently reported by patients is the persistence of fear-provoking memories that they are unable to extinguish. Presently, exposure therapy, which involves controlled re-exposure to the original fearful experience, is considered one of the most effective evidence-based treatments for anxiety disorders. Exposure therapy produces a new memory, called an extinction memory, to coexist and compete with the fearful memory when the fearful cue/context is re-encountered.

The strength of the extinction memory determines the efficacy of exposure therapy. A demonstrated prerequisite for the successful development of an extinction memory is adequate sleep, particularly REM sleep, after exposure therapy. However, adequate or increased sleep alone does not universally guarantee its therapeutic efficacy.

"Given the inconsistency and unpredictability of exposure therapy, we are working to identify which process(es) during REM sleep dictate the success or failure of exposure therapy," said Subimal Datta, PhD, director and principle investigator at the Laboratory of Sleep and Cognitive Neuroscience at BUSM who served as the study’s lead author.

The researchers used contextual fear extinction training, which works to turn off the conditioned fear, to study which brain mechanisms play a role in the success of exposure therapy. The study results showed that fear extinction training increased REM sleep. Surprisingly, however, only 57 percent of subjects retained fear extinction memory, meaning that they did not experience the fear, after 24 hours. There was a tremendous increase of phasic P-wave activity among those subjects. In 43 percent of subjects, however, the wave activity was absent and they failed to retain fear extinction memory, meaning that they re-experienced fear.

"The study results provide direct evidence that the activation of phasic P-wave activity within the brainstem, in conjunction with exposure therapy, is critical for the development of long-term retention of fear extinction memory," said Datta, who also is a professor of psychiatry and neurology at BUSM. In addition, the study indicates the important role that the brainstem plays in regulating emotional memory.

Future research will explore how to activate this mechanism in order to help facilitate the development of new potential pharmacological treatments that will complement exposure therapy to better treat anxiety and other psychological disorders.

According to the National Institute of Mental Health, anxiety disorders affect approximately 40 million American adults each year. While anxiety can sometimes be a normal and beneficial reaction to stress, some people experience excessive anxiety that they are unable to control, which can negatively impact their day to day life.

(Source: eurekalert.org)

Filed under anxiety memory eye movements saccadic eye movements brainwaves sleep fear extinction neuroscience science

38 notes

More Than Just Looking – A Role of Tiny Eye Movements Explained
Tübingen researcher learns how the brain keeps an eye on the periphery even when focusing on one object.
Have you ever wondered whether it’s possible to look at two places at once? Because our eyes have a specialized central region with high visual acuity and good color vision, we must always focus on one spot at a time in order to see our environment. As a result, our eyes constantly jump back and forth as we look around.
But what if – when you are looking at an object – your brain also allowed you to “look” somewhere else at the same time, out of the corner of your eye, as it were? Now, a scientist at the Werner Reichardt Centre for Integrative Neuroscience (CIN), which is funded by the German Excellence initiative at Tübingen University, has found a possible explanation for how this might happen.
Ziad Hafed, the leader of the Physiology of Active Vision Junior Research Group at CIN, wondered about the role of a type of tiny microscopic eye movement that occurs when we fix our gaze on something, called a microsaccade. “Microsaccades are sort of enigmatic,” Hafed says. They are movements of the eye which occur at exactly the moment when we are trying to look at something steadily – i.e., when we are trying to prevent our eyes from moving.
It was long thought that microsaccades were nothing but random, inconsequential tics, but Hafed wondered whether the mere unconscious preparation to generate these tiny eye movements can alter visual perception and effectively allow you to “see” out of the corner of your eye. He found that before generating a microsaccade, the brain reorganizes its visual processing to alter how you perceive things. “Imagine that you are the coach of a football team,” Hafed says. “You would normally ask your defenders to spread out across the field in order to provide good coverage during match play. However, in preparation for an upcoming corner kick by your opposing team, you would reorganize your defenders, assigning two of them to become temporary goalkeepers and protect the goal. What I found was evidence for a similar strategy in the visual brain before microsaccades,” says Hafed. That is, in preparation for generating a tiny microscopic eye movement, the brain – the “coach” – causes a subtle reorganization of the visual system, and thus alters how you might see out of the corner of your eyes (see diagram).
Using a series of experiments on human participants, coupled with computational modeling of the human visual system, Hafed asked participants to fix their attention on a spot that appeared on a screen in front of them, while he carefully measured their tiny microscopic eye movements. Hafed then probed the participants’ ability to look at two places at once by testing their peripheral vision. He found that in preparation to generate a tiny microsaccade, the participants demonstrated remarkable changes in their ability to process visual inputs. In the periphery, tiny microscopic eye movements effectively improved the capacity to direct visual input – from around where gaze is fixed – towards the brain. Hafed’s results, which are described in the leading science journal Neuron, thus demonstrate an important functional role for these tiny, microscopic, and “enigmatic” movements of the eye in helping us to perceive our environment.
Hafed’s results not only help us understand a previously puzzling phenomenon; there are also potentially wide-ranging applications arising from this work. In particular, this work can affect how we design computer and machine user interfaces. For example, using knowledge about the whole range of eye movements we constantly make, including microscopic ones, our future “smart user interfaces” can ensure that things likely to attract our attention are not displayed in places where they can be distracting. Conversely, if we need to locate something that should attract our attention – a warning light in a control room, for instance – this same approach will also be useful. As Hafed put it, “eye movements would essentially be a window on our minds.”

More Than Just Looking – A Role of Tiny Eye Movements Explained

Tübingen researcher learns how the brain keeps an eye on the periphery even when focusing on one object.

Have you ever wondered whether it’s possible to look at two places at once? Because our eyes have a specialized central region with high visual acuity and good color vision, we must always focus on one spot at a time in order to see our environment. As a result, our eyes constantly jump back and forth as we look around.

But what if – when you are looking at an object – your brain also allowed you to “look” somewhere else at the same time, out of the corner of your eye, as it were? Now, a scientist at the Werner Reichardt Centre for Integrative Neuroscience (CIN), which is funded by the German Excellence initiative at Tübingen University, has found a possible explanation for how this might happen.

Ziad Hafed, the leader of the Physiology of Active Vision Junior Research Group at CIN, wondered about the role of a type of tiny microscopic eye movement that occurs when we fix our gaze on something, called a microsaccade. “Microsaccades are sort of enigmatic,” Hafed says. They are movements of the eye which occur at exactly the moment when we are trying to look at something steadily – i.e., when we are trying to prevent our eyes from moving.

It was long thought that microsaccades were nothing but random, inconsequential tics, but Hafed wondered whether the mere unconscious preparation to generate these tiny eye movements can alter visual perception and effectively allow you to “see” out of the corner of your eye. He found that before generating a microsaccade, the brain reorganizes its visual processing to alter how you perceive things. “Imagine that you are the coach of a football team,” Hafed says. “You would normally ask your defenders to spread out across the field in order to provide good coverage during match play. However, in preparation for an upcoming corner kick by your opposing team, you would reorganize your defenders, assigning two of them to become temporary goalkeepers and protect the goal. What I found was evidence for a similar strategy in the visual brain before microsaccades,” says Hafed. That is, in preparation for generating a tiny microscopic eye movement, the brain – the “coach” – causes a subtle reorganization of the visual system, and thus alters how you might see out of the corner of your eyes (see diagram).

Using a series of experiments on human participants, coupled with computational modeling of the human visual system, Hafed asked participants to fix their attention on a spot that appeared on a screen in front of them, while he carefully measured their tiny microscopic eye movements. Hafed then probed the participants’ ability to look at two places at once by testing their peripheral vision. He found that in preparation to generate a tiny microsaccade, the participants demonstrated remarkable changes in their ability to process visual inputs. In the periphery, tiny microscopic eye movements effectively improved the capacity to direct visual input – from around where gaze is fixed – towards the brain. Hafed’s results, which are described in the leading science journal Neuron, thus demonstrate an important functional role for these tiny, microscopic, and “enigmatic” movements of the eye in helping us to perceive our environment.

Hafed’s results not only help us understand a previously puzzling phenomenon; there are also potentially wide-ranging applications arising from this work. In particular, this work can affect how we design computer and machine user interfaces. For example, using knowledge about the whole range of eye movements we constantly make, including microscopic ones, our future “smart user interfaces” can ensure that things likely to attract our attention are not displayed in places where they can be distracting. Conversely, if we need to locate something that should attract our attention – a warning light in a control room, for instance – this same approach will also be useful. As Hafed put it, “eye movements would essentially be a window on our minds.”

Filed under visual perception microsaccades eye movements peripheral vision neuroscience science

91 notes

Eye movements reveal impaired reading in schizophrenia

A study of eye movements in schizophrenia patients provides new evidence of impaired reading fluency in individuals with the mental illness.

The findings, by researchers at McGill University in Montreal, could open avenues to earlier detection and intervention for people with the illness.
While schizophrenia patients are known to have abnormalities in language and in eye movements, until recently reading ability was believed to be unaffected. That is because most previous studies examined reading in schizophrenia using single-word reading tests, the McGill researchers conclude. Such tests aren’t sensitive to problems in reading fluency, which is affected by the context in which words appear and by eye movements that shift attention from one word to the next.
The McGill study, led by Ph.D. candidate Veronica Whitford and psychology professors Debra Titone and Gillian A. O’Driscoll, monitored how people move their eyes as they read simple sentences. The results, which were first published online last year, appear in the February issue of the Journal of Experimental Psychology: General.
Eye movement measures provide clear and objective indicators of how hard people are working as they read. For example, when struggling with a difficult sentence, people generally make smaller eye movements, spend more time looking at each word, and spend more time re-reading words. They also have more difficulty attending to upcoming words, so they plan their eye movements less efficiently.
The McGill study, which involved 20 schizophrenia outpatients and 16 non-psychiatric participants, showed that reading patterns in people with schizophrenia differed in several important ways from healthy participants matched for gender, age, and family social status. People with schizophrenia read more slowly, generated smaller eye movements, spent more time processing individual words, and spent more time re-reading. In addition, people with schizophrenia were less efficient at processing upcoming words to facilitate reading.
The researchers evaluated factors that could contribute to the problems in reading fluency among the schizophrenia outpatients – specifically, their ability to parse words into sound components and their ability to skillfully control eye movements in non-reading contexts. Both factors were found to contribute to the reading deficits.

Eye movements reveal impaired reading in schizophrenia

A study of eye movements in schizophrenia patients provides new evidence of impaired reading fluency in individuals with the mental illness.

The findings, by researchers at McGill University in Montreal, could open avenues to earlier detection and intervention for people with the illness.

While schizophrenia patients are known to have abnormalities in language and in eye movements, until recently reading ability was believed to be unaffected. That is because most previous studies examined reading in schizophrenia using single-word reading tests, the McGill researchers conclude. Such tests aren’t sensitive to problems in reading fluency, which is affected by the context in which words appear and by eye movements that shift attention from one word to the next.

The McGill study, led by Ph.D. candidate Veronica Whitford and psychology professors Debra Titone and Gillian A. O’Driscoll, monitored how people move their eyes as they read simple sentences. The results, which were first published online last year, appear in the February issue of the Journal of Experimental Psychology: General.

Eye movement measures provide clear and objective indicators of how hard people are working as they read. For example, when struggling with a difficult sentence, people generally make smaller eye movements, spend more time looking at each word, and spend more time re-reading words. They also have more difficulty attending to upcoming words, so they plan their eye movements less efficiently.

The McGill study, which involved 20 schizophrenia outpatients and 16 non-psychiatric participants, showed that reading patterns in people with schizophrenia differed in several important ways from healthy participants matched for gender, age, and family social status. People with schizophrenia read more slowly, generated smaller eye movements, spent more time processing individual words, and spent more time re-reading. In addition, people with schizophrenia were less efficient at processing upcoming words to facilitate reading.

The researchers evaluated factors that could contribute to the problems in reading fluency among the schizophrenia outpatients – specifically, their ability to parse words into sound components and their ability to skillfully control eye movements in non-reading contexts. Both factors were found to contribute to the reading deficits.

Filed under eye movements visual attention schizophrenia neuroscience medicine science

35 notes

Why Do Age-Related Macular Degeneration Patients Have Trouble Recognizing Faces?
Abnormalities of eye movement and fixation may contribute to difficulty in perceiving and recognizing faces among older adults with age-related macular degeneration (AMD), suggests a study “Abnormal Fixation in Individuals with AMD when Viewing an Image of a Face” appearing in the January issue of Optometry and Vision Science, official journal of the American Academy of Optometry. The journal is published by Lippincott Williams & Wilkins, a part of Wolters Kluwer Health.
Unlike people with normal vision focus, those with AMD don’t focus on “internal features” (the eyes, nose and mouth) when looking at the image of a face, according to the study by William Seiple, PhD, and colleagues of Lighthouse International, New York.
When Viewing Famous Face, AMD Patients Focus on External Features
The researchers used a sophisticated technique called optical coherence tomography/scanning laser ophthalmoscopy (OCT-SLO) to examine the interior of the eye in nine patients with AMD. Age-related macular degeneration is the leading cause of vision loss in older adults. It causes gradual destruction of the macula, leading to blurring and loss of central vision.
Previous studies have suggested that people with AMD have difficulty perceiving faces. To evaluate the possible role of abnormal eye movements, Dr Seiple and colleagues used the OCT-SLO equipment to make microscopic movies of the interior of the eye (fundus, including the retina and macula) as the patients viewed one of the world’s most famous faces: the Mona Lisa.
This technique allowed the researchers to record eye movements and where the patients looked (fixations) while looking at the face. They compared the findings in AMD patients to a control group of subjects with normal vision.
The results showed significant differences in eye movement patterns and fixations between groups. The AMD patients had fewer fixations on the internal features of the Mona Lisa’s face—eyes, nose, and mouth. For controls, an average of 87 percent of fixations were on internal features, compared to only 13 percent on external features. In contrast, for AMD patients, 62 percent of fixations were on internal features while 38 percent were on external features.
The normal controls also tended to make fewer and shorter eye movements (called saccades) than AMD patients. The differences between groups did not appear to be related to the blurring of vision associated with AMD.
Some older adults with AMD report difficulties perceiving faces. While the problem in “processing faces” is certainly related to the overall sensory visual loss, the new evidence suggests that specific patterns of eye movement abnormalities may also play a role.
Dr Seiple and colleagues note that “abnormal scanning patterns when viewing faces” have also been found in other conditions associated with difficulties in face perception, including autism, social phobias, and schizophrenia. The authors discuss the possible mechanisms of the abnormal scanning patterns in AMD, involving the complex interplay between the eyes and brain in governing eye movement and interpreting visual information.
A previous study suggested that drawing attention to specific characteristics—such as the internal facial features—may increase fixations on internal features and improve face perception. Dr Seiple and coauthors conclude, “That report gives hope that eye movement control training and training of allocation of attention could improve face perception and eye scanning behavior in individuals with AMD.”

Why Do Age-Related Macular Degeneration Patients Have Trouble Recognizing Faces?

Abnormalities of eye movement and fixation may contribute to difficulty in perceiving and recognizing faces among older adults with age-related macular degeneration (AMD), suggests a study “Abnormal Fixation in Individuals with AMD when Viewing an Image of a Face” appearing in the January issue of Optometry and Vision Science, official journal of the American Academy of Optometry. The journal is published by Lippincott Williams & Wilkins, a part of Wolters Kluwer Health.

Unlike people with normal vision focus, those with AMD don’t focus on “internal features” (the eyes, nose and mouth) when looking at the image of a face, according to the study by William Seiple, PhD, and colleagues of Lighthouse International, New York.

When Viewing Famous Face, AMD Patients Focus on External Features

The researchers used a sophisticated technique called optical coherence tomography/scanning laser ophthalmoscopy (OCT-SLO) to examine the interior of the eye in nine patients with AMD. Age-related macular degeneration is the leading cause of vision loss in older adults. It causes gradual destruction of the macula, leading to blurring and loss of central vision.

Previous studies have suggested that people with AMD have difficulty perceiving faces. To evaluate the possible role of abnormal eye movements, Dr Seiple and colleagues used the OCT-SLO equipment to make microscopic movies of the interior of the eye (fundus, including the retina and macula) as the patients viewed one of the world’s most famous faces: the Mona Lisa.

This technique allowed the researchers to record eye movements and where the patients looked (fixations) while looking at the face. They compared the findings in AMD patients to a control group of subjects with normal vision.

The results showed significant differences in eye movement patterns and fixations between groups. The AMD patients had fewer fixations on the internal features of the Mona Lisa’s face—eyes, nose, and mouth. For controls, an average of 87 percent of fixations were on internal features, compared to only 13 percent on external features. In contrast, for AMD patients, 62 percent of fixations were on internal features while 38 percent were on external features.

The normal controls also tended to make fewer and shorter eye movements (called saccades) than AMD patients. The differences between groups did not appear to be related to the blurring of vision associated with AMD.

Some older adults with AMD report difficulties perceiving faces. While the problem in “processing faces” is certainly related to the overall sensory visual loss, the new evidence suggests that specific patterns of eye movement abnormalities may also play a role.

Dr Seiple and colleagues note that “abnormal scanning patterns when viewing faces” have also been found in other conditions associated with difficulties in face perception, including autism, social phobias, and schizophrenia. The authors discuss the possible mechanisms of the abnormal scanning patterns in AMD, involving the complex interplay between the eyes and brain in governing eye movement and interpreting visual information.

A previous study suggested that drawing attention to specific characteristics—such as the internal facial features—may increase fixations on internal features and improve face perception. Dr Seiple and coauthors conclude, “That report gives hope that eye movement control training and training of allocation of attention could improve face perception and eye scanning behavior in individuals with AMD.”

Filed under macular degeneration eye movements face recognition AMD vision aging optical coherence tomography science

40 notes



Video-based Test to Study Language Development in Toddlers and Children with Autism
Parents often wonder how much of the world their young children really understand. Though typically developing children are not able to speak or point to objects on command until they are between eighteen months and two years old, they do provide clues that they understand language as early as the age of one. These clues provide a point of measurement for psychologists interested in language comprehension of toddlers and young children with autism, as demonstrated in a new video-article published in JoVE (Journal of Visualized Experiments). 
In the assessment, psychologists track a child’s eye movements while they are watching two side by side videos. Children who understand language are more likely to look at the video that the audio corresponds to. This way, language comprehension is tested by attention, not by asking the child to respond or point something out.  Furthermore, all assessments can be conducted in the child’s home, using mobile, commercially available equipment. The technique was developed in the laboratory of Dr. Letitia Naigles, and is known as a portable intermodal preferential looking assessment (IPL).
"When I started working with children with autism, I realized that they have similar issues with strangers that very young typical children do," Dr. Naigles tells us. "Children with autism may understand more than they can show because they are not socially inclined and find social interaction aversive and challenging." Dr. Naigles’ approach helps make this assessment more valuable. By testing the child in the home, where they are comfortable, Dr. Naigles removes much of the anxiety associated with a new environment that may skew results.
While this technique identifies some similarities between typically developing toddlers and children with autism spectrum disorder, such as understanding some types of sentences before they produce them, this does not mean that these children are the same. “Some strategies of word learning that typical children have acquired are not demonstrated in children with autism.” Dr. Naigles says. By illuminating both strengths and weaknesses, the test is valuable for assessing language development. “JoVE is useful because in the past, I have gone to visit various labs to coach them in putting together an IPL. JoVE will enable other labs to set up the procedure more efficiently.” JoVE associate editor Allison Diamond stated, “Showing this work in a video format will allow other scientists in the field to quickly adapt Dr. Naigles’ technique, and use it to address the question of language development in autism, an extremely important field of research.”

Video-based Test to Study Language Development in Toddlers and Children with Autism

Parents often wonder how much of the world their young children really understand. Though typically developing children are not able to speak or point to objects on command until they are between eighteen months and two years old, they do provide clues that they understand language as early as the age of one. These clues provide a point of measurement for psychologists interested in language comprehension of toddlers and young children with autism, as demonstrated in a new video-article published in JoVE (Journal of Visualized Experiments).

In the assessment, psychologists track a child’s eye movements while they are watching two side by side videos. Children who understand language are more likely to look at the video that the audio corresponds to. This way, language comprehension is tested by attention, not by asking the child to respond or point something out.  Furthermore, all assessments can be conducted in the child’s home, using mobile, commercially available equipment. The technique was developed in the laboratory of Dr. Letitia Naigles, and is known as a portable intermodal preferential looking assessment (IPL).

"When I started working with children with autism, I realized that they have similar issues with strangers that very young typical children do," Dr. Naigles tells us. "Children with autism may understand more than they can show because they are not socially inclined and find social interaction aversive and challenging." Dr. Naigles’ approach helps make this assessment more valuable. By testing the child in the home, where they are comfortable, Dr. Naigles removes much of the anxiety associated with a new environment that may skew results.

While this technique identifies some similarities between typically developing toddlers and children with autism spectrum disorder, such as understanding some types of sentences before they produce them, this does not mean that these children are the same. “Some strategies of word learning that typical children have acquired are not demonstrated in children with autism.” Dr. Naigles says. By illuminating both strengths and weaknesses, the test is valuable for assessing language development. “JoVE is useful because in the past, I have gone to visit various labs to coach them in putting together an IPL. JoVE will enable other labs to set up the procedure more efficiently.” JoVE associate editor Allison Diamond stated, “Showing this work in a video format will allow other scientists in the field to quickly adapt Dr. Naigles’ technique, and use it to address the question of language development in autism, an extremely important field of research.”

Filed under autism language language development eye movements language comprehension psychology neuroscience science

free counters