Neuroscience

Articles and news from the latest research reports.

Posts tagged visual impairment

146 notes

'Seeing' through Virtual Touch Is Believing

A University of Cincinnati experiment aimed at this diverse and growing population could spark development of advanced tools to help all the aging baby boomers, injured veterans, diabetics and white-cane-wielding pedestrians navigate the blurred edges of everyday life.

These tools could be based on a device called the Enactive Torch, which looks like a combination between a TV remote and Captain Kirk’s weapon of choice. But it can do much greater things than change channels or stun aliens.

image

Luis Favela, a graduate student in philosophy and psychology, has found the torch enables the visually impaired to judge their ability to comfortably pass through narrow passages, like an open door or busy sidewalk, as good as if they were actually seeing such pathways themselves.

The handheld torch uses infra-red sensors to “see” objects in front of it. When the torch detects an object, it emits a vibration – similar to a cellphone alert – through an attached wristband. The gentle buzz increases in intensity as the torch nears the object, letting the user make judgments about where to move based on a virtual touch.

"Results of this experiment point in the direction of different kinds of tools or sensory augmentation devices that could help people who have visual impairment or other sorts of perceptual deficiencies. This could start a research program that could help people like that," Favela says.

Favela presented his research “Augmenting the Sensory Judgment Abilities of the Visually Impaired” at the American Psychological Association’s (APA) annual convention, held Aug. 7-10 in Washington, D.C. More than 11,000 psychology professionals, scholars and students from around the world annually attend APA’s convention.

A Growing Population in Need

Favela studies how people perceive their environment and how those perceptions inform their judgments. For this experiment, he was inspired by what he knew about the surging population of visually impaired Americans.

image

The Centers for Disease Control and Prevention (CDC) predicts that more than 6 million Americans age 40 and older will be affected by blindness or low vision by 2030 – double the number from 2004 – due to diabetes or other chronic diseases and the rapidly aging population. The CDC also notes that vision loss is among the top 10 causes of disability in the U.S., and vision impairment is one of the most prevalent disabilities in children.

"In my research I’ve found that there’s an emotional stigma that people who are visually impaired experience, particularly children," Favela says. "When you’re a kid in elementary school, you want to blend in and be part of the group. It’s hard to do that when you’re carrying this big, white cane."

Substituting Sight with Touch

In Favela’s experiment, 27 undergraduate students with normal or corrected-to-normal vision and no prior experience with mobility assistance devices were asked to make perceptual judgments about their ability to pass through an opening a few feet in front of them without needing to shift their normal posture. Favela tested participants’ judgments in three ways: using only their vision, using a cane while blindfolded and using the Enactive Torch while blindfolded. The idea was to compare judgments made with vision against those made by touch.

image

The results of the experiment were surprising. Favela figured vision-based judgments would be the most accurate because vision tends to be most people’s dominant perceptual modality. However, he found the three types of judgments were equally accurate.

"When you compare the participants’ judgments with vision, cane and Enactive Torch, there was not a significant difference, meaning that they made the same judgments," Favela says. "The three modalities are functionally equivalent. People can carry out actions just about to the same degree whether they’re using their vision or their sense of touch. I was really surprised."

Favela plans additional experiments requiring more complicated judgments, such as the ability to step over an obstacle or to climb stairs. With further study and improvements to the Enactive Torch, Favela says similar tools that augment touch-based perception could have a significant impact on the lives of the visually impaired.

"If the future version of the Enactive Torch is smaller and more compact, kids who use it wouldn’t stand out from the crowd, they might feel like they blend in more," he says, noting people can quickly adapt to using the torch. "That bodes well, say, for someone in the Marines who was injured by a roadside bomb. They could be devastated. But hope’s not lost. They will learn how to navigate the world pretty quickly."

(Source: uc.edu)

Filed under enactive torch visual impairment augmented reality perception sense of touch psychology neuroscience science

139 notes

Visual hallucinations more common than previously thought

Vivid hallucinations experienced by people with sight loss last far longer and have more serious consequences than previously thought, according to new research from King’s College London and the Macular Society. 

image

The study is the largest survey of the phenomenon, known as Charles Bonnet Syndrome, and documented the experiences of 492 visually impaired people who had experienced visual hallucinations. The findings, published in the British Journal of Ophthalmology, show there is a serious discrepancy between medical opinion and the realities of the condition.

Charles Bonnet Syndrome is widely considered by the medical profession to be benign and short-lived. However, the new research shows that 80% of respondents had hallucinations for five years or more and 32% found them predominantly unpleasant, distressing and negative. 

The study described this group of people as having “negative outcome Charles Bonnet Syndrome”. The group was more likely to have frequent, fear inducing, longer duration hallucinations, which affected daily activities. They were more likely to attribute hallucinations to serious mental illness and were less likely to have been warned about the possibility of hallucinations before they started. 

Of respondents, 38% regarded their hallucinations as startling, terrifying or frightening when they first occurred and 46% said hallucinations had an effect on their ability to complete daily tasks. 36% of people who discussed the issue with a medical professional said the professional was “unsure or did not know” about the diagnosis.

Dr Dominic ffytche, who led the research at the Institute of Psychiatry at King’s, says:  “Charles Bonnet Syndrome has been traditionally thought of as benign. Indeed, it has been questioned whether it should even be considered a medical condition given it does not cause problems and goes away by itself. The results of our survey paint a very different picture.

“With no specific treatments for Charles Bonnet Syndrome, the survey highlights the importance of raising awareness to reduce the distress it causes, particularly before symptoms start. All people with Charles Bonnet Syndrome are relieved or reassured to find out about the cause of their hallucinations and our evidence shows the knowledge may help reduce negative outcome.”

People with macular disease are particularly prone to Charles Bonnet hallucinations. They are thought to be a reaction of the brain to the loss of visual stimulation. More than half of people with severe sight loss experience them but many do not tell others for fear they will be thought to have a serious mental illness. 

Age-related macular (AMD) degeneration affects the central vision and is the most common cause of sight loss in the UK. Nearly 600,000 people have late-stage AMD today and more people will become affected as our population ages. Around half will have hallucinations at some stage. 

Tony Rucinski, Chief Executive, the Macular Society, said: “It is essential that people affected by sight loss are given information about Charles Bonnet Syndrome at diagnosis or as soon after as possible. 

“Losing your sight is bad enough without the fear that you have something like dementia as well. We need medical professionals to recognise the seriousness of Charles Bonnet Syndrome and ensure that people don’t suffer unnecessarily. More research is also needed to investigate Charles Bonnet Syndrome and possible ways of reducing its impact.”

Dr ffytche is also leading a large NIHR funded research programme on visual hallucinations to develop a much-needed evidence base to inform NHS practice in managing and treating the symptoms. 

Filed under hallucinations Charles Bonnet Syndrome vision visual impairment neuroscience science

583 notes

Yoga accessible for the blind with new Microsoft Kinect-based program
In a typical yoga class, students watch an instructor to learn how to properly hold a position. But for people who are blind or can’t see well, it can be frustrating to participate in these types of exercises.
Now, a team of University of Washington computer scientists has created a software program that watches a user’s movements and gives spoken feedback on what to change to accurately complete a yoga pose.
“My hope for this technology is for people who are blind or low-vision to be able to try it out, and help give a basic understanding of yoga in a more comfortable setting,” said project lead Kyle Rector, a UW doctoral student in computer science and engineering.
The program, called Eyes-Free Yoga, uses Microsoft Kinect software to track body movements and offer auditory feedback in real time for six yoga poses, including Warrior I and II, Tree and Chair poses. Rector and her collaborators published their methodology in the conference proceedings of the Association for Computing Machinery’s SIGACCESS International Conference on Computers and Accessibility in Bellevue, Wash., Oct. 21-23.
Rector wrote programming code that instructs the Kinect to read a user’s body angles, then gives verbal feedback on how to adjust his or her arms, legs, neck or back to complete the pose. For example, the program might say: “Rotate your shoulders left,” or “Lean sideways toward your left.”
The result is an accessible yoga “exergame” – a video game used for exercise – that allows people without sight to interact verbally with a simulated yoga instructor. Rector and collaborators Julie Kientz, a UW assistant professor in Human Centered Design & Engineering, and Cynthia Bennett, a research assistant in computer science and engineering, believe this can transform a typically visual activity into something that blind people can also enjoy.
“I see this as a good way of helping people who may not know much about yoga to try something on their own and feel comfortable and confident doing it,” Kientz said. “We hope this acts as a gateway to encouraging people with visual impairments to try exercise on a broader scale.”
Each of the six poses has about 30 different commands for improvement based on a dozen rules deemed essential for each yoga position. Rector worked with a number of yoga instructors to put together the criteria for reaching the correct alignment in each pose. The Kinect first checks a person’s core and suggests alignment changes, then moves to the head and neck area, and finally the arms and legs. It also gives positive feedback when a person is holding a pose correctly.
Rector practiced a lot of yoga as she developed this technology. She tested and tweaked each aspect by deliberately making mistakes while performing the exercises. The result is a program that she believes is robust and useful for people who are blind.
“I tested it all on myself so I felt comfortable having someone else try it,” she said.
Rector worked with 16 blind and low-vision people around Washington to test the program and get feedback. Several of the participants had never done yoga before, while others had tried it a few times or took yoga classes regularly. Thirteen of the 16 people said they would recommend the program and nearly everyone would use it again.
The technology uses simple geometry and the law of cosines to calculate angles created during yoga. For example, in some poses a bent leg must be at a 90-degree angle, while the arm spread must form a 160-degree angle. The Kinect reads the angle of the pose using cameras and skeletal-tracking technology, then tells the user how to move to reach the desired angle.
Rector opted to use Kinect software because it’s open source and easily accessible on the market, but she said it does have some limitations in the level of detail with which it tracks movement.
Rector and collaborators plan to make this technology available online so users could download the program, plug in their Kinect and start doing yoga. The team also is pursuing other projects that help with fitness.

Yoga accessible for the blind with new Microsoft Kinect-based program

In a typical yoga class, students watch an instructor to learn how to properly hold a position. But for people who are blind or can’t see well, it can be frustrating to participate in these types of exercises.

Now, a team of University of Washington computer scientists has created a software program that watches a user’s movements and gives spoken feedback on what to change to accurately complete a yoga pose.

“My hope for this technology is for people who are blind or low-vision to be able to try it out, and help give a basic understanding of yoga in a more comfortable setting,” said project lead Kyle Rector, a UW doctoral student in computer science and engineering.

The program, called Eyes-Free Yoga, uses Microsoft Kinect software to track body movements and offer auditory feedback in real time for six yoga poses, including Warrior I and II, Tree and Chair poses. Rector and her collaborators published their methodology in the conference proceedings of the Association for Computing Machinery’s SIGACCESS International Conference on Computers and Accessibility in Bellevue, Wash., Oct. 21-23.

Rector wrote programming code that instructs the Kinect to read a user’s body angles, then gives verbal feedback on how to adjust his or her arms, legs, neck or back to complete the pose. For example, the program might say: “Rotate your shoulders left,” or “Lean sideways toward your left.”

The result is an accessible yoga “exergame” – a video game used for exercise – that allows people without sight to interact verbally with a simulated yoga instructor. Rector and collaborators Julie Kientz, a UW assistant professor in Human Centered Design & Engineering, and Cynthia Bennett, a research assistant in computer science and engineering, believe this can transform a typically visual activity into something that blind people can also enjoy.

“I see this as a good way of helping people who may not know much about yoga to try something on their own and feel comfortable and confident doing it,” Kientz said. “We hope this acts as a gateway to encouraging people with visual impairments to try exercise on a broader scale.”

Each of the six poses has about 30 different commands for improvement based on a dozen rules deemed essential for each yoga position. Rector worked with a number of yoga instructors to put together the criteria for reaching the correct alignment in each pose. The Kinect first checks a person’s core and suggests alignment changes, then moves to the head and neck area, and finally the arms and legs. It also gives positive feedback when a person is holding a pose correctly.

Rector practiced a lot of yoga as she developed this technology. She tested and tweaked each aspect by deliberately making mistakes while performing the exercises. The result is a program that she believes is robust and useful for people who are blind.

“I tested it all on myself so I felt comfortable having someone else try it,” she said.

Rector worked with 16 blind and low-vision people around Washington to test the program and get feedback. Several of the participants had never done yoga before, while others had tried it a few times or took yoga classes regularly. Thirteen of the 16 people said they would recommend the program and nearly everyone would use it again.

The technology uses simple geometry and the law of cosines to calculate angles created during yoga. For example, in some poses a bent leg must be at a 90-degree angle, while the arm spread must form a 160-degree angle. The Kinect reads the angle of the pose using cameras and skeletal-tracking technology, then tells the user how to move to reach the desired angle.

Rector opted to use Kinect software because it’s open source and easily accessible on the market, but she said it does have some limitations in the level of detail with which it tracks movement.

Rector and collaborators plan to make this technology available online so users could download the program, plug in their Kinect and start doing yoga. The team also is pursuing other projects that help with fitness.

Filed under yoga eyes-free yoga health visual impairment technology science

208 notes

Scary Faces Terrify Woman with Unusual Condition
When the 67-year-old woman came to the hospital, she was deeply afraid of two things — the visions of odd-looking faces that appeared hovering before her, and that the hallucinations might mean she was losing her mind.
But this retired teacher wasn’t going crazy, and laboratory tests also ruled out two common culprits of hallucinations — infection and drug interactions.
"She was absolutely terrified by what she was seeing," said Dr. Bharat Kumar, an internal medicine resident at the University of Kentucky who treated the woman. In fact, the patient and her family were so concerned in the days before she came to the hospital, they asked a priest about performing an exorcism, Kumar said.
The woman drew a picture of what she saw. The faces had large teeth, eyes and ears, and a horizontally elongated shape, like a football.
That peculiar shape and the fact that the patient recognized that she was hallucinating (rather than believing the visions to be real) provided two important clues in making a diagnosis, Kumar said. He determined that the woman had condition called Charles Bonnet syndrome.
Patients with the syndrome may see small people and animals, bright moving shapes or distorted faces. These hallucinations are purely visual; no sounds accompany them.
In the woman’s case, the condition developed because she had macular degeneration. Tissue within the retinas of her eyes was deteriorating, and her ability to see was declining.
Charles Bonnet syndrome results from the absence of such sensory input to the brain. “When it expects sensory input and receives nothing, it often creates its own input,” Kumar explained.
The brain isn’t a sophisticated computer that processes information objectively and efficiently, he said. “It’s more of a wibbly-wobbly, messy-guessy ball of goo.”
There is no treatment for the condition, but in many cases the hallucinations stop happening as the brain becomes used to vision loss. Patients who are very frightened might be given anti-psychotic medications, but these drugs have serious side effects and aren’t appropriate for everyone.
The woman was grateful to receive her diagnosis and learn that she was not losing her mind, Kumar said. When he followed up with her three months later, she was still having the hallucinations, but they were happening less often.
A 2010 study showed that 10 to 40 percent of elderly patients with visual impairments may have Charles Bonnet syndrome.
Kumar had never before seen a patient with the condition, although he noted that it may occur more commonly than it is diagnosed. “Patients are often hesitant to say that they see things because they are afraid that they will be called crazy,” he said.
The case report was published online Feb. 25 in the journal Age and Aging.

Scary Faces Terrify Woman with Unusual Condition

When the 67-year-old woman came to the hospital, she was deeply afraid of two things — the visions of odd-looking faces that appeared hovering before her, and that the hallucinations might mean she was losing her mind.

But this retired teacher wasn’t going crazy, and laboratory tests also ruled out two common culprits of hallucinations — infection and drug interactions.

"She was absolutely terrified by what she was seeing," said Dr. Bharat Kumar, an internal medicine resident at the University of Kentucky who treated the woman. In fact, the patient and her family were so concerned in the days before she came to the hospital, they asked a priest about performing an exorcism, Kumar said.

The woman drew a picture of what she saw. The faces had large teeth, eyes and ears, and a horizontally elongated shape, like a football.

That peculiar shape and the fact that the patient recognized that she was hallucinating (rather than believing the visions to be real) provided two important clues in making a diagnosis, Kumar said. He determined that the woman had condition called Charles Bonnet syndrome.

Patients with the syndrome may see small people and animals, bright moving shapes or distorted faces. These hallucinations are purely visual; no sounds accompany them.

In the woman’s case, the condition developed because she had macular degeneration. Tissue within the retinas of her eyes was deteriorating, and her ability to see was declining.

Charles Bonnet syndrome results from the absence of such sensory input to the brain. “When it expects sensory input and receives nothing, it often creates its own input,” Kumar explained.

The brain isn’t a sophisticated computer that processes information objectively and efficiently, he said. “It’s more of a wibbly-wobbly, messy-guessy ball of goo.”

There is no treatment for the condition, but in many cases the hallucinations stop happening as the brain becomes used to vision loss. Patients who are very frightened might be given anti-psychotic medications, but these drugs have serious side effects and aren’t appropriate for everyone.

The woman was grateful to receive her diagnosis and learn that she was not losing her mind, Kumar said. When he followed up with her three months later, she was still having the hallucinations, but they were happening less often.

A 2010 study showed that 10 to 40 percent of elderly patients with visual impairments may have Charles Bonnet syndrome.

Kumar had never before seen a patient with the condition, although he noted that it may occur more commonly than it is diagnosed. “Patients are often hesitant to say that they see things because they are afraid that they will be called crazy,” he said.

The case report was published online Feb. 25 in the journal Age and Aging.

Filed under visual impairment macular degeneration hallucinations Charles Bonnet syndrome neuroscience science

21 notes

Teaching the Blind to Find Their Way by Playing Video Games
Computer based video games are receiving great interest as a means to learn and acquire new skills. As a novel approach to teaching navigation skills in the blind, we have developed Audio-based Environment Simulator (AbES); a virtual reality environment set within the context of a video game metaphor. Despite the fact that participants were naïve to the overall purpose of the software, we found that early blind users were able to acquire relevant information regarding the spatial layout of a previously unfamiliar building using audio based cues alone. This was confirmed by a series of behavioral performance tests designed to assess the transfer of acquired spatial information to a large-scale, real-world indoor navigation task. Furthermore, learning the spatial layout through a goal directed gaming strategy allowed for the mental manipulation of spatial information as evidenced by enhanced navigation performance when compared to an explicit route learning strategy. We conclude that the immersive and highly interactive nature of the software greatly engages the blind user to actively explore the virtual environment. This in turn generates an accurate sense of a large-scale three-dimensional space and facilitates the learning and transfer of navigation skills to the physical world.

Teaching the Blind to Find Their Way by Playing Video Games

Computer based video games are receiving great interest as a means to learn and acquire new skills. As a novel approach to teaching navigation skills in the blind, we have developed Audio-based Environment Simulator (AbES); a virtual reality environment set within the context of a video game metaphor. Despite the fact that participants were naïve to the overall purpose of the software, we found that early blind users were able to acquire relevant information regarding the spatial layout of a previously unfamiliar building using audio based cues alone. This was confirmed by a series of behavioral performance tests designed to assess the transfer of acquired spatial information to a large-scale, real-world indoor navigation task. Furthermore, learning the spatial layout through a goal directed gaming strategy allowed for the mental manipulation of spatial information as evidenced by enhanced navigation performance when compared to an explicit route learning strategy. We conclude that the immersive and highly interactive nature of the software greatly engages the blind user to actively explore the virtual environment. This in turn generates an accurate sense of a large-scale three-dimensional space and facilitates the learning and transfer of navigation skills to the physical world.

Filed under brain vision game play visual impairment blindness mental spatial representations AbES neuroscience science

free counters