Posts tagged haptics

Posts tagged haptics
Brain mechanism underlying the recognition of hand gestures develops even when blind
Does a distinctive mechanism work in the brain of congenitally blind individuals when understanding and learning others’ gestures? Or does the same mechanism as with sighted individuals work? Japanese researchers figured out that activated brain regions of congenitally blind individuals and activated brain regions of sighted individuals share common regions when recognizing human hand gestures. They indicated that a region of the neural network that recognizes others’ hand gestures is formed in the same way even without visual information. The findings are discussed in The Journal of Neuroscience.
Our brain mechanism perceives human bodies from inanimate objects and shows a particular response. A part of a region of the “visual cortex” that processes visual information supports this mechanism. Since visual information is largely used in perception, this is reasonable, however, for perception using haptic information and also for the recognition of one’s own gestures, it has been recently learned that the same brain region is activated. It came to be considered that there is a mechanism that is formed regardless of the sensory modalities and recognizes human bodies.
Blind and sighted individuals participated in the study of the research group of Assistant Professor Ryo Kitada of the National Institute for Physiological Sciences, National Institutes of Natural Sciences. With their eyes closed, they were instructed to touch plastic casts of hands, teapots, and toy cars and identify the shape. As it turned out, sighted individuals and blind individuals could make an identification with the same accuracy. Through measuring the activated brain region using functional magnetic resonance imaging (fMRI), for plastic casts of hands and not for teapots or toy cars, the research group was able to pinpoint a common activated brain region regardless of visual experience. On another front, it also revealed a region showing signs of activity that is dependent on the duration of the visual experience and it was also learned that this region functions as a supplement when recognizing hand gestures.
As Assistant Professor Ryo Kitada notes, “Many individuals are active in many parts of the society even with the loss of their sight as a child. Developmental psychology has been advancing its doctrine based on sighted individuals. I wish this finding will help us grasp how blind individuals understand and learn about others and be seen as an important step in supporting the development of social skills for blind individuals.”
Juggling may sound like mere entertainment, but a study led by Johns Hopkins engineers has used this circus skill to gather critical clues about how vision and the sense of touch help control the way humans and animals move their limbs in a repetitive way, such as in running. The findings eventually may aid in the treatment of people with neurological diseases and could lead to prosthetic limbs and robots that move more efficiently.

The study was published online recently by the Journal of Neurophysiology and is the cover article in the journal’s March 2014 print edition.
In their paper, the team led by Johns Hopkins researchers detailed the unusual jump from juggling for fun to serious science. Jugglers, they explained, rely on repeated rhythmic motions to keep multiple balls aloft. Similar forms of rhythmic movement are also common in the animal world, where effective locomotion is equally important to a swift-moving gazelle and to the cheetah that’s chasing it.
“It turns out that the art of juggling provides an interesting window into many of the same questions that you try to answer when you study forms of locomotion, such as walking or running,” said Noah Cowan, an associate professor of mechanical engineering who supervised the research. “In our study, we had participants stand still and use their hands in a rhythmic way. It’s very much like watching them move their feet as they run. But we used juggling as a model for rhythmic motor coordination because it’s a simpler system to study.”
New array measures vibrations across the skin, may help engineers design optimal, wearable tactile displays.

In the near future, a buzz in your belt or a pulse from your jacket may give you instructions on how to navigate your surroundings.
Think of it as tactile Morse code: vibrations from a wearable, GPS-linked device that tell you to turn right or left, or stop, depending on the pattern of pulses you feel. Such a device could free drivers from having to look at maps, and could also serve as a tactile guide for the visually and hearing impaired.
Lynette Jones, a senior research scientist in MIT’s Department of Mechanical Engineering, designs wearable tactile displays. Through her work, she’s observed that the skin is a sensitive — though largely untapped — medium for communication.
“If you compare the skin to the retina, you have about the same number of sensory receptors, you just have them over almost two square meters of space, unlike the eye where it’s all concentrated in an extremely small area,” Jones says. “The skin is generally as useful as a very acute area. It’s just that you need to disperse the information that you’re presenting.”
Knowing just how to disperse tactile information across the skin is tricky. For instance, people may be much more sensitive to stimuli on areas like the hand, as opposed to the forearm, and may respond best to certain patterns of vibrations. Such information on skin responsiveness could help designers determine the best configuration of motors in a display, given where on the skin a device would be worn.
Now Jones has built an array that precisely tracks a motor’s vibrations through skin in three dimensions. The array consists of eight miniature accelerometers and a single pancake motor — a type of vibrating motor used in cellphones. She used the array to measure motor vibrations in three locations: the palm of the hand, the forearm and the thigh. From her studies with eight healthy participants, Jones found that a motor’s mechanical vibrations through skin drop off quickly in all three locations, within 8 millimeters from where the vibrations originated.
Jones also gauged participants’ perception of vibrations, fitting them with a 3-by-3 array of pancake motors in these three locations on the body. While skin generally stopped vibrating 8 millimeters from the source, most people continued to perceive the vibrations as far away as 24 millimeters.
When participants were asked to identify specific locations of motors within the array, they were much more sensitive on the palm than on the forearm or thigh. But in all three locations, people were better at picking out vibrations in the four corners of the array, versus the inner motors, leading Jones to posit that perhaps people use the edges of their limbs to localize vibrations and other stimuli.
“For a lot of sensory modalities, you have to work out what it is people can process, as one of the dictates for how you design,” says Jones, whose results will appear in the journal IEEE Transactions on Haptics. “There’s no point in making things much more compact, which may be a desirable feature from an engineering point of view, but from a human-use point of view, doesn’t make a difference.”
Mapping good vibrations
In addition to measuring skin’s sensitivity to vibrations, Jones and co-author Katherine Sofia ’12 found that skin has a strong effect on motor vibrations. The researchers compared a pancake motor’s frequency of vibrations when mounted on a rigid structure or on more compliant skin. They found that in general, skin reduced a motor’s vibrations by 28 percent, with the forearm and thigh having a slightly stronger dampening effect than the palm of the hand.
The skin’s damping of motor vibrations is significant, Jones says, if engineers plan to build tactile displays that incorporate different frequencies of vibrations. For instance, the difference between two motors — one slightly faster than the other — may be indistinguishable in certain parts of the skin. Likewise, two motors spaced a certain distance apart may be differentiable in one area but not another.
“Should I have eight motors, or is four enough that 90 percent of the time, I’ll know that when this one’s on, it’s this one and not that one?” Jones says. “We’re answering those sorts of questions in the context of what information you want to present using a device.”
Roberta Klatzky, a professor of psychology at Carnegie Mellon University, says that measurements taken by Jones’ arrays can be used to set up displays in which the location of a stimulus — for example, a pattern to convey a letter — is important.
“A major challenge is to enable people to tell the difference between patterns applied to the skin as, for example, blind people do when reading Braille,” says Klatzky, who specializes in the study of spatial cognition. “Lynette’s work sets up a methodology and potential guidelines for effective pattern displays.”
Creating a buzz
Jones sees promising applications for wearable tactile displays. In addition to helping drivers navigate, she says tactile stimuli may direct firefighters through burning buildings, or emergency workers through disaster sites. In more mundane scenarios, she says tactile displays may help joggers traverse an unfamiliar city, taking directions from a buzzing wristband, instead of having to look at a smartphone.
Using data from their mechanical and perceptual experiments, Jones’ group is designing arrays that can be worn across the back and around the wrist, and is investigating various ways to present vibrations. For example, a row of vibrations activated sequentially from left to right may tell a driver to turn right; a single motor that buzzes with increasing frequency may be a warning to slow down.
“There’s a lot of things you can do with these displays that are fairly intuitive in terms of how people respond,” Jones says, “which is important because no one’s going to spend hours and hours in any application, learning what a signal means.”
(Source: web.mit.edu)