Posts tagged robotics

Posts tagged robotics
The world’s first Brain Training Device has given a ray of new hope to the recovery of survivors after stroke. Developed by researchers of The Hong Kong Polytechnic University (PolyU)’s Interdisciplinary Division of Biomedical Engineering (BME), this novel device which can detect brainwave, and thereby control the movement of paralyzed limbs, or go even further to control a robotic hand based on its sophisticated algorithm.
The research was led by Prof. Raymond Tong Kai-yu, Professor of PolyU’s Interdisciplinary Division of Biomedical Engineering, who is also the Principal Investigator of the award-winning Exoskeleton Hand Robotic Training Device or the “Hand of Hope”. His team members include the BME research team (Newmen Ho, Xiaoling Hu, Ching-hang Fong, Xinxin Lou, Lawrence Chong and Nathan Lam) and the Industrial Centre team of PolyU (Robert Tam, Bun Yu, Shu-to Ng and Peter Pang).
The latest breakthrough “Brain Training Device” can be coupled with the use of the “Hand of Hope” to achieve higher degree of recovery for stroke patients. While effective motor recovery after stroke depends on early rehabilitation program and intensive voluntary practice of the paretic limbs, current rehabilitation products have not use brainwave to guide the stroke survivors to identify voluntary intention and to relearn how to reconnect to their paralyzed limb again.
Prof. Raymond Tong and his team therefore developed the Brain Training Device with a new coherence algorithm for hand function training. The new algorithm is based on frequency coherence on surface electroencephalography (EEG, brainwave) and electromyography (EMG, muscle activities) to identify voluntary intention and their connection.
"The Brain Training Device is able to guide the stroke patients to relearn the reconnection between the brain and the limb, with a new design on the EEG headset and the EMG forearm brace to transmit data for controlling a hand robotic system interfaced by a telecare software platform using iPad app." Prof. Raymond Tong explained.
The patented Brain Training System, which looks like a helmet for cyclist and can read brainwaves, also has new features to find the specific EEG electrode locations for each individual stroke patient and reduce the number of EEG electrodes, which can reduce the system cost and the preparation time for brain training, added by Prof. Tong.
To find a minimal set of electrodes to control the device with accuracy higher than 90%, five chronic stroke patients were recruited to be trained for 20 sessions in the study. The researchers found that, in general, 32 electrodes are needed to maintain accuracy higher than 90%.
The high accuracy and low number of channels needed means that the Brain Training Device is a viable tool for assistive aid and rehabilitation training. The futuristic system will be made portable and easy-to-use at hospital and home settings.
PolyU researchers have already filed patents for this Brain Training Device in both the United States and China. This project is funded by the HKSAR Government’s Innovation and Technology Fund (ITF). The findings of this brain control algorithm have been published as the cover story in top international journal IEEE Transactions on Neural Systems and Rehabilitation Engineering (2011.12).
In a study recently published in IEEE Transactions on Neural Systems and Rehabilitation Engineering, neurobiologists at the University of Chicago show how an organism can sense a tactile stimulus, in real time, through an artificial sensor in a prosthetic hand.
Scientists have made tremendous advances toward building lifelike prosthetic limbs that move and function like the real thing. These are amazing accomplishments, but an important element to creating a realistic replacement for a hand is the sense of touch. Without somatosensory feedback from the fingertips about how hard you’re squeezing something or where it’s positioned relative to the hand, grasping an object is about as accurate as using one of those skill cranes to grab a stuffed animal at an arcade. Sure, you can do it, but you have to concentrate intently while watching every movement. You’re relying on your sense of vision to compensate for the lack of touch.
Sliman Bensmaia, assistant professor of organismal biology and anatomy at the University of Chicago, studies the neural basis of the sense of touch. Now, he and his colleagues are working with a robotic hand equipped with sensors that send electrical signals to electrodes implanted in the brain to recreate the same response to touch as a real hand.
Bensmaia spoke about how important the sense of touch is to creating a lifelike experience with a prosthetic limb.
“If you lose your somatosensory system it almost looks like your motor system is impaired,” he said. “If you really want to create an arm that can actually be used dexterously without the enormous amount of concentration it takes without sensory feedback, you need to restore the somatosensory feedback.”
The researchers performed a series of experiments with rhesus macaques that were trained to respond to stimulation of the hand. In one setting, they were gently poked on the hand with a physical probe at varying levels of pressure. In a second setting, some of the animals had electrodes implanted into the area of the brain that responds to touch. These animals were given electrical pulses to simulate the sensation of touch, and their hands were hidden so they wouldn’t see that they weren’t actually being touched.
Using data from the animals’ responses to each type of stimulus, the researchers were able to create a function, or equation, that described the requisite electrical pulse to go with each physical poke of the hand. Then, they repeated the experiments with a prosthetic hand that was wired to the brain implants. They touched the prosthetic hand with the physical probe, which in turn sent electrical signals to the brain.
Bensmaia said that the animals performed identically whether poked on their own hand or on the prosthetic one.
“This is the first time as far as I know where an animal or organism actually perceives a tactile stimulus through an artificial transducer,” Bensmaia said. “It’s an engineering milestone. But from a neuroengineering standpoint, this validates this function. You can use this function to have an animal perform this very precise task, precisely identically.”
The FDA is in the process of approving similar devices for human trials, and Bensmaia said he hopes such a system is implemented within the next year. Producing a lifelike sense of touch would go a long way toward improving the dexterity and performance of prosthetic hands, but he said it would also help bridge a mental divide for amputees or people who have lost the use of a limb. Until now, prosthetics and robotic arms feel more like tools than real replacements because they don’t produce the expected sensations.
“If every time you see your robotic arm touching something, you get a sensation that is projected to it, I think it’s very possible that in fact, you will consider this new thing as being part of your body,” he said.
(Source: newswise.com)
Using a kid-friendly robot during behavioral therapy sessions may help some children with autism gain better social skills, a preliminary study suggests.

The study, of 19 children with autism spectrum disorders (ASDs), found that kids tended to do better when their visit with a therapist included a robot “co-therapist.” On average, they made bigger gains in social skills such as asking “appropriate” questions, answering questions and making conversational comments.
So-called humanoid robots are already being marketed for this purpose, but there has been little research to back it up.
"Going into this study, we were skeptical," said lead researcher Joshua Diehl, an assistant professor of psychology at the University of Notre Dame in Indiana, who said he has no financial interest in the technology.
"We found that, to our surprise, the kids did better when the robot was added," he said.
There are still plenty of caveats, however, said Diehl, who is presenting his team’s findings Saturday at the International Meeting for Autism Research (IMFAR) in San Sebastian, Spain.
For one, the study was small. And it’s not clear that the results seen in a controlled research setting would be the same in the real world of therapists’ offices, according to Diehl.
"I’d say this is not yet ready for prime time," he said.
ASDs are a group of developmental disorders that affect a person’s ability to communicate and interact socially. The severity of those effects range widely: Some people have mild problems socializing, but have normal to above-normal intelligence; some people have profound difficulties relating to others, and may have intellectual impairment as well.
Experts have become interested in using technology — from robots to iPads — along with standard ASD therapies because it may help bridge some of the communication issues kids have.
Human communication is complex and unpredictable, with body language, facial expressions and other subtle cues coming into the mix, explained Geraldine Dawson, chief science officer for the advocacy group Autism Speaks.
A robot or a computer game, on the other hand, can be programmed to be simple and predictable, and that may help kids with ASDs better process the information they are being given, Dawson said.
"Broadly speaking," she said, "we are very excited about the potential role for technology in diagnosing and treating ASDs." But she also agreed with Diehl that the findings are "very preliminary," and that researchers have a lot more to learn about how technology — robots or otherwise — fits into ASD therapies.
For the study, Diehl’s team used a humanoid robot manufactured by Aldebaran Robotics, which markets the NAO robot for use in education, including special education for kids with ASDs. The robot, which stands at about 2 feet tall, looks like a toy but it’s priced more like a small car, Diehl noted.
The NAO H25 “Academic Edition” rings up at about $16,000. (Diehl said the study was funded by government and private grants, not the manufacturer.)
The researchers had 19 kids aged 6 to 13 complete 12 behavioral therapy sessions, where a therapist worked with the child on social skills. Half of the sessions involved the robot, named Kelly, which was wheeled out so the child could practice conversing with her, while the therapist stood by.
"So the child might say, ‘Hi Kelly, how are you?’" Diehl explained. "Then Kelly would say, ‘Fine. What did you do today?’" During the non-Kelly sessions, another person entered the room and carried on the same conversation with the child that the robot would have.
On average, Diehl’s team found, kids made bigger gains from the sessions that included Kelly — based on both their interactions with their therapists, and their parents’ reports.
"There was one child who, when his dad came home from work, asked him how his day was," Diehl said. "He’d never done that before."
Still, he stressed that while the robot sessions seemed more successful on average, the children varied widely in their responses to Kelly. Going forward, Diehl said, it will be important to figure out whether there are certain kids with ASDs more likely to benefit from a robot co-therapist.
Dawson agreed that there is no one-size-fits-all ASD therapy. “Any therapy for a person with an ASD has to be individualized,” she said. The idea with any technology, she added, is to give therapists and doctors extra “tools” to work with.
A separate study presented at the same meeting looked at another type of tool. Researchers had 60 “minimally verbal” children with ASDs attend two “play-based” sessions per week, aimed at boosting their ability to speak and gesture. Half of the kids were also given a “speech-generating device,” like an iPad.
Three and six months later, children who worked with the devices were able to say more words and were quicker to take up conversational skills.
Dawson said the robot and iPad studies are just part of the growing body of research into how technology can not only aid in ASD therapies, but also help doctors diagnose the disorders or help parents manage at home.
But both Diehl and Dawson stressed that no robot or iPad is intended to stand in for human connection. The idea, after all, is to enhance kids’ ability to communicate and have relationships, Dawson noted. “Technology will never take the place of people,” she said.
The data and conclusions of research presented at meetings should be viewed as preliminary until published in a peer-reviewed journal.
(Source: webmd.com)
It sounds like science fiction, but researchers are gaining ground in developing mind-controlled robotic arms that could give people with paralysis or amputated limbs more independence.

The technology, known as brain-computer (or brain-machine) interface, is in its infancy as far as human use — though scientists have been studying the concept for years. But experts say that people with paralysis or amputations could be using the technology at home within the next decade.
It basically boils down to people using their thoughts to control a robot arm that then performs a desired task, like grasping and moving a cup. That’s done via tiny electrode “grids” implanted in the brain that read the movement signals firing from individual nerve cells, then translate them to the robot arm.
"We have the ability to capture information from the brain and use it to control the robotic arm," said Dr. Elizabeth Tyler-Kabara, who presented her team’s latest findings on the technology Tuesday, at the annual meeting of the American Association of Neurological Surgeons, in New Orleans.
However, she stressed, “we still have a ton to learn.”
Right now, the robot arm is confined to the lab. After getting their electrodes implanted, study patients come to the lab to work with the robotic limb under the researchers’ supervision. So far, Tyler-Kabara and her colleagues at the University of Pittsburgh School of Medicine have tested the approach in one patient. Researchers at Brown University in Providence, R.I., have done it in a handful of others.
One of the big questions, Tyler-Kabara said, is “how much control is enough?” That is, how well does the mind-controlled arm need to work to bring real everyday benefits to people?
At the meeting on Tuesday, Tyler-Kabara presented an update on how her team’s patient is faring. The 53-year-old woman had long-standing quadriplegia due to a disease called spinocerebellar degeneration — where, for unknown reasons, the connections between the brain and muscles slowly deteriorate.
Tyler-Kabara performed the surgery, where two tiny electrode grids were placed in the area of the brain that would normally control the movement of the right hand and arm. The electrode points penetrate the brain’s surface by about one-sixteenth of an inch.
"The idea is pretty scary," Tyler-Kabara acknowledged. But her team’s patient had no complications from the surgery and left the hospital the next day. There’ve been no longer-term problems either, she said — though, in theory, there would be concerns about infection or bleeding over the long haul.
The surgery left the patient with two terminals that protrude through her skull. The researchers used those to connect the implanted electrodes to a computer, where they could see brain cells firing when the patient thought about moving her hand.
She was quickly able to master simple movements with the robotic arm, like high-fiving the researchers. And after six months, she was performing “10-degrees-of-freedom” movements, Tyler-Kabara reported at the meeting.
That includes not only moving the arm, but also flexing and rotating the wrist, grasping objects and affecting several different hand “postures.” She has accomplished feats like feeding herself chocolate.
The researchers initially used a computer in training sessions with the patient, but after that the robot arm is directly linked to the electrodes — so there is no need for “computer assistance,” according to Tyler-Kabara.
Still, before the technology can ultimately be used at home, she said, researchers have to devise a “fully implanted” wireless system for controlling the robot arm.
Another expert talked about the new technology.
"This is one more encouraging step toward developing something practical that people can use in their daily lives," said Dr. Robert Grossman, a neurosurgeon at Methodist Neurological Institute in Houston, who was not involved in the research.
It’s hard to put a time line on it all, Grossman said, since technological advances could changes things. He also noted that several research groups are looking at different approaches to brain-computer interfaces.
One, Grossman said, is to do it noninvasively, through electrodes placed on the scalp.
Study author Tyler-Kabara said that noninvasive approach has met with success in helping people perform simple tasks, like moving a cursor on a computer screen. “But I don’t think it will ever be good enough for performing complicated tasks,” she said, noting that it can’t work as precisely as the implanted electrodes.
A next step, Tyler-Kabara said, is to develop a “two-way” electrode system that stimulates the brain to generate sensation — with the aim of helping people adjust the robot’s grip strength.
She said there is also much to learn about which people will ultimately be good candidates for the technology. There may, for example, be some brain injuries that prevent people from benefiting.
Because this study was presented at a medical meeting, the data and conclusions should be viewed as preliminary until published in a peer-reviewed journal.
(Source: health.usnews.com)
DARPA Looks To New Form Of Computation That Mimics The Human Brain
The next frontier for the robotics industry has always been to build machines that think like humans. Scientists have pursued that elusive goal for decades, and some now believe that they are now extremely close to achieving the goal.
Now, a Pentagon-funded team of researchers has constructed a tiny machine that might allow robots to act independently.
Compared to traditional artificial intelligence systems that rely on conventional computer programming, this one “looks and ‘thinks’ like a human brain,” said James K. Gimzewski, professor of chemistry at the University of California, Los Angeles.
Gimsewski is a member of the team that has been working under sponsorship of the Defense Advanced Research Projects Agency (DARPA) on a program called Physical Intelligence.
The stated objective of the program is: “The analysis domain is to develop analytical tools to support the development of human-engineered physically intelligent systems and to understand physical intelligence in the natural world”.
This technology could be the secret to making robots that are truly autonomous, Gimzewski said during a conference call hosted by Technolink, a Los Angeles-based industry group.
Gimzewski says his project does not use standard robot hardware with integrated circuitry. The device that his team constructed is capable, without being programmed like a traditional robot, of performing actions similar to humans.
What sets this new device apart from any others is that it has nano-scale interconnected wires that perform billions of connections like a human brain, and is capable of remembering information, Gimzewski said. Each connection is a synthetic synapse. A synapse is what allows a neuron to pass an electric or chemical signal to another cell. Because its structure is so complex, most artificial intelligence projects so far have been unable to replicate it.
“Physical Intelligence” devices would not require a human controller the way a robot does, said Gimzewski. The applications of this technology for the military would be far reaching.
For instance an aircraft, for example, would be able to learn and explore the terrain and work its way through the environment without human intervention, he said. These machines would be able to process information in ways that would be unimaginable with current computers.
Artificial intelligence research over the past five decades has not been able to generate human-like reasoning or cognitive functions, said Gimzewski. DARPA’s program is the most ambitious he has seen to date. “It’s an off-the-wall approach,” he added.
Studies of the brain have shown that one of its key traits is self-organization. “That seems to be a prerequisite for autonomous behavior,” he said. “Rather than move information from memory to processor, like conventional computers, this device processes information in a totally new way.” This could represent a revolutionary breakthrough in robotic systems, said Gimzewski.
Information from the senses has an important influence on how we move. For instance, you can see and feel when a mug is filled with hot coffee, and you lift it in a different way than if the mug were empty. Neuroscientist Julian Tramper discovered that the brain uses two forms of old information in order to execute new movements well. This discovery can be useful for the field of robotics. Tramper will receive his doctorate on Thursday 24 April from Radboud University Nijmegen
Every time you move, the brain deals with two problems. First, there is a slight delay in the sensory information needed to execute the movement. Second, the command from the brain directing the muscles to move is not entirely clear, because neuronal signals contain a certain amount of natural static interference. According to Tramper, the brain has a clever way of getting around both problems: It combines the old information from the senses with experience gained through similar movements made in the past. This means that our senses use two forms of old information in order to make new movements.
Computer versus test subject
Understanding the brain processes behind movement can be of great importance to fields like robotics. Therefore Tramper is trying to model his findings so that it will be possible to use them in robots in the future. He has already succeeded in this for certain hand-eye coordination experiments, to the extent that a computer can perform at about the same level as human test subjects. As a post-doctoral researcher within the Donders Institute, Tramper is researching how these types of models can be integrated into bio-inspired robots (robots based on biological principles).
SpaceCog
Tramper is currently working on a project called SpaceCog. The goal of this project is to develop a robot which can independently orient itself in space, something that humans do automatically. This is difficult to achieve, because each time a robot moves, it must reinterpret the information from its cameras and other sensors in order to determine whether the changes to its input are the result of its own movement or an external cause. The researchers involved in SpaceCog want to figure out how our brain has solved this problem. Tramper has three years to come up with a good computer model addressing this issue.
Looking towards the future
Tramper is studying hand-eye coordination by having test subjects play a special computer game. The subjects use a game controller to move a digital right hand and left hand on a screen. They have to move the two hands independently of one another and make them each follow a particular path in order to reach a final destination (see film 1). It turned out that the test subject’s eyes moved ahead of the digital hands. In other words, the eyes looked at a point that the hands would reach in the future (see film 2). This type of eye movement is called smooth pursuit, and before now it had only been detected in the case of external stimuli, when a subject was following an object’s movement. Tramper detected smooth pursuit eye movements at locations the hands had not yet reached, meaning these movements were triggered by internal stimuli.
Smooth pursuit
Tramper explains, ‘We’d previously demonstrated for other types of eye movement that the eye anticipates and moves in advance of external movement To our surprise, this is also the case with smooth pursuit. It is probable that this is a compromise between where you are at a particular moment and where you want to get to. When moving, you need to keep track of your current location (which is constantly changing) and your target destination. Smooth pursuit eye movements can help you do this by letting your eye “hover” between both locations. If we can teach robots to do something like this, it will help make their movements much more natural. This will increase the number of ways in which robots can be put to work.’
(Source: ru.nl)
State science fair winner creates robot
The winner of this year’s State Science and Engineering Fair is from South Florida, and her project can someday make life easier for the physically challenged.
"It captures the brain waves of electrochemical activity. Basically, the nerve impulse produced by the brain, and it sends it over to the robot," said Daniela Rodriguez.
Steve is an award winning robot controlled by brain waves. He was invented by 13-year-old Daniela Rodriguez, who loves math and science. “I’ve always been interested in robotics; it’s my passion,” she said.
This year, Rodriguez won first place in the Annual State Science and Engineering Fair against 900 other finalists.
Rodriguez’ goal is to help people. “If the person is disabled, they can sit in their wheelchair, and they can use their thoughts and brain waves to control its movements, so they don’t have to move,” she said.
Her science project comes from the heart. Her mother was diagnosed with multiple sclerosis in 1996, and she is trying to find a way to keep her mom independent. “I work really hard to try to stay mobile, but the fact that she wants to help patients dealing with this illness is just a Godsend” said Rodriguez’ mom Jeannie.
Rodriguez’ wants to one day use her technology to help paralyzed people. Steve’s technology can even give wounded veterans the ability to use their brains to move the robot. “To help them move around in their wheelchairs or move their prosthetics because usually prosthetics now is just the muscle movement, but now it can be used and be more natural. It’s moving by your brain,” said Rodriguez.
Not only is Rodriguez winning awards, prosthetic companies have expressed interest in her program.
Artificial muscle computer performs as a universal Turing machine
In 1936, Alan Turing showed that all computers are simply manifestations of an underlying logical architecture, no matter what materials they’re made of. Although most of the computer’s we’re familiar with are made of silicon semiconductors, other computers have been made of DNA, light, legos, paper, and many other unconventional materials.
Now in a new study, scientists have built a computer made of artificial muscles that are themselves made of electroactive polymers. The artificial muscle computer is an example of the simplest known universal Turing machine, and as such it is capable of solving any computable problem given sufficient time and memory. By showing that artificial muscles can “think,” the study paves the way for the development of smart, lifelike prostheses and soft robots that can conform to changing environments.
The authors, Benjamin Marc O’Brien and Iain Alexander Anderson at the University of Auckland in New Zealand, have published their study on the artificial muscle computer in a recent issue of Applied Physics Letters.
"To the best of our knowledge, this is the first time a computer has been built out of artificial muscles," O’Brien told Phys.org. "What makes it exciting is that the technology can be directly and intimately embedded into artificial muscle devices, giving them lifelike reflexes. Even though our computer has hard bits, the technology is fundamentally soft and stretchy, something that traditional methods of computation struggle with."

Sugar Cube-Sized Robotic Ants Mimic Real Foraging Behavior
For ants, the pheromone-laden foraging trails they leave behind are like lifelines: they direct the workers toward food hubs discovered earlier and help guide them home back to their nest.
These networks of trails can stretch for hundreds of feet, quite the achievement considering many worker ants are less than half an inch in length. One type of harvester ant can lay down a set of trails (PDF) that stretch 82 feet from the entrance of its nest. The trails of a wood ant, an insect measuring just five millimeters (that’s one-fifth of an inch), reach 656 feet, each one branching out into more pathways at up to 10 spots on each trail. The leafcutter ant can build a network that spreads for almost two and a half acres.
Ant species such as these tend to take the shortest path between their colony’s nest and a food source, following branches that stray as little as possible from the direction in which they began their journey. The forks in their network of trails, known as bifurcations, are not symmetrical and don’t branch out into angles of the same size. But do ants use a sophisticated sense of geometry to trace their path, measuring the angles of the roads before picking one?
To learn more, researchers at the New Jersey Institute of Technology (NJIT) and the Research Centre on Animal Cognition in France used miniature robots to replicate the behavior of a colony of Argentine ants on the move, reported today in the journal PLOS Computational Biology. This ant species has extremely poor eyesight and darts around at high speeds, yet it can maneuver through corridor after corridor, from home to food and vice versa.

Robot-Delivered Speech and Physical Therapy
In one of the earliest experiments using a humanoid robot to deliver speech and physical therapy to a stroke patient, researchers at the University of Massachusetts Amherst saw notable speech and physical therapy gains and significant improvement in quality of life.
Regarding the overall outcome, speech language pathologist and study leader Yu-kyong Choe says, “It’s clear from our study of a 72-year-old male stroke client that a personal humanoid robot can help people recover by delivering therapy such as word-retrieval games and arm movement tasks in an enjoyable and engaging way.”
A major focus of this case study was to assess how therapy interventions in one domain, speech, affected interventions in another, physical therapy, in two different delivery scenarios. Despite the importance of working with other professionals, the authors point out, until now it has been “largely unknown how interventions by one type of therapy affects progress in others.”
The client, with aphasia and physical disability on one side, completed a robot-mediated program of only speech therapy for five weeks followed by only physical therapy for five weeks in the sole condition, but for the sequential condition he attended back-to-back speech and physical therapy sessions for five weeks.
Over the course of the experiment, the client made “notable gains in the frequency and range of the upper-limb movements,” the authors say. He also made positive gains in verbal expression. Interestingly, his improvements in speech and physical function were much greater when he engaged in only one therapy than when the two therapies were paired in sessions immediately following each other. The authors summarize that in such a sequential schedule “speech and physical functions seemed to compete for limited resources” in the brain. Their work is described in the current issue of the journal Aphasiology.
Choe and computer science researcher and robot expert Rod Grupen, director of the Laboratory for Perceptual Robotics at UMass Amherst, are in the second year of a $109,251 grant from the American Heart Association to investigate the effect of stroke rehabilitation delivered by a humanoid robot, uBot-5. It is a child-sized unit with arms and a computer screen through which therapists interact with the client.
Choe, Grupen and colleagues are seeking ways to bring more and longer-term therapy and social contact to people recovering from stroke. It’s estimated that 3 million Americans daily experience the debilitating effects of stroke. But even after years, they can recover significant function with intensive rehabilitation, says Choe. The bad news is that this is rarely available or accessible due to a shortage of therapists and lack of coverage for long-term treatment. Many people are left with chronic low function, which can lead to social isolation and depression.
While some may object to robots delivering therapy, the need is great and definitely not being met now, especially in rural areas, Grupen and Choe point out. They hope to aid human-to-human interaction, so a robot can temporarily take the therapist’s place. Grupen says, “In addition to improving quality of life, if we can support a client in the home so they can delay institutionalization, we can improve outcomes and make a huge impact on the cost of elder care. There are 70 million baby boomers beginning to retire now.”
“Stroke rehabilitation is such a monumental financial problem everywhere in the world, that’s where it can pay for itself,” he adds. “A personal robot could save billions of dollars in elder care while letting people stay in their own homes and communities. We’re hoping for a win-win where our elders live better, more independent and productive lives and our overtaxed healthcare resources are used more effectively.”