Neuroscience

Articles and news from the latest research reports.

Posts tagged robots

86 notes

Kelly the Robot Helps Kids Tackle Autism

Using a kid-friendly robot during behavioral therapy sessions may help some children with autism gain better social skills, a preliminary study suggests.

image

The study, of 19 children with autism spectrum disorders (ASDs), found that kids tended to do better when their visit with a therapist included a robot “co-therapist.” On average, they made bigger gains in social skills such as asking “appropriate” questions, answering questions and making conversational comments.

So-called humanoid robots are already being marketed for this purpose, but there has been little research to back it up.

"Going into this study, we were skeptical," said lead researcher Joshua Diehl, an assistant professor of psychology at the University of Notre Dame in Indiana, who said he has no financial interest in the technology.

"We found that, to our surprise, the kids did better when the robot was added," he said.

There are still plenty of caveats, however, said Diehl, who is presenting his team’s findings Saturday at the International Meeting for Autism Research (IMFAR) in San Sebastian, Spain.

For one, the study was small. And it’s not clear that the results seen in a controlled research setting would be the same in the real world of therapists’ offices, according to Diehl.

"I’d say this is not yet ready for prime time," he said.

ASDs are a group of developmental disorders that affect a person’s ability to communicate and interact socially. The severity of those effects range widely: Some people have mild problems socializing, but have normal to above-normal intelligence; some people have profound difficulties relating to others, and may have intellectual impairment as well.

Experts have become interested in using technology — from robots to iPads — along with standard ASD therapies because it may help bridge some of the communication issues kids have.

Human communication is complex and unpredictable, with body language, facial expressions and other subtle cues coming into the mix, explained Geraldine Dawson, chief science officer for the advocacy group Autism Speaks.

A robot or a computer game, on the other hand, can be programmed to be simple and predictable, and that may help kids with ASDs better process the information they are being given, Dawson said.

"Broadly speaking," she said, "we are very excited about the potential role for technology in diagnosing and treating ASDs." But she also agreed with Diehl that the findings are "very preliminary," and that researchers have a lot more to learn about how technology — robots or otherwise — fits into ASD therapies.

For the study, Diehl’s team used a humanoid robot manufactured by Aldebaran Robotics, which markets the NAO robot for use in education, including special education for kids with ASDs. The robot, which stands at about 2 feet tall, looks like a toy but it’s priced more like a small car, Diehl noted.

The NAO H25 “Academic Edition” rings up at about $16,000. (Diehl said the study was funded by government and private grants, not the manufacturer.)

The researchers had 19 kids aged 6 to 13 complete 12 behavioral therapy sessions, where a therapist worked with the child on social skills. Half of the sessions involved the robot, named Kelly, which was wheeled out so the child could practice conversing with her, while the therapist stood by.

"So the child might say, ‘Hi Kelly, how are you?’" Diehl explained. "Then Kelly would say, ‘Fine. What did you do today?’" During the non-Kelly sessions, another person entered the room and carried on the same conversation with the child that the robot would have.

On average, Diehl’s team found, kids made bigger gains from the sessions that included Kelly — based on both their interactions with their therapists, and their parents’ reports.

"There was one child who, when his dad came home from work, asked him how his day was," Diehl said. "He’d never done that before."

Still, he stressed that while the robot sessions seemed more successful on average, the children varied widely in their responses to Kelly. Going forward, Diehl said, it will be important to figure out whether there are certain kids with ASDs more likely to benefit from a robot co-therapist.

Dawson agreed that there is no one-size-fits-all ASD therapy. “Any therapy for a person with an ASD has to be individualized,” she said. The idea with any technology, she added, is to give therapists and doctors extra “tools” to work with.

A separate study presented at the same meeting looked at another type of tool. Researchers had 60 “minimally verbal” children with ASDs attend two “play-based” sessions per week, aimed at boosting their ability to speak and gesture. Half of the kids were also given a “speech-generating device,” like an iPad.

Three and six months later, children who worked with the devices were able to say more words and were quicker to take up conversational skills.

Dawson said the robot and iPad studies are just part of the growing body of research into how technology can not only aid in ASD therapies, but also help doctors diagnose the disorders or help parents manage at home.

But both Diehl and Dawson stressed that no robot or iPad is intended to stand in for human connection. The idea, after all, is to enhance kids’ ability to communicate and have relationships, Dawson noted. “Technology will never take the place of people,” she said.

The data and conclusions of research presented at meetings should be viewed as preliminary until published in a peer-reviewed journal.

(Source: webmd.com)

Filed under ASD autism humanoid robots robots robotics communication social skills neuroscience psychology science

264 notes

Paralyzed Patient Moves Prosthetic Arm With Her Mind

It sounds like science fiction, but researchers are gaining ground in developing mind-controlled robotic arms that could give people with paralysis or amputated limbs more independence.

image

The technology, known as brain-computer (or brain-machine) interface, is in its infancy as far as human use — though scientists have been studying the concept for years. But experts say that people with paralysis or amputations could be using the technology at home within the next decade.

It basically boils down to people using their thoughts to control a robot arm that then performs a desired task, like grasping and moving a cup. That’s done via tiny electrode “grids” implanted in the brain that read the movement signals firing from individual nerve cells, then translate them to the robot arm.

"We have the ability to capture information from the brain and use it to control the robotic arm," said Dr. Elizabeth Tyler-Kabara, who presented her team’s latest findings on the technology Tuesday, at the annual meeting of the American Association of Neurological Surgeons, in New Orleans.

However, she stressed, “we still have a ton to learn.”

Right now, the robot arm is confined to the lab. After getting their electrodes implanted, study patients come to the lab to work with the robotic limb under the researchers’ supervision. So far, Tyler-Kabara and her colleagues at the University of Pittsburgh School of Medicine have tested the approach in one patient. Researchers at Brown University in Providence, R.I., have done it in a handful of others.

One of the big questions, Tyler-Kabara said, is “how much control is enough?” That is, how well does the mind-controlled arm need to work to bring real everyday benefits to people?

At the meeting on Tuesday, Tyler-Kabara presented an update on how her team’s patient is faring. The 53-year-old woman had long-standing quadriplegia due to a disease called spinocerebellar degeneration — where, for unknown reasons, the connections between the brain and muscles slowly deteriorate.

Tyler-Kabara performed the surgery, where two tiny electrode grids were placed in the area of the brain that would normally control the movement of the right hand and arm. The electrode points penetrate the brain’s surface by about one-sixteenth of an inch.

"The idea is pretty scary," Tyler-Kabara acknowledged. But her team’s patient had no complications from the surgery and left the hospital the next day. There’ve been no longer-term problems either, she said — though, in theory, there would be concerns about infection or bleeding over the long haul.

The surgery left the patient with two terminals that protrude through her skull. The researchers used those to connect the implanted electrodes to a computer, where they could see brain cells firing when the patient thought about moving her hand.

She was quickly able to master simple movements with the robotic arm, like high-fiving the researchers. And after six months, she was performing “10-degrees-of-freedom” movements, Tyler-Kabara reported at the meeting.

That includes not only moving the arm, but also flexing and rotating the wrist, grasping objects and affecting several different hand “postures.” She has accomplished feats like feeding herself chocolate.

The researchers initially used a computer in training sessions with the patient, but after that the robot arm is directly linked to the electrodes — so there is no need for “computer assistance,” according to Tyler-Kabara.

Still, before the technology can ultimately be used at home, she said, researchers have to devise a “fully implanted” wireless system for controlling the robot arm.

Another expert talked about the new technology.

"This is one more encouraging step toward developing something practical that people can use in their daily lives," said Dr. Robert Grossman, a neurosurgeon at Methodist Neurological Institute in Houston, who was not involved in the research.

It’s hard to put a time line on it all, Grossman said, since technological advances could changes things. He also noted that several research groups are looking at different approaches to brain-computer interfaces.

One, Grossman said, is to do it noninvasively, through electrodes placed on the scalp.

Study author Tyler-Kabara said that noninvasive approach has met with success in helping people perform simple tasks, like moving a cursor on a computer screen. “But I don’t think it will ever be good enough for performing complicated tasks,” she said, noting that it can’t work as precisely as the implanted electrodes.

A next step, Tyler-Kabara said, is to develop a “two-way” electrode system that stimulates the brain to generate sensation — with the aim of helping people adjust the robot’s grip strength.

She said there is also much to learn about which people will ultimately be good candidates for the technology. There may, for example, be some brain injuries that prevent people from benefiting.

Because this study was presented at a medical meeting, the data and conclusions should be viewed as preliminary until published in a peer-reviewed journal.

(Source: health.usnews.com)

Filed under BCI robots robotics prosthetic limbs prosthetic arm neuroscience science

180 notes

Brain Scans Reveal That Humans Definitely Feel Empathy For Robots
While creating an empathetic robot is a long-held dream, understanding whether humans genuinely empathise with robots should — in theory — be easier. Now, a team of scientists have analysed fMRI brain scans to reveal that humans have similar brain function when shown affection and violence being inflicted on both humans and robots.
The experiments, conducted at the University of Duisburg, Essen, had 40 participants sit and watch videos of a small dinosaur-shaped robot. It was either treated in an affectionate or violent way, and then researchers measured physiological arousal — finding overwhelmingly strong reaction to the scenes of violence. A second study used functional magnetic-resonance imaging, and shows that affectionate interaction towards both robots and humans resulted in similar neural activation patterns in the brain.
That suggests that those actions elicit similar reactions for interactions with both humans and robots. The problem with most experiments on this subject is that participants generally choose not to report emotional reaction to robots — an fMRI scan gets around that problem. Rosenthal-von der Pütten, one of the researchers, explains the implications of the findings:
“One goal of current robotics research is to develop robotic companions that establish a long-term relationship with a human user, because robot companions can be useful and beneficial tools. They could assist elderly people in daily tasks and enable them to live longer autonomously in their homes, help disabled people in their environments, or keep patients engaged during the rehabilitation process. A common problem is that a new technology is exciting at the beginning, but this effect wears off especially when it comes to tasks like boring and repetitive exercise in rehabilitation. The development and implementation of uniquely humanlike abilities in robots like theory of mind, emotion and empathy is considered to have the potential to solve this dilemma.”
The scientists present their findings at the 63rd Annual International Communication Association conference in London in June.

Brain Scans Reveal That Humans Definitely Feel Empathy For Robots

While creating an empathetic robot is a long-held dream, understanding whether humans genuinely empathise with robots should — in theory — be easier. Now, a team of scientists have analysed fMRI brain scans to reveal that humans have similar brain function when shown affection and violence being inflicted on both humans and robots.

The experiments, conducted at the University of Duisburg, Essen, had 40 participants sit and watch videos of a small dinosaur-shaped robot. It was either treated in an affectionate or violent way, and then researchers measured physiological arousal — finding overwhelmingly strong reaction to the scenes of violence. A second study used functional magnetic-resonance imaging, and shows that affectionate interaction towards both robots and humans resulted in similar neural activation patterns in the brain.

That suggests that those actions elicit similar reactions for interactions with both humans and robots. The problem with most experiments on this subject is that participants generally choose not to report emotional reaction to robots — an fMRI scan gets around that problem. Rosenthal-von der Pütten, one of the researchers, explains the implications of the findings:

“One goal of current robotics research is to develop robotic companions that establish a long-term relationship with a human user, because robot companions can be useful and beneficial tools. They could assist elderly people in daily tasks and enable them to live longer autonomously in their homes, help disabled people in their environments, or keep patients engaged during the rehabilitation process. A common problem is that a new technology is exciting at the beginning, but this effect wears off especially when it comes to tasks like boring and repetitive exercise in rehabilitation. The development and implementation of uniquely humanlike abilities in robots like theory of mind, emotion and empathy is considered to have the potential to solve this dilemma.”

The scientists present their findings at the 63rd Annual International Communication Association conference in London in June.

Filed under robots empathy brain scans fMRI human-robot interaction neuroscience science

158 notes

DARPA Looks To New Form Of Computation That Mimics The Human Brain
The next frontier for the robotics industry has always been to build machines that think like humans. Scientists have pursued that elusive goal for decades, and some now believe that they are now extremely close to achieving the goal.
Now, a Pentagon-funded team of researchers has constructed a tiny machine that might allow robots to act independently.
Compared to traditional artificial intelligence systems that rely on conventional computer programming, this one “looks and ‘thinks’ like a human brain,” said James K. Gimzewski, professor of chemistry at the University of California, Los Angeles.
Gimsewski is a member of the team that has been working under sponsorship of the Defense Advanced Research Projects Agency (DARPA) on a program called Physical Intelligence.
The stated objective of the program is: “The analysis domain is to develop analytical tools to support the development of human-engineered physically intelligent systems and to understand physical intelligence in the natural world”.
This technology could be the secret to making robots that are truly autonomous, Gimzewski said during a conference call hosted by Technolink, a Los Angeles-based industry group.
Gimzewski says his project does not use standard robot hardware with integrated circuitry. The device that his team constructed is capable, without being programmed like a traditional robot, of performing actions similar to humans.
What sets this new device apart from any others is that it has nano-scale interconnected wires that perform billions of connections like a human brain, and is capable of remembering information, Gimzewski said. Each connection is a synthetic synapse. A synapse is what allows a neuron to pass an electric or chemical signal to another cell. Because its structure is so complex, most artificial intelligence projects so far have been unable to replicate it.
“Physical Intelligence” devices would not require a human controller the way a robot does, said Gimzewski. The applications of this technology for the military would be far reaching.
For instance an aircraft, for example, would be able to learn and explore the terrain and work its way through the environment without human intervention, he said. These machines would be able to process information in ways that would be unimaginable with current computers.
Artificial intelligence research over the past five decades has not been able to generate human-like reasoning or cognitive functions, said Gimzewski. DARPA’s program is the most ambitious he has seen to date. “It’s an off-the-wall approach,” he added.
Studies of the brain have shown that one of its key traits is self-organization. “That seems to be a prerequisite for autonomous behavior,” he said. “Rather than move information from memory to processor, like conventional computers, this device processes information in a totally new way.” This could represent a revolutionary breakthrough in robotic systems, said Gimzewski.

DARPA Looks To New Form Of Computation That Mimics The Human Brain

The next frontier for the robotics industry has always been to build machines that think like humans. Scientists have pursued that elusive goal for decades, and some now believe that they are now extremely close to achieving the goal.

Now, a Pentagon-funded team of researchers has constructed a tiny machine that might allow robots to act independently.

Compared to traditional artificial intelligence systems that rely on conventional computer programming, this one “looks and ‘thinks’ like a human brain,” said James K. Gimzewski, professor of chemistry at the University of California, Los Angeles.

Gimsewski is a member of the team that has been working under sponsorship of the Defense Advanced Research Projects Agency (DARPA) on a program called Physical Intelligence.

The stated objective of the program is: “The analysis domain is to develop analytical tools to support the development of human-engineered physically intelligent systems and to understand physical intelligence in the natural world”.

This technology could be the secret to making robots that are truly autonomous, Gimzewski said during a conference call hosted by Technolink, a Los Angeles-based industry group.

Gimzewski says his project does not use standard robot hardware with integrated circuitry. The device that his team constructed is capable, without being programmed like a traditional robot, of performing actions similar to humans.

What sets this new device apart from any others is that it has nano-scale interconnected wires that perform billions of connections like a human brain, and is capable of remembering information, Gimzewski said. Each connection is a synthetic synapse. A synapse is what allows a neuron to pass an electric or chemical signal to another cell. Because its structure is so complex, most artificial intelligence projects so far have been unable to replicate it.

“Physical Intelligence” devices would not require a human controller the way a robot does, said Gimzewski. The applications of this technology for the military would be far reaching.

For instance an aircraft, for example, would be able to learn and explore the terrain and work its way through the environment without human intervention, he said. These machines would be able to process information in ways that would be unimaginable with current computers.

Artificial intelligence research over the past five decades has not been able to generate human-like reasoning or cognitive functions, said Gimzewski. DARPA’s program is the most ambitious he has seen to date. “It’s an off-the-wall approach,” he added.

Studies of the brain have shown that one of its key traits is self-organization. “That seems to be a prerequisite for autonomous behavior,” he said. “Rather than move information from memory to processor, like conventional computers, this device processes information in a totally new way.” This could represent a revolutionary breakthrough in robotic systems, said Gimzewski.

Filed under brain robotics robots autonomous robots AI physical intelligence neuroscience science

64 notes

Helpful for robotics: brain uses old information for new movements

Information from the senses has an important influence on how we move. For instance, you can see and feel when a mug is filled with hot coffee, and you lift it in a different way than if the mug were empty. Neuroscientist Julian Tramper discovered that the brain uses two forms of old information in order to execute new movements well. This discovery can be useful for the field of robotics. Tramper will receive his doctorate on Thursday 24 April from Radboud University Nijmegen

Every time you move, the brain deals with two problems. First, there is a slight delay in the sensory information needed to execute the movement. Second, the command from the brain directing the muscles to move is not entirely clear, because neuronal signals contain a certain amount of natural static interference. According to Tramper, the brain has a clever way of getting around both problems: It combines the old information from the senses with experience gained through similar movements made in the past. This means that our senses use two forms of old information in order to make new movements.

Computer versus test subject
Understanding the brain processes behind movement can be of great importance to fields like robotics. Therefore Tramper is trying to model his findings so that it will be possible to use them in robots in the future. He has already succeeded in this for certain hand-eye coordination experiments, to the extent that a computer can perform at about the same level as human test subjects. As a post-doctoral researcher within the Donders Institute, Tramper is researching how these types of models can be integrated into bio-inspired robots (robots based on biological principles).

SpaceCog
Tramper is currently working on a project called SpaceCog. The goal of this project is to develop a robot which can independently orient itself in space, something that humans do automatically. This is difficult to achieve, because each time a robot moves, it must reinterpret the information from its cameras and other sensors in order to determine whether the changes to its input are the result of its own movement or an external cause. The researchers involved in SpaceCog want to figure out how our brain has solved this problem. Tramper has three years to come up with a good computer model addressing this issue.

Looking towards the future
Tramper is studying hand-eye coordination by having test subjects play a special computer game. The subjects use a game controller to move a digital right hand and left hand on a screen. They have to move the two hands independently of one another and make them each follow a particular path in order to reach a final destination (see film 1). It turned out that the test subject’s eyes moved ahead of the digital hands. In other words, the eyes looked at a point that the hands would reach in the future (see film 2). This type of eye movement is called smooth pursuit, and before now it had only been detected in the case of external stimuli, when a subject was following an object’s movement. Tramper detected smooth pursuit eye movements at locations the hands had not yet reached, meaning these movements were triggered by internal stimuli.

Smooth pursuit
Tramper explains, ‘We’d previously demonstrated for other types of eye movement that the eye anticipates and moves in advance of external movement  To our surprise, this is also the case with smooth pursuit. It is probable that this is a compromise between where you are at a particular moment and where you want to get to. When moving, you need to keep track of your current location (which is constantly changing) and your target destination. Smooth pursuit eye movements can help you do this by letting your eye “hover” between both locations. If we can teach robots to do something like this, it will help make their movements much more natural. This will increase the number of ways in which robots can be put to work.’

(Source: ru.nl)

Filed under sensory information robots robotics motor movements hand-eye coordination SpaceCog neuroscience science

263 notes

State science fair winner creates robot
The winner of this year’s State Science and Engineering Fair is from South Florida, and her project can someday make life easier for the physically challenged.
"It captures the brain waves of electrochemical activity. Basically, the nerve impulse produced by the brain, and it sends it over to the robot," said Daniela Rodriguez.
Steve is an award winning robot controlled by brain waves. He was invented by 13-year-old Daniela Rodriguez, who loves math and science. “I’ve always been interested in robotics; it’s my passion,” she said.
This year, Rodriguez won first place in the Annual State Science and Engineering Fair against 900 other finalists.
Rodriguez’ goal is to help people. “If the person is disabled, they can sit in their wheelchair, and they can use their thoughts and brain waves to control its movements, so they don’t have to move,” she said.
Her science project comes from the heart. Her mother was diagnosed with multiple sclerosis in 1996, and she is trying to find a way to keep her mom independent. “I work really hard to try to stay mobile, but the fact that she wants to help patients dealing with this illness is just a Godsend” said Rodriguez’ mom Jeannie.
Rodriguez’ wants to one day use her technology to help paralyzed people. Steve’s technology can even give wounded veterans the ability to use their brains to move the robot. “To help them move around in their wheelchairs or move their prosthetics because usually prosthetics now is just the muscle movement, but now it can be used and be more natural. It’s moving by your brain,” said Rodriguez.
Not only is Rodriguez winning awards, prosthetic companies have expressed interest in her program.

State science fair winner creates robot

The winner of this year’s State Science and Engineering Fair is from South Florida, and her project can someday make life easier for the physically challenged.

"It captures the brain waves of electrochemical activity. Basically, the nerve impulse produced by the brain, and it sends it over to the robot," said Daniela Rodriguez.

Steve is an award winning robot controlled by brain waves. He was invented by 13-year-old Daniela Rodriguez, who loves math and science. “I’ve always been interested in robotics; it’s my passion,” she said.

This year, Rodriguez won first place in the Annual State Science and Engineering Fair against 900 other finalists.

Rodriguez’ goal is to help people. “If the person is disabled, they can sit in their wheelchair, and they can use their thoughts and brain waves to control its movements, so they don’t have to move,” she said.

Her science project comes from the heart. Her mother was diagnosed with multiple sclerosis in 1996, and she is trying to find a way to keep her mom independent. “I work really hard to try to stay mobile, but the fact that she wants to help patients dealing with this illness is just a Godsend” said Rodriguez’ mom Jeannie.

Rodriguez’ wants to one day use her technology to help paralyzed people. Steve’s technology can even give wounded veterans the ability to use their brains to move the robot. “To help them move around in their wheelchairs or move their prosthetics because usually prosthetics now is just the muscle movement, but now it can be used and be more natural. It’s moving by your brain,” said Rodriguez.

Not only is Rodriguez winning awards, prosthetic companies have expressed interest in her program.

Filed under brain brainwaves robots robotics Steve prosthetics neuroscience science

77 notes

Sugar Cube-Sized Robotic Ants Mimic Real Foraging Behavior
For ants, the pheromone-laden foraging trails they leave behind are like lifelines: they direct the workers toward food hubs discovered earlier and help guide them home back to their nest.
These networks of trails can stretch for hundreds of feet, quite the achievement considering many worker ants are less than half an inch in length. One type of harvester ant can lay down a set of trails (PDF) that stretch 82 feet from the entrance of its nest. The trails of a wood ant, an insect measuring just five millimeters (that’s one-fifth of an inch), reach 656 feet, each one branching out into more pathways at up to 10 spots on each trail. The leafcutter ant can build a network that spreads for almost two and a half acres.
Ant species such as these tend to take the shortest path between their colony’s nest and a food source, following branches that stray as little as possible from the direction in which they began their journey. The forks in their network of trails, known as bifurcations, are not symmetrical and don’t branch out into angles of the same size. But do ants use a sophisticated sense of geometry to trace their path, measuring the angles of the roads before picking one?
To learn more, researchers at the New Jersey Institute of Technology (NJIT) and the Research Centre on Animal Cognition in France used miniature robots to replicate the behavior of a colony of Argentine ants on the move, reported today in the journal PLOS Computational Biology. This ant species has extremely poor eyesight and darts around at high speeds, yet it can maneuver through corridor after corridor, from home to food and vice versa.

Sugar Cube-Sized Robotic Ants Mimic Real Foraging Behavior

For ants, the pheromone-laden foraging trails they leave behind are like lifelines: they direct the workers toward food hubs discovered earlier and help guide them home back to their nest.

These networks of trails can stretch for hundreds of feet, quite the achievement considering many worker ants are less than half an inch in length. One type of harvester ant can lay down a set of trails (PDF) that stretch 82 feet from the entrance of its nest. The trails of a wood ant, an insect measuring just five millimeters (that’s one-fifth of an inch), reach 656 feet, each one branching out into more pathways at up to 10 spots on each trail. The leafcutter ant can build a network that spreads for almost two and a half acres.

Ant species such as these tend to take the shortest path between their colony’s nest and a food source, following branches that stray as little as possible from the direction in which they began their journey. The forks in their network of trails, known as bifurcations, are not symmetrical and don’t branch out into angles of the same size. But do ants use a sophisticated sense of geometry to trace their path, measuring the angles of the roads before picking one?

To learn more, researchers at the New Jersey Institute of Technology (NJIT) and the Research Centre on Animal Cognition in France used miniature robots to replicate the behavior of a colony of Argentine ants on the move, reported today in the journal PLOS Computational Biology. This ant species has extremely poor eyesight and darts around at high speeds, yet it can maneuver through corridor after corridor, from home to food and vice versa.

Filed under robots robotics foraging trail networks ants colony behavior navigation skills alice neuroscience science

244 notes

Robot-Delivered Speech and Physical Therapy
In one of the earliest experiments using a humanoid robot to deliver speech and physical therapy to a stroke patient, researchers at the University of Massachusetts Amherst saw notable speech and physical therapy gains and significant improvement in quality of life.
Regarding the overall outcome, speech language pathologist and study leader Yu-kyong Choe says, “It’s clear from our study of a 72-year-old male stroke client that a personal humanoid robot can help people recover by delivering therapy such as word-retrieval games and arm movement tasks in an enjoyable and engaging way.”
A major focus of this case study was to assess how therapy interventions in one domain, speech, affected interventions in another, physical therapy, in two different delivery scenarios. Despite the importance of working with other professionals, the authors point out, until now it has been “largely unknown how interventions by one type of therapy affects progress in others.”
The client, with aphasia and physical disability on one side, completed a robot-mediated program of only speech therapy for five weeks followed by only physical therapy for five weeks in the sole condition, but for the sequential condition he attended back-to-back speech and physical therapy sessions for five weeks.
Over the course of the experiment, the client made “notable gains in the frequency and range of the upper-limb movements,” the authors say. He also made positive gains in verbal expression. Interestingly, his improvements in speech and physical function were much greater when he engaged in only one therapy than when the two therapies were paired in sessions immediately following each other. The authors summarize that in such a sequential schedule “speech and physical functions seemed to compete for limited resources” in the brain. Their work is described in the current issue of the journal Aphasiology.
Choe and computer science researcher and robot expert Rod Grupen, director of the Laboratory for Perceptual Robotics at UMass Amherst, are in the second year of a $109,251 grant from the American Heart Association to investigate the effect of stroke rehabilitation delivered by a humanoid robot, uBot-5. It is a child-sized unit with arms and a computer screen through which therapists interact with the client.
Choe, Grupen and colleagues are seeking ways to bring more and longer-term therapy and social contact to people recovering from stroke. It’s estimated that 3 million Americans daily experience the debilitating effects of stroke. But even after years, they can recover significant function with intensive rehabilitation, says Choe. The bad news is that this is rarely available or accessible due to a shortage of therapists and lack of coverage for long-term treatment. Many people are left with chronic low function, which can lead to social isolation and depression.
While some may object to robots delivering therapy, the need is great and definitely not being met now, especially in rural areas, Grupen and Choe point out. They hope to aid human-to-human interaction, so a robot can temporarily take the therapist’s place. Grupen says, “In addition to improving quality of life, if we can support a client in the home so they can delay institutionalization, we can improve outcomes and make a huge impact on the cost of elder care. There are 70 million baby boomers beginning to retire now.”
“Stroke rehabilitation is such a monumental financial problem everywhere in the world, that’s where it can pay for itself,” he adds. “A personal robot could save billions of dollars in elder care while letting people stay in their own homes and communities. We’re hoping for a win-win where our elders live better, more independent and productive lives and our overtaxed healthcare resources are used more effectively.”

Robot-Delivered Speech and Physical Therapy

In one of the earliest experiments using a humanoid robot to deliver speech and physical therapy to a stroke patient, researchers at the University of Massachusetts Amherst saw notable speech and physical therapy gains and significant improvement in quality of life.

Regarding the overall outcome, speech language pathologist and study leader Yu-kyong Choe says, “It’s clear from our study of a 72-year-old male stroke client that a personal humanoid robot can help people recover by delivering therapy such as word-retrieval games and arm movement tasks in an enjoyable and engaging way.”

A major focus of this case study was to assess how therapy interventions in one domain, speech, affected interventions in another, physical therapy, in two different delivery scenarios. Despite the importance of working with other professionals, the authors point out, until now it has been “largely unknown how interventions by one type of therapy affects progress in others.”

The client, with aphasia and physical disability on one side, completed a robot-mediated program of only speech therapy for five weeks followed by only physical therapy for five weeks in the sole condition, but for the sequential condition he attended back-to-back speech and physical therapy sessions for five weeks.

Over the course of the experiment, the client made “notable gains in the frequency and range of the upper-limb movements,” the authors say. He also made positive gains in verbal expression. Interestingly, his improvements in speech and physical function were much greater when he engaged in only one therapy than when the two therapies were paired in sessions immediately following each other. The authors summarize that in such a sequential schedule “speech and physical functions seemed to compete for limited resources” in the brain. Their work is described in the current issue of the journal Aphasiology.

Choe and computer science researcher and robot expert Rod Grupen, director of the Laboratory for Perceptual Robotics at UMass Amherst, are in the second year of a $109,251 grant from the American Heart Association to investigate the effect of stroke rehabilitation delivered by a humanoid robot, uBot-5. It is a child-sized unit with arms and a computer screen through which therapists interact with the client.

Choe, Grupen and colleagues are seeking ways to bring more and longer-term therapy and social contact to people recovering from stroke. It’s estimated that 3 million Americans daily experience the debilitating effects of stroke. But even after years, they can recover significant function with intensive rehabilitation, says Choe. The bad news is that this is rarely available or accessible due to a shortage of therapists and lack of coverage for long-term treatment. Many people are left with chronic low function, which can lead to social isolation and depression.

While some may object to robots delivering therapy, the need is great and definitely not being met now, especially in rural areas, Grupen and Choe point out. They hope to aid human-to-human interaction, so a robot can temporarily take the therapist’s place. Grupen says, “In addition to improving quality of life, if we can support a client in the home so they can delay institutionalization, we can improve outcomes and make a huge impact on the cost of elder care. There are 70 million baby boomers beginning to retire now.”

“Stroke rehabilitation is such a monumental financial problem everywhere in the world, that’s where it can pay for itself,” he adds. “A personal robot could save billions of dollars in elder care while letting people stay in their own homes and communities. We’re hoping for a win-win where our elders live better, more independent and productive lives and our overtaxed healthcare resources are used more effectively.”

Filed under robots robotics humanoids stroke speech therapy aphasia neuroscience science

253 notes

Humanoid robot helps train children with autism
“Aiden, look!” piped NAO, a two-foot tall humanoid robot, as it pointed to a flat-panel display on a far wall. As the cartoon dog Scooby Doo flashed on the screen, Aiden, a young boy with an unruly thatch of straw-colored hair, looked in the direction the robot was pointing.
Aiden, who is three and a half years old, has been diagnosed with autism spectrum disorder (ASD). NAO (pronounced “now”) is the diminutive “front man” for an elaborate system of cameras, sensors and computers designed specifically to help children like Aiden learn how to coordinate their attention with other people and objects in their environment. This basic social skill is called joint attention. Typically developing children learn it naturally. Children with autism, however, have difficulty mastering it and that inability can compound into a variety of learning difficulties as they age.
An interdisciplinary team of mechanical engineers and autism experts at Vanderbilt University have developed the system and used it to demonstrate that robotic systems may be powerful tools for enhancing the basic social learning skills of children with ASD. Writing in the March issue of the IEEE Transactions on Neural Systems and Rehabilitation Engineering, the researchers report that children with ASD paid more attention to the robot and followed its instructions almost as well as they did those of a human therapist in standard exercises used to develop joint attention skill.
The finding indicates that robots could play a crucial role in responding to the “public health emergency” that has been created by the rapid growth in the number of children being diagnosed with ASD. Today, one in 88 children (one in 54 boys) are being diagnosed with ASD. That is a 78 percent increase in just four years. The trend has major implications for the nation’s healthcare budget because estimates of the lifetime cost of treating ASD patients ranges from four to six times greater than for patients without autism.
“This is the first real world test of whether intelligent adaptive systems can make an impact on autism,” said team member Zachary Warren, who directs the Treatment and Research Institute for Autism Spectrum Disorders (TRIAD) at Vanderbilt’s Kennedy Center.

Humanoid robot helps train children with autism

“Aiden, look!” piped NAO, a two-foot tall humanoid robot, as it pointed to a flat-panel display on a far wall. As the cartoon dog Scooby Doo flashed on the screen, Aiden, a young boy with an unruly thatch of straw-colored hair, looked in the direction the robot was pointing.

Aiden, who is three and a half years old, has been diagnosed with autism spectrum disorder (ASD). NAO (pronounced “now”) is the diminutive “front man” for an elaborate system of cameras, sensors and computers designed specifically to help children like Aiden learn how to coordinate their attention with other people and objects in their environment. This basic social skill is called joint attention. Typically developing children learn it naturally. Children with autism, however, have difficulty mastering it and that inability can compound into a variety of learning difficulties as they age.

An interdisciplinary team of mechanical engineers and autism experts at Vanderbilt University have developed the system and used it to demonstrate that robotic systems may be powerful tools for enhancing the basic social learning skills of children with ASD. Writing in the March issue of the IEEE Transactions on Neural Systems and Rehabilitation Engineering, the researchers report that children with ASD paid more attention to the robot and followed its instructions almost as well as they did those of a human therapist in standard exercises used to develop joint attention skill.

The finding indicates that robots could play a crucial role in responding to the “public health emergency” that has been created by the rapid growth in the number of children being diagnosed with ASD. Today, one in 88 children (one in 54 boys) are being diagnosed with ASD. That is a 78 percent increase in just four years. The trend has major implications for the nation’s healthcare budget because estimates of the lifetime cost of treating ASD patients ranges from four to six times greater than for patients without autism.

“This is the first real world test of whether intelligent adaptive systems can make an impact on autism,” said team member Zachary Warren, who directs the Treatment and Research Institute for Autism Spectrum Disorders (TRIAD) at Vanderbilt’s Kennedy Center.

Filed under robots robotics humanoids ASD autism NAO joint attention neuroscience science

336 notes

Brainless robots swarm just like animals
Swarming patterns and herding behaviours have been observed throughout the animal kingdom. Scientists and mathematicians have pondered the cause of complex relationships and group dynamics at work that allow schools of fish, such as herring, and flocks of birds, such as starlings, to move together in apparent unity — and now, in an interesting twist to the discussion, a team of engineers from Harvard University has observed apparent collective behaviour in brainless robots.
The robot research team was looking for a way to investigate the transition that swarming groups make from random behaviour into collective motion. In order to observe a randomly moving collective, they built the simplest of “self-propelled automatons”, the charmingly named Bristle-Bot (BBots).
Read more

Brainless robots swarm just like animals

Swarming patterns and herding behaviours have been observed throughout the animal kingdom. Scientists and mathematicians have pondered the cause of complex relationships and group dynamics at work that allow schools of fish, such as herring, and flocks of birds, such as starlings, to move together in apparent unity — and now, in an interesting twist to the discussion, a team of engineers from Harvard University has observed apparent collective behaviour in brainless robots.

The robot research team was looking for a way to investigate the transition that swarming groups make from random behaviour into collective motion. In order to observe a randomly moving collective, they built the simplest of “self-propelled automatons”, the charmingly named Bristle-Bot (BBots).

Read more

Filed under swarming bristle-bots robots robotics animal cognition technology neuroscience science

free counters