Posts tagged humanoids

Posts tagged humanoids
Softbank’s Pepper Robot Makes Emotional Debut in Japan
Japanese telecommunications giant Softbank Corp. on Thursday unveiled a new humanoid robot named Pepper, which the company claimed can identify human emotions and respond to them.

Robot-Delivered Speech and Physical Therapy
In one of the earliest experiments using a humanoid robot to deliver speech and physical therapy to a stroke patient, researchers at the University of Massachusetts Amherst saw notable speech and physical therapy gains and significant improvement in quality of life.
Regarding the overall outcome, speech language pathologist and study leader Yu-kyong Choe says, “It’s clear from our study of a 72-year-old male stroke client that a personal humanoid robot can help people recover by delivering therapy such as word-retrieval games and arm movement tasks in an enjoyable and engaging way.”
A major focus of this case study was to assess how therapy interventions in one domain, speech, affected interventions in another, physical therapy, in two different delivery scenarios. Despite the importance of working with other professionals, the authors point out, until now it has been “largely unknown how interventions by one type of therapy affects progress in others.”
The client, with aphasia and physical disability on one side, completed a robot-mediated program of only speech therapy for five weeks followed by only physical therapy for five weeks in the sole condition, but for the sequential condition he attended back-to-back speech and physical therapy sessions for five weeks.
Over the course of the experiment, the client made “notable gains in the frequency and range of the upper-limb movements,” the authors say. He also made positive gains in verbal expression. Interestingly, his improvements in speech and physical function were much greater when he engaged in only one therapy than when the two therapies were paired in sessions immediately following each other. The authors summarize that in such a sequential schedule “speech and physical functions seemed to compete for limited resources” in the brain. Their work is described in the current issue of the journal Aphasiology.
Choe and computer science researcher and robot expert Rod Grupen, director of the Laboratory for Perceptual Robotics at UMass Amherst, are in the second year of a $109,251 grant from the American Heart Association to investigate the effect of stroke rehabilitation delivered by a humanoid robot, uBot-5. It is a child-sized unit with arms and a computer screen through which therapists interact with the client.
Choe, Grupen and colleagues are seeking ways to bring more and longer-term therapy and social contact to people recovering from stroke. It’s estimated that 3 million Americans daily experience the debilitating effects of stroke. But even after years, they can recover significant function with intensive rehabilitation, says Choe. The bad news is that this is rarely available or accessible due to a shortage of therapists and lack of coverage for long-term treatment. Many people are left with chronic low function, which can lead to social isolation and depression.
While some may object to robots delivering therapy, the need is great and definitely not being met now, especially in rural areas, Grupen and Choe point out. They hope to aid human-to-human interaction, so a robot can temporarily take the therapist’s place. Grupen says, “In addition to improving quality of life, if we can support a client in the home so they can delay institutionalization, we can improve outcomes and make a huge impact on the cost of elder care. There are 70 million baby boomers beginning to retire now.”
“Stroke rehabilitation is such a monumental financial problem everywhere in the world, that’s where it can pay for itself,” he adds. “A personal robot could save billions of dollars in elder care while letting people stay in their own homes and communities. We’re hoping for a win-win where our elders live better, more independent and productive lives and our overtaxed healthcare resources are used more effectively.”
Humanoid robot helps train children with autism
“Aiden, look!” piped NAO, a two-foot tall humanoid robot, as it pointed to a flat-panel display on a far wall. As the cartoon dog Scooby Doo flashed on the screen, Aiden, a young boy with an unruly thatch of straw-colored hair, looked in the direction the robot was pointing.
Aiden, who is three and a half years old, has been diagnosed with autism spectrum disorder (ASD). NAO (pronounced “now”) is the diminutive “front man” for an elaborate system of cameras, sensors and computers designed specifically to help children like Aiden learn how to coordinate their attention with other people and objects in their environment. This basic social skill is called joint attention. Typically developing children learn it naturally. Children with autism, however, have difficulty mastering it and that inability can compound into a variety of learning difficulties as they age.
An interdisciplinary team of mechanical engineers and autism experts at Vanderbilt University have developed the system and used it to demonstrate that robotic systems may be powerful tools for enhancing the basic social learning skills of children with ASD. Writing in the March issue of the IEEE Transactions on Neural Systems and Rehabilitation Engineering, the researchers report that children with ASD paid more attention to the robot and followed its instructions almost as well as they did those of a human therapist in standard exercises used to develop joint attention skill.
The finding indicates that robots could play a crucial role in responding to the “public health emergency” that has been created by the rapid growth in the number of children being diagnosed with ASD. Today, one in 88 children (one in 54 boys) are being diagnosed with ASD. That is a 78 percent increase in just four years. The trend has major implications for the nation’s healthcare budget because estimates of the lifetime cost of treating ASD patients ranges from four to six times greater than for patients without autism.
“This is the first real world test of whether intelligent adaptive systems can make an impact on autism,” said team member Zachary Warren, who directs the Treatment and Research Institute for Autism Spectrum Disorders (TRIAD) at Vanderbilt’s Kennedy Center.
![“Simplified” brain lets the iCub robot learn language
The iCub humanoid robot on which the team directed by Peter Ford Dominey, CNRS Director of Research at Inserm Unit 846 known as the “Institut pour les cellules souches et cerveau de Lyon” [Lyon Institute for Stem Cell and Brain Research] (Inserm, CNRS, Université Claude Bernard Lyon 1) has been working for many years will now be able to understand what is being said to it and even anticipate the end of a sentence. This technological prowess was made possible by the development of a “simplified artificial brain” that reproduces certain types of so-called “recurrent” connections observed in the human brain. The artificial brain system enables the robot to learn, and subsequently understand, new sentences containing a new grammatical structure. It can link two sentences together and even predict how a sentence will end before it is uttered. This research has been published in the Plos One journal.](http://41.media.tumblr.com/c403540193bd571984867d237b9495ef/tumblr_miitc4e0oJ1rog5d1o1_500.jpg)
“Simplified” brain lets the iCub robot learn language
The iCub humanoid robot on which the team directed by Peter Ford Dominey, CNRS Director of Research at Inserm Unit 846 known as the “Institut pour les cellules souches et cerveau de Lyon” [Lyon Institute for Stem Cell and Brain Research] (Inserm, CNRS, Université Claude Bernard Lyon 1) has been working for many years will now be able to understand what is being said to it and even anticipate the end of a sentence. This technological prowess was made possible by the development of a “simplified artificial brain” that reproduces certain types of so-called “recurrent” connections observed in the human brain. The artificial brain system enables the robot to learn, and subsequently understand, new sentences containing a new grammatical structure. It can link two sentences together and even predict how a sentence will end before it is uttered. This research has been published in the Plos One journal.
Robovie talking robot joins science class at Higashihikari Elementary School in Japan
Robovie a 1.2-meter robot developed by ATR joined the science class at Higashihikari Elementary School in Japan on Feb. 5 for the start of a 14-month experiment. Data will be gathered to improve the robot’s ability to interact naturally with multiple people. The robot has been given facial photos and voiceprints of 119 fifth graders and teachers. On the first day of class, Robovie greeted the students, and was asked by a teacher to answer what a “wound up copper wire” was. It answered, “A copper coil. It’s part of the motors that move my body.” During class Robovie waited at the back of the room, recognizing the faces of the students and recording their movements. After class it shook hands with sixth graders and answered their questions.
As part of research into the co-existence of humans and robots, the experiment with Robovie is being carried out at a school because the environment allows for the acquisition of large amounts of data from the movements of the children. The robot has been given facial photos and voiceprints of 119 fifth graders and teachers. Robovie’s daily conversation level is equivalent to a five-year-old human, but it has been programmed with the entire contents of a fifth-grade science textbook. This is the first experiment using a robot at a school to last over a year.
'Bionic man' goes on show at British museum
A “bionic man” costing one million dollars went on display on Tuesday at Britain’s Science Museum, complete with artificial organs, synthetic blood and robot limbs.
Named Rex, which is short for “Robotic Exoskeleton”, the six foot six inch (two metre) humanoid with its uncannily life-like face was assembled by leading roboticists for a television programme.
Although cheaper than the “Six Million Dollar Man” made famous by the cult 1970s television series starring Lee Majors, the technology is far advanced from the fictional bionics on show back then.
The creation includes key advances in prosthetic technology, as well as an artificial pancreas, kidney, spleen and trachea and a functional blood circulatory system.
Welcoming Rex to the museum in London on Tuesday was Swiss social psychologist Bertolt Meyer, who was himself born without a left hand and has a sophisticated bionic replacement.
"I’ve looked around for new bionic technologies, out of personal interest, for a very long time and I think that until five or six years ago nothing much was happening," Meyer said.
"Then suddenly we are now at a point where we can build a body that is great and beautiful in its own special way."
The museum exhibit, which opens to the public on Thursday, will explore changing perceptions of human identity against the background of rapid progress in bionics—although Rex is not strictly bionic as he does not include living tissue.
Swiss aim to birth advanced humanoid in 9 months
Here’s a robotics challenge for you: create an advanced humanoid robot in only nine months.
That’s what engineers at the University of Zurich’s Artificial Intelligence Lab are trying to do with Roboy, a kid-style bot that’s designed to help people in everyday environments.
Researchers around the world are trying to create useful humanoids. One interesting aspect of Roboy is its tendon-driven locomotion system.
Like Japan’s Kenshiro humanoid, Roboy relies on artificial muscles to move; in the future, it will be covered with a soft skin.
Roboy could become a prototype for service robots that will help elderly people remain independent for as long as possible.
It’s based on an earlier, one-eyed machine called Ecce, which looks something like a cyclops version of Skeletor. It was designed to be “the first truly anthropomimetic robot.” Except the eye, of course.
Already well along in its development (check out the video), Roboy is expected to be born in March 2013, when it will be unveiled at the Robots on Tour event in Zurich. The lab is seeking donations to fund the work, including branding opportunities.
If you have 50,000 Swiss francs ($55,000) lying around, you can get your logo on Roboy, and strike terror into the hearts of your enemies.
Japanese researchers build robot with most humanlike muscle-skeleton structure yet
Researchers at the University of Tokyo have taken another step towards creating a robot with a faithfully recreated human skeleton and muscle structure. Called Kenshiro, the robot has been demonstrated at the recent Humanoids 2012 conference in Osaka, Japan.
An artificially intelligent virtual gamer created by computer scientists at The University of Texas at Austin has won the BotPrize by convincing a panel of judges that it was more human-like than half the humans it competed against.
How non-verbal cues can predict a person’s (and a robot’s) trustworthiness
People face this predicament all the time—can you determine a person’s character in a single interaction? Can you judge whether someone you just met can be trusted when you have only a few minutes together? And if you can, how do you do it? Using a robot named Nexi, Northeastern University psychology professor David DeSteno and collaborators Cynthia Breazeal from MIT’s Media Lab and Robert Frank and David Pizarro from Cornell University have figured out the answer. The findings were recently published in the journal Psychological Science, a journal of the Association for Psychological Science.
It’s What You’re Not Saying…
In the absence of reliable information about a person’s reputation, nonverbal cues can offer a look into a person’s likely actions. This concept has been known for years, but the cues that convey trustworthiness or untrustworthiness have remained a mystery. Collecting data from face-to-face conversations with research participants where money was on the line, DeSteno and his team realized that it’s not one single non-verbal movement or cue that determines a person’s trustworthiness, but rather sets of cues. When participants expressed these cues, they cheated their partners more, and, at a gut level, their partners expected it. “Scientists haven’t been able to unlock the cues to trust because they’ve been going about it the wrong way,” DeSteno said. “There’s no one golden-cue. Context and coordination of movements is what matters.”
Robots Have Feelings, Too
People are fidgety – they’re moving all the time. So how could the team truly zero-in on the cues that mattered? This is where Nexi comes in. Nexi is a humanoid social robot that afforded the team an important benefit – they could control all its movements perfectly. In a second experiment, the team had research participants converse with Nexi for 10 minutes, much like they did with another person in the first experiment. While conversing with the participants, Nexi — operated remotely by researchers — either expressed cues that were considered less than trustworthy or expressed similar, but non-trust-related cues. Confirming their theory, the team found that participants exposed to Nexi’s untrustworthy cues intuited that Nexi was likely to cheat them and adjusted their financial decisions accordingly. “Certain nonverbal gestures trigger emotional reactions we’re not consciously aware of, and these reactions are enormously important for understanding how interpersonal relationships develop,” said Frank. “The fact that a robot can trigger the same reactions confirms the mechanistic nature of many of the forces that influence human interaction.”
Real-Life Application
This discovery has led the research team to not only answer enduring questions about if and how people are able to assess the trustworthiness of an unknown person, but also to show the human mind’s willingness to ascribe trust-related intentions to technological entities based on the same movements. “This is a very exciting result that showcases how social robots can be used to gain important insights about human behavior,” said Cynthia Breazeal of MIT’s Media Lab. “This also has fascinating implications for the design of future robots that interact and work alongside people as partners.” Accordingly, these findings hold important insights not only for security and financial endeavors and for the evolving design of robots and computer-based agents. The subconscious mind is ready to see these entities as social beings.