Posts tagged robots

Posts tagged robots
Brave New Machines
Robots are here to stay. They will be smarter, more versatile, more autonomous, and more like us in many ways. We humans will need to adapt to keep up.
The word “robot” was used for the first time only about 80 years ago, in the play “RUR” by the Czech author Karel Capek. The robots in that book were artificial humans, chemically synthesized using appropriate formulas. Robots at present and in the future will be made largely of inorganic materials, both mechanical and electronic. However, some form of hybridization between electromechanical and biological subsystems is possible and will occur. I believe that the major developments in robotics in the next 100 years will be the following areas:
Robot intelligence: The ability of a robot to solve problems, to learn, to interact with humans and other robots, and related skills are all measures of intelligence. Robots will indeed be increasingly intelligent, because:
- High speed memory, long term storage capacity, and speed of the on-board computers will continue to increase. Futurist Ray Kurzweil has predicted that the capacity of robot brains will exceed that of human brains within the next 20 years.
- Neuroscience is rapidly obtaining better and better models of the information processing ability of the human brain. These models will lead to the development of software to enable robot brains to emulate more and more of the features of the human brain.
- Research in learning will enable robots to learn by imitating humans, from their own mistakes and from their successes.
Human-robot interaction: This is an area of significant research activity at the present time. I believe that during the coming decades robots will be able to interact with humans (and with each other) in increasingly human-like ways, including speech and gestures. Robots will be able to understand the semantic as well as the emotional aspects of speech, so that they will understand the significance of increasing loudness, irritation, affection, and other emotional aspects in spoken utterances, and they will be able to include these aspects in their own speech as well.
Gets stroke patients back on their feet
A robot is now being built to help stroke patients with training, motivation and walking.
In Europe, strokes are the most common cause of physical disability among the elderly. This often result in paralysis of one side of the body, and many patients suffer much reduced physical mobility and are often unable to walk on their own. These are the hard facts the EU project CORBYS has taken seriously. Researchers in six countries are currently developing a robotic system designed to help stroke patients re-train their bodies. The concept is based on helping the patient by constructing a system consisting of powered orthosis to help patient in moving his/her legs and a mobile platform providing patient mobility.
The CORBYS researchers are also working with the cognitive aspects. The aim is to enable the robot to interpret data from the patient and adapt the training programme to his or her capabilities and intention. This will bring rehabilitation robots to the next level.
Back to walking normally
It is vital to get stroke patients up on their feet as soon as possible. They must have frequent training exercises, and re-learn how to walk so that they can function as good as possible on their own.
Why a robot? “Absolutely, because it is difficult to meet these requirements using today’s work-intensive manual method where two therapists assisting the patient by lifting one leg after the other”, says ICT researcher Anders Liverud at SINTEF, which is one of the CORBYS project partners.
Robot-patient learning
CORBYS involves the use of physiological data such as heart rate, temperature and muscle activity measurements to provide feedback to the therapist and help control the robot. Do the patient’s legs always go where the patient want? Is the patient getting tired and stressed?
“The walking robot has several settings, and the therapist selects the correct mode based on how far the patient has come in his or her rehabilitation”, says Liverud. “The first step is to attach sensors to the patient’s body and let them walk on a treadmill. A therapist manually corrects the walking pattern and, with the help of the sensors, create a model of the patient’s walking pattern”, he says.
In the next mode, the system adjusts the walking pattern to the defined model. New adjustments are made and are used to improve optimisation of the walking pattern.
“The patient wears an EEG cap which measures brain activity”, says Liverud. “By using these signals combined with input from other physiological and system sensors, the robotic system registers whether the patient wants to stop, change speed or turn, and can adapt immediately”, he says. “The robot continues to correct any walking pattern errors. However, since it also allows the patient the freedom to decide where and how he or she walks, the patient experiences control and keeps motivation to continue with the training”, says Liverud.
Working with Europe
The European researchers have now completed specification of the system and its components, and construction of the robot is underway.
Construction involves a large team. The University of Bremen is heading the project and developing the architecture to integrate all system modules, and German wheelchair, orthosis and robotics experts are constructing the mechanical components, while two UK universities are working with cognitive aspects. Spanish specialists are addressing brain activity measurements and the University of Brussels is looking into robot control. SINTEF is working with the sensors and the final functional integration of the system. In a year’s time construction will be completed and the robot will be tested on stroke patients at rehabilitation institutes in Slovenia and Germany. The CORBYS project has a total budget of EUR 8.7 million.

Japan’s Robot Suit Gets Global Safety Certificate
A robot suit that can help the elderly or disabled get around was given its global safety certificate in Japan on Wednesday, paving the way for its worldwide rollout.
The Hybrid Assistive Limb, or HAL, is a power-assisted pair of legs developed by Japanese robot maker Cyberdyne, which has also developed similar robot arms.
A quality assurance body issued the certificate based on a draft version of an international safety standard for personal robots that is expected to be approved later this year, the ministry for the economy, trade and industry said.
The metal-and-plastic exoskeleton has become the first nursing-care robot certified under the draft standard, a ministry official said.
Battery-powered HAL, which detects muscle impulses to anticipate and support the user’s body movements, is designed to help the elderly with mobility or help hospital or nursing carers to lift patients.
Cyberdyne, based in Tsukuba, northeast of Tokyo, has so far leased some 330 suits to 150 hospitals, welfare and other facilities in Japan since 2010, at 178,000 yen ($1,950) per suit per year.
"It is very significant that Japan has obtained this certification before others in the world," said Yoshiyuki Sankai, the head of Cyberdyne.
The company is unrelated to the firm of the same name responsible for the cyborg assassin played by Arnold Schwarzenegger in the 1984 film “The Terminator”.
"This is a first step forward for Japan, the great robot nation, to send our message to the world about robots of the future," said Sankai, who is also a professor at Tsukuba University.
A different version of HAL — coincidentally the name of the evil supercomputer in Stanley Kubrick’s “2001: A Space Odyssey” — has been developed for workers who need to wear heavy radiation protection as part of the clean-up at the crippled Fukushima nuclear plant.
Industrial robots have long been used in Japan, and robo-suits are gradually making inroads into hospitals and retirement homes.
But critics say the government has been slow in creating a safety framework for such robots in a country whose rapidly-ageing population is expected to enjoy ever longer lives.
Researchers build robotic bat wing
Researchers at Brown University have developed a robotic bat wing that is providing valuable new information about dynamics of flapping flight in real bats.
The robot, which mimics the wing shape and motion of the lesser dog-faced fruit bat, is designed to flap while attached to a force transducer in a wind tunnel. As the lifelike wing flaps, the force transducer records the aerodynamic forces generated by the moving wing. By measuring the power output of the three servo motors that control the robot’s seven movable joints, researchers can evaluate the energy required to execute wing movements.
Testing showed the robot can match the basic flight parameters of bats, producing enough thrust to overcome drag and enough lift to carry the weight of the model species.
A paper describing the robot and presenting results from preliminary experiments is published in the journal Bioinspiration and Biomimetics. The work was done in labs of Brown professors Kenneth Breuer and Sharon Swartz, who are the senior authors on the paper. Breuer, an engineer, and Swartz, a biologist, have studied bat flight and anatomy for years.
The faux flapper generates data that could never be collected directly from live animals, said Joseph Bahlman, a graduate student at Brown who led the project. Bats can’t fly when connected to instruments that record aerodynamic forces directly, so that isn’t an option — and bats don’t take requests.
“We can’t ask a bat to flap at a frequency of eight hertz then raise it to nine hertz so we can see what difference that makes,” Bahlman said. “They don’t really cooperate that way.”
But the model does exactly what the researchers want it to do. They can control each of its movement capabilities — kinematic parameters — individually. That way they can adjust one parameter while keeping the rest constant to isolate the effects.
“We can answer questions like, ‘Does increasing wing beat frequency improve lift and what’s the energetic cost of doing that?’” Bahlman said. “We can directly measure the relationship between these kinematic parameters, aerodynamic forces, and energetics.”
Detailed experimental results from the robot will be described in future research papers, but this first paper includes some preliminary results from a few case studies.

Real Angry Birds Flip ‘the Bird’ Before a Fight
Male sparrows are capable of fighting to the death. But a new study shows that they often wave their wings wildly first in an attempt to avoid a dangerous brawl.
"For birds, wing waves are like flipping the bird or saying ‘put up your dukes. I’m ready to fight,’" said Duke biologist Rindy Anderson.
Male swamp sparrows use wing waves as an aggressive signal to defend their territories and mates from intruding males, Anderson said. The findings also are a first step toward understanding how the birds use a combination of visual displays and songs to communicate with other males.
Anderson and her colleagues published the results online Jan. 28 in the journal Behavioral Ecology and Sociobiology.
Scientists had assumed the sparrows’ wing-waving behavior was a signal intended for other males, but testing the observations was difficult, Anderson said. So she and her co-author, former Duke engineering undergraduate student David Piech (‘12), built a miniature computer and some robotics, which the team then stuffed into the body cavity of a deceased bird. The result was a ‘robosparrow’ that looked just like a male swamp sparrow, which could flip its wings just like a live male.
Anderson took the wing-waving robosparrow to a swamp sparrow breeding ground in Pennsylvania and placed it in the territories of live males. The robotic bird “sang” swamp sparrow songs using a nearby sound system to let the birds know he was intruding, while Anderson and her colleagues crouched in the swampy grasses and watched the live birds’ responses. She also performed the tests with a stuffed sparrow that stayed stationary and one that twisted from side to side. These tests showed that wing waves combined with song are more potent than song on its own, and that wing waves in particular, not just any movement, evoked aggression from live birds.
The live birds responded most aggressively to the invading, wing-waving robotic sparrow, which Anderson said she expected. “What I didn’t expect to see was that the birds would give strikingly similar aggressive wing-wave signals to the three types of invaders,” she said. That means that if a bird wing-waved five times to the stationary stuffed bird, he would also wing-wave five times to the wing-waving robot.
Anderson had hypothesized that the defending birds would match the signals of the intruding robots, but her team’s results suggest that the males are more individualistic and consistent in the level of aggressiveness that they want to signal, she said.
"That response makes sense, in retrospect, since attacks can be devastating," Anderson said. Because of the risk, the real males may only want to signal a certain level of aggression to see if they could scare off an intruder without the conflict coming to a fight and possible death.

Insects inspiring new technology
Scientists from the University of Lincoln and Newcastle University have created a computerised system which allows for autonomous navigation of mobile robots based on the locust’s unique visual system.
The work could provide the blueprint for the development of highly accurate vehicle collision sensors, surveillance technology and even aid video game programming according to the research published today.
Locusts have a distinctive way of processing information through electrical and chemical signals, giving them an extremely fast and accurate warning system for impending collisions.
The insect has incredibly powerful data processing systems built into its biology, which can in theory be recreated in robotics.
Inspired by the visual processing power built into these insects’ biology, Professor Shigang Yue from the University of Lincoln’s School of Computer Science and Dr Claire Rind from Newcastle University’s Institute of Neuroscience created the computerised system.
Their findings are published in the International Journal of Advanced Mechatronic Systems.
The research started by understanding the anatomy, responses and development of the circuits in the locust brain that allow it to detect approaching objects and avoid them when in flight or on the ground.
A visually stimulated motor control (VSMC) system was then created which consists of two movement detector types and a simple motor command generator. Each detector processes images and extracts relevant visual clues which are then converted into motor commands.
Prof Yue said: “We were inspired by the way the locusts’ visual system works when interacting with the outside world and the potential to simulate such complex systems in software and hardware for various applications. We created a system inspired by the locusts’ motion sensitive interneuron – the lobula giant movement detector. This system was then used in a robot to enable it to explore paths or interact with objects, effectively using visual input only.”
Funded by the European Union’s Seventh Framework Programme (FP7), the research was carried out as part of a collaborative project with the University of Hamburg in Germany and Tsinghua University and Xi’an Jiaotong University, China.
Lessons from cockroaches could inform robotics
Running cockroaches start to recover from being shoved sideways before their dawdling nervous system kicks in to tell their legs what to do, researchers have found. These new insights on how biological systems stabilize could one day help engineers design steadier robots and improve doctors’ understanding of human gait abnormalities.
In experiments, the roaches were able to maintain their footing mechanically—using their momentum and the spring-like architecture of their legs, rather than neurologically, relying on impulses sent from their central nervous system to their muscles.
"The response time we observed is more than three times longer than you’d expect," said Shai Revzen, an assistant professor of electrical engineering and computer science, as well as ecology and evolutionary biology, at the University of Michigan. Revzen is the lead author of a paper on the findings published online in Biological Cybernetics. It will appear in a forthcoming print edition.
"What we see is that the animals’ nervous system is working at a substantial delay," he said. "It could potentially act a lot sooner, within about a thirtieth of a second, but instead, it kicks in after about a step and a half or two steps—about a tenth of a second. For some reason, the nervous system is waiting and seeing how it shapes out."
Revzen said the new findings might imply that the biological brain, at least in cockroaches, adjusts the gait only at whole-step intervals rather than at any point in a step. Periodic, rather than continuous, feedback systems might lead to more stable (not to mention energy-efficient) walking robots—whether they travel on two feet or six.
Robot makers often look to nature for inspiration. As animals move through the world, they have to respond to unexpected disturbances like rocky, uneven ground or damaged limbs. Revzen and his team believe that patterns in how they move as they adjust could give away how their machinery and neurology work together.
"The fundamental question is, ‘What can you do with a mechanical suspension versus one that requires electronic feedback?" Revzen said. "The animals obviously have much better mechanical designs than anything we know how to build. But if we could learn how they do it, we might be able to reproduce it."
![“Simplified” brain lets the iCub robot learn language
The iCub humanoid robot on which the team directed by Peter Ford Dominey, CNRS Director of Research at Inserm Unit 846 known as the “Institut pour les cellules souches et cerveau de Lyon” [Lyon Institute for Stem Cell and Brain Research] (Inserm, CNRS, Université Claude Bernard Lyon 1) has been working for many years will now be able to understand what is being said to it and even anticipate the end of a sentence. This technological prowess was made possible by the development of a “simplified artificial brain” that reproduces certain types of so-called “recurrent” connections observed in the human brain. The artificial brain system enables the robot to learn, and subsequently understand, new sentences containing a new grammatical structure. It can link two sentences together and even predict how a sentence will end before it is uttered. This research has been published in the Plos One journal.](http://41.media.tumblr.com/c403540193bd571984867d237b9495ef/tumblr_miitc4e0oJ1rog5d1o1_500.jpg)
“Simplified” brain lets the iCub robot learn language
The iCub humanoid robot on which the team directed by Peter Ford Dominey, CNRS Director of Research at Inserm Unit 846 known as the “Institut pour les cellules souches et cerveau de Lyon” [Lyon Institute for Stem Cell and Brain Research] (Inserm, CNRS, Université Claude Bernard Lyon 1) has been working for many years will now be able to understand what is being said to it and even anticipate the end of a sentence. This technological prowess was made possible by the development of a “simplified artificial brain” that reproduces certain types of so-called “recurrent” connections observed in the human brain. The artificial brain system enables the robot to learn, and subsequently understand, new sentences containing a new grammatical structure. It can link two sentences together and even predict how a sentence will end before it is uttered. This research has been published in the Plos One journal.

Japan to field test rehabilitation robots
Ten hospitals in Japan are set to begin testing the use of a robot known as “Robot Suit HAL” starting next month. The purpose of the test will be to determine whether use of the robot is beneficial to patients needing physical therapy to regain normal use of their legs.
When people experience nerve or muscle damage to their lower backs or legs due to illness, stroke or injury, the normal course of treatment involves undergoing physical therapy. Doing so causes the body to slowly repair the damage that has been done. In order for it to work however, the parts of the body that work properly have to coax the parts that do not into action, a laborious and quite often painful process. For this reason, professional physical therapists assist patients with the process to ensure that all of the body parts are exercised and to offer emotional support. But such experts can only help so much, and for that reason, robots have been developed to help. The thinking is that because they are sensor based and lack emotional involvement in the process, robots are likely to do a better job.
The Robot Suit HAL (Hybrid Assistive Limb) has been designed and built by Cyberdyne Inc. with assistance from researchers around the country. It’s described by its makers as a cyborg-type robot meant to supplement human muscles or to assist in their rehabilitation. Its part handrail, part sensor and part hydraulically controlled machinery. A patient stands between two handrails, holding on, while sensors are affixed to the skin of the legs. The sensors pick up nerve signals which are sent to an onboard computer. Those signals are then converted to action by small motors and power units that cause the muscle to be worked in the same way it would be were the person’s body able to move it on their own. The end result is a direct connection between nerve signals and movement, which the researchers believe, will result in faster and perhaps better recovery for the patient.
Initial testing will involve 30 volunteer patients. Representatives for Cyberdyne have also announced that the company is in the process of making arrangements for testing the robot in hospitals in Europe as well.
This Robotic Mouse Was Designed to Stress Out Real Mice
Lab rats have a new companion, but it’s not friendly. Researchers at Waseda University in Tokyo, Japan, have developed a robotic rat called WR-3 whose job is to induce stress and depression in lab animals, creating models of psychological conditions on which new drugs can be tested.
Animal are used throughout medicine as models to test treatments for human conditions, including mental disorders like depression. Rats and mice get their sense of smell severed to induce something like depression, or are forced to swim for long periods, for instance. Other methods rely on genetic modification and environmental stress, but none is entirely satisfactory in recreating a human-like version of depression for treatment. Hiroyuki Ishii and his team aim to do better with WR-3.
The researchers tested WR-3’s ability to depress two groups of 12 rats, measured by the somewhat crude assumption that a depressed rat moves around less. Rats in group A were constantly harassed by their robot counterpart, while the other rats were attacked intermittently and automatically by WR-3, whenever they moved. Ishii’s team found that the deepest depression was triggered by intermittent attacks on a mature rat that had been constantly harassed in its youth.
The team say they plan to test their new model of depression against more conventional systems, like forced swimming.
The robot has been developed just as new research by Junhee Seok of Stanford University in Palo Alto, California, and colleagues shows that the use of mouse models for human conditions has led researchers trying to find treatments for sepsis, burns and trauma astray at a cost of billions of tax dollars.