Posts tagged robotics

Posts tagged robotics
World premiere of muscle and nerve controlled arm prosthesis
For the first time an operation has been conducted, at Sahlgrenska University Hospital, where electrodes have been permanently implanted in nerves and muscles of an amputee to directly control an arm prosthesis. The result allows natural control of an advanced robotic prosthesis, similarly to the motions of a natural limb.
A surgical team led by Dr Rickard Brånemark, Sahlgrenska University Hospital, has carried out the first operation of its kind, where neuromuscular electrodes have been permanently implanted in an amputee. The operation was possible thanks to new advanced technology developed by Max Ortiz Catalan, supervised by Rickard Brånemark at Sahlgrenska University Hospital and Bo Håkansson at Chalmers University of Technology.
“The new technology is a major breakthrough that has many advantages over current technology, which provides very limited functionality to patients with missing limbs,” says Rickard Brånemark.
Big challenges
There have been two major issues on the advancement of robotic prostheses: 1) how to firmly attach an artificial limb to the human body; 2) how to intuitively and efficiently control the prosthesis in order to be truly useful and regain lost functionality.
“This technology solves both these problems by combining a bone anchored prosthesis with implanted electrodes,” said Rickard Brånemark, who along with his team has developed a pioneering implant system called Opra, Osseointegrated Prostheses for the Rehabilitation of Amputees.
A titanium screw, so-called osseointegrated implant, is used to anchor the prosthesis directly to the stump, which provides many advantages over a traditionally used socket prosthesis.
“It allows complete degree of motion for the patient, fewer skin related problems and a more natural feeling that the prosthesis is part of the body. Overall, it brings better quality of life to people who are amputees,” says Rickard Brånemark.
How it works
Presently, robotic prostheses rely on electrodes over the skin to pick up the muscles electrical activity to drive few actions by the prosthesis. The problem with this approach is that normally only two functions are regained out of the tens of different movements an able-body is capable of. By using implanted electrodes, more signals can be retrieved, and therefore control of more movements is possible. Furthermore, it is also possible to provide the patient with natural perception, or “feeling”, through neural stimulation.
“We believe that implanted electrodes, together with a long-term stable human-machine interface provided by the osseointegrated implant, is a breakthrough that will pave the way for a new era in limb replacement,” says Rickard Brånemark.
The patient
The first patient has recently been treated with this technology, and the first tests gave excellent results. The patient, a previous user of a robotic hand, reported major difficulties in operating that device in cold and hot environments and interference from shoulder muscles. These issues have now disappeared, thanks to the new system, and the patient has now reported that almost no effort is required to generate control signals. Moreover, tests have shown that more movements may be performed in a coordinated way, and that several movements can be performed simultaneously.
“The next step will be to test electrical stimulation of nerves to see if the patient can sense environmental stimuli, that is, get an artificial sensation. The ultimate goal is to make a more natural way to replace a lost limb, to improve the quality of life for people with amputations,” says Rickard Brånemark.
![“Simplified” brain lets the iCub robot learn language
The iCub humanoid robot on which the team directed by Peter Ford Dominey, CNRS Director of Research at Inserm Unit 846 known as the “Institut pour les cellules souches et cerveau de Lyon” [Lyon Institute for Stem Cell and Brain Research] (Inserm, CNRS, Université Claude Bernard Lyon 1) has been working for many years will now be able to understand what is being said to it and even anticipate the end of a sentence. This technological prowess was made possible by the development of a “simplified artificial brain” that reproduces certain types of so-called “recurrent” connections observed in the human brain. The artificial brain system enables the robot to learn, and subsequently understand, new sentences containing a new grammatical structure. It can link two sentences together and even predict how a sentence will end before it is uttered. This research has been published in the Plos One journal.](http://41.media.tumblr.com/c403540193bd571984867d237b9495ef/tumblr_miitc4e0oJ1rog5d1o1_500.jpg)
“Simplified” brain lets the iCub robot learn language
The iCub humanoid robot on which the team directed by Peter Ford Dominey, CNRS Director of Research at Inserm Unit 846 known as the “Institut pour les cellules souches et cerveau de Lyon” [Lyon Institute for Stem Cell and Brain Research] (Inserm, CNRS, Université Claude Bernard Lyon 1) has been working for many years will now be able to understand what is being said to it and even anticipate the end of a sentence. This technological prowess was made possible by the development of a “simplified artificial brain” that reproduces certain types of so-called “recurrent” connections observed in the human brain. The artificial brain system enables the robot to learn, and subsequently understand, new sentences containing a new grammatical structure. It can link two sentences together and even predict how a sentence will end before it is uttered. This research has been published in the Plos One journal.
A sensational breakthrough: the first bionic hand that can feel
The first bionic hand that allows an amputee to feel what they are touching will be transplanted later this year in a pioneering operation that could introduce a new generation of artificial limbs with sensory perception.
The patient is an unnamed man in his 20s living in Rome who lost the lower part of his arm following an accident, said Silvestro Micera of the Ecole Polytechnique Federale de Lausanne in Switzerland.
The wiring of his new bionic hand will be connected to the patient’s nervous system with the hope that the man will be able to control the movements of the hand as well as receiving touch signals from the hand’s skin sensors.
Dr Micera said that the hand will be attached directly to the patient’s nervous system via electrodes clipped onto two of the arm’s main nerves, the median and the ulnar nerves.
This should allow the man to control the hand by his thoughts, as well as receiving sensory signals to his brain from the hand’s sensors. It will effectively provide a fast, bidirectional flow of information between the man’s nervous system and the prosthetic hand.
“This is real progress, real hope for amputees. It will be the first prosthetic that will provide real-time sensory feedback for grasping,” Dr Micera said.
“It is clear that the more sensory feeling an amputee has, the more likely you will get full acceptance of that limb,” he told the American Association for the Advancement of Science meeting in Boston.
“We could be on the cusp of providing new and more effective clinical solutions to amputees in the next year,” he said.
Nano-machines for “Bionic Proteins”
Physicists of the University of Vienna together with researchers from the University of Natural Resources and Life Sciences Vienna developed nano-machines which recreate principal activities of proteins. They present the first versatile and modular example of a fully artificial protein-mimetic model system, thanks to the Vienna Scientific Cluster (VSC), a high performance computing infrastructure. These “bionic proteins” could play an important role in innovating pharmaceutical research. The results have now been published in the renowned journal “Physical Review Letters”.
Proteins are the fundamental building blocks of all living organism we currently know. Because of the large number and complexity of bio-molecular processes they are capable of, proteins are often referred to as “molecular machines”. Take for instance the proteins in your muscles: At each contraction stimulated by the brain, an uncountable number of proteins change their structures to create the collective motion of the contraction. This extraordinary process is performed by molecules which have a size of only about a nanometer, a billionth of a meter. Muscle contraction is just one of the numerous activities of proteins: There are proteins that transport cargo in the cells, proteins that construct other proteins, there are even cages in which proteins that “mis-behave” can be trapped for correction, and the list goes on and on. “Imitating these astonishing bio-mechanical properties of proteins and transferring them to a fully artificial system is our long term objective”, says Ivan Coluzza from the Faculty of Physics of the University of Vienna, who works on this project together with colleagues of the University of Natural Resources and Life Sciences Vienna.
Simulations thanks to Vienna Scientific Cluster (VSC)
In a recent paper in Physical Review Letters, the team presented the first example of a fully artificial bio-mimetic model system capable of spontaneously self-knotting into a target structure. Using computer simulations, they reverse engineered proteins by focusing on the key elements that give them the ability to execute the program written in the genetic code. The computationally very intensive simulations have been made possible by access to the powerful Vienna Scientific Cluster (VSC), a high performance computing infrastructure operated jointly by the University of Vienna, the Vienna University of Technology and the University of Natural Resources and Life Sciences Vienna.
Artificial proteins in the laboratory
The team now works on realizing such artificial proteins in the laboratory using specially functionalized nanoparticles. The particles will then be connected into chains following the sequence determined by the computer simulations, such that the artificial proteins fold into the desired shapes. Such knotted nanostructures could be used as new stable drug delivery vehicles and as enzyme-like, but more stable, catalysts.

Japan to field test rehabilitation robots
Ten hospitals in Japan are set to begin testing the use of a robot known as “Robot Suit HAL” starting next month. The purpose of the test will be to determine whether use of the robot is beneficial to patients needing physical therapy to regain normal use of their legs.
When people experience nerve or muscle damage to their lower backs or legs due to illness, stroke or injury, the normal course of treatment involves undergoing physical therapy. Doing so causes the body to slowly repair the damage that has been done. In order for it to work however, the parts of the body that work properly have to coax the parts that do not into action, a laborious and quite often painful process. For this reason, professional physical therapists assist patients with the process to ensure that all of the body parts are exercised and to offer emotional support. But such experts can only help so much, and for that reason, robots have been developed to help. The thinking is that because they are sensor based and lack emotional involvement in the process, robots are likely to do a better job.
The Robot Suit HAL (Hybrid Assistive Limb) has been designed and built by Cyberdyne Inc. with assistance from researchers around the country. It’s described by its makers as a cyborg-type robot meant to supplement human muscles or to assist in their rehabilitation. Its part handrail, part sensor and part hydraulically controlled machinery. A patient stands between two handrails, holding on, while sensors are affixed to the skin of the legs. The sensors pick up nerve signals which are sent to an onboard computer. Those signals are then converted to action by small motors and power units that cause the muscle to be worked in the same way it would be were the person’s body able to move it on their own. The end result is a direct connection between nerve signals and movement, which the researchers believe, will result in faster and perhaps better recovery for the patient.
Initial testing will involve 30 volunteer patients. Representatives for Cyberdyne have also announced that the company is in the process of making arrangements for testing the robot in hospitals in Europe as well.
This Robotic Mouse Was Designed to Stress Out Real Mice
Lab rats have a new companion, but it’s not friendly. Researchers at Waseda University in Tokyo, Japan, have developed a robotic rat called WR-3 whose job is to induce stress and depression in lab animals, creating models of psychological conditions on which new drugs can be tested.
Animal are used throughout medicine as models to test treatments for human conditions, including mental disorders like depression. Rats and mice get their sense of smell severed to induce something like depression, or are forced to swim for long periods, for instance. Other methods rely on genetic modification and environmental stress, but none is entirely satisfactory in recreating a human-like version of depression for treatment. Hiroyuki Ishii and his team aim to do better with WR-3.
The researchers tested WR-3’s ability to depress two groups of 12 rats, measured by the somewhat crude assumption that a depressed rat moves around less. Rats in group A were constantly harassed by their robot counterpart, while the other rats were attacked intermittently and automatically by WR-3, whenever they moved. Ishii’s team found that the deepest depression was triggered by intermittent attacks on a mature rat that had been constantly harassed in its youth.
The team say they plan to test their new model of depression against more conventional systems, like forced swimming.
The robot has been developed just as new research by Junhee Seok of Stanford University in Palo Alto, California, and colleagues shows that the use of mouse models for human conditions has led researchers trying to find treatments for sepsis, burns and trauma astray at a cost of billions of tax dollars.
Cyborg Possibilities – The Arms and Legs
The most recent advancements in bionic arms seem to be included in the BeBionic prosthetic arms. This arm can detect signals in the nerves that exist in whatever amount of the arm remains and then uses those signals to drive the prosthetic’s functions. Essentially, operation ought to work much like the user’s original arm did: The person thinks about moving their arm in a certain way and the arm responds.
Despite looking cooler, the BeBionic hand is still a ways away from a human hand. Yet, the improvements are impressive. Grip strength has improved from about 17 pounds to about 31. It can hold about 100 pounds of weight, up from about 70. It also comes in a range of designs. The hand isn’t exorbitantly expensive, but at $25,000 to $35,000 it isn’t exactly cheap either. At that price range, concerns that future human enhancement technology will be a possibility only for the well to do seem likely.
Humans and robots work better together following cross-training
Spending a day in someone else’s shoes can help us to learn what makes them tick. Now the same approach is being used to develop a better understanding between humans and robots, to enable them to work together as a team.
Robots are increasingly being used in the manufacturing industry to perform tasks that bring them into closer contact with humans. But while a great deal of work is being done to ensure robots and humans can operate safely side-by-side, more effort is needed to make robots smart enough to work effectively with people, says Julie Shah, an assistant professor of aeronautics and astronautics at MIT and head of the Interactive Robotics Group in the Computer Science and Artificial Intelligence Laboratory (CSAIL).
“People aren’t robots, they don’t do things the same way every single time,” Shah says. “And so there is a mismatch between the way we program robots to perform tasks in exactly the same way each time and what we need them to do if they are going to work in concert with people.”
Most existing research into making robots better team players is based on the concept of interactive reward, in which a human trainer gives a positive or negative response each time a robot performs a task.
However, human studies carried out by the military have shown that simply telling people they have done well or badly at a task is a very inefficient method of encouraging them to work well as a team.
So Shah and PhD student Stefanos Nikolaidis began to investigate whether techniques that have been shown to work well in training people could also be applied to mixed teams of humans and robots. One such technique, known as cross-training, sees team members swap roles with each other on given days. “This allows people to form a better idea of how their role affects their partner and how their partner’s role affects them,” Shah says.
In a paper to be presented at the International Conference on Human-Robot Interaction in Tokyo in March, Shah and Nikolaidis will present the results of experiments they carried out with a mixed group of humans and robots, demonstrating that cross-training is an extremely effective team-building tool.
Robovie talking robot joins science class at Higashihikari Elementary School in Japan
Robovie a 1.2-meter robot developed by ATR joined the science class at Higashihikari Elementary School in Japan on Feb. 5 for the start of a 14-month experiment. Data will be gathered to improve the robot’s ability to interact naturally with multiple people. The robot has been given facial photos and voiceprints of 119 fifth graders and teachers. On the first day of class, Robovie greeted the students, and was asked by a teacher to answer what a “wound up copper wire” was. It answered, “A copper coil. It’s part of the motors that move my body.” During class Robovie waited at the back of the room, recognizing the faces of the students and recording their movements. After class it shook hands with sixth graders and answered their questions.
As part of research into the co-existence of humans and robots, the experiment with Robovie is being carried out at a school because the environment allows for the acquisition of large amounts of data from the movements of the children. The robot has been given facial photos and voiceprints of 119 fifth graders and teachers. Robovie’s daily conversation level is equivalent to a five-year-old human, but it has been programmed with the entire contents of a fifth-grade science textbook. This is the first experiment using a robot at a school to last over a year.
Researchers at the University of Pittsburgh School of Medicine and UPMC describe in PLoS ONE how an electrode array sitting on top of the brain enabled a 30-year-old paralyzed man to control the movement of a character on a computer screen in three dimensions with just his thoughts. It also enabled him to move a robot arm to touch a friend’s hand for the first time in the seven years since he was injured in a motorcycle accident.
With brain-computer interface (BCI) technology, the thoughts of Tim Hemmes, who sustained a spinal cord injury that left him unable to move his body below the shoulders, were interpreted by computer algorithms and translated into intended movement of a computer cursor and, later, a robot arm, explained lead investigator Wei Wang, Ph.D., assistant professor, Department of Physical Medicine and Rehabilitation, Pitt School of Medicine.
“When Tim reached out to high-five me with the robotic arm, we knew this technology had the potential to help people who cannot move their own arms achieve greater independence,” said Dr. Wang, reflecting on a memorable scene from September 2011 that was re-told in stories around the world. “It’s very important that we continue this effort to fulfill the promise we saw that day.”
Six weeks before the implantation surgery, the team conducted functional magnetic resonance imaging (fMRI) of Mr. Hemmes’ brain while he watched videos of arm movement. They used that information to place a postage stamp-size electrocortigraphy (ECoG) grid of 28 recording electrodes on the surface of the brain region that fMRI showed controlled right arm and hand movement. Wires from the device were tunneled under the skin of his neck to emerge from his chest where they could be connected to computer cables as necessary.
For 12 days at his home and nine days in the research lab, Mr. Hemmes began the testing protocol by watching a virtual arm move, which triggered neural signals that were sensed by the electrodes. Distinct signal patterns for particular observed movements were used to guide the up and down motion of a ball on a computer screen. Soon after mastering movement of the ball in two dimensions, namely up/down and right/left, he was able to also move it in/out with accuracy on a 3-dimensional display.
“During the learning process, the computer helped Tim hit his target smoothly by restricting how far off course the ball could wander,” Dr. Wang said. “We gradually took off the ‘training wheels,’ as we called it, and he was soon doing the tasks by himself with 100 percent brain control.”
The robot arm was developed by Johns Hopkins University’s Applied Physics Laboratory. Currently, Jan Scheuermann, of Whitehall, Pa., is testing another BCI technology at Pitt/UPMC.