Posts tagged robotics

Posts tagged robotics

Over a half-century has passed since the concept of artificial intelligence first emerged. In the United States, a computer has been built to become a TV quiz show champion, and minor research developments such as robotic vacuum cleaners and smartphones that talk back have become commonplace. We take a look into the evolution of machine intellect.
Perceive first, act afterwards. The architecture of most of today’s robots is underpinned by this control strategy. The eSMCs project has set itself the aim of changing the paradigm and generating more dynamic computer models in which action is not a mere consequence of perception but an integral part of the perception process. It is about improving robot behaviour by means of perception models closer to those of humans.
"The concept of how science understands the mind when it comes to building a robot or looking at the brain is that you take a photo, which is then processed as if the mind were a computer, and a recognition of patterns is carried out. There are various types of algorithms and techniques for identifying an object, scenes, etc. However, organic perception, that of human beings, is much more active. The eye, for example, carries out a whole host of saccadic movements -small rapid ocular movements- that we do not see. Seeing is establishing and recognising objects through this visual action, knowing how the relationship and sensation of my body changes with respect to movement," explains Xabier Barandiaran, a PhD-holder in Philosophy and researcher at IAS-Research (UPV/EHU) which under the leadership of Ikerbasque researcher Ezequiel di Paolo is part of the European project eSMCs (Extending Sensorimotor Contingencies to Cognition).
Until now, the belief has been that sensations were processed, and the perception was created, and this in turn then led to reasoning and action. As Barandiaran sees it, action is an integral part of perception:”Our basic idea is that when we perceive, what is there is active exploration, a particular co-ordination with the surroundings, like a kind of invisible dance than makes vision possible.”
The eSMCs project aims to apply this idea to the computer models used in robots, improve their behaviour and thus understand the nature of the animal and human mind. For this purpose, the researchers are working on sensorimotor contingencies: regular relationships existing between actions and changes in the sensory variations associated with these actions.
An example of this kind of contingency is when you drink water and speak at the same time, almost without realising it. Interaction with the surroundings has taken place “without any need to internally represent that this is a glass and then compute needs and plan an action,” explains Barandiaran, “seeing the glass draws one’s attention, it is coordinated with thirst while the presence of the water itself on the table is enough for me to coordinate the visual-motor cycle that ends up with the glass at my lips.”The same thing happens in the robots in the eSMCs project, “they are moving the whole time, they don’t stop to think; they think about the act using the body and the surroundings,” he adds.
The researchers in the eSMCs project maintain that actions play a key role not only in perception, but also in the development of more complex cognitive capacities. That is why they believe that sensorimotor contingencies can be used to specify habits, intentions, tendencies and mental structures, thus providing the robot with a more complex, fluid behaviour.
So one of the experiments involves a robot simulation (developed by Thomas Buhrmann, who is also a member of this team at the UPV/EHU) in which an agent has to discriminate between what we could call an acne pimple and a bite or lump on the skin.”The acne has a tip, the bite doesn’t. Just as people do, our agent stays with the tip and recognises the acne, and when it goes on to touch the lump, it ignores it. What we are seeking to model and explain is that moment of perception that is built with the active exploration of the skin, when you feel ‘ah! I’ve found the acne pimple’ and you go on sliding your finger across it,” says Barandiaran. The model tries to identify what kind of relationship is established between the movement and sensation cycles and the neurodynamic patterns that are simulated in the robot’s “mini brain”.
In another robot, built at the Artificial Intelligence Laboratory of Zürich University, Puppy, a robot dog, is capable of adapting and “feeling” the texture of the terrain on which it is moving (slippery, viscous, rough, etc.) by exploring the sensorimotor contingencies that take place when walking.
The work of the UPV/EHU’s research team is focusing on the theoretical part of the models to be developed.”As philosophers, what we mostly do is define concepts. Our main aim is to be able to define technical concepts like the sensorimotor habitat, or that of the pattern of sensorimotor co-ordination, as well as that of habit or of mental life as a whole. “Defining concepts and giving them a mathematical form is essential so that the scientist can apply it to specific experiments, not only with robots, but also with human beings. The partners at the University Medical Centre Hamburg-Eppendorf, for example, are studying in dialogue with the theoretical development of the UPV/EHU team how the perception of time and space changes in Parkinson’s patients.
(Source: basqueresearch.com)
Robot Suit HAL
“Robot Suit HAL" is a cyborg-type robot that can supplement, expand or improve physical capability.
When a person attempts to move, nerve signals are sent from the brain to the muscles via motoneurons, moving the musculoskeletal system as a consequence. At this moment, very weak biosignals can be detected on the surface of the skin. “HAL” catches these signals through a sensor attached on the skin of the wearer. Based on the signals obtained, the power unit is controlled to move the joint in unison with the wearer’s muscle movement, enabling HAL to support the wearer’s daily activities. This is what we call a ‘voluntary control system’ that provides movement interpreting the wearer’s intention from the biosignals in advance of the actual movement. Not only a ‘voluntary control system’ “HAL” has, but also a ‘robotic autonomous control system’ that provides human-like movement based on a robotic system which integrally work together with the ‘autonomous control system’. “HAL” is the world’s first cyborg-type robot controlled by this unique Hybrid System.
"HAL" is expected to be applied in various fields such as rehabilitation support and physical training support in medical field, ADL support for disabled people, heavy labour support at factories, and rescue support at disaster sites, as well as in the entertainment field.
(Source: cyberdyne.jp)
Theresa Klein talks about Achilles, the first machine to move in a biologically accurate way.
"Our robot, named Achilles, is the first to walk in a biologically accurate way. That means it doesn’t just move like a person, but also sends commands to the legs like the human nervous system does.
Each leg has eight muscles—Kevlar straps attached to a motor on one end and to the plastic skeleton on the other. As the motor turns, it pulls the strap, mimicking the way our muscles contract. Some of Achilles’ muscles extend from the hip or thigh to the lower leg so they can project forces all the way down the limb. This allows us to put most of the motors in the hips and thighs. Placing them up high keeps the lower leg light, so that it can swing quickly like a human’s lower leg.
In people, neurons in the spinal column send out rhythmic signals that control our legs. It’s like a metronome, and sensory feedback from the legs alters the pace. Your brain can step in to make corrections, but it doesn’t explicitly control every muscle, which is essentially why you can walk without thinking about it. For our robot, a computer program running off an external PC controls movement in a similar way. With each step, the computer sends a signal to flex one hip muscle and extend the other. The computer changes the timing of those signals based on feedback from the legs’ load and angle sensors. A similar control system handles the lower muscles.
Modeling human movement has applications outside of robotics. It could also help us understand how people recover after spinal-cord injuries, for example. But our robot is still a very simplified model—it has no torso and can’t handle complex terrain. Initially, we also had a problem with its feet slipping. We thought about different types of rubber to give its feet more grip but eventually realized a solution already exists. Now, the robot wears a pair of Keds.”
Worldwide patent for a Spanish stroke rehabilitation robot
Robotherapist 3D, a robot which aids stroke patients’ recovery, is to be brought to market by its worldwide patent holder, a spin-off company from the Miguel Hernández University of Elche (Alicante, Spain). It is the first robot to enable patients to start doing exercises while supine, allowing them to begin shortly after the stroke and expediting recovery.
The company, a leader in this field in Spain, already has two robots: Robotherapist 2D and Robotherapist 3D. For the latter, it has a worldwide patent. Both are actuated by pneumatic technology and have been designed to improve arm movement in stroke patients.
According to the researcher, Robotherapist 2D is a planar robot which allows movement in two dimensions and includes sensors to determine the patient’s condition and a sound feedback system. “With this robot, certain tasks are carried out. The patient’s arm is moved parallel to the table: to the right, to the left and in a straight line. They are exercises to improve coordination,” he says.
The £90,000 ‘robolegs’ that got me out of my wheelchair: How one woman stood on her own feet nine years after she was paralysed
It is an extraordinary sight. From the waist up, 27-year-old Sophie Morgan is every inch the pretty blonde girl-next-door. But from the waist down, with her legs encased in £90,000 of motorised carbon-fibre, she is RoboCop.
Sophie’s thumb manipulates a joystick built into the armrests of her suit, causing the legs to hiss and whirr into life, before she takes three slow but sure steps. Her face breaks into a broad grin.
Five minutes earlier, Sophie was in her wheelchair. She was left paralysed from the chest down in a car crash nine years ago that shattered her spine. Over the years, Sophie, an aspiring television presenter who appeared in Channel 4’s Paralympics coverage, had come to accept that she would never walk again.
The New Medicine: Hacking Our Biology is part of the series “Engineers of the New Millennium” from IEEE Spectrum magazine and the Directorate for Engineering of the National Science Foundation. These stories explore technological advances in medical inventions to enhance and extend life.
Toys Cars Offer Mobility to Children with Disabilities
Children born with severe mobility impairments, such as those associated with cerebral palsy, are at increased risk for mobility-related developmental delays in cognition, language and socialization. Providing daily mobility between the ages of 1 and 5 is critical, given that significant learning, brain and behavioral development is dependent on mobility during this time.
The NSF-funded project, affectionately termed “Babies Driving Robots and Racecars,” began at the University of Delaware when Sunil Agrawal, a professor in the Department of Mechanical Engineering, approached Cole Galloway, a professor in the Department of Physical Therapy.
"Dr. Agrawal told me, ‘We have small robots, and you have small infants, do you think we can do something together?’" Galloway explained.
Galloway was hesitant at first; he could not envision babies and robots in the same room much less interacting with each other. However, after visiting the lab and seeing Agrawal’s robots in action, Galloway began to see the possibilities.

You, robot?
Technology and regulation: A research project considers how the law should deal with technologies that blur man and machine
SPEAKING at a conference organised by The Economist earlier this year, Hugh Herr, a roboticist at the Massachusetts Institute of Technology, described disabilities as conditions that persist “because of poor technology” and made the bold claim that during the 21st century disability would be largely eliminated. What gave his words added force was that half way through his speech, after ten minutes of strolling around the stage, he unexpectedly pulled up his trouser legs to reveal his bionic legs, and then danced a little jig. In future, he suggested, people might choose to replace an arthritic, painful limb with a fully functional robotic one. “Why wouldn’t you replace it?” he asked. “We’re going to see a lot of unusual situations like that.”
Robots paint hotel guests’ sleep patterns
Global hotel chain Ibis is transforming the nightly tosses and turns of its guests into works of modern art, painted by robots.
"Our masterpiece is to make your sleep a true work of art," the promotional video gushes, after putting a far more interesting point to viewers: "What does sleep look like?" To find out, the budget chain is installing thin grids covered in 80 heat, pressure and sound sensors on mattresses in select guestrooms, kicking off on 13 October in Paris. Data gathered by the sensors will be fed wirelessly throughout the night to the studio, where it is then fed through an algorithm that converts information on a guest’s movement, sound and temperature into colour and movement.
This video shows the robot, much like an assembly line arm, reacting in sequence, tracing acrylic paints onto a black canvas in a visual and physical interpretation of sleep cycles and patterns.
Only 40 participants can take part — anyone who wants to try it out can enter a competition on the Ibis Facebook page. When the project is wrapped up in Novemeber there will be an online gallery of the artworks and guests will get an original to take home.