Neuroscience

Articles and news from the latest research reports.

Posts tagged robotics

42 notes

Science fiction comes to life in Italian lab

At Italy’s Sant’Anna university, a bionic arm commanded by the human brain or a limb extension that allows rescuers to lift rubble after earthquakes are just some of the futuristic innovations in the pipeline.

“The idea is to get robots out of factories where they have shown their worth and to transform them into household machines which can live together with humans,” says Professor Paolo Dario, director of the college’s bio-robotics department.

The university in the historic town of Pisa in Tuscany is a veritable factory of ideas.

Researchers here are working on projects ranging from a robot that can come to your door to collect your recycling to tomatoes that slow the effects of ageing and plants that survive underwater to help flood-prone regions of the world.

Filed under AI bionics natural disasters neuroscience robotics robots science science fiction technology tech

15 notes

Brain scanner, not joystick, is in human-robot future

July 6, 2012 by Nancy Owano

(Phys.org) — Talk about fMRI may not be entirely familiar to many people, but that could change with new events that are highlighting efforts to link up humans and machines. fMRI (Functional Magnetic Resonance Imaging) is a promising technology that can help human move beyond joysticks to control robots via brain scanners instead. Now a research project exploring ways to develop robot surrogates with whom humans can interact has turned a corner. A university student‘s ability to make his robot surrogate move around, using fMRI technology, was successful. The experiment linked up Israeli student Tirosh Shapira in a lab at Bar-Ilan University, Israel, with a small robot in another lab far away at Beziers Technology Institute in France.

Shapira merely had to think about moving his arms or legs and the robot, with a camera on its head with an image displayed in front of Shapira, successfully would do the same. If Shapira thought about moving forward or backward, the robot responded accordingly.

fmri monitors blood flowing through the brain and can spot when areas associated with certain actions, such as movement, are in use. The fMRI read the student’s thoughts, which were translated via computer into commands relayed across the Internet to the robot in France.

There is much more work to be done to advance this approach, however. The researchers seek to devise a different type of scanning. An fMRI scanner is an expensive piece of equipment but the scientists believe that improvements in software might allow for a head-mounted device. Another research goal is to see if they can get humans to speak via the robot. The size of the robot will need modification, closer to the size and movement of a human, and engineered with a wider range of movement that would include hand gestures. In sum, according to the researchers, this experiment is only one of many steps ahead.

Medical applications for this technology are seen as promising, especially as scientists explore how patients with paralysis can interface with robots so that the patients can reconnect to the world. Another suggested application has been in the military, where robot surrogates rather than soldiers would be sent into battle.

Source: PHYS.ORG

Filed under science neuroscience brain fMRI robotics

128 notes

Using piezoelectric materials, researchers have replicated the muscle motion of the human eye to control camera systems in a way designed to improve the operation of robots. This new muscle-like action could help make robotic tools safer and more effective for MRI-guided surgery and robotic rehabilitation.

Read more: Robot vision: Muscle-like action allows camera to mimic human eye movement

Filed under science neuroscience brain psychology AI robotics vision

11 notes

'Hallucinating' robots arrange objects for human use

June 18, 2012 By Bill Steele

(Phys.org) — If you hire a robot to help you move into your new apartment, you won’t have to send out for pizza. But you will have to give the robot a system for figuring out where things go. The best approach, according to Cornell researchers, is to ask “How will humans use this?”

A robot populates a room with imaginary human stick figures in order to decide where objects should go to suit the needs of humans.

Researchers in the Personal Robotics Lab of Ashutosh Saxena, assistant professor of computer science, have already taught robots to identify common objects, pick them up and place them stably in appropriate locations. Now they’ve added the human element by teaching robots to “hallucinate” where and how humans might stand, sit or work in a room, and place objects in their usual relationship to those imaginary people.

Their work will be reported at the International Symposium on Experimental Robotics, June 21 in Quebec, and the International Conference of Machine Learning, June 29 in Edinburgh, Scotland.

Previous work on robotic placement, the researchers note, has relied on modeling relationships between objects. A keyboard goes in front of a monitor, and a mouse goes next to the keyboard. But that doesn’t help if the robot puts the monitor, keyboard and mouse at the back of the desk, facing the wall.

Above left, random placing of objects in a scene puts food on the floor, shoes on the desk and a laptop teetering on the top of the fridge. Considering the relationships between objects (upper right) is better, but he laptop is facing away from a potential user and the food higher than most humans would like. Adding human context (lower left) makes things more accessible. Lower right: how an actual robot carried it out. (Personal Robotics Lab)

Relating objects to humans not only avoids such mistakes but also makes computation easier, the researchers said, because each object is described in terms of its relationship to a small set of human poses, rather than to the long list of other objects in a scene. A computer learns these relationships by observing 3-D images of rooms with objects in them, in which it imagines human figures, placing them in practical relationships with objects and furniture. You don’t don’t put a sitting person where there is no chair. You can put a sitting person on top of a bookcase, but there are no objects there for the person to use, so that”s ignored. It The computer calculates the distance of objects from various parts of the imagined human figures, and notes the orientation of the objects.

Eventually it learns commonalities: There are lots of imaginary people sitting on the sofa facing the TV, and the TV is always facing them. The remote is usually near a human’s reaching arm, seldom near a standing person’s feet. “It is more important for a robot to figure out how an object is to be used by humans, rather than what the object is. One key achievement in this work is using unlabeled data to figure out how humans use a space,” Saxena said.

In a new situation the a robot places human figures in a 3-D image of a room, locating them in relation to objects and furniture already there. “It puts a sample of human poses in the environment, then figures out which ones are relevant and ignores the others,” Saxena explained. It decides where new objects should be placed in relation to the human figures, and carries out the action.

The researchers tested their method using images of living rooms, kitchens and offices from the Google 3-D Warehouse, and later, images of local offices and apartments. Finally, they programmed a robot to carry out the predicted placements in local settings. Volunteers who were not associated with the project rated the placement of each object for correctness on a scale of 1 to 5.

Comparing various algorithms, the researchers found that placements based on human context were more accurate than those based solely in relationships between objects, but the best results of all came from combining human context with object-to-object relationships, with an average score of 4.3. Some tests were done in rooms with furniture and some objects, others in rooms where only a major piece of furniture was present. The object-only method performed significantly worse in the latter case because there was no context to use. “The difference between previous works and our [human to object] method was significantly higher in the case of empty rooms,” Saxena reported.

Provided by Cornell University

Source: phys.org

Filed under science neuroscience robotics

13 notes

Robots Get a Feel for the World

June 18th, 2012

Robots equipped with tactile sensor able to identify materials through touch, paving the way for more useful prostheses.

What does a robot feel when it touches something? Little or nothing until now. But with the right sensors, actuators and software, robots can be given the sense of feel, or at least the ability to identify different materials by touch.

Researchers at the University of Southern California’s Viterbi School of Engineering published a study today in Frontiers in Neurorobotics showing that a specially designed robot can outperform humans in identifying a wide range of natural materials according to their textures, paving the way for advancements in prostheses, personal assistive robots and consumer product testing.

The robot was equipped with a new type of tactile sensor built to mimic the human fingertip. It also used a newly designed algorithm to make decisions about how to explore the outside world by imitating human strategies. Capable of other human sensations, the sensor can also tell where and in which direction forces are applied to the fingertip and even the thermal properties of an object being touched.

Like the human finger, the group’s BioTac® sensor has a soft, flexible skin over a liquid filling. The skin even has fingerprints on its surface, greatly enhancing its sensitivity to vibration. As the finger slides over a textured surface, the skin vibrates in characteristic ways. These vibrations are detected by a hydrophone inside the bone-like core of the finger. The human finger uses similar vibrations to identify textures, but the robot finger is even more sensitive.

[Video: Robots Get a Feel for the World]
What does a robot feel when it touches something? Little or nothing until now. Researchers at the USC Viterbi School of Engineering publish a study in Frontiers in Neurorobotics showing that specially designed robots can be taught to feel even more than humans. Vimeo video by USC Viterbi. USC Viterbi.

When humans try to identify an object by touch, they use a wide range of exploratory movements based on their prior experience with similar objects. A famous theorem by 18th century mathematician Thomas Bayes describes how decisions might be made from the information obtained during these movements. Until now, however, there was no way to decide which exploratory movement to make next. The article, authored by Professor of Biomedical Engineering Gerald Loeb and recently graduated doctoral student Jeremy Fishel, describes their new theorem for solving this general problem as “Bayesian Exploration.”

Built by Fishel, the specialized robot was trained on 117 common materials gathered from fabric, stationery and hardware stores. When confronted with one material at random, the robot could correctly identify the material 95% of the time, after intelligently selecting and making an average of five exploratory movements. It was only rarely confused by pairs of similar textures that human subjects making their own exploratory movements could not distinguish at all.

Tactile sensors which mimic finger tips enables robots to identify materials through touch better than humans. Image from press release by USC Viterbi School of Engineering.

So, is touch another task that humans will outsource to robots? Fishel and Loeb point out that while their robot is very good at identifying which textures are similar to each other, it has no way to tell what textures people will prefer. Instead, they say this robot touch technology could be used in human prostheses or to assist companies who employ experts to assess the feel of consumer products and even human skin.

Source: Neuroscience News

Filed under science neuroscience robotics

free counters