Neuroscience

Articles and news from the latest research reports.

Posts tagged robotics

253 notes

Humanoid robot helps train children with autism
“Aiden, look!” piped NAO, a two-foot tall humanoid robot, as it pointed to a flat-panel display on a far wall. As the cartoon dog Scooby Doo flashed on the screen, Aiden, a young boy with an unruly thatch of straw-colored hair, looked in the direction the robot was pointing.
Aiden, who is three and a half years old, has been diagnosed with autism spectrum disorder (ASD). NAO (pronounced “now”) is the diminutive “front man” for an elaborate system of cameras, sensors and computers designed specifically to help children like Aiden learn how to coordinate their attention with other people and objects in their environment. This basic social skill is called joint attention. Typically developing children learn it naturally. Children with autism, however, have difficulty mastering it and that inability can compound into a variety of learning difficulties as they age.
An interdisciplinary team of mechanical engineers and autism experts at Vanderbilt University have developed the system and used it to demonstrate that robotic systems may be powerful tools for enhancing the basic social learning skills of children with ASD. Writing in the March issue of the IEEE Transactions on Neural Systems and Rehabilitation Engineering, the researchers report that children with ASD paid more attention to the robot and followed its instructions almost as well as they did those of a human therapist in standard exercises used to develop joint attention skill.
The finding indicates that robots could play a crucial role in responding to the “public health emergency” that has been created by the rapid growth in the number of children being diagnosed with ASD. Today, one in 88 children (one in 54 boys) are being diagnosed with ASD. That is a 78 percent increase in just four years. The trend has major implications for the nation’s healthcare budget because estimates of the lifetime cost of treating ASD patients ranges from four to six times greater than for patients without autism.
“This is the first real world test of whether intelligent adaptive systems can make an impact on autism,” said team member Zachary Warren, who directs the Treatment and Research Institute for Autism Spectrum Disorders (TRIAD) at Vanderbilt’s Kennedy Center.

Humanoid robot helps train children with autism

“Aiden, look!” piped NAO, a two-foot tall humanoid robot, as it pointed to a flat-panel display on a far wall. As the cartoon dog Scooby Doo flashed on the screen, Aiden, a young boy with an unruly thatch of straw-colored hair, looked in the direction the robot was pointing.

Aiden, who is three and a half years old, has been diagnosed with autism spectrum disorder (ASD). NAO (pronounced “now”) is the diminutive “front man” for an elaborate system of cameras, sensors and computers designed specifically to help children like Aiden learn how to coordinate their attention with other people and objects in their environment. This basic social skill is called joint attention. Typically developing children learn it naturally. Children with autism, however, have difficulty mastering it and that inability can compound into a variety of learning difficulties as they age.

An interdisciplinary team of mechanical engineers and autism experts at Vanderbilt University have developed the system and used it to demonstrate that robotic systems may be powerful tools for enhancing the basic social learning skills of children with ASD. Writing in the March issue of the IEEE Transactions on Neural Systems and Rehabilitation Engineering, the researchers report that children with ASD paid more attention to the robot and followed its instructions almost as well as they did those of a human therapist in standard exercises used to develop joint attention skill.

The finding indicates that robots could play a crucial role in responding to the “public health emergency” that has been created by the rapid growth in the number of children being diagnosed with ASD. Today, one in 88 children (one in 54 boys) are being diagnosed with ASD. That is a 78 percent increase in just four years. The trend has major implications for the nation’s healthcare budget because estimates of the lifetime cost of treating ASD patients ranges from four to six times greater than for patients without autism.

“This is the first real world test of whether intelligent adaptive systems can make an impact on autism,” said team member Zachary Warren, who directs the Treatment and Research Institute for Autism Spectrum Disorders (TRIAD) at Vanderbilt’s Kennedy Center.

Filed under robots robotics humanoids ASD autism NAO joint attention neuroscience science

336 notes

Brainless robots swarm just like animals
Swarming patterns and herding behaviours have been observed throughout the animal kingdom. Scientists and mathematicians have pondered the cause of complex relationships and group dynamics at work that allow schools of fish, such as herring, and flocks of birds, such as starlings, to move together in apparent unity — and now, in an interesting twist to the discussion, a team of engineers from Harvard University has observed apparent collective behaviour in brainless robots.
The robot research team was looking for a way to investigate the transition that swarming groups make from random behaviour into collective motion. In order to observe a randomly moving collective, they built the simplest of “self-propelled automatons”, the charmingly named Bristle-Bot (BBots).
Read more

Brainless robots swarm just like animals

Swarming patterns and herding behaviours have been observed throughout the animal kingdom. Scientists and mathematicians have pondered the cause of complex relationships and group dynamics at work that allow schools of fish, such as herring, and flocks of birds, such as starlings, to move together in apparent unity — and now, in an interesting twist to the discussion, a team of engineers from Harvard University has observed apparent collective behaviour in brainless robots.

The robot research team was looking for a way to investigate the transition that swarming groups make from random behaviour into collective motion. In order to observe a randomly moving collective, they built the simplest of “self-propelled automatons”, the charmingly named Bristle-Bot (BBots).

Read more

Filed under swarming bristle-bots robots robotics animal cognition technology neuroscience science

97 notes

Brave New Machines
Robots are here to stay. They will be smarter, more versatile, more autonomous, and more like us in many ways. We humans will need to adapt to keep up.
The word “robot” was used for the first time only about 80 years ago, in the play “RUR” by the Czech author Karel Capek. The robots in that book were artificial humans, chemically synthesized using appropriate formulas. Robots at present and in the future will be made largely of inorganic materials, both mechanical and electronic. However, some form of hybridization between electromechanical and biological subsystems is possible and will occur. I believe that the major developments in robotics in the next 100 years will be the following areas:
Robot intelligence: The ability of a robot to solve problems, to learn, to interact with humans and other robots, and related skills are all measures of intelligence. Robots will indeed be increasingly intelligent, because:
- High speed memory, long term storage capacity, and speed of the on-board computers will continue to increase. Futurist Ray Kurzweil has predicted that the capacity of robot brains will exceed that of human brains within the next 20 years.
- Neuroscience is rapidly obtaining better and better models of the information processing ability of the human brain. These models will lead to the development of software to enable robot brains to emulate more and more of the features of the human brain.
- Research in learning will enable robots to learn by imitating humans, from their own mistakes and from their successes.
Human-robot interaction: This is an area of significant research activity at the present time. I believe that during the coming decades robots will be able to interact with humans (and with each other) in increasingly human-like ways, including speech and gestures. Robots will be able to understand the semantic as well as the emotional aspects of speech, so that they will understand the significance of increasing loudness, irritation, affection, and other emotional aspects in spoken utterances, and they will be able to include these aspects in their own speech as well.
Read more

Brave New Machines

Robots are here to stay. They will be smarter, more versatile, more autonomous, and more like us in many ways. We humans will need to adapt to keep up.

The word “robot” was used for the first time only about 80 years ago, in the play “RUR” by the Czech author Karel Capek. The robots in that book were artificial humans, chemically synthesized using appropriate formulas. Robots at present and in the future will be made largely of inorganic materials, both mechanical and electronic. However, some form of hybridization between electromechanical and biological subsystems is possible and will occur. I believe that the major developments in robotics in the next 100 years will be the following areas:

Robot intelligence: The ability of a robot to solve problems, to learn, to interact with humans and other robots, and related skills are all measures of intelligence. Robots will indeed be increasingly intelligent, because:

- High speed memory, long term storage capacity, and speed of the on-board computers will continue to increase. Futurist Ray Kurzweil has predicted that the capacity of robot brains will exceed that of human brains within the next 20 years.

- Neuroscience is rapidly obtaining better and better models of the information processing ability of the human brain. These models will lead to the development of software to enable robot brains to emulate more and more of the features of the human brain.

- Research in learning will enable robots to learn by imitating humans, from their own mistakes and from their successes.

Human-robot interaction: This is an area of significant research activity at the present time. I believe that during the coming decades robots will be able to interact with humans (and with each other) in increasingly human-like ways, including speech and gestures. Robots will be able to understand the semantic as well as the emotional aspects of speech, so that they will understand the significance of increasing loudness, irritation, affection, and other emotional aspects in spoken utterances, and they will be able to include these aspects in their own speech as well.

Read more

Filed under robots robotics intelligence AI human-robot interaction neuroscience science

34 notes

Gets stroke patients back on their feet
A robot is now being built to help stroke patients with training, motivation and walking.
In Europe, strokes are the most common cause of physical disability among the elderly. This often result in paralysis of one side of the body, and many patients suffer much reduced physical mobility and are often unable to walk on their own. These are the hard facts the EU project CORBYS has taken seriously. Researchers in six countries are currently developing a robotic system designed to help stroke patients re-train their bodies. The concept is based on helping the patient by constructing a system consisting of powered orthosis to help patient in moving his/her legs and a mobile platform providing patient mobility.
The CORBYS researchers are also working with the cognitive aspects. The aim is to enable the robot to interpret data from the patient and adapt the training programme to his or her capabilities and intention. This will bring rehabilitation robots to the next level.
Back to walking normallyIt is vital to get stroke patients up on their feet as soon as possible. They must have frequent training exercises, and re-learn how to walk so that they can function as good as possible on their own.Why a robot? “Absolutely, because it is difficult to meet these requirements using today’s work-intensive manual method where two therapists assisting the patient by lifting one leg after the other”, says ICT researcher Anders Liverud at SINTEF, which is one of the CORBYS project partners.
Robot-patient learningCORBYS involves the use of physiological data such as heart rate, temperature and muscle activity measurements to provide feedback to the therapist and help control the robot. Do the patient’s legs always go where the patient want? Is the patient getting tired and stressed?
“The walking robot has several settings, and the therapist selects the correct mode based on how far the patient has come in his or her rehabilitation”, says Liverud. “The first step is to attach sensors to the patient’s body and let them walk on a treadmill. A therapist manually corrects the walking pattern and, with the help of the sensors, create a model of the patient’s walking pattern”, he says.
In the next mode, the system adjusts the walking pattern to the defined model. New adjustments are made and are used to improve optimisation of the walking pattern.
“The patient wears an EEG cap which measures brain activity”, says Liverud. “By using these signals combined with input from other physiological and system sensors, the robotic system registers whether the patient wants to stop, change speed or turn, and can adapt immediately”, he says. “The robot continues to correct any walking pattern errors. However, since it also allows the patient the freedom to decide where and how he or she walks, the patient experiences control and keeps motivation to continue with the training”, says Liverud.
Working with EuropeThe European researchers have now completed specification of the system and its components, and construction of the robot is underway.Construction involves a large team. The University of Bremen is heading the project and developing the architecture to integrate all system modules, and German wheelchair, orthosis and robotics experts are constructing the mechanical components, while two UK universities are working with cognitive aspects. Spanish specialists are addressing brain activity measurements and the University of Brussels is looking into robot control. SINTEF is working with the sensors and the final functional integration of the system. In a year’s time construction will be completed and the robot will be tested on stroke patients at rehabilitation institutes in Slovenia and Germany. The CORBYS project has a total budget of EUR 8.7 million.

Gets stroke patients back on their feet

A robot is now being built to help stroke patients with training, motivation and walking.

In Europe, strokes are the most common cause of physical disability among the elderly. This often result in paralysis of one side of the body, and many patients suffer much reduced physical mobility and are often unable to walk on their own. These are the hard facts the EU project CORBYS has taken seriously. Researchers in six countries are currently developing a robotic system designed to help stroke patients re-train their bodies. The concept is based on helping the patient by constructing a system consisting of powered orthosis to help patient in moving his/her legs and a mobile platform providing patient mobility.

The CORBYS researchers are also working with the cognitive aspects. The aim is to enable the robot to interpret data from the patient and adapt the training programme to his or her capabilities and intention. This will bring rehabilitation robots to the next level.

Back to walking normally
It is vital to get stroke patients up on their feet as soon as possible. They must have frequent training exercises, and re-learn how to walk so that they can function as good as possible on their own.
Why a robot? “Absolutely, because it is difficult to meet these requirements using today’s work-intensive manual method where two therapists assisting the patient by lifting one leg after the other”, says ICT researcher Anders Liverud at SINTEF, which is one of the CORBYS project partners.

Robot-patient learning
CORBYS involves the use of physiological data such as heart rate, temperature and muscle activity measurements to provide feedback to the therapist and help control the robot. Do the patient’s legs always go where the patient want? Is the patient getting tired and stressed?

“The walking robot has several settings, and the therapist selects the correct mode based on how far the patient has come in his or her rehabilitation”, says Liverud. “The first step is to attach sensors to the patient’s body and let them walk on a treadmill. A therapist manually corrects the walking pattern and, with the help of the sensors, create a model of the patient’s walking pattern”, he says.

In the next mode, the system adjusts the walking pattern to the defined model. New adjustments are made and are used to improve optimisation of the walking pattern.

“The patient wears an EEG cap which measures brain activity”, says Liverud. “By using these signals combined with input from other physiological and system sensors, the robotic system registers whether the patient wants to stop, change speed or turn, and can adapt immediately”, he says. “The robot continues to correct any walking pattern errors. However, since it also allows the patient the freedom to decide where and how he or she walks, the patient experiences control and keeps motivation to continue with the training”, says Liverud.

Working with Europe
The European researchers have now completed specification of the system and its components, and construction of the robot is underway.
Construction involves a large team. The University of Bremen is heading the project and developing the architecture to integrate all system modules, and German wheelchair, orthosis and robotics experts are constructing the mechanical components, while two UK universities are working with cognitive aspects. Spanish specialists are addressing brain activity measurements and the University of Brussels is looking into robot control. SINTEF is working with the sensors and the final functional integration of the system. In a year’s time construction will be completed and the robot will be tested on stroke patients at rehabilitation institutes in Slovenia and Germany. The CORBYS project has a total budget of EUR 8.7 million.

Filed under robots robotics stroke rehabilitation muscle activity brain activity neuroscience science

507 notes

Mind-controlled exoskeleton to help disabled people walk again

Every year thousands of people in Europe are paralysed by a spinal cord injury. Many are young adults, facing the rest of their lives confined to a wheelchair. Although no medical cure currently exists, in the future they could be able to walk again thanks to a mind-controlled robotic exoskeleton being developed by EU-funded researchers.

image

The system, based on innovative ‘Brain-neural-computer interface’ (BNCI) technology - combined with a light-weight exoskeleton attached to users’ legs and a virtual reality environment for training - could also find applications in the rehabilitation of stroke victims and in assisting astronauts rebuild muscle mass after prolonged periods in space.

In the United Kingdom, every eight hours someone suffers a spinal cord injury, often leading to partial or full lower-body paralysis. In the United States, more than 250.000 people are living with paralysis as a result of damage to their spinal cord, usually because of a traffic accident, fall or sporting injury. Many are under the age of 50, and with no known medical cure or way of repairing damaged spinal nerves they face the rest of their lives in a wheelchair.

But by bypassing the spinal cord entirely and routing brain signals to a robotic exoskeleton, they should be able to get back on their feet. That is the ultimate goal of researchers working in the ‘Mind-controlled orthosis and VR-training environment for walk empowering' (Mindwalker) project, a three-year initiative supported by EUR 2.75 million in funding from the European Commission.

'Mindwalker was proposed as a very ambitious project intended to investigate promising approaches to exploit brain signals for the purpose of controlling advanced orthosis, and to design and implement a prototype system demonstrating the potential of related technologies,' explains Michel Ilzkovitz, the project coordinator at Space Applications Services in Belgium.

The team’s approach relies on an advanced BNCI system that converts electroencephalography (EEG) signals from the brain, or electromyography (EMG) signals from shoulder muscles, into electronic commands to control the exoskeleton.

The Laboratory of Neurophysiology and Movement Biomechanics at the Université Libre de Bruxelles (ULB) focused on the exploitation of EEG and EMG signals treated by an artificial neural network, while the Foundation Santa Lucia in Italy developed techniques based on EMG signals modelled by the coupling of neural and biomechanical oscillators.

One approach for controlling the exoskeleton uses so-called ‘steady-state visually evoked potential’, a method that reads flickering visual stimuli produced at different frequencies to induce correlated EEG signals. Detection of these EEG signals is used to trigger commands such as ‘stand’, ‘walk’, ‘faster’ or ‘slower’.

A second approach is based on processing EMG signals generated by the user’s shoulders and exploits the natural arm-leg coordination in human walking: arm-swing patterns can be perceived in this way and converted into control signals commanding the exoskeleton’s legs.

A third approach, ‘ideation’, is also based on EEG-signal processing. It uses the identification and exploitation of EEG Theta cortical signals produced by the natural mental process associated with walking. The approach was investigated by the Mindwalker team but had to be dropped due to the difficulty, and time needed, in turning the results of early experiments into a fully exploitable system.

Regardless of which method is used, the BNCI signals have to be filtered and processed before they can be used to control the exoskeleton. To achieve this, the Mindwalker researchers fed the signals into a ‘Dynamic recurrent neural network’ (DRNN), a processing technique capable of learning and exploiting the dynamic character of the BNCI signals.

'This is appealing for kinematic control and allows a much more natural and fluid way of controlling an exoskeleton,' Mr Ilzkovitz says.

The team adopted a similarly practical approach for collecting EEG signals from the user’s scalp. Most BNCI systems are either invasive, requiring electrodes to be placed directly into brain tissue, or require users to wear a ‘wet’ capon their head, necessitating lengthy fitting procedures and the use of special gels to reduce the electrical resistance at the interface between the skin and the electrodes. While such systems deliver signals of very good quality and signal-to-noise ratio, they are impractical for everyday use.

The Mindwalker team therefore turned to a ‘dry’ technology developed by Berlin-based eemagine Medical Imaging Solutions: a cap covered in electrodes that the user can fit themselves, and which uses innovative electronic components to amplify and optimise signals before sending them to the neural network.

'The dry EEG cap can be placed by the subject on their head by themselves in less than a minute, just like a swimming cap,' Mr Ilzkovitz says.

Read more …

Filed under exoskeletons BNCI spinal cord injury paralysis robotics mind control mindwalker EEG neuroscience science

2,854 notes

Japan’s Robot Suit Gets Global Safety Certificate
A robot suit that can help the elderly or disabled get around was given its global safety certificate in Japan on Wednesday, paving the way for its worldwide rollout.
The Hybrid Assistive Limb, or HAL, is a power-assisted pair of legs developed by Japanese robot maker Cyberdyne, which has also developed similar robot arms.
A quality assurance body issued the certificate based on a draft version of an international safety standard for personal robots that is expected to be approved later this year, the ministry for the economy, trade and industry said.
The metal-and-plastic exoskeleton has become the first nursing-care robot certified under the draft standard, a ministry official said.
Battery-powered HAL, which detects muscle impulses to anticipate and support the user’s body movements, is designed to help the elderly with mobility or help hospital or nursing carers to lift patients.
Cyberdyne, based in Tsukuba, northeast of Tokyo, has so far leased some 330 suits to 150 hospitals, welfare and other facilities in Japan since 2010, at 178,000 yen ($1,950) per suit per year.
"It is very significant that Japan has obtained this certification before others in the world," said Yoshiyuki Sankai, the head of Cyberdyne.
The company is unrelated to the firm of the same name responsible for the cyborg assassin played by Arnold Schwarzenegger in the 1984 film “The Terminator”.
"This is a first step forward for Japan, the great robot nation, to send our message to the world about robots of the future," said Sankai, who is also a professor at Tsukuba University.
A different version of HAL — coincidentally the name of the evil supercomputer in Stanley Kubrick’s “2001: A Space Odyssey” — has been developed for workers who need to wear heavy radiation protection as part of the clean-up at the crippled Fukushima nuclear plant.
Industrial robots have long been used in Japan, and robo-suits are gradually making inroads into hospitals and retirement homes.
But critics say the government has been slow in creating a safety framework for such robots in a country whose rapidly-ageing population is expected to enjoy ever longer lives.

Japan’s Robot Suit Gets Global Safety Certificate

A robot suit that can help the elderly or disabled get around was given its global safety certificate in Japan on Wednesday, paving the way for its worldwide rollout.

The Hybrid Assistive Limb, or HAL, is a power-assisted pair of legs developed by Japanese robot maker Cyberdyne, which has also developed similar robot arms.

A quality assurance body issued the certificate based on a draft version of an international safety standard for personal robots that is expected to be approved later this year, the ministry for the economy, trade and industry said.

The metal-and-plastic exoskeleton has become the first nursing-care robot certified under the draft standard, a ministry official said.

Battery-powered HAL, which detects muscle impulses to anticipate and support the user’s body movements, is designed to help the elderly with mobility or help hospital or nursing carers to lift patients.

Cyberdyne, based in Tsukuba, northeast of Tokyo, has so far leased some 330 suits to 150 hospitals, welfare and other facilities in Japan since 2010, at 178,000 yen ($1,950) per suit per year.

"It is very significant that Japan has obtained this certification before others in the world," said Yoshiyuki Sankai, the head of Cyberdyne.

The company is unrelated to the firm of the same name responsible for the cyborg assassin played by Arnold Schwarzenegger in the 1984 film “The Terminator”.

"This is a first step forward for Japan, the great robot nation, to send our message to the world about robots of the future," said Sankai, who is also a professor at Tsukuba University.

A different version of HAL — coincidentally the name of the evil supercomputer in Stanley Kubrick’s “2001: A Space Odyssey” — has been developed for workers who need to wear heavy radiation protection as part of the clean-up at the crippled Fukushima nuclear plant.

Industrial robots have long been used in Japan, and robo-suits are gradually making inroads into hospitals and retirement homes.

But critics say the government has been slow in creating a safety framework for such robots in a country whose rapidly-ageing population is expected to enjoy ever longer lives.

Filed under robots robotics HAL robot suit HAL rehabilitation science

167 notes

Researchers build robotic bat wing
Researchers at Brown University have developed a robotic bat wing that is providing valuable new information about dynamics of flapping flight in real bats.
The robot, which mimics the wing shape and motion of the lesser dog-faced fruit bat, is designed to flap while attached to a force transducer in a wind tunnel. As the lifelike wing flaps, the force transducer records the aerodynamic forces generated by the moving wing. By measuring the power output of the three servo motors that control the robot’s seven movable joints, researchers can evaluate the energy required to execute wing movements.
Testing showed the robot can match the basic flight parameters of bats, producing enough thrust to overcome drag and enough lift to carry the weight of the model species.
A paper describing the robot and presenting results from preliminary experiments is published in the journal Bioinspiration and Biomimetics. The work was done in labs of Brown professors Kenneth Breuer and Sharon Swartz, who are the senior authors on the paper. Breuer, an engineer, and Swartz, a biologist, have studied bat flight and anatomy for years.
The faux flapper generates data that could never be collected directly from live animals, said Joseph Bahlman, a graduate student at Brown who led the project. Bats can’t fly when connected to instruments that record aerodynamic forces directly, so that isn’t an option — and bats don’t take requests.
“We can’t ask a bat to flap at a frequency of eight hertz then raise it to nine hertz so we can see what difference that makes,” Bahlman said. “They don’t really cooperate that way.”
But the model does exactly what the researchers want it to do. They can control each of its movement capabilities — kinematic parameters — individually. That way they can adjust one parameter while keeping the rest constant to isolate the effects.
“We can answer questions like, ‘Does increasing wing beat frequency improve lift and what’s the energetic cost of doing that?’” Bahlman said. “We can directly measure the relationship between these kinematic parameters, aerodynamic forces, and energetics.”
Detailed experimental results from the robot will be described in future research papers, but this first paper includes some preliminary results from a few case studies.

Researchers build robotic bat wing

Researchers at Brown University have developed a robotic bat wing that is providing valuable new information about dynamics of flapping flight in real bats.

The robot, which mimics the wing shape and motion of the lesser dog-faced fruit bat, is designed to flap while attached to a force transducer in a wind tunnel. As the lifelike wing flaps, the force transducer records the aerodynamic forces generated by the moving wing. By measuring the power output of the three servo motors that control the robot’s seven movable joints, researchers can evaluate the energy required to execute wing movements.

Testing showed the robot can match the basic flight parameters of bats, producing enough thrust to overcome drag and enough lift to carry the weight of the model species.

A paper describing the robot and presenting results from preliminary experiments is published in the journal Bioinspiration and Biomimetics. The work was done in labs of Brown professors Kenneth Breuer and Sharon Swartz, who are the senior authors on the paper. Breuer, an engineer, and Swartz, a biologist, have studied bat flight and anatomy for years.

The faux flapper generates data that could never be collected directly from live animals, said Joseph Bahlman, a graduate student at Brown who led the project. Bats can’t fly when connected to instruments that record aerodynamic forces directly, so that isn’t an option — and bats don’t take requests.

“We can’t ask a bat to flap at a frequency of eight hertz then raise it to nine hertz so we can see what difference that makes,” Bahlman said. “They don’t really cooperate that way.”

But the model does exactly what the researchers want it to do. They can control each of its movement capabilities — kinematic parameters — individually. That way they can adjust one parameter while keeping the rest constant to isolate the effects.

“We can answer questions like, ‘Does increasing wing beat frequency improve lift and what’s the energetic cost of doing that?’” Bahlman said. “We can directly measure the relationship between these kinematic parameters, aerodynamic forces, and energetics.”

Detailed experimental results from the robot will be described in future research papers, but this first paper includes some preliminary results from a few case studies.

Filed under robobat bats robotics robots wing movements neuroscience technology science

23 notes

Real Angry Birds Flip ‘the Bird’ Before a Fight
Male sparrows are capable of fighting to the death. But a new study shows that they often wave their wings wildly first in an attempt to avoid a dangerous brawl.
"For birds, wing waves are like flipping the bird or saying ‘put up your dukes. I’m ready to fight,’" said Duke biologist Rindy Anderson.
Male swamp sparrows use wing waves as an aggressive signal to defend their territories and mates from intruding males, Anderson said. The findings also are a first step toward understanding how the birds use a combination of visual displays and songs to communicate with other males.
Anderson and her colleagues published the results online Jan. 28 in the journal Behavioral Ecology and Sociobiology.
Scientists had assumed the sparrows’ wing-waving behavior was a signal intended for other males, but testing the observations was difficult, Anderson said. So she and her co-author, former Duke engineering undergraduate student David Piech (‘12), built a miniature computer and some robotics, which the team then stuffed into the body cavity of a deceased bird. The result was a ‘robosparrow’ that looked just like a male swamp sparrow, which could flip its wings just like a live male.
Anderson took the wing-waving robosparrow to a swamp sparrow breeding ground in Pennsylvania and placed it in the territories of live males. The robotic bird “sang” swamp sparrow songs using a nearby sound system to let the birds know he was intruding, while Anderson and her colleagues crouched in the swampy grasses and watched the live birds’ responses. She also performed the tests with a stuffed sparrow that stayed stationary and one that twisted from side to side. These tests showed that wing waves combined with song are more potent than song on its own, and that wing waves in particular, not just any movement, evoked aggression from live birds.
The live birds responded most aggressively to the invading, wing-waving robotic sparrow, which Anderson said she expected. “What I didn’€™t expect to see was that the birds would give strikingly similar aggressive wing-wave signals to the three types of invaders,” she said. That means that if a bird wing-waved five times to the stationary stuffed bird, he would also wing-wave five times to the wing-waving robot.
Anderson had hypothesized that the defending birds would match the signals of the intruding robots, but her team’s results suggest that the males are more individualistic and consistent in the level of aggressiveness that they want to signal, she said.
"That response makes sense, in retrospect, since attacks can be devastating," Anderson said. Because of the risk, the real males may only want to signal a certain level of aggression to see if they could scare off an intruder without the conflict coming to a fight and possible death.

Real Angry Birds Flip ‘the Bird’ Before a Fight

Male sparrows are capable of fighting to the death. But a new study shows that they often wave their wings wildly first in an attempt to avoid a dangerous brawl.

"For birds, wing waves are like flipping the bird or saying ‘put up your dukes. I’m ready to fight,’" said Duke biologist Rindy Anderson.

Male swamp sparrows use wing waves as an aggressive signal to defend their territories and mates from intruding males, Anderson said. The findings also are a first step toward understanding how the birds use a combination of visual displays and songs to communicate with other males.

Anderson and her colleagues published the results online Jan. 28 in the journal Behavioral Ecology and Sociobiology.

Scientists had assumed the sparrows’ wing-waving behavior was a signal intended for other males, but testing the observations was difficult, Anderson said. So she and her co-author, former Duke engineering undergraduate student David Piech (‘12), built a miniature computer and some robotics, which the team then stuffed into the body cavity of a deceased bird. The result was a ‘robosparrow’ that looked just like a male swamp sparrow, which could flip its wings just like a live male.

Anderson took the wing-waving robosparrow to a swamp sparrow breeding ground in Pennsylvania and placed it in the territories of live males. The robotic bird “sang” swamp sparrow songs using a nearby sound system to let the birds know he was intruding, while Anderson and her colleagues crouched in the swampy grasses and watched the live birds’ responses. She also performed the tests with a stuffed sparrow that stayed stationary and one that twisted from side to side. These tests showed that wing waves combined with song are more potent than song on its own, and that wing waves in particular, not just any movement, evoked aggression from live birds.

The live birds responded most aggressively to the invading, wing-waving robotic sparrow, which Anderson said she expected. “What I didn’€™t expect to see was that the birds would give strikingly similar aggressive wing-wave signals to the three types of invaders,” she said. That means that if a bird wing-waved five times to the stationary stuffed bird, he would also wing-wave five times to the wing-waving robot.

Anderson had hypothesized that the defending birds would match the signals of the intruding robots, but her team’s results suggest that the males are more individualistic and consistent in the level of aggressiveness that they want to signal, she said.

"That response makes sense, in retrospect, since attacks can be devastating," Anderson said. Because of the risk, the real males may only want to signal a certain level of aggression to see if they could scare off an intruder without the conflict coming to a fight and possible death.

Filed under robosparrow animal behavior robotics robots aggression aggressive communication wing waves biology neuroscience science

69 notes

Insects inspiring new technology
Scientists from the University of Lincoln and Newcastle University have created a computerised system which allows for autonomous navigation of mobile robots based on the locust’s unique visual system.
The work could provide the blueprint for the development of highly accurate vehicle collision sensors, surveillance technology and even aid video game programming according to the research published today.
Locusts have a distinctive way of processing information through electrical and chemical signals, giving them an extremely fast and accurate warning system for impending collisions.
The insect has incredibly powerful data processing systems built into its biology, which can in theory be recreated in robotics.
Inspired by the visual processing power built into these insects’ biology, Professor Shigang Yue from the University of Lincoln’s School of Computer Science and Dr Claire Rind from Newcastle University’s Institute of Neuroscience created the computerised system.
Their findings are published in the International Journal of Advanced Mechatronic Systems.
The research started by understanding the anatomy, responses and development of the circuits in the locust brain that allow it to detect approaching objects and avoid them when in flight or on the ground.
A visually stimulated motor control (VSMC) system was then created which consists of two movement detector types and a simple motor command generator. Each detector processes images and extracts relevant visual clues which are then converted into motor commands.
Prof Yue said: “We were inspired by the way the locusts’ visual system works when interacting with the outside world and the potential to simulate such complex systems in software and hardware for various applications. We created a system inspired by the locusts’ motion sensitive interneuron – the lobula giant movement detector. This system was then used in a robot to enable it to explore paths or interact with objects, effectively using visual input only.”
Funded by the European Union’s Seventh Framework Programme (FP7), the research was carried out as part of a collaborative project with the University of Hamburg in Germany and Tsinghua University and Xi’an Jiaotong University, China.

Insects inspiring new technology

Scientists from the University of Lincoln and Newcastle University have created a computerised system which allows for autonomous navigation of mobile robots based on the locust’s unique visual system.

The work could provide the blueprint for the development of highly accurate vehicle collision sensors, surveillance technology and even aid video game programming according to the research published today.

Locusts have a distinctive way of processing information through electrical and chemical signals, giving them an extremely fast and accurate warning system for impending collisions.

The insect has incredibly powerful data processing systems built into its biology, which can in theory be recreated in robotics.

Inspired by the visual processing power built into these insects’ biology, Professor Shigang Yue from the University of Lincoln’s School of Computer Science and Dr Claire Rind from Newcastle University’s Institute of Neuroscience created the computerised system.

Their findings are published in the International Journal of Advanced Mechatronic Systems.

The research started by understanding the anatomy, responses and development of the circuits in the locust brain that allow it to detect approaching objects and avoid them when in flight or on the ground.

A visually stimulated motor control (VSMC) system was then created which consists of two movement detector types and a simple motor command generator. Each detector processes images and extracts relevant visual clues which are then converted into motor commands.

Prof Yue said: “We were inspired by the way the locusts’ visual system works when interacting with the outside world and the potential to simulate such complex systems in software and hardware for various applications. We created a system inspired by the locusts’ motion sensitive interneuron – the lobula giant movement detector. This system was then used in a robot to enable it to explore paths or interact with objects, effectively using visual input only.”

Funded by the European Union’s Seventh Framework Programme (FP7), the research was carried out as part of a collaborative project with the University of Hamburg in Germany and Tsinghua University and Xi’an Jiaotong University, China.

Filed under robots robotics mobile robots navigation locust visual stimulation neural networks neuroscience science

85 notes

Lessons from cockroaches could inform robotics
Running cockroaches start to recover from being shoved sideways before their dawdling nervous system kicks in to tell their legs what to do, researchers have found. These new insights on how biological systems stabilize could one day help engineers design steadier robots and improve doctors’ understanding of human gait abnormalities.
In experiments, the roaches were able to maintain their footing mechanically—using their momentum and the spring-like architecture of their legs, rather than neurologically, relying on impulses sent from their central nervous system to their muscles.
"The response time we observed is more than three times longer than you’d expect," said Shai Revzen, an assistant professor of electrical engineering and computer science, as well as ecology and evolutionary biology, at the University of Michigan. Revzen is the lead author of a paper on the findings published online in Biological Cybernetics. It will appear in a forthcoming print edition.
"What we see is that the animals’ nervous system is working at a substantial delay," he said. "It could potentially act a lot sooner, within about a thirtieth of a second, but instead, it kicks in after about a step and a half or two steps—about a tenth of a second. For some reason, the nervous system is waiting and seeing how it shapes out."
Revzen said the new findings might imply that the biological brain, at least in cockroaches, adjusts the gait only at whole-step intervals rather than at any point in a step. Periodic, rather than continuous, feedback systems might lead to more stable (not to mention energy-efficient) walking robots—whether they travel on two feet or six.
Robot makers often look to nature for inspiration. As animals move through the world, they have to respond to unexpected disturbances like rocky, uneven ground or damaged limbs. Revzen and his team believe that patterns in how they move as they adjust could give away how their machinery and neurology work together.
"The fundamental question is, ‘What can you do with a mechanical suspension versus one that requires electronic feedback?" Revzen said. "The animals obviously have much better mechanical designs than anything we know how to build. But if we could learn how they do it, we might be able to reproduce it."

Lessons from cockroaches could inform robotics

Running cockroaches start to recover from being shoved sideways before their dawdling nervous system kicks in to tell their legs what to do, researchers have found. These new insights on how biological systems stabilize could one day help engineers design steadier robots and improve doctors’ understanding of human gait abnormalities.

In experiments, the roaches were able to maintain their footing mechanically—using their momentum and the spring-like architecture of their legs, rather than neurologically, relying on impulses sent from their central nervous system to their muscles.

"The response time we observed is more than three times longer than you’d expect," said Shai Revzen, an assistant professor of electrical engineering and computer science, as well as ecology and evolutionary biology, at the University of Michigan. Revzen is the lead author of a paper on the findings published online in Biological Cybernetics. It will appear in a forthcoming print edition.

"What we see is that the animals’ nervous system is working at a substantial delay," he said. "It could potentially act a lot sooner, within about a thirtieth of a second, but instead, it kicks in after about a step and a half or two steps—about a tenth of a second. For some reason, the nervous system is waiting and seeing how it shapes out."

Revzen said the new findings might imply that the biological brain, at least in cockroaches, adjusts the gait only at whole-step intervals rather than at any point in a step. Periodic, rather than continuous, feedback systems might lead to more stable (not to mention energy-efficient) walking robots—whether they travel on two feet or six.

Robot makers often look to nature for inspiration. As animals move through the world, they have to respond to unexpected disturbances like rocky, uneven ground or damaged limbs. Revzen and his team believe that patterns in how they move as they adjust could give away how their machinery and neurology work together.

"The fundamental question is, ‘What can you do with a mechanical suspension versus one that requires electronic feedback?" Revzen said. "The animals obviously have much better mechanical designs than anything we know how to build. But if we could learn how they do it, we might be able to reproduce it."

Filed under robots robotics cockroaches gait disorders neuroscience technology science

free counters