Neuroscience

Articles and news from the latest research reports.

Posts tagged prosthetics

392 notes

Mind-controlled prosthetic arms that work in daily life are now a reality
In January 2013 a Swedish arm amputee was the first person in the world to receive a prosthesis with a direct connection to bone, nerves and muscles. An article about this achievement and its long-term stability will now be published in the Science Translational Medicine journal.
“Going beyond the lab to allow the patient to face real-world challenges is the main contribution of this work,” says Max Ortiz Catalan, research scientist at Chalmers University of Technology and leading author of the publication.
“We have used osseointegration to create a long-term stable fusion between man and machine, where we have integrated them at different levels. The artificial arm is directly attached to the skeleton, thus providing mechanical stability. Then the human’s biological control system, that is nerves and muscles, is also interfaced to the machine’s control system via neuromuscular electrodes. This creates an intimate union between the body and the machine; between biology and mechatronics.”
The direct skeletal attachment is created by what is known as osseointegration, a technology in limb prostheses pioneered by associate professor Rickard Brånemark and his colleagues at Sahlgrenska University Hospital. Rickard Brånemark led the surgical implantation and collaborated closely with Max Ortiz Catalan and Professor Bo Håkansson at Chalmers University of Technology on this project.
The patient’s arm was amputated over ten years ago. Before the surgery, his prosthesis was controlled via electrodes placed over the skin. Robotic prostheses can be very advanced, but such a control system makes them unreliable and limits their functionality, and patients commonly reject them as a result.
Now, the patient has been given a control system that is directly connected to his own. He has a physically challenging job as a truck driver in northern Sweden, and since the surgery he has experienced that he can cope with all the situations he faces; everything from clamping his trailer load and operating machinery, to unpacking eggs and tying his children’s skates, regardless of the environmental conditions (read more about the benefits of the new technology below).
The patient is also one of the first in the world to take part in an effort to achieve long-term sensation via the prosthesis. Because the implant is a bidirectional interface, it can also be used to send signals in the opposite direction – from the prosthetic arm to the brain. This is the researchers’ next step, to clinically implement their findings on sensory feedback.
“Reliable communication between the prosthesis and the body has been the missing link for the clinical implementation of neural control and sensory feedback, and this is now in place,” says Max Ortiz Catalan. “So far we have shown that the patient has a long-term stable ability to perceive touch in different locations in the missing hand. Intuitive sensory feedback and control are crucial for interacting with the environment, for example to reliably hold an object despite disturbances or uncertainty. Today, no patient walks around with a prosthesis that provides such information, but we are working towards changing that in the very short term.”
The researchers plan to treat more patients with the novel technology later this year.
“We see this technology as an important step towards more natural control of artificial limbs,” says Max Ortiz Catalan. “It is the missing link for allowing sophisticated neural interfaces to control sophisticated prostheses. So far, this has only been possible in short experiments within controlled environments.”

Mind-controlled prosthetic arms that work in daily life are now a reality

In January 2013 a Swedish arm amputee was the first person in the world to receive a prosthesis with a direct connection to bone, nerves and muscles. An article about this achievement and its long-term stability will now be published in the Science Translational Medicine journal.

“Going beyond the lab to allow the patient to face real-world challenges is the main contribution of this work,” says Max Ortiz Catalan, research scientist at Chalmers University of Technology and leading author of the publication.

“We have used osseointegration to create a long-term stable fusion between man and machine, where we have integrated them at different levels. The artificial arm is directly attached to the skeleton, thus providing mechanical stability. Then the human’s biological control system, that is nerves and muscles, is also interfaced to the machine’s control system via neuromuscular electrodes. This creates an intimate union between the body and the machine; between biology and mechatronics.”

The direct skeletal attachment is created by what is known as osseointegration, a technology in limb prostheses pioneered by associate professor Rickard Brånemark and his colleagues at Sahlgrenska University Hospital. Rickard Brånemark led the surgical implantation and collaborated closely with Max Ortiz Catalan and Professor Bo Håkansson at Chalmers University of Technology on this project.

The patient’s arm was amputated over ten years ago. Before the surgery, his prosthesis was controlled via electrodes placed over the skin. Robotic prostheses can be very advanced, but such a control system makes them unreliable and limits their functionality, and patients commonly reject them as a result.

Now, the patient has been given a control system that is directly connected to his own. He has a physically challenging job as a truck driver in northern Sweden, and since the surgery he has experienced that he can cope with all the situations he faces; everything from clamping his trailer load and operating machinery, to unpacking eggs and tying his children’s skates, regardless of the environmental conditions (read more about the benefits of the new technology below).

The patient is also one of the first in the world to take part in an effort to achieve long-term sensation via the prosthesis. Because the implant is a bidirectional interface, it can also be used to send signals in the opposite direction – from the prosthetic arm to the brain. This is the researchers’ next step, to clinically implement their findings on sensory feedback.

“Reliable communication between the prosthesis and the body has been the missing link for the clinical implementation of neural control and sensory feedback, and this is now in place,” says Max Ortiz Catalan. “So far we have shown that the patient has a long-term stable ability to perceive touch in different locations in the missing hand. Intuitive sensory feedback and control are crucial for interacting with the environment, for example to reliably hold an object despite disturbances or uncertainty. Today, no patient walks around with a prosthesis that provides such information, but we are working towards changing that in the very short term.”

The researchers plan to treat more patients with the novel technology later this year.

“We see this technology as an important step towards more natural control of artificial limbs,” says Max Ortiz Catalan. “It is the missing link for allowing sophisticated neural interfaces to control sophisticated prostheses. So far, this has only been possible in short experiments within controlled environments.”

Filed under prosthetics artificial limbs sensory perception osseointegration neuroscience science

420 notes

Amputees discern familiar sensations across prosthetic hand
Even before he lost his right hand to an industrial accident 4 years ago, Igor Spetic had family open his medicine bottles. Cotton balls give him goose bumps.
Now, blindfolded during an experiment, he feels his arm hairs rise when a researcher brushes the back of his prosthetic hand with a cotton ball.
Spetic, of course, can’t feel the ball. But patterns of electric signals are sent by a computer into nerves in his arm and to his brain, which tells him different. “I knew immediately it was cotton,” he said.
That’s one of several types of sensation Spetic, of Madison, Ohio, can feel with the prosthetic system being developed by Case Western Reserve University and the Louis Stokes Cleveland Veterans Affairs Medical Center.
Spetic was excited just to “feel” again, and quickly received an unexpected benefit. The phantom pain he’d suffered, which he’s described as a vice crushing his closed fist, subsided almost completely. A second patient, who had less phantom pain after losing his right hand and much of his forearm in an accident, said his, too, is nearly gone.
Read more

Amputees discern familiar sensations across prosthetic hand

Even before he lost his right hand to an industrial accident 4 years ago, Igor Spetic had family open his medicine bottles. Cotton balls give him goose bumps.

Now, blindfolded during an experiment, he feels his arm hairs rise when a researcher brushes the back of his prosthetic hand with a cotton ball.

Spetic, of course, can’t feel the ball. But patterns of electric signals are sent by a computer into nerves in his arm and to his brain, which tells him different. “I knew immediately it was cotton,” he said.

That’s one of several types of sensation Spetic, of Madison, Ohio, can feel with the prosthetic system being developed by Case Western Reserve University and the Louis Stokes Cleveland Veterans Affairs Medical Center.

Spetic was excited just to “feel” again, and quickly received an unexpected benefit. The phantom pain he’d suffered, which he’s described as a vice crushing his closed fist, subsided almost completely. A second patient, who had less phantom pain after losing his right hand and much of his forearm in an accident, said his, too, is nearly gone.

Read more

Filed under prosthetics prosthetic arm sense of touch haptic sensation phantom pain neuroscience science

541 notes

New prosthetic arm controlled by neural messages 
This design hopes to identify the memory of movement in the amputee’s brain to translate to an order allowing manipulation of the device.
Controlling a prosthetic arm by just imagining a motion may be possible through the work of Mexican scientists at the Centre for Research and Advanced Studies (CINVESTAV), who work in the development of an arm replacement to identify movement patterns from brain signals.
First, it is necessary to know if there is a memory pattern to remember in the amputee’s brain in order to know how it moved and, thus, translating it to instructions for the prosthesis,” says Roberto Muñoz Guerrero, researcher at the Department of Electrical Engineering and project leader at Cinvestav.
He explains that the electric signal won’t come from the muscles that form the stump, but from the movement patterns of the brain. “If this phase is successful, the patient would be able to move the prosthesis by imagining different movements.”
However, Muñoz Guerrero acknowledges this is not an easy task because the brain registers a wide range of activities that occur in the human body and from all of them, the movement pattern is tried to be drawn. “Therefore, the first step is to recall the patterns in the EEG and define there the memory that can be electrically recorded. Then we need to evaluate how sensitive the signal is to other external shocks, such as light or blinking.”
Regarding this, it should be noted that the prosthesis could only be used by individuals who once had their entire arm and was amputated because some accident or illness. Patients were able to move the arm naturally and stored in their memory the process that would apply for the use of the prosthesis.
According to the researcher, the prosthesis must be provided with a mechanical and electronic system, the elements necessary to activate it and a section that would interpret the brain signals. “Regarding the material with which it must be built, it has not yet been fully defined because it must weigh between two and three kilograms, which is similar to the missing arm’s weight.”
The unique prosthesis represents a new topic in bioelectronics called BCI (Brain Computer Interface), which is a direct communication pathway between the brain and an external device in order to help or repair sensory and motor functions. “An additional benefit is the ability to create motion paths for the prosthesis, which is not possible with commercial products,” says Muñoz Guerrero.

New prosthetic arm controlled by neural messages

This design hopes to identify the memory of movement in the amputee’s brain to translate to an order allowing manipulation of the device.

Controlling a prosthetic arm by just imagining a motion may be possible through the work of Mexican scientists at the Centre for Research and Advanced Studies (CINVESTAV), who work in the development of an arm replacement to identify movement patterns from brain signals.

First, it is necessary to know if there is a memory pattern to remember in the amputee’s brain in order to know how it moved and, thus, translating it to instructions for the prosthesis,” says Roberto Muñoz Guerrero, researcher at the Department of Electrical Engineering and project leader at Cinvestav.

He explains that the electric signal won’t come from the muscles that form the stump, but from the movement patterns of the brain. “If this phase is successful, the patient would be able to move the prosthesis by imagining different movements.”

However, Muñoz Guerrero acknowledges this is not an easy task because the brain registers a wide range of activities that occur in the human body and from all of them, the movement pattern is tried to be drawn. “Therefore, the first step is to recall the patterns in the EEG and define there the memory that can be electrically recorded. Then we need to evaluate how sensitive the signal is to other external shocks, such as light or blinking.”

Regarding this, it should be noted that the prosthesis could only be used by individuals who once had their entire arm and was amputated because some accident or illness. Patients were able to move the arm naturally and stored in their memory the process that would apply for the use of the prosthesis.

According to the researcher, the prosthesis must be provided with a mechanical and electronic system, the elements necessary to activate it and a section that would interpret the brain signals. “Regarding the material with which it must be built, it has not yet been fully defined because it must weigh between two and three kilograms, which is similar to the missing arm’s weight.”

The unique prosthesis represents a new topic in bioelectronics called BCI (Brain Computer Interface), which is a direct communication pathway between the brain and an external device in order to help or repair sensory and motor functions. “An additional benefit is the ability to create motion paths for the prosthesis, which is not possible with commercial products,” says Muñoz Guerrero.

Filed under BCI prosthetics prosthetic arm motor movement EEG neuroscience science

103 notes

(Image caption: A schematic of the interactions that occur between the saccade and reach brain systems when deciding where to look and reach. Credit: Bijan Pesaran, New York University)
Complexity of eye-hand coordination
People not only use their eyes to see, but also to move. It takes less than a fraction of a second to execute the loop that travels from the brain to the eyes, and then to the hands and/or arms. Bijan Pesaran is trying to figure out what occurs in the brain during this process.
"Eye-hand coordination is the result of a complex interplay between two systems of the brain, but there are many regions where this interaction takes place," says Pesaran, an associate professor of neural science at New York University. "One of the things about the current state of knowledge is that it is focused on the different pieces of the brain and how each works individually. Relatively little work has been done to link how they work together at the cellular level."
The thrust of his research involves studying how neurons in these parts of the brain communicate with one another.
"The cerebral cortex contains a mosaic of brain areas that are connected to form distributed networks," says the National Science Foundation (NSF)-funded scientist. "In the frontal and parietal cortex, these networks are specialized for movements such as saccadic (voluntary) eye movements and reaches, that is, hand and arm movements. Before each movement we decide to make, these areas contain specific patterns of neural activity which can be used to predict what we will do."
A more sophisticated understanding of the brain’s role in eye-hand coordination can be an important model for discovering how brain systems interact to carry out cognitive processes in general, he says. Such insights could lead to new neural technologies that translate thoughts into actions, for example, to control a robotic arm or prompt speech.
"There is a whole new set of technologies called neural prostheses," Pesaran says. "In the future, there could be devices in the brain that will help people remember, to think more clearly, and to help them move."
Using eye movements to prompt hand and arm movements involves building a spatial representation, “which is improved by moving our eyes,” he says. “The command that is sent to the eyes moves the eyes, which effectively measure space when they move, and that is used to improve the accuracy of the reach. We move our eyes to improve our movement, not just to see better.”
He often describes the behavior of high level ping pong players to explain how it works.
"You keep your eye on the ball so you know where it is, so you can hit it," he says. "But right up until the minute you hit the ball, something important is happening, which is that your brain is sending a command to your arm to hit the ball. But the visual signals are delayed. At the time you hit the ball, the vision of the ball won’t enter your brain for another fraction of a second, so there is no point in looking at the ball. You can look all you want, but your arm already has moved.
"When ping pong players are playing at a high level, they look at the ball up to the point where they hit it. As soon as the paddle makes contact with the ball, you can see their eyes and head turn to now look at their opponent. They think they are looking at their opponent when they are hitting the ball, but they are looking at ball. Their eyes are tracking the ball, even though they are aware of their opponent.
"This helps the brain keep a very high resolution of space to make the stroke more accurate," he continues. "It’s not about seeing the ball, because by then it’s too late. It’s about moving the eyes with the ball so that the stroke is more accurate. And the brain orchestrates this complicated pattern of behavior."
Visual signals always are delayed. They enter the brain, converted into a movement, and then leave the brain for the arm muscles. “It’s a loop that takes about 200 millisecond—about one-fifth of second—and in that time the ball is moved,” he says.
Pesaran is conducting his research under an NSF Faculty Early Career Development (CAREER) award, which he received in 2010. The award supports junior faculty who exemplify the role of teacher-scholars through outstanding research, excellent education and the integration of education and research within the context of the mission of their organization.
To prove his hypothesis that two regions in the brain (the parietal reach region and the parietal eye field, both in the parietal cortex) must talk to each other to prompt movement, Pesaran and his team are recording the activity of neurons, brain cells that send electrical signals to each other called “spikes.” They do so by placing micro-electrodes into the brains of animals that look and reach, much like humans, and study the correlation and patterns in those signals.
"We think we can measure these signals when they are leaving one area, and coming into another," he says. "How does this show that this reflects communication between those two areas? Because something happens, something changes. We set up these movements in a particular way that requires communication between the eye and the arm centers, and we then made measurements in the brain from those centers. Then we linked the changes in the activity between the two areas to the changes in how the eyes and arm move."
As part of the grant’s educational component, Pesaran is trying to show youngsters how far neuroscience has come, and encourage them to learn about it. He and his colleagues are working with middle school children in Brooklyn, and have presented demonstrations at the American Museum of Natural History about the field of brain science.
"We go into schools and teach children about what we know about the brain," he says. "We had a brain computer interface, where they had the chance to control the cursor on the screen with their minds. We placed an EEG sensor on their heads, which measures brain activity. When they concentrate, it changes the position of the ball, and moves it up or down."
School children typically are unaware of neuroscience as an emerging field “that involves medicine, biology, engineering, a whole range of disciplines that come together,” he says. “Increasing their sophistication and tools in this discipline early will be a hallmark of the next generation of brain scientists.”

(Image caption: A schematic of the interactions that occur between the saccade and reach brain systems when deciding where to look and reach. Credit: Bijan Pesaran, New York University)

Complexity of eye-hand coordination

People not only use their eyes to see, but also to move. It takes less than a fraction of a second to execute the loop that travels from the brain to the eyes, and then to the hands and/or arms. Bijan Pesaran is trying to figure out what occurs in the brain during this process.

"Eye-hand coordination is the result of a complex interplay between two systems of the brain, but there are many regions where this interaction takes place," says Pesaran, an associate professor of neural science at New York University. "One of the things about the current state of knowledge is that it is focused on the different pieces of the brain and how each works individually. Relatively little work has been done to link how they work together at the cellular level."

The thrust of his research involves studying how neurons in these parts of the brain communicate with one another.

"The cerebral cortex contains a mosaic of brain areas that are connected to form distributed networks," says the National Science Foundation (NSF)-funded scientist. "In the frontal and parietal cortex, these networks are specialized for movements such as saccadic (voluntary) eye movements and reaches, that is, hand and arm movements. Before each movement we decide to make, these areas contain specific patterns of neural activity which can be used to predict what we will do."

A more sophisticated understanding of the brain’s role in eye-hand coordination can be an important model for discovering how brain systems interact to carry out cognitive processes in general, he says. Such insights could lead to new neural technologies that translate thoughts into actions, for example, to control a robotic arm or prompt speech.

"There is a whole new set of technologies called neural prostheses," Pesaran says. "In the future, there could be devices in the brain that will help people remember, to think more clearly, and to help them move."

Using eye movements to prompt hand and arm movements involves building a spatial representation, “which is improved by moving our eyes,” he says. “The command that is sent to the eyes moves the eyes, which effectively measure space when they move, and that is used to improve the accuracy of the reach. We move our eyes to improve our movement, not just to see better.”

He often describes the behavior of high level ping pong players to explain how it works.

"You keep your eye on the ball so you know where it is, so you can hit it," he says. "But right up until the minute you hit the ball, something important is happening, which is that your brain is sending a command to your arm to hit the ball. But the visual signals are delayed. At the time you hit the ball, the vision of the ball won’t enter your brain for another fraction of a second, so there is no point in looking at the ball. You can look all you want, but your arm already has moved.

"When ping pong players are playing at a high level, they look at the ball up to the point where they hit it. As soon as the paddle makes contact with the ball, you can see their eyes and head turn to now look at their opponent. They think they are looking at their opponent when they are hitting the ball, but they are looking at ball. Their eyes are tracking the ball, even though they are aware of their opponent.

"This helps the brain keep a very high resolution of space to make the stroke more accurate," he continues. "It’s not about seeing the ball, because by then it’s too late. It’s about moving the eyes with the ball so that the stroke is more accurate. And the brain orchestrates this complicated pattern of behavior."

Visual signals always are delayed. They enter the brain, converted into a movement, and then leave the brain for the arm muscles. “It’s a loop that takes about 200 millisecond—about one-fifth of second—and in that time the ball is moved,” he says.

Pesaran is conducting his research under an NSF Faculty Early Career Development (CAREER) award, which he received in 2010. The award supports junior faculty who exemplify the role of teacher-scholars through outstanding research, excellent education and the integration of education and research within the context of the mission of their organization.

To prove his hypothesis that two regions in the brain (the parietal reach region and the parietal eye field, both in the parietal cortex) must talk to each other to prompt movement, Pesaran and his team are recording the activity of neurons, brain cells that send electrical signals to each other called “spikes.” They do so by placing micro-electrodes into the brains of animals that look and reach, much like humans, and study the correlation and patterns in those signals.

"We think we can measure these signals when they are leaving one area, and coming into another," he says. "How does this show that this reflects communication between those two areas? Because something happens, something changes. We set up these movements in a particular way that requires communication between the eye and the arm centers, and we then made measurements in the brain from those centers. Then we linked the changes in the activity between the two areas to the changes in how the eyes and arm move."

As part of the grant’s educational component, Pesaran is trying to show youngsters how far neuroscience has come, and encourage them to learn about it. He and his colleagues are working with middle school children in Brooklyn, and have presented demonstrations at the American Museum of Natural History about the field of brain science.

"We go into schools and teach children about what we know about the brain," he says. "We had a brain computer interface, where they had the chance to control the cursor on the screen with their minds. We placed an EEG sensor on their heads, which measures brain activity. When they concentrate, it changes the position of the ball, and moves it up or down."

School children typically are unaware of neuroscience as an emerging field “that involves medicine, biology, engineering, a whole range of disciplines that come together,” he says. “Increasing their sophistication and tools in this discipline early will be a hallmark of the next generation of brain scientists.”

Filed under eye-hand coordination eye movements parietal cortex prosthetics neural activity psychology neuroscience science

165 notes

By Restoring Sense of Touch to Amputees, HAPTIX Seeks to Overcome Physical and Psychological Effects of Upper Limb Loss
To understand the meaning of “proprioception,” try a simple experiment. Close your eyes and lift your right arm above your head. Then, move it down so that it’s parallel to the ground. Make a fist and release it. Move it forward, and then swing it around behind you like you’re stretching. Finally, freeze in place, open your eyes, and look. Is your arm positioned where you thought it would be?
For most people, the answer will be, “Yes.” That’s because your brain and nervous system worked together to move your body according to your intent and processed the sensory feedback to know where your arm was in space despite not being able to visually track it.
For many upper-limb amputees using prosthetic devices, the answer would be, “No.” They wouldn’t have confidence that their device would be where they think it is because current prostheses lack provisions for providing complex tactile and proprioceptive feedback to the user. Without this feedback, even the most advanced prosthetic limbs will remain numb to the user and manipulation functions will be impaired.
DARPA’s new Hand Proprioception and Touch Interfaces (HAPTIX) program seeks to deliver those kinds of naturalistic sensations to amputees, and in the process, enable intuitive, dexterous control of advanced prosthetic devices that substitute for amputated limbs, provide the psychological benefit of improving prosthesis “embodiment,” and reduce phantom limb pain. The program builds on neural-interface technologies advanced during DARPA’s Revolutionizing Prosthetics and Reliable Neural-Interface Technology (RE-NET) programs that made major steps forward in providing a direct and powerful link between user intent and prosthesis control.
HAPTIX aims to achieve its goals by developing interface systems that measure and decode motor signals recorded in peripheral nerves and/or muscles. The program will adapt one of the advanced prosthetic limb systems developed under Revolutionizing Prosthetics to incorporate sensors that provide tactile and proprioceptive feedback to the user, delivered through patterned stimulation of sensory pathways in the peripheral nerve. One of the key challenges will be to identify stimulation patterning strategies that elicit naturalistic sensations of touch and movement. The ultimate goal is to create a fully-implantable device that is safe, reliable, effective, and approved for human use.
“Peripheral nerves are information-rich and readily accessible targets for interfacing with the human nervous system. Research performed under DARPA’s RE-NET program and elsewhere showed that these nerves maintain motor and sensory fibers that previously innervated the amputated limb, and that these fibers remain functional for decades after limb loss,” said Doug Weber, the DARPA program manager. “HAPTIX will try to tap in to these biological communication pathways so that users can control and sense the prosthesis via the same neural signaling pathways used for intact hands and arms.”
In addition to the improved motor performance that restored touch and proprioception would convey to the user, mounting evidence suggests that sensory stimulation in amputees may provide important psychological benefits such as improving prosthesis “embodiment” and reducing the phantom limb pain that is suffered by approximately 80 percent of amputees. For this reason, DARPA seeks the inclusion of psychologists in the multi-disciplinary teams of scientists, engineers, and clinicians proposing to develop the electrodes, algorithms, and electronics technology components for the HAPTIX system. Teams will need to consider how the use of HAPTIX system may impact the user in several important domains including motor and sensory function, psychology, pain, and quality of life.
“We have the opportunity to not only significantly improve an amputee’s ability to control a prosthetic limb, but to make a profound, positive psychological impact,” Weber said. “Amputees view existing prostheses as if they were tools, like a wrench, used only to perform a specific job, so many people abandon their prostheses unless absolutely needed. We believe that HAPTIX will create a sensory experience so rich and vibrant that the user will want to wear his or her prosthesis full-time and accept it as a natural extension of the body. If we can achieve that, DARPA is even closer to fulfilling its commitment to help restore full and natural functionality to wounded service members.”
The program plan culminates with a 12-month, take-home trial of the complete HAPTIX prosthesis system. To aid performers in the completion of the steps necessary to achieve regulatory approvals for human trials, DARPA consulted with the U.S Food and Drug Administration to incorporate regulatory timelines into the program process.
“Once development of the HAPTIX system is complete, we want people to benefit immediately and be able to use their limb all day, every day, and in every aspect of their lives,” Weber said. “The experience needs to be comfortable and easy. Take-home trials are the first step in making that vision a reality.”
If it is successful, the HAPTIX program will create fully-implantable, modular, and reconfigurable neural-interface microsystems that communicate wirelessly with external modules, such as the prosthesis interface link. Because such technology would have broad application and could fuel future medical devices, HAPTIX also plans to fund teams to pursue the science and technology that would support next-generation HAPTIX capabilities.
Full details of the HAPTIX opportunity are available on the Federal Business Opportunities website at: http://go.usa.gov/kyjJ.

By Restoring Sense of Touch to Amputees, HAPTIX Seeks to Overcome Physical and Psychological Effects of Upper Limb Loss

To understand the meaning of “proprioception,” try a simple experiment. Close your eyes and lift your right arm above your head. Then, move it down so that it’s parallel to the ground. Make a fist and release it. Move it forward, and then swing it around behind you like you’re stretching. Finally, freeze in place, open your eyes, and look. Is your arm positioned where you thought it would be?

For most people, the answer will be, “Yes.” That’s because your brain and nervous system worked together to move your body according to your intent and processed the sensory feedback to know where your arm was in space despite not being able to visually track it.

For many upper-limb amputees using prosthetic devices, the answer would be, “No.” They wouldn’t have confidence that their device would be where they think it is because current prostheses lack provisions for providing complex tactile and proprioceptive feedback to the user. Without this feedback, even the most advanced prosthetic limbs will remain numb to the user and manipulation functions will be impaired.

DARPA’s new Hand Proprioception and Touch Interfaces (HAPTIX) program seeks to deliver those kinds of naturalistic sensations to amputees, and in the process, enable intuitive, dexterous control of advanced prosthetic devices that substitute for amputated limbs, provide the psychological benefit of improving prosthesis “embodiment,” and reduce phantom limb pain. The program builds on neural-interface technologies advanced during DARPA’s Revolutionizing Prosthetics and Reliable Neural-Interface Technology (RE-NET) programs that made major steps forward in providing a direct and powerful link between user intent and prosthesis control.

HAPTIX aims to achieve its goals by developing interface systems that measure and decode motor signals recorded in peripheral nerves and/or muscles. The program will adapt one of the advanced prosthetic limb systems developed under Revolutionizing Prosthetics to incorporate sensors that provide tactile and proprioceptive feedback to the user, delivered through patterned stimulation of sensory pathways in the peripheral nerve. One of the key challenges will be to identify stimulation patterning strategies that elicit naturalistic sensations of touch and movement. The ultimate goal is to create a fully-implantable device that is safe, reliable, effective, and approved for human use.

“Peripheral nerves are information-rich and readily accessible targets for interfacing with the human nervous system. Research performed under DARPA’s RE-NET program and elsewhere showed that these nerves maintain motor and sensory fibers that previously innervated the amputated limb, and that these fibers remain functional for decades after limb loss,” said Doug Weber, the DARPA program manager. “HAPTIX will try to tap in to these biological communication pathways so that users can control and sense the prosthesis via the same neural signaling pathways used for intact hands and arms.”

In addition to the improved motor performance that restored touch and proprioception would convey to the user, mounting evidence suggests that sensory stimulation in amputees may provide important psychological benefits such as improving prosthesis “embodiment” and reducing the phantom limb pain that is suffered by approximately 80 percent of amputees. For this reason, DARPA seeks the inclusion of psychologists in the multi-disciplinary teams of scientists, engineers, and clinicians proposing to develop the electrodes, algorithms, and electronics technology components for the HAPTIX system. Teams will need to consider how the use of HAPTIX system may impact the user in several important domains including motor and sensory function, psychology, pain, and quality of life.

“We have the opportunity to not only significantly improve an amputee’s ability to control a prosthetic limb, but to make a profound, positive psychological impact,” Weber said. “Amputees view existing prostheses as if they were tools, like a wrench, used only to perform a specific job, so many people abandon their prostheses unless absolutely needed. We believe that HAPTIX will create a sensory experience so rich and vibrant that the user will want to wear his or her prosthesis full-time and accept it as a natural extension of the body. If we can achieve that, DARPA is even closer to fulfilling its commitment to help restore full and natural functionality to wounded service members.”

The program plan culminates with a 12-month, take-home trial of the complete HAPTIX prosthesis system. To aid performers in the completion of the steps necessary to achieve regulatory approvals for human trials, DARPA consulted with the U.S Food and Drug Administration to incorporate regulatory timelines into the program process.

“Once development of the HAPTIX system is complete, we want people to benefit immediately and be able to use their limb all day, every day, and in every aspect of their lives,” Weber said. “The experience needs to be comfortable and easy. Take-home trials are the first step in making that vision a reality.”

If it is successful, the HAPTIX program will create fully-implantable, modular, and reconfigurable neural-interface microsystems that communicate wirelessly with external modules, such as the prosthesis interface link. Because such technology would have broad application and could fuel future medical devices, HAPTIX also plans to fund teams to pursue the science and technology that would support next-generation HAPTIX capabilities.

Full details of the HAPTIX opportunity are available on the Federal Business Opportunities website at: http://go.usa.gov/kyjJ.

Filed under proprioception prosthetics HAPTIX phantom limb pain amputation neuroscience science

219 notes

Silicon-based probe microstructure could underpin safer neural implants

Neural probe arrays are expected to significantly benefit the lives of amputees and people affected by spinal cord injuries or severe neuromotor diseases. By providing a direct route of communication between the brain and artificial limbs, these arrays record and stimulate neurons in the cerebral cortex.

image

(Image caption: The compact neural probe array consists of a three-dimensional probe array, a custom 100-channel neural recording chip and a flexible polyimide polymer cable. Credit: A*STAR Institute of Microelectronics)

The need for neural probe arrays that are compact, reliable and deliver high performance has prompted researchers to use microfabrication techniques to manufacture probe arrays. Now, a team led by Ming-Yuan Cheng from the A*STAR Institute of Microelectronics, Singapore, has developed a three-dimensional probe array for chronic and long-term implantation in the brain. This array is compact enough to freely float along with the brain when implanted on the cortex.

The neural probe array needs to be implanted in the subarachnoid space of the brain, a narrow region of 1–2.5 millimeters in depth that lies between the pia mater and dura mater brain meninges. “A high-profile array may touch the skull and damage the tissue when relative micromotions occur between the brain and the probes,” explains Cheng. To avoid this problem, the array should be as thin as possible.

Read more

Filed under neural probe arrays neural implants prosthetics cerebral cortex neuroscience science

70 notes

CYBATHLON 2016

The Championship for Robot-Assisted Parathletes
Hallenstadion Zurich, 8 October 2016

The Cybathlon is a championship for racing pilots with disabilities (i.e. parathletes) who are using advanced assistive devices including robotic technologies. The competitions are comprised by different disciplines that apply the most modern powered knee prostheses, wearable arm prostheses, powered exoskeletons, powered wheelchairs, electrically stimulated muscles and novel brain-computer interfaces. The assistive devices can include commercially available products provided by companies, but also prototypes developed by research labs. There will be two medals for each competition, one for the pilot, who is driving the device, and one for the provider of the device. The event is organized on behalf of the Swiss National Competence Center of Research in Robotics (NCCR Robotics).

The main objectives of the Cybathlon are:

  • to promote the development of novel assistive systems and reinforce the scientific exchange,
  • to improve the public awareness about the challenges and opportunities of assistive technologies, and
  • to enable pilots with disabilities to compete in races, making this a unique event.

Filed under cybathlon robotics prosthetics artificial limbs BCI exoskeleton technology neuroscience science

506 notes

The Next Big Thing You Missed: 3-D Printing Promises Better Bionic Limbs for the War-Wounded
David Sengeh grew up in Sierra Leone during the African country’s decade-long civil war. The horribly bloody conflict was defined not just by the enormous death toll, but by the way rebel armies systematically severed the limbs of their enemies, leaving thousands of men, women, and children with missing arms and legs. Though the war ended more than a decade ago, Sengeh says, many victims are still struggling through life with artificial limbs that are too uncomfortable to wear.
But at the famed MIT Media Lab, the 27-year-old doctoral student is now using 3-D printing and advanced math to create a new kind of artificial limb he believes can significantly improve the lives of amputees in Sierra Leone and across the rest of the world. Sengeh relies on data-backed digital models to fashion prosthetics that he says better match the contours of the human body. And because these prosthetics are fabricated by 3-D printers, he says, they become far easier to produce.
The key problem with today’s prosthetics, Sengeh says, is that they don’t fit. Many people who have lost limbs — whether they’re Sierra Leone civilians or U.S. war vets — don’t wear their prostheses because the sockets aren’t tailored to their bodies. The tools needed to make well-fitting artificial limbs today are neither affordable nor widespread. “It does not matter how powerful your prosthetic ankle is,” Senghe said on Monday during a talk at TED, the global ideas conference being held this year in Vancouver, British Columbia. “If your prosthetic socket is uncomfortable, you will not use your leg.”
Read more

The Next Big Thing You Missed: 3-D Printing Promises Better Bionic Limbs for the War-Wounded

David Sengeh grew up in Sierra Leone during the African country’s decade-long civil war. The horribly bloody conflict was defined not just by the enormous death toll, but by the way rebel armies systematically severed the limbs of their enemies, leaving thousands of men, women, and children with missing arms and legs. Though the war ended more than a decade ago, Sengeh says, many victims are still struggling through life with artificial limbs that are too uncomfortable to wear.

But at the famed MIT Media Lab, the 27-year-old doctoral student is now using 3-D printing and advanced math to create a new kind of artificial limb he believes can significantly improve the lives of amputees in Sierra Leone and across the rest of the world. Sengeh relies on data-backed digital models to fashion prosthetics that he says better match the contours of the human body. And because these prosthetics are fabricated by 3-D printers, he says, they become far easier to produce.

The key problem with today’s prosthetics, Sengeh says, is that they don’t fit. Many people who have lost limbs — whether they’re Sierra Leone civilians or U.S. war vets — don’t wear their prostheses because the sockets aren’t tailored to their bodies. The tools needed to make well-fitting artificial limbs today are neither affordable nor widespread. “It does not matter how powerful your prosthetic ankle is,” Senghe said on Monday during a talk at TED, the global ideas conference being held this year in Vancouver, British Columbia. “If your prosthetic socket is uncomfortable, you will not use your leg.”

Read more

Filed under artificial limbs prosthetics 3-d printing tech science

357 notes

The Future of Brain Implants
What would you give for a retinal chip that let you see in the dark or for a next-generation cochlear implant that let you hear any conversation in a noisy restaurant, no matter how loud? Or for a memory chip, wired directly into your brain’s hippocampus, that gave you perfect recall of everything you read? Or for an implanted interface with the Internet that automatically translated a clearly articulated silent thought (“the French sun king”) into an online search that digested the relevant Wikipedia page and projected a summary directly into your brain?
Science fiction? Perhaps not for very much longer. Brain implants today are where laser eye surgery was several decades ago. They are not risk-free and make sense only for a narrowly defined set of patients—but they are a sign of things to come.
Unlike pacemakers, dental crowns or implantable insulin pumps, neuroprosthetics—devices that restore or supplement the mind’s capacities with electronics inserted directly into the nervous system—change how we perceive the world and move through it. For better or worse, these devices become part of who we are.
Neuroprosthetics aren’t new. They have been around commercially for three decades, in the form of the cochlear implants used in the ears (the outer reaches of the nervous system) of more than 300,000 hearing-impaired people around the world. Last year, the Food and Drug Administration approved the first retinal implant, made by the company Second Sight.
Both technologies exploit the same principle: An external device, either a microphone or a video camera, captures sounds or images and processes them, using the results to drive a set of electrodes that stimulate either the auditory or the optic nerve, approximating the naturally occurring output from the ear or the eye.
Read more

The Future of Brain Implants

What would you give for a retinal chip that let you see in the dark or for a next-generation cochlear implant that let you hear any conversation in a noisy restaurant, no matter how loud? Or for a memory chip, wired directly into your brain’s hippocampus, that gave you perfect recall of everything you read? Or for an implanted interface with the Internet that automatically translated a clearly articulated silent thought (“the French sun king”) into an online search that digested the relevant Wikipedia page and projected a summary directly into your brain?

Science fiction? Perhaps not for very much longer. Brain implants today are where laser eye surgery was several decades ago. They are not risk-free and make sense only for a narrowly defined set of patients—but they are a sign of things to come.

Unlike pacemakers, dental crowns or implantable insulin pumps, neuroprosthetics—devices that restore or supplement the mind’s capacities with electronics inserted directly into the nervous system—change how we perceive the world and move through it. For better or worse, these devices become part of who we are.

Neuroprosthetics aren’t new. They have been around commercially for three decades, in the form of the cochlear implants used in the ears (the outer reaches of the nervous system) of more than 300,000 hearing-impaired people around the world. Last year, the Food and Drug Administration approved the first retinal implant, made by the company Second Sight.

Both technologies exploit the same principle: An external device, either a microphone or a video camera, captures sounds or images and processes them, using the results to drive a set of electrodes that stimulate either the auditory or the optic nerve, approximating the naturally occurring output from the ear or the eye.

Read more

Filed under brain implants prosthetics technology neuroscience science

free counters