Neuroscience

Articles and news from the latest research reports.

Posts tagged prosthetics

247 notes

Phantom limb pain relieved when amputated arm is put back to work
Max Ortiz Catalan has developed a new method for the treatment of phantom limb pain (PLP) after an amputation. The method is based on a unique combination of several technologies, and has been initially tested on a patient who has suffered from severe phantom limb pain for 48 years. A case study shows a drastic reduction of pain.
People who lose an arm or a leg often experience phantom sensations, as if the missing limb were still there. Seventy per cent of amputees experience pain in the amputated limb despite that it no longer exists. Phantom limb pain can be a serious chronic and deteriorating condition that reduces the quality of the person´s life considerably. The exact cause of phantom limb pain and other phantom sensations is yet unknown.
Phantom limb pain is currently treated with several different methods. Examples include mirror therapy, different types of medication, acupuncture and hypnosis. In many cases, however, nothing helps. This was the case for the patient that Chalmers researcher Max Ortiz Catalan selected for a case study of the new treatment method he has envisaged as a potential solution.
The patient lost his arm 48 years ago, and had since that time suffered from phantom pain varying from moderate to unbearable. He was never entirely free of pain.
The patient´s pain was drastically reduced after a period of treatment with the new method. He now has periods where he is entirely free of pain, and he is no longer awakened by intense periods of pain at night like he was previously. The new method uses muscle signals from the patient´s arm stump to drive a system known as augmented reality. The electrical signals in the muscles are sensed by electrodes on the skin. The signals are then translated into arm movements by complex algorithms. The patient can see himself on a screen with a superimposed virtual arm, which is controlled using his own neural command in real time.
”There are several features of this system which combined might be the cause of pain relief” says Max Ortiz Catalan. “The motor areas in the brain needed for movement of the amputated arm are reactivated, and the patient obtains visual feedback that tricks the brain into believing there is an arm executing such motor commands. He experiences himself as a whole, with the amputated arm back in place.”
Modern therapies that use conventional mirrors or virtual reality are based on visual feedback via the opposite arm or leg. For this reason, people who have lost both arms or both legs cannot be helped using these methods.
”Our method differs from previous treatment because the control signals are retrieved from the arm stump, and thus the affected arm is in charge” says Max Ortiz Catalan. ”The promotion of motor execution and the vivid sensation of completion provided by augmented reality may be the reason for the patient improvement, while mirror therapy and medicaments did not help previously.”
A clinical study will now be conducted of the new treatment, which has been developed in a collaboration between Chalmers University of Technology, Sahlgrenska University Hospital, the University of Gothenburg and Integrum. Three Swedish hospitals and other European clinics will cooperate during the study which will target patients with conditions resembling the one in the case study – that is, people who suffer from phantom pain and who have not responded to other currently available treatments.
The research group has also developed a system that can be used at home. Patients will be able to apply this therapy on their own, once it has been approved. An extension of the treatment is that it can be used by other patient groups that need to rehabilitate their mobility, such as stroke victims or some patients with spinal cord injuries.

Phantom limb pain relieved when amputated arm is put back to work

Max Ortiz Catalan has developed a new method for the treatment of phantom limb pain (PLP) after an amputation. The method is based on a unique combination of several technologies, and has been initially tested on a patient who has suffered from severe phantom limb pain for 48 years. A case study shows a drastic reduction of pain.

People who lose an arm or a leg often experience phantom sensations, as if the missing limb were still there. Seventy per cent of amputees experience pain in the amputated limb despite that it no longer exists. Phantom limb pain can be a serious chronic and deteriorating condition that reduces the quality of the person´s life considerably. The exact cause of phantom limb pain and other phantom sensations is yet unknown.

Phantom limb pain is currently treated with several different methods. Examples include mirror therapy, different types of medication, acupuncture and hypnosis. In many cases, however, nothing helps. This was the case for the patient that Chalmers researcher Max Ortiz Catalan selected for a case study of the new treatment method he has envisaged as a potential solution.

The patient lost his arm 48 years ago, and had since that time suffered from phantom pain varying from moderate to unbearable. He was never entirely free of pain.

The patient´s pain was drastically reduced after a period of treatment with the new method. He now has periods where he is entirely free of pain, and he is no longer awakened by intense periods of pain at night like he was previously.
The new method uses muscle signals from the patient´s arm stump to drive a system known as augmented reality. The electrical signals in the muscles are sensed by electrodes on the skin. The signals are then translated into arm movements by complex algorithms. The patient can see himself on a screen with a superimposed virtual arm, which is controlled using his own neural command in real time.

”There are several features of this system which combined might be the cause of pain relief” says Max Ortiz Catalan. “The motor areas in the brain needed for movement of the amputated arm are reactivated, and the patient obtains visual feedback that tricks the brain into believing there is an arm executing such motor commands. He experiences himself as a whole, with the amputated arm back in place.”

Modern therapies that use conventional mirrors or virtual reality are based on visual feedback via the opposite arm or leg. For this reason, people who have lost both arms or both legs cannot be helped using these methods.

”Our method differs from previous treatment because the control signals are retrieved from the arm stump, and thus the affected arm is in charge” says Max Ortiz Catalan. ”The promotion of motor execution and the vivid sensation of completion provided by augmented reality may be the reason for the patient improvement, while mirror therapy and medicaments did not help previously.”

A clinical study will now be conducted of the new treatment, which has been developed in a collaboration between Chalmers University of Technology, Sahlgrenska University Hospital, the University of Gothenburg and Integrum. Three Swedish hospitals and other European clinics will cooperate during the study which will target patients with conditions resembling the one in the case study – that is, people who suffer from phantom pain and who have not responded to other currently available treatments.

The research group has also developed a system that can be used at home. Patients will be able to apply this therapy on their own, once it has been approved. An extension of the treatment is that it can be used by other patient groups that need to rehabilitate their mobility, such as stroke victims or some patients with spinal cord injuries.

Filed under amputation phantom limb phantom limb pain prosthetics virtual reality technology neuroscience science

3,034 notes

Amputee Feels in Real-Time with Bionic Hand

Nine years after an accident caused the loss of his left hand, Dennis Aabo Sørensen from Denmark became the first amputee in the world to feel – in real-time – with a sensory-enhanced prosthetic hand that was surgically wired to nerves in his upper arm. Silvestro Micera and his team at EPFL Center for Neuroprosthetics and SSSA (Italy) developed the revolutionary sensory feedback that allowed Sørensen to feel again while handling objects. A prototype of this bionic technology was tested in February 2013 during a clinical trial in Rome under the supervision of Paolo Maria Rossini at Gemelli Hospital (Italy). The study is published in the February 5, 2014 edition of Science Translational Medicine, and represents a collaboration called Lifehand 2 between several European universities and hospitals.

“The sensory feedback was incredible,” reports the 36 year-old amputee from Denmark. “I could feel things that I hadn’t been able to feel in over nine years.” In a laboratory setting wearing a blindfold and earplugs, Sørensen was able to detect how strongly he was grasping, as well as the shape and consistency of different objects he picked up with his prosthetic. “When I held an object, I could feel if it was soft or hard, round or square.”

From Electrical Signal to Nerve Impulse
Micera and his team enhanced the artificial hand with sensors that detect information about touch. This was done by measuring the tension in artificial tendons that control finger movement and turning this measurement into an electrical current. But this electrical signal is too coarse to be understood by the nervous system. Using computer algorithms, the scientists transformed the electrical signal into an impulse that sensory nerves can interpret. The sense of touch was achieved by sending the digitally refined signal through wires into four electrodes that were surgically implanted into what remains of Sørensen’s upper arm nerves.

“This is the first time in neuroprosthetics that sensory feedback has been restored and used by an amputee in real-time to control an artificial limb,” says Micera.

“We were worried about reduced sensitivity in Dennis’ nerves since they hadn’t been used in over nine years,” says Stanisa Raspopovic, first author and scientist at EPFL and SSSA. These concerns faded away as the scientists successfully reactivated Sørensen’s sense of touch.

Connecting Electrodes to Nerves

On January 26, 2013, Sørensen underwent surgery in Rome at Gemelli Hospital. A specialized group of surgeons and neurologists, led by Paolo Maria Rossini, implanted so-called transneural electrodes into the ulnar and median nerves of Sørensen’s left arm. After 19 days of preliminary tests, Micera and his team connected their prosthetic to the electrodes – and to Sørensen – every day for an entire week.

The ultra-thin, ultra-precise electrodes, developed by Thomas Stieglitz’s research group at Freiburg University (Germany), made it possible to relay extremely weak electrical signals directly into the nervous system. A tremendous amount of preliminary research was done to ensure that the electrodes would continue to work even after the formation of post-surgery scar tissue. It is also the first time that such electrodes have been transversally implanted into the peripheral nervous system of an amputee.

The First Sensory-Enhanced Artificial Limb
The clinical study provides the first step towards a bionic hand, although a sensory-enhanced prosthetic is years away from being commercially available and the bionic hand of science fiction movies is even further away.

The next step involves miniaturizing the sensory feedback electronics for a portable prosthetic. In addition, the scientists will fine-tune the sensory technology for better touch resolution and increased awareness about the angular movement of fingers.

The electrodes were removed from Sørensen’s arm after one month due to safety restrictions imposed on clinical trials, although the scientists are optimistic that they could remain implanted and functional without damage to the nervous system for many years.

Psychological Strength an Asset
Sørensen’s psychological strength was an asset for the clinical study. He says, “I was more than happy to volunteer for the clinical trial, not only for myself, but to help other amputees as well.” Now he faces the challenge of having experienced touch again for only a short period of time. 

Sørensen lost his left hand while handling fireworks during a family holiday. He was rushed to the hospital where his hand was immediately amputated. Since then, he has been wearing a commercial prosthetic that detects muscle movement in his stump, allowing him to open and close his hand, and hold onto objects.

“It works like a brake on a motorbike,” explains Sørensen about the conventional prosthetic he usually wears. “When you squeeze the brake, the hand closes. When you relax, the hand opens.” Without sensory information being fed back into the nervous system, though, Sørensen cannot feel what he’s trying to grasp and must constantly watch his prosthetic to avoid crushing the object.

Just after the amputation, Sørensen recounts what the doctor told him. “There are two ways you can view this. You can sit in the corner and feel sorry for yourself. Or, you can get up and feel grateful for what you have. I believe you’ll adopt the second view.”

“He was right,” says Sørensen.

Filed under bionic hand artificial limb transneural electrodes prosthetics sensory feedback robotics neuroscience science

189 notes

Researchers discover how parts of the brain work together, or alone

Our brains have billions of neurons grouped into different regions. These regions often work alone but sometimes must join forces. How do regions communicate selectively?

image

Stanford researchers may have solved a riddle about the inner workings of the brain, which consists of billions of neurons, organized into many different regions, with each region primarily responsible for different tasks.

The various regions of the brain often work independently, relying on the neurons inside that region to do their work. At other times, however, two regions must cooperate to accomplish the task at hand. The riddle is this: what mechanism allows two brain regions to communicate when they need to cooperate yet avoid interfering with one another when they must work alone?

In a paper published today in Nature Neuroscience, a team led by Stanford electrical engineering professor Krishna Shenoy reveals a previously unknown process that helps two brain regions cooperate when joint action is required to perform a task.

“This is among the first mechanisms reported in the literature for letting brain areas process information continuously but only communicate what they need to,” said Matthew T. Kaufman, who was a postdoctoral scholar in the Shenoy lab when he co-authored the paper.

(Source: engineering.stanford.edu)

Read more …

Filed under cortical activity motor cortex arm movements neurons prosthetics neuroscience science

147 notes

Researchers reveal more about how our brains control our arms
Ready, set, go.
Sometimes that’s how our brains work. When we anticipate a physical act, such as reaching for the keys we noticed on the table, the neurons that control the task adopt a state of readiness, like sprinters bent into a crouch.
Other times, however, our neurons must simply react, such as if someone were to toss us the keys without gesturing first, to prepare us to catch.
How do the neurons in the brain control planned versus unplanned arm movements?
Krishna Shenoy, a Stanford professor of electrical engineering, neurobiology (by courtesy) and bioengineering (affiliate), wanted to answer that question as part of his group’s ongoing efforts to develop and improve brain-controlled prosthetic devices.
In a paper published today in the journal Neuron, Shenoy and first author Katherine Cora Ames, a doctoral student in the Neurosciences Graduate Program, present a mathematical analysis of the brain activity of monkeys as they make anticipated and unanticipated reaching motions.
Monitoring the neurons
The experimental data came from recording the electrical activity of neurons in the brain that control motor and premotor functions. The idea was to observe and understand the activity levels of these neurons during experiments in which the monkeys made planned or reactive arm movements. What the researchers found is that when the monkeys knew what arm movement they were supposed to make and were simply waiting for the cue to act, electrical readings showed that the neurons went into what scientists call the prepare-and-hold state – the brain’s equivalent of ready, set, waiting for the cue to go.
But when the monkeys made unplanned or unexpected movements, the neurons did not go through the expected prepare-and-hold state. “This was a surprise,” Ames said.
Before the experiment, the researchers had believed that a prepare-and-hold state had to precede movement. In short, they thought the neurons had to go into a “ready, set” crouch before acting on the “go” command. But they discovered otherwise in three variations of an experiment involving similar arm movements.
Experimental design
In all three cases, the monkeys were trained to touch a target that appeared on a display screen.
During each motion, the researchers measured the electrical activity of the neurons in control of arm movements.
In one set of experiments, the monkeys were shown the target but were trained not to touch it until they got the “go” signal. This is called a delayed reach experiment. It served as the planned action.
In a second set of experiments the monkeys were trained to touch the target as soon as it appeared. This served as the unplanned action.
In a third variant, the position of the target was changed. It briefly appeared in one location on the screen. The target then reappeared in a different location. This required the monkeys to revise their movement plan.
Monkey see, then monkey do
Ames said that, in all three instances, the first information to reach the neurons was awareness of the target.
“Perception always occurred first,” Ames said.
Then, about 50 milliseconds later, some differences appeared in the data. When the monkeys had to wait for the go command, the brain recordings showed that the neurons went into a discernable prepare-and-hold state. But in the other two cases, the neurons did not enter the prepare-and-hold state.
Instead, roughly 50 milliseconds after the electrical readings showed evidence of perception, a change in neuronal activity signaled the command to touch the target; it came with no apparent further preparation between perception and action. “Ready, set” was unnecessary. In these instances, the neurons just said, “Go!”
Applications
“This study changes our view of how movement is controlled,” Ames said. “First you get the information about where to move. Then comes the decision to move. There is no specific prepare-and-hold stage unless you are waiting for the signal to move.”
These nuanced understandings are important to Shenoy. His lab develops and improves electronic systems that can convert neural activity into electronic signals in order to control a prosthetic arm or move the cursor on a computer screen.
One example of such efforts is the BrainGate clinical trial here at Stanford, now being conducted under U.S. Food & Drug Administration supervision, to test the safety of brain-controlled, computer cursor systems – “think-and-click” communication for people who can’t move.
“In addition to advancing basic brain science, these new findings will lead to better brain-controlled prosthetic arms and communication systems for people with paralysis,” Shenoy said.

Researchers reveal more about how our brains control our arms

Ready, set, go.

Sometimes that’s how our brains work. When we anticipate a physical act, such as reaching for the keys we noticed on the table, the neurons that control the task adopt a state of readiness, like sprinters bent into a crouch.

Other times, however, our neurons must simply react, such as if someone were to toss us the keys without gesturing first, to prepare us to catch.

How do the neurons in the brain control planned versus unplanned arm movements?

Krishna Shenoy, a Stanford professor of electrical engineering, neurobiology (by courtesy) and bioengineering (affiliate), wanted to answer that question as part of his group’s ongoing efforts to develop and improve brain-controlled prosthetic devices.

In a paper published today in the journal Neuron, Shenoy and first author Katherine Cora Ames, a doctoral student in the Neurosciences Graduate Program, present a mathematical analysis of the brain activity of monkeys as they make anticipated and unanticipated reaching motions.

Monitoring the neurons

The experimental data came from recording the electrical activity of neurons in the brain that control motor and premotor functions. The idea was to observe and understand the activity levels of these neurons during experiments in which the monkeys made planned or reactive arm movements. What the researchers found is that when the monkeys knew what arm movement they were supposed to make and were simply waiting for the cue to act, electrical readings showed that the neurons went into what scientists call the prepare-and-hold state – the brain’s equivalent of ready, set, waiting for the cue to go.

But when the monkeys made unplanned or unexpected movements, the neurons did not go through the expected prepare-and-hold state. “This was a surprise,” Ames said.

Before the experiment, the researchers had believed that a prepare-and-hold state had to precede movement. In short, they thought the neurons had to go into a “ready, set” crouch before acting on the “go” command. But they discovered otherwise in three variations of an experiment involving similar arm movements.

Experimental design

In all three cases, the monkeys were trained to touch a target that appeared on a display screen.

During each motion, the researchers measured the electrical activity of the neurons in control of arm movements.

In one set of experiments, the monkeys were shown the target but were trained not to touch it until they got the “go” signal. This is called a delayed reach experiment. It served as the planned action.

In a second set of experiments the monkeys were trained to touch the target as soon as it appeared. This served as the unplanned action.

In a third variant, the position of the target was changed. It briefly appeared in one location on the screen. The target then reappeared in a different location. This required the monkeys to revise their movement plan.

Monkey see, then monkey do

Ames said that, in all three instances, the first information to reach the neurons was awareness of the target.

“Perception always occurred first,” Ames said.

Then, about 50 milliseconds later, some differences appeared in the data. When the monkeys had to wait for the go command, the brain recordings showed that the neurons went into a discernable prepare-and-hold state. But in the other two cases, the neurons did not enter the prepare-and-hold state.

Instead, roughly 50 milliseconds after the electrical readings showed evidence of perception, a change in neuronal activity signaled the command to touch the target; it came with no apparent further preparation between perception and action. “Ready, set” was unnecessary. In these instances, the neurons just said, “Go!”

Applications

“This study changes our view of how movement is controlled,” Ames said. “First you get the information about where to move. Then comes the decision to move. There is no specific prepare-and-hold stage unless you are waiting for the signal to move.”

These nuanced understandings are important to Shenoy. His lab develops and improves electronic systems that can convert neural activity into electronic signals in order to control a prosthetic arm or move the cursor on a computer screen.

One example of such efforts is the BrainGate clinical trial here at Stanford, now being conducted under U.S. Food & Drug Administration supervision, to test the safety of brain-controlled, computer cursor systems – “think-and-click” communication for people who can’t move.

“In addition to advancing basic brain science, these new findings will lead to better brain-controlled prosthetic arms and communication systems for people with paralysis,” Shenoy said.

Filed under arm movement prosthetics BCI neural activity robotics neurons neuroscience science

1,231 notes

The Cyborgs Era Has Started
Medical implants, complex interfaces between brain and machine or remotely controlled insects: Recent developments combining machines and organisms have great potentials, but also give rise to major ethical concerns. In their review entitled “Chemie der Cyborgs – zur Verknüpfung technischer Systeme mit Lebewesen” (The Chemistry of Cyborgs – Interfacing Technical Devices with Organisms), KIT scientists discuss the state of the art of research, opportunities, and risks. The review is published now by the renowned journal “Angewandte Chemie Int. Ed.”
They are known from science fiction novels and films – technically modified organisms with extraordinary skills, so-called cyborgs. This name originates from the English term “cybernetic organism”. In fact, cyborgs that combine technical systems with living organisms are already reality. The KIT researchers Professor Christof M. Niemeyer and Dr. Stefan Giselbrecht of the Institute for Biological Interfaces 1 (IBG 1) and Dr. Bastian E. Rapp, Institute of Microstructure Technology (IMT), point out that this especially applies to medical implants.
In recent years, medical implants based on smart materials that automatically react to changing conditions, computer-supported design and fabrication based on magnetic resonance tomography datasets or surface modifications for improved tissue integration allowed major progress to be achieved. For successful tissue integration and the prevention of inflammation reactions, special surface coatings were developed also by the KIT under e.g. the multidisciplinary Helmholtz program “BioInterfaces”.
Progress in microelectronics and semiconductor technology has been the basis of electronic implants controlling, restoring or improving the functions of the human body, such as cardiac pacemakers, retina implants, hearing implants, or implants for deep brain stimulation in pain or Parkinson therapies. Currently, bioelectronic developments are being combined with robotics systems to design highly complex neuroprostheses. Scientists are working on brain-machine interfaces (BMI) for the direct physical contacting of the brain. BMI are used among others to control prostheses and complex movements, such as gripping. Moreover, they are important tools in neurosciences, as they provide insight into the functioning of the brain. Apart from electric signals, substances released by implanted micro- and nanofluidic systems in a spatially or temporarily controlled manner can be used for communication between technical devices and organisms.
BMI are often considered data suppliers. However, they can also be used to feed signals into the brain, which is a highly controversial issue from the ethical point of view. “Implanted BMI that feed signals into nerves, muscles or directly into the brain are already used on a routine basis, e.g. in cardiac pacemakers or implants for deep brain stimulation,” Professor Christof M. Niemeyer, KIT, explains. “But these signals are neither planned to be used nor suited to control the entire organism – brains of most living organisms are far too complex.”
Brains of lower organisms, such as insects, are less complex. As soon as a signal is coupled in, a certain movement program, such as running or flying, is started. So-called biobots, i.e. large insects with implanted electronic and microfluidic control units, are used in a new generation of tools, such as small flying objects for monitoring and rescue missions. In addition, they are applied as model systems in neurosciences in order to understand basic relationships.
Electrically active medical implants that are used for longer terms depend on reliable power supply. Presently, scientists are working on methods to use the patient body’s own thermal, kinetic, electric or chemical energy.
In their review the KIT researchers sum up that developments combining technical devices with organisms have a fascinating potential. They may considerably improve the quality of life of many people in the medical sector in particular. However, ethical and social aspects always have to be taken into account.

The Cyborgs Era Has Started

Medical implants, complex interfaces between brain and machine or remotely controlled insects: Recent developments combining machines and organisms have great potentials, but also give rise to major ethical concerns. In their review entitled “Chemie der Cyborgs – zur Verknüpfung technischer Systeme mit Lebewesen” (The Chemistry of Cyborgs – Interfacing Technical Devices with Organisms), KIT scientists discuss the state of the art of research, opportunities, and risks. The review is published now by the renowned journal “Angewandte Chemie Int. Ed.

They are known from science fiction novels and films – technically modified organisms with extraordinary skills, so-called cyborgs. This name originates from the English term “cybernetic organism”. In fact, cyborgs that combine technical systems with living organisms are already reality. The KIT researchers Professor Christof M. Niemeyer and Dr. Stefan Giselbrecht of the Institute for Biological Interfaces 1 (IBG 1) and Dr. Bastian E. Rapp, Institute of Microstructure Technology (IMT), point out that this especially applies to medical implants.

In recent years, medical implants based on smart materials that automatically react to changing conditions, computer-supported design and fabrication based on magnetic resonance tomography datasets or surface modifications for improved tissue integration allowed major progress to be achieved. For successful tissue integration and the prevention of inflammation reactions, special surface coatings were developed also by the KIT under e.g. the multidisciplinary Helmholtz program “BioInterfaces”.

Progress in microelectronics and semiconductor technology has been the basis of electronic implants controlling, restoring or improving the functions of the human body, such as cardiac pacemakers, retina implants, hearing implants, or implants for deep brain stimulation in pain or Parkinson therapies. Currently, bioelectronic developments are being combined with robotics systems to design highly complex neuroprostheses. Scientists are working on brain-machine interfaces (BMI) for the direct physical contacting of the brain. BMI are used among others to control prostheses and complex movements, such as gripping. Moreover, they are important tools in neurosciences, as they provide insight into the functioning of the brain. Apart from electric signals, substances released by implanted micro- and nanofluidic systems in a spatially or temporarily controlled manner can be used for communication between technical devices and organisms.

BMI are often considered data suppliers. However, they can also be used to feed signals into the brain, which is a highly controversial issue from the ethical point of view. “Implanted BMI that feed signals into nerves, muscles or directly into the brain are already used on a routine basis, e.g. in cardiac pacemakers or implants for deep brain stimulation,” Professor Christof M. Niemeyer, KIT, explains. “But these signals are neither planned to be used nor suited to control the entire organism – brains of most living organisms are far too complex.”

Brains of lower organisms, such as insects, are less complex. As soon as a signal is coupled in, a certain movement program, such as running or flying, is started. So-called biobots, i.e. large insects with implanted electronic and microfluidic control units, are used in a new generation of tools, such as small flying objects for monitoring and rescue missions. In addition, they are applied as model systems in neurosciences in order to understand basic relationships.

Electrically active medical implants that are used for longer terms depend on reliable power supply. Presently, scientists are working on methods to use the patient body’s own thermal, kinetic, electric or chemical energy.

In their review the KIT researchers sum up that developments combining technical devices with organisms have a fascinating potential. They may considerably improve the quality of life of many people in the medical sector in particular. However, ethical and social aspects always have to be taken into account.

Filed under cybernetic organism medical implants brain-machine interface prosthetics deep brain stimulation medicine neuroscience science

231 notes

Mind-controlled prostheses offer hope for disabled
The first kick of the 2014 FIFA World Cup may be delivered in Sao Paulo next June by a Brazilian who is paralyzed from the waist down. If all goes according to plan, the teenager will walk onto the field, cock back a foot and swing at the soccer ball, using a mechanical exoskeleton controlled by the teen’s brain.
Motorized metal braces tested on monkeys will support and bend the kicker’s legs. The braces will be stabilized by gyroscopes and powered by a battery carried by the kicker in a backpack. German-made sensors will relay a feeling of pressure when each foot touches the ground. And months of training on a virtual-reality simulator will have prepared the teenager — selected from a pool of 10 candidates — to do all this using a device that translates thoughts into actions.
“We want to galvanize people’s imaginations,” says Miguel Nicolelis, the Brazilian neuroscientist at Duke University who is leading the Walk Again Project’s efforts to create the robotic suit. “With enough political will and investment, we could make wheelchairs obsolete.”
Mind-controlled leg armor may sound more like the movie “Iron Man” than modern medicine. But after decades of testing on rats and monkeys, neuroprosthetics are finally beginning to show promise for people. Devices plugged directly into the brain seem capable of restoring some self-reliance to stroke victims, car crash survivors, injured soldiers and others hampered by incapacitated or missing limbs.
Read more

Mind-controlled prostheses offer hope for disabled

The first kick of the 2014 FIFA World Cup may be delivered in Sao Paulo next June by a Brazilian who is paralyzed from the waist down. If all goes according to plan, the teenager will walk onto the field, cock back a foot and swing at the soccer ball, using a mechanical exoskeleton controlled by the teen’s brain.

Motorized metal braces tested on monkeys will support and bend the kicker’s legs. The braces will be stabilized by gyroscopes and powered by a battery carried by the kicker in a backpack. German-made sensors will relay a feeling of pressure when each foot touches the ground. And months of training on a virtual-reality simulator will have prepared the teenager — selected from a pool of 10 candidates — to do all this using a device that translates thoughts into actions.

“We want to galvanize people’s imaginations,” says Miguel Nicolelis, the Brazilian neuroscientist at Duke University who is leading the Walk Again Project’s efforts to create the robotic suit. “With enough political will and investment, we could make wheelchairs obsolete.”

Mind-controlled leg armor may sound more like the movie “Iron Man” than modern medicine. But after decades of testing on rats and monkeys, neuroprosthetics are finally beginning to show promise for people. Devices plugged directly into the brain seem capable of restoring some self-reliance to stroke victims, car crash survivors, injured soldiers and others hampered by incapacitated or missing limbs.

Read more

Filed under prosthetics mind control walk again project robotics neuroscience science

138 notes

Neural prosthesis restores behavior after brain injury

Scientists from Case Western Reserve University and University of Kansas Medical Center have restored behavior—in this case, the ability to reach through a narrow opening and grasp food—using a neural prosthesis in a rat model of brain injury.

Ultimately, the team hopes to develop a device that rapidly and substantially improves function after brain injury in humans. There is no such commercial treatment for the 1.5 million Americans, including soldiers in Afghanistan and Iraq, who suffer traumatic brain injuries (TBI), or the nearly 800,000 stroke victims who suffer weakness or paralysis in the United States, annually.

The prosthesis, called a brain-machine-brain interface, is a closed-loop microelectronic system. It records signals from one part of the brain, processes them in real time, and then bridges the injury by stimulating a second part of the brain that had lost connectivity.
Their work is published online this week in the science journal Proceedings of the National Academy of Sciences.

“If you use the device to couple activity from one part of the brain to another, is it possible to induce recovery from TBI? That’s the core of this investigation,” said Pedram Mohseni, professor of electrical engineering and computer science at Case Western Reserve, who built the brain prosthesis.

“We found that, yes, it is possible to use a closed-loop neural prosthesis to facilitate repair of a brain injury,” he said.

The researchers tested the prosthesis in a rat model of brain injury in the laboratory of Randolph J. Nudo, professor of molecular and integrative physiology at the University of Kansas. Nudo mapped the rat’s brain and developed the model in which anterior and posterior parts of the brain that control the rat’s forelimbs are disconnected.

Atop each animal’s head, the brain-machine-brain interface is a microchip on a circuit board smaller than a quarter connected to microelectrodes implanted in the two brain regions.

The device amplifies signals, which are called neural action potentials and produced by the neurons in the anterior of the brain. An algorithm separates these signals, recorded as brain spike activity, from noise and other artifacts. With each spike detected, the microchip sends a pulse of electric current to stimulate neurons in the posterior part of the brain, artificially connecting the two brain regions.

Two weeks after the prosthesis had been implanted and run continuously, the rat models using the full closed-loop system had recovered nearly all function lost due to injury, successfully retrieving a food pellet close to 70 percent of the time, or as well as normal, uninjured rats. Rat models that received random stimuli from the device retrieved less than half the pellets and those that received no stimuli retrieved about a quarter of them.

“A question still to be answered is must the implant be left in place for life?” Mohseni said. “Or can it be removed after two months or six months, if and when new connections have been formed in the brain?”

Brain studies have shown that, during periods of growth, neurons that regularly communicate with each other develop and solidify connections.

Mohseni and Nudo said they need more systematic studies to determine what happens in the brain that leads to restoration of function. They also want to determine if there is an optimal time window after injury in which they must implant the device in order to restore function.

(Source: blog.case.edu)

Filed under TBI brain injury prosthetics BMI brain damage neuroscience science

218 notes

Robotic advances promise artificial legs that emulate healthy limbs
Recent advances in robotics technology make it possible to create prosthetics that can duplicate the natural movement of human legs. This capability promises to dramatically improve the mobility of lower-limb amputees, allowing them to negotiate stairs and slopes and uneven ground, significantly reducing their risk of falling as well as reducing stress on the rest of their bodies.
That is the view of Michael Goldfarb, the H. Fort Flowers Professor of Mechanical Engineering, and his colleagues at Vanderbilt University’s Center for Intelligent Mechatronics expressed in a perspective’s article in the Nov. 6 issue of the journal Science Translational Medicine.
For the last decade, Goldfarb’s team has been doing pioneering research in lower-limb prosthetics. It developed the first robotic prosthesis with both powered knee and ankle joints. And the design became the first artificial leg controlled by thought when researchers at the Rehabilitation Institute of Chicago created a neural interface for it.
In the article, Goldfarb and graduate students Brian Lawson and Amanda Shultz describe the technological advances that have made robotic prostheses viable. These include lithium-ion batteries that can store more electricity, powerful brushless electric motors with rare-Earth magnets, miniaturized sensors built into semiconductor chips, particularly accelerometers and gyroscopes, and low-power computer chips.
The size and weight of these components is small enough so that they can be combined into a package comparable to that of a biological leg and they can duplicate all of its basic functions. The electric motors play the role of muscles. The batteries store enough power so the robot legs can operate for a full day on a single charge. The sensors serve the function of the nerves in the peripheral nervous system, providing vital information such as the angle between the thigh and lower leg and the force being exerted on the bottom of the foot, etc. The microprocessor provides the coordination function normally provided by the central nervous system. And, in the most advanced systems, a neural interface enhances integration with the brain.
Unlike passive artificial legs, robotic legs have the capability of moving independently and out of sync with its user’s movements. So the development of a system that integrates the movement of the prosthesis with the movement of the user is “substantially more important with a robotic leg,” according to the authors.
Not only must this control system coordinate the actions of the prosthesis within an activity, such as walking, but it must also recognize a user’s intent to change from one activity to another, such as moving from walking to stair climbing.
Identifying the user’s intent requires some connection with the central nervous system. Currently, there are several different approaches to establishing this connection that vary greatly in invasiveness. The least invasive method uses physical sensors that divine the user’s intent from his or her body language. Another method – the electromyography interface – uses electrodes implanted into the user’s leg muscles. The most invasive techniques involve implanting electrodes directly into a patient’s peripheral nerves or directly into his or her brain. The jury is still out on which of these approaches will prove to be best. “Approaches that entail a greater degree of invasiveness must obviously justify the invasiveness with substantial functional advantage,” the article states.
There are a number of potential advantages of bionic legs, the authors point out.
Studies have shown that users equipped with the lower-limb prostheses with powered knee and heel joints naturally walk faster with decreased hip effort while expending less energy than when they are using passive prostheses.
In addition, amputees using conventional artificial legs experience falls that lead to hospitalization at a higher rate than elderly living in institutions. The rate is actually highest among younger amputees, presumably because they are less likely to limit their activities and terrain. There are several reasons why a robotic prosthesis should decrease the rate of falls: Users don’t have to compensate for deficiencies in its movement like they do for passive legs because it moves like a natural leg. Both walking and standing, it can compensate better for uneven ground. Active responses can be programmed into the robotic leg that helps users recover from stumbles.
Before individuals in the U.S. can begin realizing these benefits, however, the new devices must be approved by the U.S. Food and Drug Administration (FDA).
Single-joint devices are currently considered to be Class I medical devices, so they are subject to the least amount of regulatory control. Currently, transfemoral prostheses are generally constructed by combining two, single-joint prostheses. As a result, they have also been considered Class I devices.
In robotic legs the knee and ankle joints are electronically linked. According to the FDA that makes them multi-joint devices, which are considered Class II medical devices. This means that they must meet a number of additional regulatory requirements, including the development of performance standards, post-market surveillance, establishing patient registries and special labeling requirements.
Another translational issue that must be resolved before robotic prostheses can become viable products is the need to provide additional training for the clinicians who prescribe prostheses. Because the new devices are substantially more complex than standard prostheses, the clinicians will need additional training in robotics, the authors point out.
In addition to the robotics leg, Goldfarb’s Center for Intelligent Mechatronics has developed an advanced exoskeleton that allows paraplegics to stand up and walk, which led Popular Mechanics magazine to name him as one of the 10 innovators who changed the world in 2013, and a robotic hand with a dexterity that approaches that of the human hand.

Robotic advances promise artificial legs that emulate healthy limbs

Recent advances in robotics technology make it possible to create prosthetics that can duplicate the natural movement of human legs. This capability promises to dramatically improve the mobility of lower-limb amputees, allowing them to negotiate stairs and slopes and uneven ground, significantly reducing their risk of falling as well as reducing stress on the rest of their bodies.

That is the view of Michael Goldfarb, the H. Fort Flowers Professor of Mechanical Engineering, and his colleagues at Vanderbilt University’s Center for Intelligent Mechatronics expressed in a perspective’s article in the Nov. 6 issue of the journal Science Translational Medicine.

For the last decade, Goldfarb’s team has been doing pioneering research in lower-limb prosthetics. It developed the first robotic prosthesis with both powered knee and ankle joints. And the design became the first artificial leg controlled by thought when researchers at the Rehabilitation Institute of Chicago created a neural interface for it.

In the article, Goldfarb and graduate students Brian Lawson and Amanda Shultz describe the technological advances that have made robotic prostheses viable. These include lithium-ion batteries that can store more electricity, powerful brushless electric motors with rare-Earth magnets, miniaturized sensors built into semiconductor chips, particularly accelerometers and gyroscopes, and low-power computer chips.

The size and weight of these components is small enough so that they can be combined into a package comparable to that of a biological leg and they can duplicate all of its basic functions. The electric motors play the role of muscles. The batteries store enough power so the robot legs can operate for a full day on a single charge. The sensors serve the function of the nerves in the peripheral nervous system, providing vital information such as the angle between the thigh and lower leg and the force being exerted on the bottom of the foot, etc. The microprocessor provides the coordination function normally provided by the central nervous system. And, in the most advanced systems, a neural interface enhances integration with the brain.

Unlike passive artificial legs, robotic legs have the capability of moving independently and out of sync with its user’s movements. So the development of a system that integrates the movement of the prosthesis with the movement of the user is “substantially more important with a robotic leg,” according to the authors.

Not only must this control system coordinate the actions of the prosthesis within an activity, such as walking, but it must also recognize a user’s intent to change from one activity to another, such as moving from walking to stair climbing.

Identifying the user’s intent requires some connection with the central nervous system. Currently, there are several different approaches to establishing this connection that vary greatly in invasiveness. The least invasive method uses physical sensors that divine the user’s intent from his or her body language. Another method – the electromyography interface – uses electrodes implanted into the user’s leg muscles. The most invasive techniques involve implanting electrodes directly into a patient’s peripheral nerves or directly into his or her brain. The jury is still out on which of these approaches will prove to be best. “Approaches that entail a greater degree of invasiveness must obviously justify the invasiveness with substantial functional advantage,” the article states.

There are a number of potential advantages of bionic legs, the authors point out.

Studies have shown that users equipped with the lower-limb prostheses with powered knee and heel joints naturally walk faster with decreased hip effort while expending less energy than when they are using passive prostheses.

In addition, amputees using conventional artificial legs experience falls that lead to hospitalization at a higher rate than elderly living in institutions. The rate is actually highest among younger amputees, presumably because they are less likely to limit their activities and terrain. There are several reasons why a robotic prosthesis should decrease the rate of falls: Users don’t have to compensate for deficiencies in its movement like they do for passive legs because it moves like a natural leg. Both walking and standing, it can compensate better for uneven ground. Active responses can be programmed into the robotic leg that helps users recover from stumbles.

Before individuals in the U.S. can begin realizing these benefits, however, the new devices must be approved by the U.S. Food and Drug Administration (FDA).

Single-joint devices are currently considered to be Class I medical devices, so they are subject to the least amount of regulatory control. Currently, transfemoral prostheses are generally constructed by combining two, single-joint prostheses. As a result, they have also been considered Class I devices.

In robotic legs the knee and ankle joints are electronically linked. According to the FDA that makes them multi-joint devices, which are considered Class II medical devices. This means that they must meet a number of additional regulatory requirements, including the development of performance standards, post-market surveillance, establishing patient registries and special labeling requirements.

Another translational issue that must be resolved before robotic prostheses can become viable products is the need to provide additional training for the clinicians who prescribe prostheses. Because the new devices are substantially more complex than standard prostheses, the clinicians will need additional training in robotics, the authors point out.

In addition to the robotics leg, Goldfarb’s Center for Intelligent Mechatronics has developed an advanced exoskeleton that allows paraplegics to stand up and walk, which led Popular Mechanics magazine to name him as one of the 10 innovators who changed the world in 2013, and a robotic hand with a dexterity that approaches that of the human hand.

Filed under robotics robotic leg artificial limbs prosthetics CNS technology neuroscience science

68 notes

Monkeys Use Minds to Move Two Virtual Arms

In a study led by Duke researchers, monkeys have learned to control the movement of both arms on an avatar using just their brain activity.

The findings, published Nov. 6, 2013, in the journal Science Translational Medicine, advance efforts to develop bilateral movement in brain-controlled prosthetic devices for severely paralyzed patients.

To enable the monkeys to control two virtual arms, researchers recorded nearly 500 neurons from multiple areas in both cerebral hemispheres of the animals’ brains, the largest number of neurons recorded and reported to date

Millions of people worldwide suffer from sensory and motor deficits caused by spinal cord injuries. Researchers are working to develop tools to help restore their mobility and sense of touch by connecting their brains with assistive devices. The brain-machine interface approach, pioneered at the Duke University Center for Neuroengineering in the early 2000s, holds promise for reaching this goal. However, until now brain-machine interfaces could only control a single prosthetic limb.

“Bimanual movements in our daily activities — from typing on a keyboard to opening a can — are critically important,” said senior author Miguel Nicolelis, M.D., Ph.D., professor of neurobiology at Duke University School of Medicine. “Future brain-machine interfaces aimed at restoring mobility in humans will have to incorporate multiple limbs to greatly benefit severely paralyzed patients.”

Nicolelis and his colleagues studied large-scale cortical recordings to see if they could provide sufficient signals to brain-machine interfaces to accurately control bimanual movements.

The monkeys were trained in a virtual environment within which they viewed realistic avatar arms on a screen and were encouraged to place their virtual hands on specific targets in a bimanual motor task. The monkeys first learned to control the avatar arms using a pair of joysticks, but were able to learn to use just their brain activity to move both avatar arms without moving their own arms.

As the animals’ performance in controlling both virtual arms improved over time, the researchers observed widespread plasticity in cortical areas of their brains. These results suggest that the monkeys’ brains may incorporate the avatar arms into their internal image of their bodies, a finding recently reported by the same researchers in the journal Proceedings of the National Academy of Sciences.

The researchers also found that cortical regions showed specific patterns of neuronal electrical activity during bimanual movements that differed from the neuronal activity produced for moving each arm separately.

The study suggests that very large neuronal ensembles — not single neurons — define the underlying physiological unit of normal motor functions. Small neuronal samples of the cortex may be insufficient to control complex motor behaviors using a brain-machine interface.

“When we looked at the properties of individual neurons, or of whole populations of cortical cells, we noticed that simply summing up the neuronal activity correlated to movements of the right and left arms did not allow us to predict what the same individual neurons or neuronal populations would do when both arms were engaged together in a bimanual task,” Nicolelis said. “This finding points to an emergent brain property — a non-linear summation — for when both hands are engaged at once.”

Nicolelis is incorporating the study’s findings into the Walk Again Project, an international collaboration working to build a brain-controlled neuroprosthetic device. The Walk Again Project plans to demonstrate its first brain-controlled exoskeleton, which is currently being developed, during the opening ceremony of the 2014 FIFA World Cup.

Filed under brain activity prosthetics bimanual movements neurons plasticity neuroscience science

504 notes

A Blueprint for Restoring Touch with a Prosthetic Hand
New research at the University of Chicago is laying the groundwork for touch-sensitive prosthetic limbs that one day could convey real-time sensory information to amputees via a direct interface with the brain.
The research, published early online in the Proceedings of the National Academy of Sciences, marks an important step toward new technology that, if implemented successfully, would increase the dexterity and clinical viability of robotic prosthetic limbs.
“To restore sensory motor function of an arm, you not only have to replace the motor signals that the brain sends to the arm to move it around, but you also have to replace the sensory signals that the arm sends back to the brain,” said the study’s senior author, Sliman Bensmaia, PhD, assistant professor in the Department of Organismal Biology and Anatomy at the University of Chicago. “We think the key is to invoke what we know about how the brain of the intact organism processes sensory information, and then try to reproduce these patterns of neural activity through stimulation of the brain.”
Bensmaia’s research is part of Revolutionizing Prosthetics, a multi-year Defense Advanced Research Projects Agency (DARPA) project that seeks to create a modular, artificial upper limb that will restore natural motor control and sensation in amputees. Managed by the Johns Hopkins University Applied Physics Laboratory, the project has brought together an interdisciplinary team of experts from academic institutions, government agencies and private companies.
Bensmaia and his colleagues at the University of Chicago are working specifically on the sensory aspects of these limbs. In a series of experiments with monkeys, whose sensory systems closely resemble those of humans, they indentified patterns of neural activity that occur during natural object manipulation and then successfully induced these patterns through artificial means.
The first set of experiments focused on contact location, or sensing where the skin has been touched. The animals were trained to identify several patterns of physical contact with their fingers. Researchers then connected electrodes to areas of the brain corresponding to each finger and replaced physical touches with electrical stimuli delivered to the appropriate areas of the brain. The result: The animals responded the same way to artificial stimulation as they did to physical contact.
Next the researchers focused on the sensation of pressure. In this case, they developed an algorithm to generate the appropriate amount of electrical current to elicit a sensation of pressure. Again, the animals’ response was the same whether the stimuli were felt through their fingers or through artificial means.
Finally, Bensmaia and his colleagues studied the sensation of contact events. When the hand first touches or releases an object, it produces a burst of activity in the brain. Again, the researchers established that these bursts of brain activity can be mimicked through electrical stimulation.
The result of these experiments is a set of instructions that can be incorporated into a robotic prosthetic arm to provide sensory feedback to the brain through a neural interface. Bensmaia believes such feedback will bring these devices closer to being tested in human clinical trials.
“The algorithms to decipher motor signals have come quite a long way, where you can now control arms with seven degrees of freedom. It’s very sophisticated. But I think there’s a strong argument to be made that they will not be clinically viable until the sensory feedback is incorporated,” Bensmaia said. “When it is, the functionality of these limbs will increase substantially.”

A Blueprint for Restoring Touch with a Prosthetic Hand

New research at the University of Chicago is laying the groundwork for touch-sensitive prosthetic limbs that one day could convey real-time sensory information to amputees via a direct interface with the brain.

The research, published early online in the Proceedings of the National Academy of Sciences, marks an important step toward new technology that, if implemented successfully, would increase the dexterity and clinical viability of robotic prosthetic limbs.

“To restore sensory motor function of an arm, you not only have to replace the motor signals that the brain sends to the arm to move it around, but you also have to replace the sensory signals that the arm sends back to the brain,” said the study’s senior author, Sliman Bensmaia, PhD, assistant professor in the Department of Organismal Biology and Anatomy at the University of Chicago. “We think the key is to invoke what we know about how the brain of the intact organism processes sensory information, and then try to reproduce these patterns of neural activity through stimulation of the brain.”

Bensmaia’s research is part of Revolutionizing Prosthetics, a multi-year Defense Advanced Research Projects Agency (DARPA) project that seeks to create a modular, artificial upper limb that will restore natural motor control and sensation in amputees. Managed by the Johns Hopkins University Applied Physics Laboratory, the project has brought together an interdisciplinary team of experts from academic institutions, government agencies and private companies.

Bensmaia and his colleagues at the University of Chicago are working specifically on the sensory aspects of these limbs. In a series of experiments with monkeys, whose sensory systems closely resemble those of humans, they indentified patterns of neural activity that occur during natural object manipulation and then successfully induced these patterns through artificial means.

The first set of experiments focused on contact location, or sensing where the skin has been touched. The animals were trained to identify several patterns of physical contact with their fingers. Researchers then connected electrodes to areas of the brain corresponding to each finger and replaced physical touches with electrical stimuli delivered to the appropriate areas of the brain. The result: The animals responded the same way to artificial stimulation as they did to physical contact.

Next the researchers focused on the sensation of pressure. In this case, they developed an algorithm to generate the appropriate amount of electrical current to elicit a sensation of pressure. Again, the animals’ response was the same whether the stimuli were felt through their fingers or through artificial means.

Finally, Bensmaia and his colleagues studied the sensation of contact events. When the hand first touches or releases an object, it produces a burst of activity in the brain. Again, the researchers established that these bursts of brain activity can be mimicked through electrical stimulation.

The result of these experiments is a set of instructions that can be incorporated into a robotic prosthetic arm to provide sensory feedback to the brain through a neural interface. Bensmaia believes such feedback will bring these devices closer to being tested in human clinical trials.

“The algorithms to decipher motor signals have come quite a long way, where you can now control arms with seven degrees of freedom. It’s very sophisticated. But I think there’s a strong argument to be made that they will not be clinically viable until the sensory feedback is incorporated,” Bensmaia said. “When it is, the functionality of these limbs will increase substantially.”

Filed under BCI neural activity robotics prosthetics touch technology neuroscience science

free counters