Neuroscience

Articles and news from the latest research reports.

Posts tagged BCI

378 notes

Harvard creates brain-to-brain interface, allows humans to control other animals with thoughts alone
Researchers at Harvard University have created the first noninvasive brain-to-brain interface (BBI) between a human… and a rat. Simply by thinking the appropriate thought, the BBI allows the human to control the rat’s tail. This is one of the most important steps towards BBIs that allow for telepathic links between two or more humans — which is a good thing in the case of friends and family, but terrifying if you stop to think about the nefarious possibilities of a fascist dictatorship with mind control tech.
In recent years there have been huge advances in the field of brain-computer interfaces, where your thoughts are detected and “understood” by a sensor attached to a computer, but relatively little work has been done in the opposite direction (computer-brain interfaces). This is because it’s one thing for a computer to work out what a human is thinking (by asking or observing their actions), but another thing entirely to inject new thoughts into a human brain. To put it bluntly, we have almost no idea of how thoughts are encoded by neurons in the brain. For now, the best we can do is create a computer-brain interface that stimulates a region of the brain that’s known to create a certain reaction — such as the specific part of the motor cortex that’s in charge of your fingers. We don’t have the power to move your fingers in a specific way — that would require knowing the brain’s encoding scheme — but we can make them jerk around.
Which brings us neatly onto Harvard’s human-mouse brain-to-brain interface. The human wears a run-of-the-mill EEG-based BCI, while the mouse is equipped with a focused ultrasound (FUS) computer-brain interface (CBI). FUS is a relatively new technology that allows the researchers to excite a very specific region of neurons in the rat’s brain using an ultrasound signal. The main advantage of FUS is that, unlike most brain-stimulation techniques, such as DBS, it isn’t invasive. For now it looks like the FUS equipment is fairly bulky, but future versions might be small enough for use in everyday human CBIs. 
With the EEG equipped, the BCI detects whenever the human looks at a specific pattern on a computer screen. The BCI then fires off a command to rat’s CBI, which causes ultrasound to be beamed into the region of the rat’s motor cortex that deals with tail movement. As you can see in the video above, this causes the rat’s tail to move. The researchers report that the human BCI has an accuracy of 94%, and that it generally takes around 1.5 seconds for the entire process — from the human deciding to look at the screen, through to the movement of the rat’s tail. In theory, the human could trigger a rodent tail-wag by simply thinking about it, rather than having to look at a specific pattern — but presumably, for the sake of this experiment, the researchers wanted to focus on the FUS CBI, rather than the BCI.
Moving forward, the researchers now need to work on the transmitting of more complex ideas, such as hunger or sexual arousal, from human to rat. At some point, they’ll also have to put the FUS CBI on a human, to see if thoughts can be transferred in the opposite direction. Finally, we’ll need to combine an EEG and FUS into a single unit, to allow for bidirectional sharing of thoughts and ideas. Human-to-human telepathy is the most obvious use, but what if the same bidirectional technology also allows us to really communicate with animals, such as dogs? There would be huge ethical concerns, of course, especially if a dictatorial tyrant uses the tech to control our thoughts — but the same can be said of almost every futuristic, transhumanist technology.

Harvard creates brain-to-brain interface, allows humans to control other animals with thoughts alone

Researchers at Harvard University have created the first noninvasive brain-to-brain interface (BBI) between a human… and a rat. Simply by thinking the appropriate thought, the BBI allows the human to control the rat’s tail. This is one of the most important steps towards BBIs that allow for telepathic links between two or more humans — which is a good thing in the case of friends and family, but terrifying if you stop to think about the nefarious possibilities of a fascist dictatorship with mind control tech.

In recent years there have been huge advances in the field of brain-computer interfaces, where your thoughts are detected and “understood” by a sensor attached to a computer, but relatively little work has been done in the opposite direction (computer-brain interfaces). This is because it’s one thing for a computer to work out what a human is thinking (by asking or observing their actions), but another thing entirely to inject new thoughts into a human brain. To put it bluntly, we have almost no idea of how thoughts are encoded by neurons in the brain. For now, the best we can do is create a computer-brain interface that stimulates a region of the brain that’s known to create a certain reaction — such as the specific part of the motor cortex that’s in charge of your fingers. We don’t have the power to move your fingers in a specific way — that would require knowing the brain’s encoding scheme — but we can make them jerk around.

Which brings us neatly onto Harvard’s human-mouse brain-to-brain interface. The human wears a run-of-the-mill EEG-based BCI, while the mouse is equipped with a focused ultrasound (FUS) computer-brain interface (CBI). FUS is a relatively new technology that allows the researchers to excite a very specific region of neurons in the rat’s brain using an ultrasound signal. The main advantage of FUS is that, unlike most brain-stimulation techniques, such as DBS, it isn’t invasive. For now it looks like the FUS equipment is fairly bulky, but future versions might be small enough for use in everyday human CBIs.

With the EEG equipped, the BCI detects whenever the human looks at a specific pattern on a computer screen. The BCI then fires off a command to rat’s CBI, which causes ultrasound to be beamed into the region of the rat’s motor cortex that deals with tail movement. As you can see in the video above, this causes the rat’s tail to move. The researchers report that the human BCI has an accuracy of 94%, and that it generally takes around 1.5 seconds for the entire process — from the human deciding to look at the screen, through to the movement of the rat’s tail. In theory, the human could trigger a rodent tail-wag by simply thinking about it, rather than having to look at a specific pattern — but presumably, for the sake of this experiment, the researchers wanted to focus on the FUS CBI, rather than the BCI.

Moving forward, the researchers now need to work on the transmitting of more complex ideas, such as hunger or sexual arousal, from human to rat. At some point, they’ll also have to put the FUS CBI on a human, to see if thoughts can be transferred in the opposite direction. Finally, we’ll need to combine an EEG and FUS into a single unit, to allow for bidirectional sharing of thoughts and ideas. Human-to-human telepathy is the most obvious use, but what if the same bidirectional technology also allows us to really communicate with animals, such as dogs? There would be huge ethical concerns, of course, especially if a dictatorial tyrant uses the tech to control our thoughts — but the same can be said of almost every futuristic, transhumanist technology.

Filed under brain-to-brain interface transcranial focused ultrasound neural activity BCI neuroscience science

69 notes

Carbon Nanotube Harpoon Catches Individual Brain Cell Signals

Neuroscientists may soon be modern-day harpooners, snaring individual brain-cell signals instead of whales with tiny spears made of carbon nanotubes.

image

(This image, taken with a scanning electron microscope, shows a new brain electrode that tapers to a point as thick as a single carbon nanotube. Credit: Inho Yoon and Bruce Donald, Duke)

The new brain cell spear is a millimeter long, only a few nanometers wide and harnesses the superior electromechanical properties of carbon nanotubes to capture electrical signals from individual neurons.

"To our knowledge, this is the first time scientists have used carbon nanotubes to record signals from individual neurons, what we call intracellular recordings, in brain slices or intact brains of vertebrates," said Bruce Donald, a professor of computer science and biochemistry at Duke University who helped developed the probe. 

He and his collaborators describe the carbon nanotube probes June 19 in PLOS ONE.

"The results are a good proof of principle that carbon nanotubes could be used for studying signals from individual nerve cells," said Duke neurobiologist Richard Mooney, a study co-author. "If the technology continues to develop, it could be quite helpful for studying the brain."

Scientists want to study signals from individual neurons and their interactions with other brain cells to better understand the computational complexity of the brain. 

Currently, they use two main types of electrodes, metal and glass, to record signals from brain cells. Metal electrodes record spikes from a population of brain cells and work well in live animals. Glass electrodes also measure spikes, as well as the computations individual cells perform, but are delicate and break easily.

"The new carbon nanotubes combine the best features of both metal and glass electrodes. They record well both inside and outside brain cells, and they are quite flexible. Because they won’t shatter, scientists could use them to record signals from individual brain cells of live animals," said Duke neurobiologist Michael Platt, who was not involved in the study.

In the past, other scientists have experimented with carbon nanotube probes. But the electrodes were thick, causing tissue damage, or they were short, limiting how far they could penetrate into brain tissue. They could not probe inside individual neurons.

To change this, Donald began working on a harpoon-like carbon-nanotube probe with Duke neurobiologist Richard Mooney five years ago. The two met during their first year at Yale in the 1976, kept in touch throughout graduate school and began meeting to talk about their research after they both came to Duke. 

Mooney told Donald about his work recording brain signals from live zebra finches and mice. The work was challenging, he said, because the probes and machinery to do the studies were large and bulky on the small head of a mouse or bird.

With Donald’s expertise in nanotechnology and robotics and Mooney’s in neurobiology, the two thought they could work together to shrink the machinery and improve the probes with nano-materials.

To make the probe, graduate student Inho Yoon and Duke physicist Gleb Finkelstein used the tip of an electrochemically sharpened tungsten wire as the base and extended it with self-entangled multi-wall carbon nanotubes to create a millimeter-long rod. The scientists then sharpened the nanotubes into a tiny harpoon using a focused ion beam at North Carolina State University.

Yoon then took the nano-harpoon to Mooney’s lab and jabbed it into slices of mouse brain tissue and then into the brains of anesthetized mice. The results show that the probe transmits brain signals as well as, and sometimes better than, conventional glass electrodes and is less likely to break off in the tissue. The new probe also penetrates individual neurons, recording the signals of a single cell rather than the nearest population of them. 

Based on the results, the team has applied for a patent on the nano-harpoon. Platt said scientists might use the probes in a range of applications, from basic science to human brain-computer interfaces and brain prostheses.

Donald said the new probe makes advances in those directions, but the insulation layers, electrical recording abilities and geometry of the device still need improvement.

Filed under carbon nanotubes nerve cells BCI neurobiology nanotechnology neuroscience science

98 notes

New tasks become as simple as waving a hand with brain-computer interfaces
Small electrodes placed on or inside the brain allow patients to interact with computers or control robotic limbs simply by thinking about how to execute those actions. This technology could improve communication and daily life for a person who is paralyzed or has lost the ability to speak from a stroke or neurodegenerative disease.
Now, University of Washington researchers have demonstrated that when humans use this technology – called a brain-computer interface – the brain behaves much like it does when completing simple motor skills such as kicking a ball, typing or waving a hand. Learning to control a robotic arm or a prosthetic limb could become second nature for people who are paralyzed.
“What we’re seeing is that practice makes perfect with these tasks,” said Rajesh Rao, a UW professor of computer science and engineering and a senior researcher involved in the study. “There’s a lot of engagement of the brain’s cognitive resources at the very beginning, but as you get better at the task, those resources aren’t needed anymore and the brain is freed up.”
Rao and UW collaborators Jeffrey Ojemann, a professor of neurological surgery, and Jeremiah Wander, a doctoral student in bioengineering, published their results online June 10 in the Proceedings of the National Academy of Sciences.
In this study, seven people with severe epilepsy were hospitalized for a monitoring procedure that tries to identify where in the brain seizures originate. Physicians cut through the scalp, drilled into the skull and placed a thin sheet of electrodes directly on top of the brain. While they were watching for seizure signals, the researchers also conducted this study.
The patients were asked to move a mouse cursor on a computer screen by using only their thoughts to control the cursor’s movement. Electrodes on their brains picked up the signals directing the cursor to move, sending them to an amplifier and then a laptop to be analyzed. Within 40 milliseconds, the computer calculated the intentions transmitted through the signal and updated the movement of the cursor on the screen.
Researchers found that when patients started the task, a lot of brain activity was centered in the prefrontal cortex, an area associated with learning a new skill. But after often as little as 10 minutes, frontal brain activity lessened, and the brain signals transitioned to patterns similar to those seen during more automatic actions.
“Now we have a brain marker that shows a patient has actually learned a task,” Ojemann said. “Once the signal has turned off, you can assume the person has learned it.”
While researchers have demonstrated success in using brain-computer interfaces in monkeys and humans, this is the first study that clearly maps the neurological signals throughout the brain. The researchers were surprised at how many parts of the brain were involved.
“We now have a larger-scale view of what’s happening in the brain of a subject as he or she is learning a task,” Rao said. “The surprising result is that even though only a very localized population of cells is used in the brain-computer interface, the brain recruits many other areas that aren’t directly involved to get the job done.”
Several types of brain-computer interfaces are being developed and tested. The least invasive is a device placed on a person’s head that can detect weak electrical signatures of brain activity. Basic commercial gaming products are on the market, but this technology isn’t very reliable yet because signals from eye blinking and other muscle movements interfere too much.
A more invasive alternative is to surgically place electrodes inside the brain tissue itself to record the activity of individual neurons. Researchers at Brown University and the University of Pittsburgh have demonstrated this in humans as patients, unable to move their arms or legs, have learned to control robotic arms using the signal directly from their brain.
The UW team tested electrodes on the surface of the brain, underneath the skull. This allows researchers to record brain signals at higher frequencies and with less interference than measurements from the scalp. A future wireless device could be built to remain inside a person’s head for a longer time to be able to control computer cursors or robotic limbs at home.
“This is one push as to how we can improve the devices and make them more useful to people,” Wander said. “If we have an understanding of how someone learns to use these devices, we can build them to respond accordingly.”
The research team, along with the National Science Foundation’s Engineering Research Center for Sensorimotor Neural Engineering headquartered at the UW, will continue developing these technologies.

New tasks become as simple as waving a hand with brain-computer interfaces

Small electrodes placed on or inside the brain allow patients to interact with computers or control robotic limbs simply by thinking about how to execute those actions. This technology could improve communication and daily life for a person who is paralyzed or has lost the ability to speak from a stroke or neurodegenerative disease.

Now, University of Washington researchers have demonstrated that when humans use this technology – called a brain-computer interface – the brain behaves much like it does when completing simple motor skills such as kicking a ball, typing or waving a hand. Learning to control a robotic arm or a prosthetic limb could become second nature for people who are paralyzed.

“What we’re seeing is that practice makes perfect with these tasks,” said Rajesh Rao, a UW professor of computer science and engineering and a senior researcher involved in the study. “There’s a lot of engagement of the brain’s cognitive resources at the very beginning, but as you get better at the task, those resources aren’t needed anymore and the brain is freed up.”

Rao and UW collaborators Jeffrey Ojemann, a professor of neurological surgery, and Jeremiah Wander, a doctoral student in bioengineering, published their results online June 10 in the Proceedings of the National Academy of Sciences.

In this study, seven people with severe epilepsy were hospitalized for a monitoring procedure that tries to identify where in the brain seizures originate. Physicians cut through the scalp, drilled into the skull and placed a thin sheet of electrodes directly on top of the brain. While they were watching for seizure signals, the researchers also conducted this study.

The patients were asked to move a mouse cursor on a computer screen by using only their thoughts to control the cursor’s movement. Electrodes on their brains picked up the signals directing the cursor to move, sending them to an amplifier and then a laptop to be analyzed. Within 40 milliseconds, the computer calculated the intentions transmitted through the signal and updated the movement of the cursor on the screen.

Researchers found that when patients started the task, a lot of brain activity was centered in the prefrontal cortex, an area associated with learning a new skill. But after often as little as 10 minutes, frontal brain activity lessened, and the brain signals transitioned to patterns similar to those seen during more automatic actions.

“Now we have a brain marker that shows a patient has actually learned a task,” Ojemann said. “Once the signal has turned off, you can assume the person has learned it.”

While researchers have demonstrated success in using brain-computer interfaces in monkeys and humans, this is the first study that clearly maps the neurological signals throughout the brain. The researchers were surprised at how many parts of the brain were involved.

“We now have a larger-scale view of what’s happening in the brain of a subject as he or she is learning a task,” Rao said. “The surprising result is that even though only a very localized population of cells is used in the brain-computer interface, the brain recruits many other areas that aren’t directly involved to get the job done.”

Several types of brain-computer interfaces are being developed and tested. The least invasive is a device placed on a person’s head that can detect weak electrical signatures of brain activity. Basic commercial gaming products are on the market, but this technology isn’t very reliable yet because signals from eye blinking and other muscle movements interfere too much.

A more invasive alternative is to surgically place electrodes inside the brain tissue itself to record the activity of individual neurons. Researchers at Brown University and the University of Pittsburgh have demonstrated this in humans as patients, unable to move their arms or legs, have learned to control robotic arms using the signal directly from their brain.

The UW team tested electrodes on the surface of the brain, underneath the skull. This allows researchers to record brain signals at higher frequencies and with less interference than measurements from the scalp. A future wireless device could be built to remain inside a person’s head for a longer time to be able to control computer cursors or robotic limbs at home.

“This is one push as to how we can improve the devices and make them more useful to people,” Wander said. “If we have an understanding of how someone learns to use these devices, we can build them to respond accordingly.”

The research team, along with the National Science Foundation’s Engineering Research Center for Sensorimotor Neural Engineering headquartered at the UW, will continue developing these technologies.

Filed under BCI brainwaves motor skills brain activity epilepsy neuroscience science

262 notes

Incredible Technology: How to See Inside the Mind
Human experience is defined by the brain, yet much about this 3-lb. organ remains a mystery. Even so, from brain imaging to brain-computer interfaces, scientists have made impressive strides in developing technologies to peer inside the mind.
Imaging the brain
Currently, scientists who study the brain can look at its structure or its function. In structural imaging, machines take snapshots of the brain’s large-scale anatomy that can be used to diagnose tumors or blood clots, for example. Functional imaging provides a dynamic view of the brain, showing which areas are active during thinking and perception.
Structural-imaging techniques include CAT scans, or computerized axial tomography, which takes images of slices through the brain by beaming X-rays at the head from many different angles. CAT, or CT, scans are often used to diagnose a brain injury, for example. Another method, positron emission tomography (PET), generates both 2D and 3D images of the brain: A radioactively labeled chemical injected into the blood emits gamma rays that a scanner detects. And magnetic resonance imaging (MRI) provides a view of the brain’s overall structure by measuring the magnetic spin of atoms inside a strong magnetic field.
"There’s no question that MRI is probably the best way to see the brain," said Dr. Mauricio Castillo, a radiologist at the University of North Carolina at Chapel Hill and editor-in-chief of the American Journal of Neuroradiology.
In the realm of functional imaging, the current gold standard is functional MRI (fMRI). This technique measures changes in blood flow to different brain areas as a proxy for which areas are active when someone performs a task like reading a word or viewing a picture.
"The emphasis nowadays is to try to merge how the brain is wired with the activation of the cortex [the brain’s outermost layer]," Castillo said.
Several methods can be combined to merge brain structure and function. For example, MRI and PET scanning can be performed simultaneously, and the images can be combined to show physiological activity superimposed on an anatomical map of the brain. The end result can be used to tell a surgeon the location of a brain lesion so it can be removed, Castillo said.
Recently, a new technique has been developed to literally see inside the brain. Called CLARITY (originally for Clear Lipid-exchanged Acrylamide-hybridized Rigid Imaging/Immunostaining/In situ hybridization-compatible Tissue-hYdrogel), it can make a (nonliving) brain transparent to light while keeping its structure intact. The technique has already been used to visualize the neurological wiring of an adult mouse brain.
Decoding thoughts
Some scientists want to see inside the brain more figuratively. Enter brain-computer interfaces (BCIs or BMIs, brain-machine interfaces), devices that connect brain signals to an external device, such as a computer or prosthetic limb. BCIs range from noninvasive systems that consist of electrodes placed on the scalp, to more invasive ones that require the electrodes to be implanted in the brain itself.
Noninvasive BCIs include scalp-based electroencephalography (EEG), which records the activity of many neurons over large brain areas. The advantage of EEG-based systems is that they don’t require surgery. On the other hand, these systems can only detect generalized brain activity, so the user must focus his or her thoughts on just a single task.
More invasive systems include electrocorticography (ECoG), in which electrodes are implanted on the surface of the brain to record EEG signals from the cortex. Since Wilder Penfield and Herbert Jasper pioneered the technique in the early 1950s, it has been used, among other purposes, to identify brain regions where epileptic seizures begin.
Some BCIs use electrodes implanted inside the brain’s cortex. Although these systems are more invasive, they have much better resolution and can pick up the signals sent by individual neurons. BCIs can now even allow humans with paraplegia (paralysis of all four limbs) to control a robotic arm through thought alone, or allow users to spell out words on a computer screen using just their mind.
Despite many advances, a lot remains unknown about the brain. To bridge this gap, American scientists are embarking on a new project to map the human brain, announced by President Barack Obama in April, called the BRAIN initiative (Brain Research through Advancing Innovative Neurotechnologies).
But neuroscientists have their work cut out for them. “The brain is probably the most complex machine in the universe,” Castillo said. “We’re still a long way from understanding it.”

Incredible Technology: How to See Inside the Mind

Human experience is defined by the brain, yet much about this 3-lb. organ remains a mystery. Even so, from brain imaging to brain-computer interfaces, scientists have made impressive strides in developing technologies to peer inside the mind.

Imaging the brain

Currently, scientists who study the brain can look at its structure or its function. In structural imaging, machines take snapshots of the brain’s large-scale anatomy that can be used to diagnose tumors or blood clots, for example. Functional imaging provides a dynamic view of the brain, showing which areas are active during thinking and perception.

Structural-imaging techniques include CAT scans, or computerized axial tomography, which takes images of slices through the brain by beaming X-rays at the head from many different angles. CAT, or CT, scans are often used to diagnose a brain injury, for example. Another method, positron emission tomography (PET), generates both 2D and 3D images of the brain: A radioactively labeled chemical injected into the blood emits gamma rays that a scanner detects. And magnetic resonance imaging (MRI) provides a view of the brain’s overall structure by measuring the magnetic spin of atoms inside a strong magnetic field.

"There’s no question that MRI is probably the best way to see the brain," said Dr. Mauricio Castillo, a radiologist at the University of North Carolina at Chapel Hill and editor-in-chief of the American Journal of Neuroradiology.

In the realm of functional imaging, the current gold standard is functional MRI (fMRI). This technique measures changes in blood flow to different brain areas as a proxy for which areas are active when someone performs a task like reading a word or viewing a picture.

"The emphasis nowadays is to try to merge how the brain is wired with the activation of the cortex [the brain’s outermost layer]," Castillo said.

Several methods can be combined to merge brain structure and function. For example, MRI and PET scanning can be performed simultaneously, and the images can be combined to show physiological activity superimposed on an anatomical map of the brain. The end result can be used to tell a surgeon the location of a brain lesion so it can be removed, Castillo said.

Recently, a new technique has been developed to literally see inside the brain. Called CLARITY (originally for Clear Lipid-exchanged Acrylamide-hybridized Rigid Imaging/Immunostaining/In situ hybridization-compatible Tissue-hYdrogel), it can make a (nonliving) brain transparent to light while keeping its structure intact. The technique has already been used to visualize the neurological wiring of an adult mouse brain.

Decoding thoughts

Some scientists want to see inside the brain more figuratively. Enter brain-computer interfaces (BCIs or BMIs, brain-machine interfaces), devices that connect brain signals to an external device, such as a computer or prosthetic limb. BCIs range from noninvasive systems that consist of electrodes placed on the scalp, to more invasive ones that require the electrodes to be implanted in the brain itself.

Noninvasive BCIs include scalp-based electroencephalography (EEG), which records the activity of many neurons over large brain areas. The advantage of EEG-based systems is that they don’t require surgery. On the other hand, these systems can only detect generalized brain activity, so the user must focus his or her thoughts on just a single task.

More invasive systems include electrocorticography (ECoG), in which electrodes are implanted on the surface of the brain to record EEG signals from the cortex. Since Wilder Penfield and Herbert Jasper pioneered the technique in the early 1950s, it has been used, among other purposes, to identify brain regions where epileptic seizures begin.

Some BCIs use electrodes implanted inside the brain’s cortex. Although these systems are more invasive, they have much better resolution and can pick up the signals sent by individual neurons. BCIs can now even allow humans with paraplegia (paralysis of all four limbs) to control a robotic arm through thought alone, or allow users to spell out words on a computer screen using just their mind.

Despite many advances, a lot remains unknown about the brain. To bridge this gap, American scientists are embarking on a new project to map the human brain, announced by President Barack Obama in April, called the BRAIN initiative (Brain Research through Advancing Innovative Neurotechnologies).

But neuroscientists have their work cut out for them. “The brain is probably the most complex machine in the universe,” Castillo said. “We’re still a long way from understanding it.”

Filed under brain brain imaging BCI neuroscience science

114 notes

Helicopter takes to the skies with the power of thought

A remote controlled helicopter has been flown through a series of hoops around a college gymnasium in Minnesota.

It sounds like your everyday student project; however, there is one caveat…the helicopter was controlled using just the power of thought.

The experiments have been performed by researchers hoping to develop future robots that can help restore the autonomy of paralysed victims or those suffering from neurodegenerative disorders.

Their study has been published today, 4 June 2013, in IOP Publishing’s Journal of Neural Engineering and is accompanied by a video of the helicopter control in action. 

There were five subjects (three female, two male) who took part in the study and each one was able to successfully control the four-blade helicopter, also known as a quadcopter, quickly and accurately for a sustained amount of time.

Lead author of the study Professor Bin He, from the University of Minnesota College of Science and Engineering, said: “Our study shows that for the first time, humans are able to control the flight of flying robots using just their thoughts, sensed from noninvasive brain waves.”

The noninvasive technique used was electroencephalography (EEG), which recorded the electrical activity of the subjects’ brain through a cap fitted with 64 electrodes.

Facing away from the quadcopter, the subjects were asked to imagine using their right hand, left hand, and both hands together; this would instruct the quadcopter to turn right, left, lift, and then fall, respectively. The quadcopter was driven with a pre-set forward moving velocity and controlled through the sky with the subject’s thoughts.

The subjects were positioned in front of a screen which relayed images of the quadcopter’s flight through an on-board camera, allowing them to see which direction it was travelling in. Brain signals were recorded by the cap and sent to the quadcopter over WiFi.

“In previous work we showed that humans could control a virtual helicopter using just their thoughts. I initially intended to use a small helicopter for this real-life study; however, the quadcopter is more stable, smooth and has fewer safety concerns,” continued Professor He.

After several different training sessions, the subjects were required to fly the quadcopter through two foam rings suspended from the gymnasium ceiling and were scored on three aspects: the number of times they sent the quadcopter through the rings; the number of times the quadcopter collided with the rings; and the number of times they went outside the experiment boundary.

A number of statistical tests were used to calculate how each subject performed.

A group of subjects also directed the quadcopter with a keyboard in a control experiment, allowing for a comparison between a standardised method and brain control.

This process is just one example of a brain–computer interface where a direct pathway between the brain and an external device is created to help assist, augment or repair human cognitive or sensory-motor functions; researchers are currently looking at ways to restore hearing, sight and movement using this approach.

“Our next goal is to control robotic arms using noninvasive brain wave signals, with the eventual goal of developing brain–computer interfaces that aid patients with disabilities or neurodegenerative disorders,” continued Professor He.

Filed under neurodegenerative diseases quadcopter brainwaves EEG BCI robotics neuroscience science

196 notes

Painting through the power of thought enabled by scientists
To the viewer it is an accomplished semiabstract image of flowers and clouds, but in fact this painting was produced by a paralysed woman solely through the power of thought.

Heide Pfützner, a former teacher from Leipzig, Germany, was diagnosed with Amyotrophic Lateral Sclerosis, also known as Motor Neurone Disease, yet she has managed to produce a series of the paintings with the aid of a new brain controlled computer.


She has been trained to master the device that uses brain waves to take control of a palette of colours, shapes and brushes to produce digital artworks.


Building on decades of knowledge about the meaning of the tiny electrical impulses created by the brain during thought, scientists have been able to create a computer programme which translates thoughts into electronic images.
As well as helping patients with progressive brain diseases like Mrs Pfützner, other users of the device include those who are “locked in” to a physically unresponsive state and therefore unable to communicate with the rest of the world.
The system works by detecting changes in the pattern of the user’s brain waves to allow them to select options in software and to move a cursor around a screen in front of them.
Read more

Painting through the power of thought enabled by scientists

To the viewer it is an accomplished semiabstract image of flowers and clouds, but in fact this painting was produced by a paralysed woman solely through the power of thought.

Heide Pfützner, a former teacher from Leipzig, Germany, was diagnosed with Amyotrophic Lateral Sclerosis, also known as Motor Neurone Disease, yet she has managed to produce a series of the paintings with the aid of a new brain controlled computer.

She has been trained to master the device that uses brain waves to take control of a palette of colours, shapes and brushes to produce digital artworks.

Building on decades of knowledge about the meaning of the tiny electrical impulses created by the brain during thought, scientists have been able to create a computer programme which translates thoughts into electronic images.

As well as helping patients with progressive brain diseases like Mrs Pfützner, other users of the device include those who are “locked in” to a physically unresponsive state and therefore unable to communicate with the rest of the world.

The system works by detecting changes in the pattern of the user’s brain waves to allow them to select options in software and to move a cursor around a screen in front of them.

Read more

Filed under BCI brainwaves ALS art brain painting device neuroscience science

264 notes

Paralyzed Patient Moves Prosthetic Arm With Her Mind

It sounds like science fiction, but researchers are gaining ground in developing mind-controlled robotic arms that could give people with paralysis or amputated limbs more independence.

image

The technology, known as brain-computer (or brain-machine) interface, is in its infancy as far as human use — though scientists have been studying the concept for years. But experts say that people with paralysis or amputations could be using the technology at home within the next decade.

It basically boils down to people using their thoughts to control a robot arm that then performs a desired task, like grasping and moving a cup. That’s done via tiny electrode “grids” implanted in the brain that read the movement signals firing from individual nerve cells, then translate them to the robot arm.

"We have the ability to capture information from the brain and use it to control the robotic arm," said Dr. Elizabeth Tyler-Kabara, who presented her team’s latest findings on the technology Tuesday, at the annual meeting of the American Association of Neurological Surgeons, in New Orleans.

However, she stressed, “we still have a ton to learn.”

Right now, the robot arm is confined to the lab. After getting their electrodes implanted, study patients come to the lab to work with the robotic limb under the researchers’ supervision. So far, Tyler-Kabara and her colleagues at the University of Pittsburgh School of Medicine have tested the approach in one patient. Researchers at Brown University in Providence, R.I., have done it in a handful of others.

One of the big questions, Tyler-Kabara said, is “how much control is enough?” That is, how well does the mind-controlled arm need to work to bring real everyday benefits to people?

At the meeting on Tuesday, Tyler-Kabara presented an update on how her team’s patient is faring. The 53-year-old woman had long-standing quadriplegia due to a disease called spinocerebellar degeneration — where, for unknown reasons, the connections between the brain and muscles slowly deteriorate.

Tyler-Kabara performed the surgery, where two tiny electrode grids were placed in the area of the brain that would normally control the movement of the right hand and arm. The electrode points penetrate the brain’s surface by about one-sixteenth of an inch.

"The idea is pretty scary," Tyler-Kabara acknowledged. But her team’s patient had no complications from the surgery and left the hospital the next day. There’ve been no longer-term problems either, she said — though, in theory, there would be concerns about infection or bleeding over the long haul.

The surgery left the patient with two terminals that protrude through her skull. The researchers used those to connect the implanted electrodes to a computer, where they could see brain cells firing when the patient thought about moving her hand.

She was quickly able to master simple movements with the robotic arm, like high-fiving the researchers. And after six months, she was performing “10-degrees-of-freedom” movements, Tyler-Kabara reported at the meeting.

That includes not only moving the arm, but also flexing and rotating the wrist, grasping objects and affecting several different hand “postures.” She has accomplished feats like feeding herself chocolate.

The researchers initially used a computer in training sessions with the patient, but after that the robot arm is directly linked to the electrodes — so there is no need for “computer assistance,” according to Tyler-Kabara.

Still, before the technology can ultimately be used at home, she said, researchers have to devise a “fully implanted” wireless system for controlling the robot arm.

Another expert talked about the new technology.

"This is one more encouraging step toward developing something practical that people can use in their daily lives," said Dr. Robert Grossman, a neurosurgeon at Methodist Neurological Institute in Houston, who was not involved in the research.

It’s hard to put a time line on it all, Grossman said, since technological advances could changes things. He also noted that several research groups are looking at different approaches to brain-computer interfaces.

One, Grossman said, is to do it noninvasively, through electrodes placed on the scalp.

Study author Tyler-Kabara said that noninvasive approach has met with success in helping people perform simple tasks, like moving a cursor on a computer screen. “But I don’t think it will ever be good enough for performing complicated tasks,” she said, noting that it can’t work as precisely as the implanted electrodes.

A next step, Tyler-Kabara said, is to develop a “two-way” electrode system that stimulates the brain to generate sensation — with the aim of helping people adjust the robot’s grip strength.

She said there is also much to learn about which people will ultimately be good candidates for the technology. There may, for example, be some brain injuries that prevent people from benefiting.

Because this study was presented at a medical meeting, the data and conclusions should be viewed as preliminary until published in a peer-reviewed journal.

(Source: health.usnews.com)

Filed under BCI robots robotics prosthetic limbs prosthetic arm neuroscience science

99 notes

Non-Invasive Brain-to-Brain Interface (BBI): Establishing Functional Links between Two Brains
Transcranial focused ultrasound (FUS) is capable of modulating the neural activity of specific brain regions, with a potential role as a non-invasive computer-to-brain interface (CBI). In conjunction with the use of brain-to-computer interface (BCI) techniques that translate brain function to generate computer commands, we investigated the feasibility of using the FUS-based CBI to non-invasively establish a functional link between the brains of different species (i.e. human and Sprague-Dawley rat), thus creating a brain-to-brain interface (BBI). The implementation was aimed to non-invasively translate the human volunteer’s intention to stimulate a rat’s brain motor area that is responsible for the tail movement. The volunteer initiated the intention by looking at a strobe light flicker on a computer display, and the degree of synchronization in the electroencephalographic steady-state-visual-evoked-potentials (SSVEP) with respect to the strobe frequency was analyzed using a computer. Increased signal amplitude in the SSVEP, indicating the volunteer’s intention, triggered the delivery of a burst-mode FUS (350 kHz ultrasound frequency, tone burst duration of 0.5 ms, pulse repetition frequency of 1 kHz, given for 300 msec duration) to excite the motor area of an anesthetized rat transcranially. The successful excitation subsequently elicited the tail movement, which was detected by a motion sensor. The interface was achieved at 94.0±3.0% accuracy, with a time delay of 1.59±1.07 sec from the thought-initiation to the creation of the tail movement. Our results demonstrate the feasibility of a computer-mediated BBI that links central neural functions between two biological entities, which may confer unexplored opportunities in the study of neuroscience with potential implications for therapeutic applications.

Non-Invasive Brain-to-Brain Interface (BBI): Establishing Functional Links between Two Brains

Transcranial focused ultrasound (FUS) is capable of modulating the neural activity of specific brain regions, with a potential role as a non-invasive computer-to-brain interface (CBI). In conjunction with the use of brain-to-computer interface (BCI) techniques that translate brain function to generate computer commands, we investigated the feasibility of using the FUS-based CBI to non-invasively establish a functional link between the brains of different species (i.e. human and Sprague-Dawley rat), thus creating a brain-to-brain interface (BBI). The implementation was aimed to non-invasively translate the human volunteer’s intention to stimulate a rat’s brain motor area that is responsible for the tail movement. The volunteer initiated the intention by looking at a strobe light flicker on a computer display, and the degree of synchronization in the electroencephalographic steady-state-visual-evoked-potentials (SSVEP) with respect to the strobe frequency was analyzed using a computer. Increased signal amplitude in the SSVEP, indicating the volunteer’s intention, triggered the delivery of a burst-mode FUS (350 kHz ultrasound frequency, tone burst duration of 0.5 ms, pulse repetition frequency of 1 kHz, given for 300 msec duration) to excite the motor area of an anesthetized rat transcranially. The successful excitation subsequently elicited the tail movement, which was detected by a motion sensor. The interface was achieved at 94.0±3.0% accuracy, with a time delay of 1.59±1.07 sec from the thought-initiation to the creation of the tail movement. Our results demonstrate the feasibility of a computer-mediated BBI that links central neural functions between two biological entities, which may confer unexplored opportunities in the study of neuroscience with potential implications for therapeutic applications.

Filed under brain-to-brain interface transcranial focused ultrasound neural activity computer-to-brain interface BCI neuroscience science

62 notes

Wireless, implanted sensor broadens range of brain research
A compact, self-contained sensor recorded and transmitted brain activity data wirelessly for more than a year in early stage animal tests, according to a study funded by the National Institutes of Health. In addition to allowing for more natural studies of brain activity in moving subjects, this implantable device represents a potential major step toward cord-free control of advanced prosthetics that move with the power of thought. The report is in the April 2013 issue of the Journal of Neural Engineering.
“For people who have sustained paralysis or limb amputation, rehabilitation can be slow and frustrating because they have to learn a new way of doing things that the rest of us do without actively thinking about it,” said Grace Peng, Ph.D., who oversees the Rehabilitation Engineering Program of the National Institute of Biomedical Imaging and Bioengineering (NIBIB), part of NIH. “Brain-computer interfaces harness existing brain circuitry, which may offer a more intuitive rehab experience, and ultimately, a better quality of life for people who have already faced serious challenges.”
Recent advances in brain-computer interfaces (BCI) have shown that it is possible for a person to control a robotic arm through implanted brain sensors linked to powerful external computers. However, such devices have relied on wired connections, which pose infection risks and restrict movement, or were wireless but had very limited computing power.
Building on this line of research, David Borton, Ph.D., and Ming Yin, Ph.D., of Brown University, Providence, R.I., and colleagues surmounted several major barriers in developing their sensor. To be fully implantable within the brain, the device needed to be very small and completely sealed off to protect the delicate machinery inside the device and the even more delicate tissue surrounding it. At the same time, it had to be powerful enough to convert the brain’s subtle electrical activity into digital signals that could be used by a computer, and then boost those signals to a level that could be detected by a wireless receiver located some distance outside the body. Like all cordless machines, the device had to be rechargeable, but in the case of an implanted brain sensor, recharging must also be done wirelessly.
The researchers consulted with brain surgeons on the shape and size of the sensor, which they built out of titanium, commonly used in joint replacements and other medical implants. They also fitted the device with a window made of sapphire, which electromagnetic signals pass through more easily than other materials, to assist with wireless transmission and inductive charging, a method of recharging also used in electronic toothbrushes. Inside, the device was densely packed with the electronics specifically designed to function on low power to reduce the amount of heat generated by the device and to extend the time it could work on battery power.
Testing the device in animal models — two pigs and two rhesus macaques — the researchers were able to receive and record data from the implanted sensors in real time over a broadband wireless connection. The sensors could transmit signals more than three feet and have continued to perform for over a year with little degradation in quality or performance.
The ability to remotely record brain activity data as an animal interacts naturally with its environment may help inform studies on muscle control and the movement-related brain circuits, the researchers say. While testing of the current devices continues, the researchers plan to refine the sensor for better heat management and data transmission, with use in human medical care as the goal.
“Clinical applications may include thought-controlled prostheses for severely neurologically impaired patients, wireless access to motorized wheelchairs or other assistive technologies, and diagnostic monitoring such as in epilepsy, where patients currently are tethered to the bedside during assessment,” said Borton.

Wireless, implanted sensor broadens range of brain research

A compact, self-contained sensor recorded and transmitted brain activity data wirelessly for more than a year in early stage animal tests, according to a study funded by the National Institutes of Health. In addition to allowing for more natural studies of brain activity in moving subjects, this implantable device represents a potential major step toward cord-free control of advanced prosthetics that move with the power of thought. The report is in the April 2013 issue of the Journal of Neural Engineering.

“For people who have sustained paralysis or limb amputation, rehabilitation can be slow and frustrating because they have to learn a new way of doing things that the rest of us do without actively thinking about it,” said Grace Peng, Ph.D., who oversees the Rehabilitation Engineering Program of the National Institute of Biomedical Imaging and Bioengineering (NIBIB), part of NIH. “Brain-computer interfaces harness existing brain circuitry, which may offer a more intuitive rehab experience, and ultimately, a better quality of life for people who have already faced serious challenges.”

Recent advances in brain-computer interfaces (BCI) have shown that it is possible for a person to control a robotic arm through implanted brain sensors linked to powerful external computers. However, such devices have relied on wired connections, which pose infection risks and restrict movement, or were wireless but had very limited computing power.

Building on this line of research, David Borton, Ph.D., and Ming Yin, Ph.D., of Brown University, Providence, R.I., and colleagues surmounted several major barriers in developing their sensor. To be fully implantable within the brain, the device needed to be very small and completely sealed off to protect the delicate machinery inside the device and the even more delicate tissue surrounding it. At the same time, it had to be powerful enough to convert the brain’s subtle electrical activity into digital signals that could be used by a computer, and then boost those signals to a level that could be detected by a wireless receiver located some distance outside the body. Like all cordless machines, the device had to be rechargeable, but in the case of an implanted brain sensor, recharging must also be done wirelessly.

The researchers consulted with brain surgeons on the shape and size of the sensor, which they built out of titanium, commonly used in joint replacements and other medical implants. They also fitted the device with a window made of sapphire, which electromagnetic signals pass through more easily than other materials, to assist with wireless transmission and inductive charging, a method of recharging also used in electronic toothbrushes. Inside, the device was densely packed with the electronics specifically designed to function on low power to reduce the amount of heat generated by the device and to extend the time it could work on battery power.

Testing the device in animal models — two pigs and two rhesus macaques — the researchers were able to receive and record data from the implanted sensors in real time over a broadband wireless connection. The sensors could transmit signals more than three feet and have continued to perform for over a year with little degradation in quality or performance.

The ability to remotely record brain activity data as an animal interacts naturally with its environment may help inform studies on muscle control and the movement-related brain circuits, the researchers say. While testing of the current devices continues, the researchers plan to refine the sensor for better heat management and data transmission, with use in human medical care as the goal.

“Clinical applications may include thought-controlled prostheses for severely neurologically impaired patients, wireless access to motorized wheelchairs or other assistive technologies, and diagnostic monitoring such as in epilepsy, where patients currently are tethered to the bedside during assessment,” said Borton.

Filed under brain activity implants prosthetics limb amputation BCI animal model neuroscience science

459 notes

Clever Battery Completes Stretchable Electronics Package
Northwestern University’s Yonggang Huang and the University of Illinois’ John A. Rogers are the first to demonstrate a stretchable lithium-ion battery — a flexible device capable of powering their innovative stretchable electronics.
No longer needing to be connected by a cord to an electrical outlet, the stretchable electronic devices now could be used anywhere, including inside the human body. The implantable electronics could monitor anything from brain waves to heart activity, succeeding where flat, rigid batteries would fail.
Huang and Rogers have demonstrated a battery that continues to work — powering a commercial light-emitting diode (LED) — even when stretched, folded, twisted and mounted on a human elbow. The battery can work for eight to nine hours before it needs recharging, which can be done wirelessly.
The new battery enables true integration of electronics and power into a small, stretchable package. Details are published by the online journal Nature Communications.
“We start with a lot of battery components side by side in a very small space, and we connect them with tightly packed, long wavy lines,” said Huang, a corresponding author of the paper. “These wires provide the flexibility. When we stretch the battery, the wavy interconnecting lines unfurl, much like yarn unspooling. And we can stretch the device a great deal and still have a working battery.”
Huang led the portion of the research focused on theory, design and modeling. He is the Joseph Cummings Professor of Civil and Environmental Engineering and Mechanical Engineering at Northwestern’s McCormick School of Engineering and Applied Science.
The power and voltage of the stretchable battery are similar to a conventional lithium-ion battery of the same size, but the flexible battery can stretch up to 300 percent of its original size and still function.

Clever Battery Completes Stretchable Electronics Package

Northwestern University’s Yonggang Huang and the University of Illinois’ John A. Rogers are the first to demonstrate a stretchable lithium-ion battery — a flexible device capable of powering their innovative stretchable electronics.

No longer needing to be connected by a cord to an electrical outlet, the stretchable electronic devices now could be used anywhere, including inside the human body. The implantable electronics could monitor anything from brain waves to heart activity, succeeding where flat, rigid batteries would fail.

Huang and Rogers have demonstrated a battery that continues to work — powering a commercial light-emitting diode (LED) — even when stretched, folded, twisted and mounted on a human elbow. The battery can work for eight to nine hours before it needs recharging, which can be done wirelessly.

The new battery enables true integration of electronics and power into a small, stretchable package. Details are published by the online journal Nature Communications.

“We start with a lot of battery components side by side in a very small space, and we connect them with tightly packed, long wavy lines,” said Huang, a corresponding author of the paper. “These wires provide the flexibility. When we stretch the battery, the wavy interconnecting lines unfurl, much like yarn unspooling. And we can stretch the device a great deal and still have a working battery.”

Huang led the portion of the research focused on theory, design and modeling. He is the Joseph Cummings Professor of Civil and Environmental Engineering and Mechanical Engineering at Northwestern’s McCormick School of Engineering and Applied Science.

The power and voltage of the stretchable battery are similar to a conventional lithium-ion battery of the same size, but the flexible battery can stretch up to 300 percent of its original size and still function.

Filed under battery stretchable battery BCI implantable electronics implants technology science

free counters