Posts tagged EEG
Posts tagged EEG
A UT Arlington assistant engineering professor has developed a computational model that can more accurately predict when an epileptic seizure will occur next based on the patient’s personalized medical information.
The research conducted by Shouyi Wang, an assistant professor in the Department of Industrial and Manufacturing Systems Engineering, has been in the paper “Online Seizure Prediction Using an Adaptive Learning Approach” in IEEE Transactions on Knowledge and Data Engineering.
Wang’s model analyzes electroencephalography, or EEG, readings from an individual, to predict future seizures. Early warnings could lead a patient to use medicine to combat an oncoming seizure, he said.
“The challenge with seizure prediction has been that every epileptic is different. Some patients suffer several seizures a day. Others will go several years without experiencing a seizure,” Wang said. “But if we use the EEG readings to build a personalized data profile, we’re better able to understand what’s happening to that person.”
Epilepsy is one of the most common neurological disorders, characterized by recurrent seizures. Epilepsy and seizures affect nearly 3 million Americans at an estimated annual cost of $17.6 billion in direct and indirect costs, according to the national Epilepsy Foundation, About 10 percent of the American population will experience a seizure in their lifetime, the agency says.
Wang teamed with Wanpracha Art Chaovalitwongse of the University of Washington and Stephen Wong of the Rutgers Robert Wood Johnson Medical School for the research.
Wang said early indications are that the new computational model could provide 70 percent accuracy or better and give a prediction horizon of about 30 minutes before the actual seizure would occur.
The current model collects data through a cap embedded with EEG wires. Wang’s team is working to develop a less obtrusive EEG cap that will record and transmit readings to a box for easy data download or transmission.
Victoria Chen, professor and chairwoman of the Industrial and Manufacturing Systems Engineering Department, said Wang’s work in the area of bioinformatics offers hope for the many people who suffer from epilepsy.
“This computational model might be used to predict other life-threatening episodes of diseases,” Chen said.
Wang said his model builds upon an adaptive learning framework and is capable of achieving more and more accurate prediction performance for each individual patientby collecting more and more personalized medical data.
“As a society, we’ve gotten really good at looking at the big picture,” Wang said. “We can tell you the likelihood of suffering a heart attack if you’re over a certain age, of a certain weight and if you smoke. But we have only started to personalize that data for individuals who are all different.”
When Chris Chafe and Josef Parvizi began transforming recordings of brain activity into music, they did so with artistic aspirations. The professors soon realized, though, that the work could lead to a powerful biofeedback tool for identifying brain patterns associated with seizures.
Josef Parvizi was enjoying a performance by the Kronos Quartet when the idea struck. The musical troupe was midway through a piece in which the melodies were based on radio signals from outer space, and Parvizi, a neurologist at Stanford Medical Center, began wondering what the brain’s electrical activity might sound like set to music.
He didn’t have to look far for help. Chris Chafe, a professor of music research at Stanford, is one of the world’s foremost experts in “musification,” the process of converting natural signals into music. One of his previous works involved measuring the changing carbon dioxide levels near ripening tomatoes and converting those changing levels into electronic performances.
Parvizi, an associate professor, specializes in treating patients suffering from intractable seizures. To locate the source of a seizure, he places electrodes in patients’ brains to create electroencephalogram (EEG) recordings of both normal brain activity and a seizure state.
He shared a consenting patient’s EEG data with Chafe, who began setting the electrical spikes of the rapidly firing neurons to music. Chafe used a tone close to a human’s voice, in hopes of giving the listener an empathetic and intuitive understanding of the neural activity.
Upon a first listen, the duo realized they had done more than create an interesting piece of music. [Listen to the audio here]
"My initial interest was an artistic one at heart, but, surprisingly, we could instantly differentiate seizure activity from non-seizure states with just our ears," Chafe said. "It was like turning a radio dial from a static-filled station to a clear one."
If they could achieve the same result with real-time brain activity data, they might be able to develop a tool to allow caregivers for people with epilepsy to quickly listen to the patient’s brain waves to hear whether an undetected seizure might be occurring.
Parvizi and Chafe dubbed the device a “brain stethoscope.”
The sound of a seizure
The EEGs Parvizi conducts register brain activity from more than 100 electrodes placed inside the brain; Chafe selects certain electrode/neuron pairings and allows them to modulate notes sung by a female singer. As the electrode captures increased activity, it changes the pitch and inflection of the singer’s voice.
Before the seizure begins – during the so-called pre-ictal stage – the peeps and pops from each “singer” almost synchronize and fall into a clear rhythm, as if they’re following a conductor, Chafe said.
In the moments leading up to the seizure event, though, each of the singers begins to improvise. The notes become progressively louder and more scattered, as the full seizure event occurs (the ictal state). The way Chafe has orchestrated his singers, one can hear the electrical storm originate on one side of the brain and eventually cross over into the other hemisphere, creating a sort of sing-off between the two sides of the brain.
After about 30 seconds of full-on chaos, the singers begin to calm, trailing off into their post-ictal rhythm. Occasionally, one or two will pipe up erratically, but on the whole, the choir sounds extremely fatigued.
It’s the perfect representation of the three phases of a seizure event, Parvizi said.
Part art exhibit, part experiment
Caring for a person with seizures can be very difficult, as not all seizure activity manifests itself with behavioral cues. It’s often impossible to know whether a person with epilepsy is acting confused because they are having a seizure, or if they are experiencing the type of confusion that is a marker of the post-ictal seizure phase.
To that end, Parvizi and Chafe hope to apply their work to develop a device that listens for the telltale brain patterns of an ongoing seizure or a post-ictal fatigued brain state.
"Someone – perhaps a mother caring for a child – who hasn’t received training in interpreting visual EEGs can hear the seizure rhythms and easily appreciate that there is a pathological brain phenomenon taking place," Parvizi said.
The device can also offer biofeedback to non-epileptic patients who want to hear the music their own brain waves create.
The effort to build this device is funded by Stanford’s Bio-X Interdisciplinary Initiatives Program (Bio-X IIP), which provides money for interdisciplinary projects that have potential to improve human health in innovative ways. Bio-X seed grants have funded 141 research collaborations connecting hundreds of faculty since 2000. The proof-of-concept projects have produced hundreds of publications, dozens of patents, and more than a tenfold return on research funds to Stanford.
From a clinical perspective, the work is still very experimental.
"We’ve really just stuck our finger in there," Chafe said. "We know that the music is fascinating and that we can hear important dynamics, but there are still wonderful revelations to be made."
Next year, Chafe and Parvizi plan to unveil a version of the system at Stanford’s Cantor Arts Center. Visitors will don a headset that will transmit an EEG of their brain activity to their handheld device, which will convert it into music in real time.
"This is what I like about Stanford," Parvizi said. "It nurtures collaboration between fields that are seemingly light-years apart – we’re neurology and music professors! – and our work together will hopefully make a positive impact on the world we live in."
Researchers from the University of Montreal and their colleagues have found brain activity beyond a flat line EEG, which they have called Nu-complexes (from the Greek letter n). According to existing scientific data, researchers and doctors had established that beyond the so-called “flat line” (flat electroencephalogram or EEG), there is nothing at all, no brain activity, no possibility of life. This major discovery suggests that there is a whole new frontier in animal and human brain functioning.
The researchers observed a human patient in an extreme deep hypoxic coma under powerful anti-epileptic medication that he had been required to take due to his health issues. “Dr. Bogdan Florea from Romania contacted our research team because he had observed unexplainable phenomena on the EEG of a coma patient. We realized that there was cerebral activity, unknown until now, in the patient’s brain,” says Dr. Florin Amzica, director of the study and professor at the University of Montreal’s School of Dentistry.
Dr. Amzica’s team then decided to recreate the patient’s state in cats, the standard animal model for neurological studies. Using the anesthetic isoflurane, they placed the cats in an extremely deep—but completely reversible—coma. The cats passed the flat (isoelectric) EEG line, which is associated with silence in the cortex (the governing part of the brain). The team observed cerebral activity in 100% of the cats in deep coma, in the form of oscillations generated in the hippocampus, the part of the brain responsible for memory and learning processes. These oscillations, unknown until now, were transmitted to the master part of the brain, the cortex. The researchers concluded that the observed EEG waves, or Nu-complexes, were the same as those observed in the human patient.
Dr. Amzica stresses the importance of understanding the implications of these findings. “Those who have decided to or have to ‘unplug’ a near-brain-dead relative needn’t worry or doubt their doctor. The current criteria for diagnosing brain death are extremely stringent. Our finding may perhaps in the long term lead to a redefinition of the criteria, but we are far from that. Moreover, this is not the most important or useful aspect of our study,” Dr. Amzica said.
From Nu-complexesto therapeutic comas
The most useful aspect of this finding is the therapeutic potential, the neuroprotection, of the extreme deep coma. After a major injury, some patients are in such serious condition that doctors deliberately place them in an artificial coma to protect their body and brain so they can recover. But Dr. Amzica believes that the extreme deep coma experimented on the cats may be more protective.
“Indeed, an organ or muscle that remains inactive for a long time eventually atrophies. It is plausible that the same applies to a brain kept for an extended period in a state corresponding to a flat EEG,” says Professor Amzica. “An inactive brain coming out of a prolonged coma may be in worse shape than a brain that has had minimal activity. Research on the effects of extreme deep coma during which the hippocampus is active, through Nu-complexes. is absolutely vital for the benefit of patients.”
“Another implication of this finding is that we now have evidence that the brain is able to survive an extremely deep coma if the integrity of the nervous structures is preserved,” said lead author of the study, Daniel Kroeger. “We also found that the hippocampus can send ‘orders’ to the brain’s commander in chief, the cortex. Finally, the possibility of studying the learning and memory processes of the hippocampus during a state of coma will help further understanding of them. In short, all sorts of avenues for basic research are now open to us.”
University of Washington researchers have performed what they believe is the first noninvasive human-to-human brain interface, with one researcher able to send a brain signal via the Internet to control the hand motions of a fellow researcher.
Using electrical brain recordings and a form of magnetic stimulation, Rajesh Rao sent a brain signal to Andrea Stocco on the other side of the UW campus, causing Stocco’s finger to move on a keyboard.
While researchers at Duke University have demonstrated brain-to-brain communication between two rats, and Harvard researchers have demonstrated it between a human and a rat, Rao and Stocco believe this is the first demonstration of human-to-human brain interfacing.
“The Internet was a way to connect computers, and now it can be a way to connect brains,” Stocco said. “We want to take the knowledge of a brain and transmit it directly from brain to brain.”
The researchers captured the full demonstration on video recorded in both labs.
Rao, a UW professor of computer science and engineering, has been working on brain-computer interfacing in his lab for more than 10 years and just published a textbook on the subject. In 2011, spurred by the rapid advances in technology, he believed he could demonstrate the concept of human brain-to-brain interfacing. So he partnered with Stocco, a UW research assistant professor in psychology at the UW’s Institute for Learning & Brain Sciences.
On Aug. 12, Rao sat in his lab wearing a cap with electrodes hooked up to an electroencephalography machine, which reads electrical activity in the brain. Stocco was in his lab across campus wearing a purple swim cap marked with the stimulation site for the transcranial magnetic stimulation coil that was placed directly over his left motor cortex, which controls hand movement.
The team had a Skype connection set up so the two labs could coordinate, though neither Rao nor Stocco could see the Skype screens.
Rao looked at a computer screen and played a simple video game with his mind. When he was supposed to fire a cannon at a target, he imagined moving his right hand (being careful not to actually move his hand), causing a cursor to hit the “fire” button. Almost instantaneously, Stocco, who wore noise-canceling earbuds and wasn’t looking at a computer screen, involuntarily moved his right index finger to push the space bar on the keyboard in front of him, as if firing the cannon. Stocco compared the feeling of his hand moving involuntarily to that of a nervous tic.
“It was both exciting and eerie to watch an imagined action from my brain get translated into actual action by another brain,” Rao said. “This was basically a one-way flow of information from my brain to his. The next step is having a more equitable two-way conversation directly between the two brains.”
The technologies used by the researchers for recording and stimulating the brain are both well-known. Electroencephalography, or EEG, is routinely used by clinicians and researchers to record brain activity noninvasively from the scalp. Transcranial magnetic stimulation is a noninvasive way of delivering stimulation to the brain to elicit a response. Its effect depends on where the coil is placed; in this case, it was placed directly over the brain region that controls a person’s right hand. By activating these neurons, the stimulation convinced the brain that it needed to move the right hand.
Computer science and engineering undergraduates Matthew Bryan, Bryan Djunaedi, Joseph Wu and Alex Dadgar, along with bioengineering graduate student Dev Sarma, wrote the computer code for the project, translating Rao’s brain signals into a command for Stocco’s brain.
“Brain-computer interface is something people have been talking about for a long, long time,” said Chantel Prat, assistant professor in psychology at the UW’s Institute for Learning & Brain Sciences, and Stocco’s wife and research partner who helped conduct the experiment. “We plugged a brain into the most complex computer anyone has ever studied, and that is another brain.”
At first blush, this breakthrough brings to mind all kinds of science fiction scenarios. Stocco jokingly referred to it as a “Vulcan mind meld.” But Rao cautioned this technology only reads certain kinds of simple brain signals, not a person’s thoughts. And it doesn’t give anyone the ability to control your actions against your will.
Both researchers were in the lab wearing highly specialized equipment and under ideal conditions. They also had to obtain and follow a stringent set of international human-subject testing rules to conduct the demonstration.
“I think some people will be unnerved by this because they will overestimate the technology,” Prat said. “There’s no possible way the technology that we have could be used on a person unknowingly or without their willing participation.”
Stocco said years from now the technology could be used, for example, by someone on the ground to help a flight attendant or passenger land an airplane if the pilot becomes incapacitated. Or a person with disabilities could communicate his or her wish, say, for food or water. The brain signals from one person to another would work even if they didn’t speak the same language.
Rao and Stocco next plan to conduct an experiment that would transmit more complex information from one brain to the other. If that works, they then will conduct the experiment on a larger pool of subjects.
It’s always in front of a million people and feels like eternity. You’re strolling along when suddenly you’ve stumbled—the brain realizes you’re falling, but your muscles aren’t doing anything to stop it.
For a young person, a fall is usually just embarrassing. However, for the elderly, falling can be life threatening. Among the elderly who break a hip, 80 percent die within a year.
University of Michigan researchers believe that the critical window of time between when the brain senses a fall and the muscles respond may help explain why so many older people suffer these serious falls. A better understanding of what happens in the brain and muscles during this lag could go a long way toward prevention.
To that end, researchers at the U-M School of Kinesiology developed a novel way of looking at the electrical response in the brain before and during a fall by using an electroencephalogram.
Findings showed that many areas of the brain sense and respond to a fall, but that happens well before the muscles react. Lead researcher Daniel Ferris likened the study method to recording an orchestra with many microphones and then teasing out the sounds of specific instruments. In the study, researchers measured electrical activity in different regions of the brain.
"We’re using an EEG in a way others don’t, to look at what’s going on inside the brain," said Ferris, a professor in kinesiology. "We were able to determine what parts of the brain first identify when you are losing your balance during walking."
During the study, healthy young subjects with electrodes attached to their scalps walked on a balance beam mounted to a treadmill. When participants lost their balance and went off the beam, they simply continued walking on the moving treadmill, thus avoiding injury.
Ferris and colleagues then used a method called independent components analysis to separate and visualize the electrical activity in different parts of the brain. They found that people sense the start of a fall much better with both feet on the ground. Two grounded feet make it easier to determine where the ground is relative to the body, but people aren’t as sure of their stability on one foot.
The researchers were surprised that so many different parts of the brain activate during a fall, and they didn’t expect the brain to recognize a loss of balance as early as it does.
Future studies comparing the elderly with younger subjects could determine if the elderly sense falls too late, in which case, pharmaceuticals might help them regain their balance. If it’s a simple motor problem such as muscles not responding properly, strengthening exercises could help.
Other experiments under the same grant in the Ferris lab look to separate sensory and motor contributions to brain activity during walking.
The study, “Loss of balance during balance beam walking elicits a broadly distributed theta band electrocortical response,” was published in advance online in the Journal of Neurophysiology.
Inhibitory self control – not picking up a cigarette, not having a second drink, not spending when we should be saving – can operate without our awareness or intention.
That was the finding by scientists at the University of Pennsylvania’s Annenberg School for Communication and the University of Illinois at Urbana-Champaign. They demonstrated through neuroscience research that inaction-related words in our environment can unconsciously influence our self-control. Although we may mindlessly eat cookies at a party, stopping ourselves from over-indulging may seem impossible without a deliberate, conscious effort. However, it turns out that overhearing someone – even in a completely unrelated conversation – say something as simple as “calm down” might trigger us to stop our cookie eating frenzy without realizing it.
The findings were reported in the journal Cognition by Justin Hepler, M.A., University of Illinois; and Dolores Albarracín, Ph.D., the Martin Fishbein Chair of Communication and a Professor of Psychology at Penn.
Volunteers completed a study where they were given instructions to press a computer key when they saw the letter “X” on the computer screen, or not press a key when they saw the letter “Y.” Their actions were affected by subliminal messages flashing rapidly on the screen. Action messages (“run,” “go,” “move,” “hit,” and “start”) alternated with inaction messages (“still,” “sit,” “rest,” “calm,” and “stop”) and nonsense words (“rnu,” or “tsi”). The participants were equipped with electroencephalogram recording equipment to measure brain activity.
The unique aspect of this test is that the action or inaction messages had nothing to do with the actions or inactions volunteers were doing, yet Hepler and Albarracín found that the action/inaction words had a definite effect on the volunteers’ brain activity. Unconscious exposure to inaction messages increased the activity of the brain’s self-control processes, whereas unconscious exposure to action messages decreased this same activity.
“Many important behaviors such as weight loss, giving up smoking, and saving money involve a lot of self-control,” the researchers noted. “While many psychological theories state that actions can be initiated automatically with little or no conscious effort, these same theories view inhibition as an effortful, consciously controlled process. Although reaching for that cookie doesn’t require much thought, putting it back on the plate seems to require a deliberate, conscious intervention. Our research challenges the long-held assumption that inhibition processes require conscious control to operate.”
The full article, “Complete unconscious control: Using (in)action primes to demonstrate completely unconscious activation of inhibitory control mechanisms,” will be available in the September issue of the journal.
(Image: Getty Images)
Physicists and neuroscientists from The University of Nottingham and University of Birmingham have unlocked one of the mysteries of the human brain, thanks to new research using functional Magnetic Resonance Imaging (fMRI) and electroencephalography (EEG).
The work will enable neuroscientists to map a kind of brain function that up to now could not be studied, allowing a more accurate exploration of how both healthy and diseased brains work.
Functional MRI is commonly used to study how the brain works, by providing spatial maps of where in the brain external stimuli, such as pictures and sounds, are processed. The fMRI scan does this by detecting indirect changes in the brain’s blood flow in response to changes in electrical signalling during the stimulus.
A signal change that happens after the stimulus has stopped is also observed with the fMRI scan. This is called the post-stimulus signal and up until now it has not been used to study how the brain works because its origin was uncertain.
In novel experiments, the research team has now combined fMRI techniques with EEG, which measures electrical activity in the brain, to show that the post-stimulus signal also actually reflects changes in brain signalling.
18 healthy volunteers were monitored by using EEG to measure the electrical activity generated by their brains’ neurons (the signalling cells) while simultaneously recording fMRI measurements. A stimulus of electrical pulses was used to activate the part of the brain that controls movement in the right thumb.
The scientists then compared the EEG and fMRI signals and found that they both vary in the same way after the stimulus stops. This provides compelling evidence that the post-stimulus fMRI signal is a measure of neuronal activity rather than just changes in the brain’s blood flow. Curiously, the team also found the post-stimulus fMRI signal was not consistent, even though the stimulus input to the brain was the same each time. This natural variability in the brain response was also reflected by the EEG activity and the researchers suggest that this signal might help the brain make the transition from processing stimuli back to their internal thoughts in different ways.
Dr Karen Mullinger from The University of Nottingham’s Sir Peter Mansfield Magnetic Resonance Centre said: “This work opens a new window of time in the fMRI signal in which we can look at what the brain is doing. It may also open up new research avenues in exploring the function of the healthy brain and the study of neurological diseases.”
Dr Stephen Mayhew from Birmingham University Imaging Centre said “We do not know what the exact role of the post-stimulus activity is or why this response is not always consistent when the stimulus input to the brain is the same. We have already secured funding through the Birmingham-Nottingham Strategic Collaboration Fund to continue this research into further understanding of human brain function using combinations of neuroimaging methods.”
Director of the Sir Peter Mansfield Magnetic Resonance Centre, Professor Peter Morris, said: “Functional magnetic resonance imaging is the main tool available to cognitive neuroscientists for the investigation of human brain function. The demonstration in this paper, that the secondary fMRI response (the post-stimulus undershoot) is not simply a passive blood flow response, but is directly related to synchronous neural activity, as measured with EEG, heralds an exciting new chapter in our understanding of the workings of the human mind.”
The work has been funded by the Medical Research Council (MRC), Engineering and Physical Science Research Council (EPSRC), The University of Nottingham Anne McLaren Fellowships and University of Birmingham Fellowship and is published in the Proceedings of the National Academy of Sciences (PNAS).
Latest advances in capturing data on brain activity and eye movement are being combined to open up a host of ‘mindreading’ possibilities for the future. These include the potential development of a system that can detect when drivers are in danger of falling asleep at the wheel.
The research has been undertaken at the University of Leicester with funding from the Engineering and Physical Sciences Research Council (EPSRC), and in collaboration with the University of Buenos Aires in Argentina.
The breakthrough involves bringing two recent developments in the world of technology together: high-speed eye tracking that records eye movements in unprecedented detail using cutting-edge infra-red cameras*; and high-density electroencephalograph** (EEG) technology that measures electrical brain activity with millisecond precision through electrodes placed on the scalp.
The research has overcome previous technological challenges which made it difficult to monitor eye movement and brain activity simultaneously. The team has done this by developing novel signal processing techniques.
This could be the first step towards a system that combines brain and eye monitoring to automatically alert drivers who are showing signs of drowsiness. The system would be built into the vehicle and connected unobtrusively to the driver, with the EEG looking out for brain signals that only occur in the early stages of sleepiness. The eye tracker would reinforce this by looking for erratic gaze patterns symptomatic of someone starting to feel drowsy and different from those characteristic of someone driving who is constantly looking out for hazards. Fatigue has been estimated to account for around 20 per cent of traffic accidents on the UK’s motorways.***
The breakthrough achieved by the University of Leicester could also ultimately be built on to deliver many other everyday applications in the years ahead. For example:
A remote controlled helicopter has been flown through a series of hoops around a college gymnasium in Minnesota.
It sounds like your everyday student project; however, there is one caveat…the helicopter was controlled using just the power of thought.
The experiments have been performed by researchers hoping to develop future robots that can help restore the autonomy of paralysed victims or those suffering from neurodegenerative disorders.
Their study has been published today, 4 June 2013, in IOP Publishing’s Journal of Neural Engineering and is accompanied by a video of the helicopter control in action.
There were five subjects (three female, two male) who took part in the study and each one was able to successfully control the four-blade helicopter, also known as a quadcopter, quickly and accurately for a sustained amount of time.
Lead author of the study Professor Bin He, from the University of Minnesota College of Science and Engineering, said: “Our study shows that for the first time, humans are able to control the flight of flying robots using just their thoughts, sensed from noninvasive brain waves.”
The noninvasive technique used was electroencephalography (EEG), which recorded the electrical activity of the subjects’ brain through a cap fitted with 64 electrodes.
Facing away from the quadcopter, the subjects were asked to imagine using their right hand, left hand, and both hands together; this would instruct the quadcopter to turn right, left, lift, and then fall, respectively. The quadcopter was driven with a pre-set forward moving velocity and controlled through the sky with the subject’s thoughts.
The subjects were positioned in front of a screen which relayed images of the quadcopter’s flight through an on-board camera, allowing them to see which direction it was travelling in. Brain signals were recorded by the cap and sent to the quadcopter over WiFi.
“In previous work we showed that humans could control a virtual helicopter using just their thoughts. I initially intended to use a small helicopter for this real-life study; however, the quadcopter is more stable, smooth and has fewer safety concerns,” continued Professor He.
After several different training sessions, the subjects were required to fly the quadcopter through two foam rings suspended from the gymnasium ceiling and were scored on three aspects: the number of times they sent the quadcopter through the rings; the number of times the quadcopter collided with the rings; and the number of times they went outside the experiment boundary.
A number of statistical tests were used to calculate how each subject performed.
A group of subjects also directed the quadcopter with a keyboard in a control experiment, allowing for a comparison between a standardised method and brain control.
This process is just one example of a brain–computer interface where a direct pathway between the brain and an external device is created to help assist, augment or repair human cognitive or sensory-motor functions; researchers are currently looking at ways to restore hearing, sight and movement using this approach.
“Our next goal is to control robotic arms using noninvasive brain wave signals, with the eventual goal of developing brain–computer interfaces that aid patients with disabilities or neurodegenerative disorders,” continued Professor He.
Scientists from the University of Southampton have developed a device which records the brain activity of worms to help test the effects of drugs.
NeuroChip is a microfluidic electrophysiological device, which can trap the microscopic worm Caenorhadbitis elegans and record the activity of discrete neural circuits in its ‘brain’ - a worm equivalent of the EEG.
C. elegans have been enormously important in providing insight into fundamental signalling processes in the nervous system and this device opens the way for a new analysis. Prior to this development, electrophysiological recordings that resolve the activity of excitatory and inhibitory nerve cells in the nervous system of the worm required a high level of technical expertise - single microscopic (1mm long) worms have to be trapped on the end of a glass tube, a microelectrode, in order to make the recording. The worms are very mobile as well as being small and this can be a challenging procedure.
The microfluidic invention consists of a reservoir through which worms can be fed, one after the other, into a narrow fluid-filled channel. The channel tapers at one end and this captures the worm by the front end. The worm is then in the correct orientation for recording the activity of the nervous system in the anterior of its body. The device incorporates metal electrodes, which are connected to an amplifier to make the recording. The design of the trapping channel has been optimised by PhD student Chunxiao Hu, so that the quality of the worm ‘EEG’ recording is sufficient to resolve the activity of components of the neural circuit in the worm’s nervous system.
This device has been used to detect the effects of drugs and is highly suitable for high throughput screens (which allow researchers to quickly conduct millions of chemical, genetic or pharmacological tests) in neurotoxicology and for generic screening for neuroactive drugs. It has more power to resolve discrete effects on excitatory, inhibitory or modulatory transmission than previously possible with behavioural screens.
Lindy Holden-Dye, Professor of Neuroscience at the University of Southampton and lead author of the paper, says: “We are particularly interested in using this as a sensitive new tool for screening compounds for neurotoxicity. It will allow us to precisely quantify sub-lethal effects on neural network activity. It can also provide an information rich platform by reporting the effects of compounds on a diverse array of neurotransmitter pathways, which are implicated in mammalian toxicology. “
The research, which is published in the latest issue of the journal PLOS One, is a joint project between the University’s Centre for Biological Sciences and the Hybrid Biodevices Group.