Neuroscience

Articles and news from the latest research reports.

Posts tagged technology

290 notes

Brain surgery through the cheek
For those most severely affected, treating epilepsy means drilling through the skull deep into the brain to destroy the small area where the seizures originate – invasive, dangerous and with a long recovery period.
Five years ago, a team of Vanderbilt engineers wondered: Is it possible to address epileptic seizures in a less invasive way? They decided it would be possible. Because the area of the brain involved is the hippocampus, which is located at the bottom of the brain, they could develop a robotic device that pokes through the cheek and enters the brain from underneath which avoids having to drill through the skull and is much closer to the target area.
To do so, however, meant developing a shape-memory alloy needle that can be precisely steered along a curving path and a robotic platform that can operate inside the powerful magnetic field created by an MRI scanner.
The engineers have developed a working prototype, which was unveiled in a live demonstration this week at the Fluid Power Innovation and Research Conference in Nashville by David Comber, the graduate student in mechanical engineering who did much of the design work.
The business end of the device is a 1.14 mm nickel-titanium needle that operates like a mechanical pencil, with concentric tubes, some of which are curved, that allow the tip to follow a curved path into the brain. (Unlike many common metals, nickel-titanium is compatible with MRIs). Using compressed air, a robotic platform controllably steers and advances the needle segments a millimeter at a time.
According to Comber, they have measured the accuracy of the system in the lab and found that it is better than 1.18 mm, which is considered sufficient for such an operation. In addition, the needle is inserted in tiny, millimeter steps so the surgeon can track its position by taking successive MRI scans.
According to Associate Professor of Mechanical Engineering Eric Barth, who headed the project, the next stage in the surgical robot’s development is testing it with cadavers. He estimates it could be in operating rooms within the next decade.
To come up with the design, the team began with capabilities that they already had.
“I’ve done a lot of work in my career on the control of pneumatic systems,” Barth said. “We knew we had this ability to have a robot in the MRI scanner, doing something in a way that other robots could not. Then we thought, ‘What can we do that would have the highest impact?’”
At the same time, Associate Professor of Mechanical Engineering Robert Webster had developed a system of steerable surgical needles. “The idea for this came about when Eric and I were talking in the hallway one day and we figured that his expertise in pneumatics was perfect for the MRI environment and could be combined with the steerable needles I’d been working on,” said Webster.
The engineers identified epilepsy surgery as an ideal, high-impact application through discussions with Associate Professor of Neurological Surgery Joseph Neimat. They learned that currently neuroscientists use the through-the-cheek approach to implant electrodes in the brain to track brain activity and identify the location where the epileptic fits originate. But the straight needles they use can’t reach the source region, so they must drill through the skull and insert the needle used to destroy the misbehaving neurons through the top of the head.
Comber and Barth shadowed Neimat through brain surgeries to understand how their device would work in practice.
“The systems we have now that let us introduce probes into the brain – they deal with straight lines and are only manually guided,” Neimat said. “To have a system with a curved needle and unlimited access would make surgeries minimally invasive. We could do a dramatic surgery with nothing more than a needle stick to the cheek.”
The engineers have designed the system so that much of it can be made using 3-D printing in order to keep the price low. This was achieved by collaborating with Jonathon Slightam and Vito Gervasi at the Milwaukee School of Engineering who specialize in novel applications for additive manufacturing.

Brain surgery through the cheek

For those most severely affected, treating epilepsy means drilling through the skull deep into the brain to destroy the small area where the seizures originate – invasive, dangerous and with a long recovery period.

Five years ago, a team of Vanderbilt engineers wondered: Is it possible to address epileptic seizures in a less invasive way? They decided it would be possible. Because the area of the brain involved is the hippocampus, which is located at the bottom of the brain, they could develop a robotic device that pokes through the cheek and enters the brain from underneath which avoids having to drill through the skull and is much closer to the target area.

To do so, however, meant developing a shape-memory alloy needle that can be precisely steered along a curving path and a robotic platform that can operate inside the powerful magnetic field created by an MRI scanner.

The engineers have developed a working prototype, which was unveiled in a live demonstration this week at the Fluid Power Innovation and Research Conference in Nashville by David Comber, the graduate student in mechanical engineering who did much of the design work.

The business end of the device is a 1.14 mm nickel-titanium needle that operates like a mechanical pencil, with concentric tubes, some of which are curved, that allow the tip to follow a curved path into the brain. (Unlike many common metals, nickel-titanium is compatible with MRIs). Using compressed air, a robotic platform controllably steers and advances the needle segments a millimeter at a time.

According to Comber, they have measured the accuracy of the system in the lab and found that it is better than 1.18 mm, which is considered sufficient for such an operation. In addition, the needle is inserted in tiny, millimeter steps so the surgeon can track its position by taking successive MRI scans.

According to Associate Professor of Mechanical Engineering Eric Barth, who headed the project, the next stage in the surgical robot’s development is testing it with cadavers. He estimates it could be in operating rooms within the next decade.

To come up with the design, the team began with capabilities that they already had.

“I’ve done a lot of work in my career on the control of pneumatic systems,” Barth said. “We knew we had this ability to have a robot in the MRI scanner, doing something in a way that other robots could not. Then we thought, ‘What can we do that would have the highest impact?’”

At the same time, Associate Professor of Mechanical Engineering Robert Webster had developed a system of steerable surgical needles. “The idea for this came about when Eric and I were talking in the hallway one day and we figured that his expertise in pneumatics was perfect for the MRI environment and could be combined with the steerable needles I’d been working on,” said Webster.

The engineers identified epilepsy surgery as an ideal, high-impact application through discussions with Associate Professor of Neurological Surgery Joseph Neimat. They learned that currently neuroscientists use the through-the-cheek approach to implant electrodes in the brain to track brain activity and identify the location where the epileptic fits originate. But the straight needles they use can’t reach the source region, so they must drill through the skull and insert the needle used to destroy the misbehaving neurons through the top of the head.

Comber and Barth shadowed Neimat through brain surgeries to understand how their device would work in practice.

“The systems we have now that let us introduce probes into the brain – they deal with straight lines and are only manually guided,” Neimat said. “To have a system with a curved needle and unlimited access would make surgeries minimally invasive. We could do a dramatic surgery with nothing more than a needle stick to the cheek.”

The engineers have designed the system so that much of it can be made using 3-D printing in order to keep the price low. This was achieved by collaborating with Jonathon Slightam and Vito Gervasi at the Milwaukee School of Engineering who specialize in novel applications for additive manufacturing.

Filed under brain surgery epilepsy hippocampus robotics 3D printing neuroscience technology science

65 notes

Microrobots armed with new force-sensing system to probe cells
Inexpensive microrobots capable of probing and manipulating individual cells and tissue for biological research and medical applications are closer to reality with the design of a system that senses the minute forces exerted by a robot’s tiny probe.
Microrobots small enough to interact with cells already exist. However, there is no easy, inexpensive way to measure the small forces applied to cells by the robots. Measuring these microforces is essential to precisely control the bots and to use them to study cells.
"What is needed is a useful tool biologists can use every day and at low cost," said David Cappelleri, an assistant professor of mechanical engineering at Purdue University.
Now researchers have designed and built a “vision-based micro force sensor end-effector,” which is attached to the microrobots like a tiny proboscis. A camera is used to measure the probe’s displacement while it pushes against cells, allowing a simple calculation that reveals the force applied.
The approach could make it possible to easily measure the “micronewtons” of force applied at the cellular level. Such a tool is needed to better study cells and to understand how they interact with microforces. The forces can be used to transform cells into specific cell lines, including stem cells for research and medical applications. The measurement of microforces also can be used to study how cells respond to certain medications and to diagnose disease.
"You want a device that is low-cost, that can measure micronewton-level forces and that can be easily integrated into standard experimental test beds," Cappelleri said.
Microrobots used in research are controlled with magnetic fields to guide them into position.
"But this is the first one with a truly functional end effector to measure microforces," he said.
Current methods for measuring the forces applied by microrobots are impractical and expensive, requiring an atomic force microscope or cumbersome sensors with complex designs that are difficult to manufacture. The new system records the probe’s displacement with a camera as it pushes against a cell or tissue. Researchers already know the stiffness of the probe. When combined with displacement, a simple calculation reveals the force applied.
Findings were detailed in a research paper presented during the International Conference on Intelligent Robots and Systems in September. The paper was authored by postdoctoral research associate Wuming Jing and Cappelleri.
The new system combined with the microrobot is about 700 microns square, and the researchers are working to create versions about 500 microns square. To put this scale into perspective, the mini-machine is about one-half the size of the “E” in “One Cent” on a U.S. penny.
"We are currently working on scaling it down," he said.
Future research also may focus on automating the microrobots.

Microrobots armed with new force-sensing system to probe cells

Inexpensive microrobots capable of probing and manipulating individual cells and tissue for biological research and medical applications are closer to reality with the design of a system that senses the minute forces exerted by a robot’s tiny probe.

Microrobots small enough to interact with cells already exist. However, there is no easy, inexpensive way to measure the small forces applied to cells by the robots. Measuring these microforces is essential to precisely control the bots and to use them to study cells.

"What is needed is a useful tool biologists can use every day and at low cost," said David Cappelleri, an assistant professor of mechanical engineering at Purdue University.

Now researchers have designed and built a “vision-based micro force sensor end-effector,” which is attached to the microrobots like a tiny proboscis. A camera is used to measure the probe’s displacement while it pushes against cells, allowing a simple calculation that reveals the force applied.

The approach could make it possible to easily measure the “micronewtons” of force applied at the cellular level. Such a tool is needed to better study cells and to understand how they interact with microforces. The forces can be used to transform cells into specific cell lines, including stem cells for research and medical applications. The measurement of microforces also can be used to study how cells respond to certain medications and to diagnose disease.

"You want a device that is low-cost, that can measure micronewton-level forces and that can be easily integrated into standard experimental test beds," Cappelleri said.

Microrobots used in research are controlled with magnetic fields to guide them into position.

"But this is the first one with a truly functional end effector to measure microforces," he said.

Current methods for measuring the forces applied by microrobots are impractical and expensive, requiring an atomic force microscope or cumbersome sensors with complex designs that are difficult to manufacture. The new system records the probe’s displacement with a camera as it pushes against a cell or tissue. Researchers already know the stiffness of the probe. When combined with displacement, a simple calculation reveals the force applied.

Findings were detailed in a research paper presented during the International Conference on Intelligent Robots and Systems in September. The paper was authored by postdoctoral research associate Wuming Jing and Cappelleri.

The new system combined with the microrobot is about 700 microns square, and the researchers are working to create versions about 500 microns square. To put this scale into perspective, the mini-machine is about one-half the size of the “E” in “One Cent” on a U.S. penny.

"We are currently working on scaling it down," he said.

Future research also may focus on automating the microrobots.

Filed under microrobots robotics stem cells medicine technology science

145 notes

Neuroimaging could be the key to a better society
Neuroimaging techniques are a strongly emerging technology and could bring about a revolution in various areas of society, as long as we choose the direction we want to steer these developments in on time. This is one of the conclusions from a series of dialogues between neuroscientists and future users, organised for the research project Towards an appropriate societal embedding of neuroimaging. The project is part of the NWO research programme Responsible Innovation.
Read more

Neuroimaging could be the key to a better society

Neuroimaging techniques are a strongly emerging technology and could bring about a revolution in various areas of society, as long as we choose the direction we want to steer these developments in on time. This is one of the conclusions from a series of dialogues between neuroscientists and future users, organised for the research project Towards an appropriate societal embedding of neuroimaging. The project is part of the NWO research programme Responsible Innovation.

Read more

Filed under neuroimaging technology neuroscience science

114 notes

Research mimics brain cells to boost memory power

RMIT University researchers have brought ultra-fast, nano-scale data storage within striking reach, using technology that mimics the human brain.

image

The researchers have built a novel nano-structure that offers a new platform for the development of highly stable and reliable nanoscale memory devices. 

The pioneering work will feature on a forthcoming cover of prestigious materials science journal Advanced Functional Materials (11 November). 

Project leader Dr Sharath Sriram, co-leader of the RMIT Functional Materials and Microsystems Research Group, said the nanometer-thin stacked structure was created using thin film, a functional oxide material more than 10,000 times thinner than a human hair. 

“The thin film is specifically designed to have defects in its chemistry to demonstrate a ‘memristive’ effect – where the memory element’s behaviour is dependent on its past experiences,” Dr Sriram said.

“With flash memory rapidly approaching fundamental scaling limits, we need novel materials and architectures for creating the next generation of non-volatile memory. 

“The structure we developed could be used for a range of electronic applications – from ultrafast memory devices that can be shrunk down to a few nanometers, to computer logic architectures that replicate the versatility and response time of a biological neural network.

“While more investigation needs to be done, our work advances the search for next generation memory technology can replicate the complex functions of human neural system – bringing us one step closer to the bionic brain.”

The research relies on memristors, touted as a transformational replacement for current hard drive technologies such as Flash, SSD and DRAM. Memristors have potential to be fashioned into non-volatile solid-state memory and offer building blocks for computing that could be trained to mimic synaptic interfaces in the human brain.

(Source: alphagalileo.org)

Filed under memristor memory perovskite oxide brain cells technology neuroscience science

102 notes

Say ‘ahh’ to let your smartphone check for Parkinson’s disease
Smartphones are designed to be curious. Having already learned about your friendships, your family and the pattern of your daily routine, designers are now interested in your health and fitness.
A new crop of apps and wearable devices continuously measure and analyse vital signs such as movement and heart rate, claiming to count calories, optimise sleep quality and guide diet. While cynics might be tempted to dismiss these products as glorified pedometers for lycra-clad smartphone addicts, new research shows that the hardware inside existing consumer devices can already reliably detect degenerative, life-changing disorders, including Parkinson’s disease.
Parkinson’s currently affects between seven to 10m people worldwide, and there is no cure. The disease can be diagnosed from a number of characteristic symptoms, including muscle tremor, changes in speech and difficulty of movement. However, diagnosis is challenging and usually involves regular visits to the doctor. It is estimated that one in five people with Parkinson’s are never diagnosed. Even if diagnosed, it can be difficult to accurately assess the how efficient treatment is in managing the disease.
Read more

Say ‘ahh’ to let your smartphone check for Parkinson’s disease

Smartphones are designed to be curious. Having already learned about your friendships, your family and the pattern of your daily routine, designers are now interested in your health and fitness.

A new crop of apps and wearable devices continuously measure and analyse vital signs such as movement and heart rate, claiming to count calories, optimise sleep quality and guide diet. While cynics might be tempted to dismiss these products as glorified pedometers for lycra-clad smartphone addicts, new research shows that the hardware inside existing consumer devices can already reliably detect degenerative, life-changing disorders, including Parkinson’s disease.

Parkinson’s currently affects between seven to 10m people worldwide, and there is no cure. The disease can be diagnosed from a number of characteristic symptoms, including muscle tremor, changes in speech and difficulty of movement. However, diagnosis is challenging and usually involves regular visits to the doctor. It is estimated that one in five people with Parkinson’s are never diagnosed. Even if diagnosed, it can be difficult to accurately assess the how efficient treatment is in managing the disease.

Read more

Filed under parkinson's disease technology health science

312 notes

Control your environment through brain commands
Many patients with amyotrophic lateral sclerosis (ALS, or Lou Gehrig’s Disease) and other neurodegenerative conditions live every day with a frustrating inability to do small, everyday tasks, such as turning on the lights, changing the volume on the TV, or even communicating with their friends and loved ones.
Today, a first-ever proof of concept demonstrates how wearable technology and consumer products can be brought together with digital innovations to let a person with no mobility control their environment using brain commands, via a custom-built tablet application and wearable display interface.
This proof of concept demonstrates the potential to improve the quality of life for ALS patients – or any person with limited muscle and speech function – by giving them the ability to interact, communicate and issue commands without moving their body or using their voice.
Read more

Control your environment through brain commands

Many patients with amyotrophic lateral sclerosis (ALS, or Lou Gehrig’s Disease) and other neurodegenerative conditions live every day with a frustrating inability to do small, everyday tasks, such as turning on the lights, changing the volume on the TV, or even communicating with their friends and loved ones.

Today, a first-ever proof of concept demonstrates how wearable technology and consumer products can be brought together with digital innovations to let a person with no mobility control their environment using brain commands, via a custom-built tablet application and wearable display interface.

This proof of concept demonstrates the potential to improve the quality of life for ALS patients – or any person with limited muscle and speech function – by giving them the ability to interact, communicate and issue commands without moving their body or using their voice.

Read more

Filed under ALS Lou Gehrig’s disease brainwaves EEG Emotiv Insight Brainware technology neuroscience science

2,344 notes

Chinese Doctors Use 3D-Printing in Pioneering Surgery to Replace Half of Man’s Skull

Surgeons at Xijing Hospital in Xi’an, Shaanxi province in Northwest China are using 3D-printing in a pioneering surgery to help rebuild the skull of a man who suffered brain damage in a construction accident.

Hu, a 46-year-old farmer, was overseeing construction to expand his home in Zhouzhi county last October when he was hit by a pile of wood and fell down three storeys.

Although he survived the fall, the left side of his skull was severely crushed and the shattered bone fragments needed to be removed, which has led to a depression of one side of his head.

Due to his injuries, Hu cannot see well out of his left eye, experiences double vision (diplopia) and is also unable to speak and write.

Read more

Filed under 3D printing head reconstruction implants technology medicine neuroscience science

119 notes

Bats bolster brain hypothesis, maybe technology, too
Amid a neuroscience debate about how people and animals focus on distinct objects within cluttered scenes, some of the newest and best evidence comes from the way bats “see” with their ears, according to a new paper in the Journal of Experimental Biology. In fact, the perception process in question could improve sonar and radar technology.
Bats demonstrate remarkable skill in tracking targets such as bugs through the trees in the dark of night. James Simmons, professor of neuroscience at Brown University, the review paper’s author, has long sought to explain how they do that.
It turns out that experiments in Simmons’ lab point to the “temporal binding hypothesis” as an explanation. The hypothesis proposes that people and animals focus on objects versus the background when a set of neurons in the brain attuned to features of an object all respond in synchrony, as if shouting in unison, “Yes, look at that!” When the neurons do not respond together to an object, the hypothesis predicts, an object is relegated to the perceptual background.
Because bats have an especially acute need to track prey through crowded scenes, albeit with echolocation rather than vision, they have evolved to become an ideal testbed for the hypothesis.
“Sometimes the most critical questions about systems in biology that relate to humans are best approached by using an animal species whose lifestyle requires that the system in question be exaggerated in some functional sense so its qualities are more obvious,” said Simmons, who plans to discuss the research at the 2014 Cold Spring Harbor Asia Conference the week of September 15 in Suzhou, China.
A focus of frequencies
Here’s how he’s determined over the years that temporal binding works in a bat. As the bat flies it emits two spectra of sound frequencies — one high and one low — into a wide cone of space ahead of it. Within the spectra are harmonic pairs of high and low frequencies, for example 33 kilohertz and 66 kilohertz. These harmonic pairs reflect off of objects and back to the bat’s ears, triggering a response from neurons in its brain. Objects that reflect these harmonic pairs in perfect synchrony are the ones that stand out clearly for the bat.
Of course it’s more complicated than just that. Many things could reflect the same frequency pairs back at the same time. The real question is how a target object would stand out. The answer, Simmons writes, comes from the physics of the echolocation sound waves and how bat brains have evolved to process their signal. Those factors conspire to ensure that whatever the bat keeps front-and-center in its echolocation cone will stand out from surrounding interference.
The higher frequency sounds in the bat’s spectrum weaken in transit through the air more than lower frequency sounds. The bat also sends out the lower frequencies to a wider span of angles than the high frequencies. So for any given harmonic pair, the farther away or more peripheral a reflecting object is, the weaker the higher frequency reflection in the harmonic pair will be. In the brain, Simmons writes, the bat converts this difference in signal strength into a delay in time (about 15 microseconds per decibel) so that harmonic pairs with wide differences in signal strength end up being perceived as way out of synchrony in time. The temporal binding hypothesis predicts that the distant or peripheral objects with these out-of-synch signals will be perceived as the background while front-and-center objects that reflect back both harmonics with equal strength will rise above their desynchronized competitors.
With support from sources including the U.S. Navy, Simmons’s research group has experimentally verified this. In key experiments (some dating back 40 years) they have sat big brown bats at the base of a Y-shaped platform with a pair of objects – one a target with a food reward and the other a distractor – on the tines of the Y. When the objects are at different distances, the bat can tell them apart and accurately crawl to the target. When the objects are equidistant, the bat becomes confused. Crucially, when the experimenters artificially weaken the high-pitched harmonic from the distracting object, even when it remains equidistant, the bat’s acumen to find the target is restored.
In further experiments in 2010 and 2011, Simmons’ team showed that if they shifted the distractor object’s weakened high-frequency signal by the right amount of time (15 microseconds per decibel) they could restore the distractor’s ability to interfere with the target object by restoring the synchrony of the distractor’s harmonics. In other words, they used the specific predictions of the hypothesis and their understanding of how it works in bats to jam the bat’s echolocation ability.
If targeting and jamming sound like words associated with radar and sonar, that’s no coincidence. Simmons works with the U.S. Navy on applications of bat echolocation to navigation technology. He recently began a new research grant from the Office of Naval Research that involves bat sonar work in collaboration with researcher Jason Gaudette at the Naval Undersea Warfare Center in Newport, R.I.
Simmons said he believes the evidence he has gathered about the neuroscience of bats not only supports the temporal binding hypothesis, but also can inspire new technology.
“This is a better way to design a radar or sonar system if you need it to perform well in real-time for a small vehicle in complicated tasks,” he said.

Bats bolster brain hypothesis, maybe technology, too

Amid a neuroscience debate about how people and animals focus on distinct objects within cluttered scenes, some of the newest and best evidence comes from the way bats “see” with their ears, according to a new paper in the Journal of Experimental Biology. In fact, the perception process in question could improve sonar and radar technology.

Bats demonstrate remarkable skill in tracking targets such as bugs through the trees in the dark of night. James Simmons, professor of neuroscience at Brown University, the review paper’s author, has long sought to explain how they do that.

It turns out that experiments in Simmons’ lab point to the “temporal binding hypothesis” as an explanation. The hypothesis proposes that people and animals focus on objects versus the background when a set of neurons in the brain attuned to features of an object all respond in synchrony, as if shouting in unison, “Yes, look at that!” When the neurons do not respond together to an object, the hypothesis predicts, an object is relegated to the perceptual background.

Because bats have an especially acute need to track prey through crowded scenes, albeit with echolocation rather than vision, they have evolved to become an ideal testbed for the hypothesis.

“Sometimes the most critical questions about systems in biology that relate to humans are best approached by using an animal species whose lifestyle requires that the system in question be exaggerated in some functional sense so its qualities are more obvious,” said Simmons, who plans to discuss the research at the 2014 Cold Spring Harbor Asia Conference the week of September 15 in Suzhou, China.

A focus of frequencies

Here’s how he’s determined over the years that temporal binding works in a bat. As the bat flies it emits two spectra of sound frequencies — one high and one low — into a wide cone of space ahead of it. Within the spectra are harmonic pairs of high and low frequencies, for example 33 kilohertz and 66 kilohertz. These harmonic pairs reflect off of objects and back to the bat’s ears, triggering a response from neurons in its brain. Objects that reflect these harmonic pairs in perfect synchrony are the ones that stand out clearly for the bat.

Of course it’s more complicated than just that. Many things could reflect the same frequency pairs back at the same time. The real question is how a target object would stand out. The answer, Simmons writes, comes from the physics of the echolocation sound waves and how bat brains have evolved to process their signal. Those factors conspire to ensure that whatever the bat keeps front-and-center in its echolocation cone will stand out from surrounding interference.

The higher frequency sounds in the bat’s spectrum weaken in transit through the air more than lower frequency sounds. The bat also sends out the lower frequencies to a wider span of angles than the high frequencies. So for any given harmonic pair, the farther away or more peripheral a reflecting object is, the weaker the higher frequency reflection in the harmonic pair will be. In the brain, Simmons writes, the bat converts this difference in signal strength into a delay in time (about 15 microseconds per decibel) so that harmonic pairs with wide differences in signal strength end up being perceived as way out of synchrony in time. The temporal binding hypothesis predicts that the distant or peripheral objects with these out-of-synch signals will be perceived as the background while front-and-center objects that reflect back both harmonics with equal strength will rise above their desynchronized competitors.

With support from sources including the U.S. Navy, Simmons’s research group has experimentally verified this. In key experiments (some dating back 40 years) they have sat big brown bats at the base of a Y-shaped platform with a pair of objects – one a target with a food reward and the other a distractor – on the tines of the Y. When the objects are at different distances, the bat can tell them apart and accurately crawl to the target. When the objects are equidistant, the bat becomes confused. Crucially, when the experimenters artificially weaken the high-pitched harmonic from the distracting object, even when it remains equidistant, the bat’s acumen to find the target is restored.

In further experiments in 2010 and 2011, Simmons’ team showed that if they shifted the distractor object’s weakened high-frequency signal by the right amount of time (15 microseconds per decibel) they could restore the distractor’s ability to interfere with the target object by restoring the synchrony of the distractor’s harmonics. In other words, they used the specific predictions of the hypothesis and their understanding of how it works in bats to jam the bat’s echolocation ability.

If targeting and jamming sound like words associated with radar and sonar, that’s no coincidence. Simmons works with the U.S. Navy on applications of bat echolocation to navigation technology. He recently began a new research grant from the Office of Naval Research that involves bat sonar work in collaboration with researcher Jason Gaudette at the Naval Undersea Warfare Center in Newport, R.I.

Simmons said he believes the evidence he has gathered about the neuroscience of bats not only supports the temporal binding hypothesis, but also can inspire new technology.

“This is a better way to design a radar or sonar system if you need it to perform well in real-time for a small vehicle in complicated tasks,” he said.

Filed under biosonar echolocation bats temporal binding hypothesis technology neuroscience science

free counters