Neuroscience

Articles and news from the latest research reports.

Posts tagged sensory information

95 notes

Researchers identify molecule that orients neurons for high definition sensing

Many animals have highly developed senses, such as vision in carnivores, touch in mice, and hearing in bats. New research from the RIKEN Brain Science Institute has uncovered a brain molecule that can explain the existence of such finely-tuned sensory capabilities, revealing how brain cells responsible for specific senses are positioned to receive incoming sensory information.

image

The study, led by Dr. Tomomi Shimogori and published in the journal Science, sought to uncover the molecule that enables high acuity sensing by examining brain regions that receive information from the senses. They found that areas responsible for touch in mice and vision in ferrets contain a protein called BTBD3 that optimizes neuronal shape to receive sensory input more efficiently.

Neurons have a highly specialized shape, sending signals through one long projection called an axon, while receiving signals from many branch-like projections called dendrites. The final shape and connections to other neurons are typically completed after birth. Some neurons have dendrites distributed equally all around the cell body, like a starfish, while in others they extend only from one side, like a squid, steering towards axons that are actively bringing in information from the peripheral nerves. It was previously unknown what enables neurons to have highly oriented dendrites.

“We were fascinated by the dendrite patterning changes that occurred during the early postnatal stage that is controlled by neuronal input,” says Dr. Shimogori. “We found a fundamental process that is important to remove unnecessary dendrites to prevent mis-wiring and to make efficient neuronal circuits.”

The researchers searched for genes that are active exclusively in the mouse somatosensory cortex, the brain region responsible for their sense of touch, and found that the gene coding for the protein BTBD3 was active in the neurons of the barrel cortex, which receives input from their whiskers, the highly sensitive tactile sensors in mice, and that these neurons had unidirectional dendrites.

Using gene manipulations in embryonic mouse brain the authors found that eliminating BTBD3 made dendrites uniformly distribute around neurons in the mouse barrel cortex. In contrast, artificially introducing BTBD3 in the visual cortex of mice where BTBD3 is not normally found, reoriented the normally symmetrically positioned dendrites to one side. The same mechanism shaped neurons in the visual cortex of ferrets, which unlike the mouse contains BTBD3.

“High acuity sensory function may have been enabled by the evolution of BTBD3 and related proteins in brain development,” adds Dr. Shimogori. “Finding BTBD3 selectively in the visual and auditory cortex of the common marmoset, a species that relies heavily on high acuity vocal and visual communication for survival, and in mouse, where it is expressed in high-acuity tactile and olfactory areas, but not in low acuity visual cortex, supports this idea.” The authors plan to examine their theory by testing sensory function in mice without BTBD3 gene expression.

(Source: riken.jp)

Filed under neurons dendrites brain development BTBD3 sensory information neural circuits neuroscience science

31 notes

Rats take high-speed multisensory snapshots

When animals are on the hunt for food they likely use many senses, and scientists have wondered how the different senses work together.

image

New research from the laboratory of CSHL neuroscientist and Assistant Professor Adam Kepecs shows that when rats actively use the senses of smell (sniffing) and touch (through their whiskers) those two processes are locked in synchronicity. The team’s paper, published today in the Journal of Neuroscience, shows that sniffing and “whisking” movements are synchronized even when they are running at different frequencies.

Studies in the 1960s suggested these two sensory activities were coordinated: sniffing, a sharp, profound intake of air; and whisking, the back-and-forth movement of the whiskers to sample the near environment, akin to the sensation of touch as felt through the fingers in humans. Such coordination could be important for decisions that depend on multiple types of sensory information, for instance, locating food. “The question is how two very different streams of sensory information, touch and smell, are integrated into a single multisensory “snapshot” of the environment,” says Kepecs.

These snapshots can be taken at high frequency, up to 12 times a second. To determine whether these two sensorimotor rhythms are indeed phase-locked, Kepecs’ team, including postdocs Sachin Ranade and Balázs Hangya, simultaneously monitored sniffing and whisking in rats freely foraging for food pellets.

At different frequencies occurring between 4-12 times per second they found strong 1:1 phase locking — in other words, every time the rats extended their whiskers to feel their vicinity, they also smelled it. Surprisingly, they found even when the sniffing and whisking rhythms operating at different fundamental frequencies they were locked in phase. Key to this is that the phases of the sensory input – the start of inhalation and onset of whisking – are aligned, which facilitates multisensory integration.

This is similar to how a person’s breathing rhythm settles into place while running and is synchronized to the steps. In both cases, the coordination could be advantageous in terms of energy efficiency. A crucial difference, though, is that in humans, the breathing rate has to catch up to the running rhythm after changes in pace, while for sniffing and whisking in rats they lock into phase immediately.

Even though human behavior doesn’t seem to be overtly tied to rhythms, there are hints that it could be. “Underneath the smoothly executed movements of humans there are rhythm generators, which are sometimes revealed in some diseases, for example the tremors seen in Parkinson’s disease, or in the brain waves that result from the synchronized firing of neurons,” says Kepecs. Studying the rhythms of multisensory inputs in rodents could provide clues to a fundamental principle underlying sensory and brain rhythms that are essential to all animals, including humans.

(Source: cshl.edu)

Filed under rats whiskers sense synchronicity sensory information sniffing neuroscience science

64 notes

Helpful for robotics: brain uses old information for new movements

Information from the senses has an important influence on how we move. For instance, you can see and feel when a mug is filled with hot coffee, and you lift it in a different way than if the mug were empty. Neuroscientist Julian Tramper discovered that the brain uses two forms of old information in order to execute new movements well. This discovery can be useful for the field of robotics. Tramper will receive his doctorate on Thursday 24 April from Radboud University Nijmegen

Every time you move, the brain deals with two problems. First, there is a slight delay in the sensory information needed to execute the movement. Second, the command from the brain directing the muscles to move is not entirely clear, because neuronal signals contain a certain amount of natural static interference. According to Tramper, the brain has a clever way of getting around both problems: It combines the old information from the senses with experience gained through similar movements made in the past. This means that our senses use two forms of old information in order to make new movements.

Computer versus test subject
Understanding the brain processes behind movement can be of great importance to fields like robotics. Therefore Tramper is trying to model his findings so that it will be possible to use them in robots in the future. He has already succeeded in this for certain hand-eye coordination experiments, to the extent that a computer can perform at about the same level as human test subjects. As a post-doctoral researcher within the Donders Institute, Tramper is researching how these types of models can be integrated into bio-inspired robots (robots based on biological principles).

SpaceCog
Tramper is currently working on a project called SpaceCog. The goal of this project is to develop a robot which can independently orient itself in space, something that humans do automatically. This is difficult to achieve, because each time a robot moves, it must reinterpret the information from its cameras and other sensors in order to determine whether the changes to its input are the result of its own movement or an external cause. The researchers involved in SpaceCog want to figure out how our brain has solved this problem. Tramper has three years to come up with a good computer model addressing this issue.

Looking towards the future
Tramper is studying hand-eye coordination by having test subjects play a special computer game. The subjects use a game controller to move a digital right hand and left hand on a screen. They have to move the two hands independently of one another and make them each follow a particular path in order to reach a final destination (see film 1). It turned out that the test subject’s eyes moved ahead of the digital hands. In other words, the eyes looked at a point that the hands would reach in the future (see film 2). This type of eye movement is called smooth pursuit, and before now it had only been detected in the case of external stimuli, when a subject was following an object’s movement. Tramper detected smooth pursuit eye movements at locations the hands had not yet reached, meaning these movements were triggered by internal stimuli.

Smooth pursuit
Tramper explains, ‘We’d previously demonstrated for other types of eye movement that the eye anticipates and moves in advance of external movement  To our surprise, this is also the case with smooth pursuit. It is probable that this is a compromise between where you are at a particular moment and where you want to get to. When moving, you need to keep track of your current location (which is constantly changing) and your target destination. Smooth pursuit eye movements can help you do this by letting your eye “hover” between both locations. If we can teach robots to do something like this, it will help make their movements much more natural. This will increase the number of ways in which robots can be put to work.’

(Source: ru.nl)

Filed under sensory information robots robotics motor movements hand-eye coordination SpaceCog neuroscience science

87 notes

Virtual Games Help the Blind Navigate Unknown Territory
On March 27th JoVE (Journal of Visualized Experiments) published a new video article by Dr. Lotfi Merabet showing how researchers in the Department of Ophthalmology at Massachusetts Eye and Ear Infirmary and Harvard Medical School have developed a virtual gaming environment to help blind individuals improve navigation skills and develop a cognitive spatial map of unfamiliar buildings and public locations.
"For the blind, finding your way or navigating in a place that is unfamiliar presents a real challenge," Dr. Merabet explains. "As people with sight, we can capture sensory information through our eyes about our surroundings. For the blind that is a real challenge… the blind will typically use auditory and tactile cues."
The technique utilizes computer generated layouts of public buildings and spatial sensory feedback to synthesize a virtual world that mimics a real world navigation task. In the game, participants must find jewels and carry them out of the building, without being intercepted by roaming monsters that steal the jewels and hide them elsewhere.  Participants interface with the virtual building by using a keyboard and wearing headphones that play auditory cues that help spatially orient them to the world around them. This interaction helps users generate an accurate mental layout of the mimicked building.  Dr. Merabet and his colleagues are also exploring applications of this technology with other user interfaces, like a Wii Remote or joystick.
"We have developed software called ABES, the Audio Based Environment Simulator that represents the actual physical environment of the Carol Center for the Blind in Newton Massachusetts. The participants will use the game metaphor to get a sense of the whole building through open discovery, allowing people to learn room layouts more naturally than if they were just following directions."
The technology will invariably be useful for the 285 million blind people world-wide, 6 million of which live in the United States. It will also have applications beyond the blind community for individuals with other visual impairments, cognitive deficits, or those recovering from brain injuries.
Dr. Merabet considers publication in JoVE’s video format especially helpful. “It is conceptually difficult for a sighted person to understand ‘a video game for blind people.’ What JoVE allows us to do is break down layouts of the game and strategy, show how the auditory cues can be used and how we quantify performance going from the virtual game to the physical world.”

Virtual Games Help the Blind Navigate Unknown Territory

On March 27th JoVE (Journal of Visualized Experiments) published a new video article by Dr. Lotfi Merabet showing how researchers in the Department of Ophthalmology at Massachusetts Eye and Ear Infirmary and Harvard Medical School have developed a virtual gaming environment to help blind individuals improve navigation skills and develop a cognitive spatial map of unfamiliar buildings and public locations.

"For the blind, finding your way or navigating in a place that is unfamiliar presents a real challenge," Dr. Merabet explains. "As people with sight, we can capture sensory information through our eyes about our surroundings. For the blind that is a real challenge… the blind will typically use auditory and tactile cues."

The technique utilizes computer generated layouts of public buildings and spatial sensory feedback to synthesize a virtual world that mimics a real world navigation task. In the game, participants must find jewels and carry them out of the building, without being intercepted by roaming monsters that steal the jewels and hide them elsewhere.  Participants interface with the virtual building by using a keyboard and wearing headphones that play auditory cues that help spatially orient them to the world around them. This interaction helps users generate an accurate mental layout of the mimicked building.  Dr. Merabet and his colleagues are also exploring applications of this technology with other user interfaces, like a Wii Remote or joystick.

"We have developed software called ABES, the Audio Based Environment Simulator that represents the actual physical environment of the Carol Center for the Blind in Newton Massachusetts. The participants will use the game metaphor to get a sense of the whole building through open discovery, allowing people to learn room layouts more naturally than if they were just following directions."

The technology will invariably be useful for the 285 million blind people world-wide, 6 million of which live in the United States. It will also have applications beyond the blind community for individuals with other visual impairments, cognitive deficits, or those recovering from brain injuries.

Dr. Merabet considers publication in JoVE’s video format especially helpful. “It is conceptually difficult for a sighted person to understand ‘a video game for blind people.’ What JoVE allows us to do is break down layouts of the game and strategy, show how the auditory cues can be used and how we quantify performance going from the virtual game to the physical world.”

Filed under blind virtual gaming environment navigation skills sensory information cognitive map neuroscience science

882 notes

The great illusion of the self
As you wake up each morning, hazy and disoriented, you gradually become aware of the rustling of the sheets, sense their texture and squint at the light. One aspect of your self has reassembled: the first-person observer of reality, inhabiting a human body.
As wakefulness grows, so does your sense of having a past, a personality and motivations. Your self is complete, as both witness of the world and bearer of your consciousness and identity. You.
This intuitive sense of self is an effortless and fundamental human experience. But it is nothing more than an elaborate illusion. Under scrutiny, many common-sense beliefs about selfhood begin to unravel. Some thinkers even go as far as claiming that there is no such thing as the self.
In these articles, discover why “you” aren’t the person you thought you were.

The great illusion of the self

As you wake up each morning, hazy and disoriented, you gradually become aware of the rustling of the sheets, sense their texture and squint at the light. One aspect of your self has reassembled: the first-person observer of reality, inhabiting a human body.

As wakefulness grows, so does your sense of having a past, a personality and motivations. Your self is complete, as both witness of the world and bearer of your consciousness and identity. You.

This intuitive sense of self is an effortless and fundamental human experience. But it is nothing more than an elaborate illusion. Under scrutiny, many common-sense beliefs about selfhood begin to unravel. Some thinkers even go as far as claiming that there is no such thing as the self.

In these articles, discover why “you” aren’t the person you thought you were.

Filed under self perception sensory information locus of control brain psychology neuroscience science

free counters