Neuroscience

Articles and news from the latest research reports.

Posts tagged visual cortex

50 notes

Fear factor: Study shows brain’s response to scary stimuli

Driving through his hometown, a war veteran with post-traumatic stress disorder may see roadside debris and feel afraid, believing it to be a bomb. He’s ignoring his safe, familiar surroundings and only focusing on the debris; yet, when it comes to the visual cortex, a recent study at the University of Florida suggests this is completely normal.

The findings, published last month in the Journal of Neuroscience, show that even people who don’t have anxiety disorders respond visually at the sight of something scary while ignoring signs that indicate safety. This contradicts a common belief that only people with anxiety disorders have difficulty processing comforting visual stimuli, or safety cues, said Andreas Keil, a professor of psychology in UF’s College of Liberal Arts and Sciences.

“We’ve established that, in terms of visual responding, it’s not a disorder to not respond to a safety cue,” Keil said. “We all do that. So now we can study at what stage in the processing stream, with given patients, is the problem occurring.”

Co-authors Keil and Vladimir Miskovic, both members of the UF Center for the Study of Emotion and Attention, examined the effect of competing danger and safety cues within the visual cortex. The study results could help distinguish between normal and abnormal processes within the visual cortex and identify what parts of the brain are targets for the treatment of anxiety disorders.

“You’d think the visual cortex would just faithfully code for visual information,” said Shmuel Lissek, an assistant professor of psychology at the University of Minnesota not involved in the study. “This kind of work is testing the idea that activations in the visual cortex are actually different if the stimulus has an emotional value than if it doesn’t.”

(Source: news.ufl.edu)

Filed under visual cortex visual stimuli PTSD brainwaves anxiety anxiety disorders neuroscience psychology science

225 notes

Out of Sight, Out of Mind? How the brain codes its surroundings beyond the field of view
Even when they are not directly in sight, we are aware of our surroundings: so it is that when our eyes are fixed on an interesting book, for example, we know that the door is to the right, the bookshelf is to the left and the window is behind us. However, research into the brain has so far concerned itself predominantly with how information from our field of vision is coded in the visual cortex. To date it has not been known how the brain codes our surroundings beyond the field of view from an egocentric perspective (that is, from the point of view of the observer).
In the latest issue of the renowned journal Current Biology, Andreas Schindler und Andreas Bartels, scientists at the Werner Reichardt Center for Integrative Neuroscience (CIN) of the University of Tübingen, present for the first time direct evidence of this kind of spatial information in the brain.
The participants in their study found themselves in the center of a virtual octagonal room, with a unique object in each corner. As the brain’s activity was monitored by means of functional magnetic resonance imaging, the participants stood in front of one corner and looked at its object. Now they were instructed to determine the position of a second randomly chosen object within the room relative to their current perspective (for example, the object behind them). After a few trials the participant turned around so that the next object was brought into the field of view and the task was set up again. The whole procedure was repeated until every object had been looked at once.
The scientists discovered that patterns of activity in the parietal cortex code the participant’s egocentric position, that is, the relative position to his or her surroundings. The spatial information discovered there proved to be independent of the particular object, its absolute position in the room or that of the observer – i.e. it encoded egocentric spatial information of the three-dimensional surroundings. This result turns out to be particularly interesting because damage to the brain in the parietal cortex can lead to serious disruption of egocentric spatial awareness. Hence it is difficult for patients suffering from optical ataxia to carry out coordinated grasping movements. Lesions in the parietal cortex can also lead to a symptom called spatial neglect where patients have difficulties in perceiving their surroundings on the side opposite to the lesion. The brain areas identified in the present study coincided precisely with the areas of brain damage in such patients and provide for the first time insights regarding their function in the healthy brain.

Out of Sight, Out of Mind? How the brain codes its surroundings beyond the field of view

Even when they are not directly in sight, we are aware of our surroundings: so it is that when our eyes are fixed on an interesting book, for example, we know that the door is to the right, the bookshelf is to the left and the window is behind us. However, research into the brain has so far concerned itself predominantly with how information from our field of vision is coded in the visual cortex. To date it has not been known how the brain codes our surroundings beyond the field of view from an egocentric perspective (that is, from the point of view of the observer).

In the latest issue of the renowned journal Current Biology, Andreas Schindler und Andreas Bartels, scientists at the Werner Reichardt Center for Integrative Neuroscience (CIN) of the University of Tübingen, present for the first time direct evidence of this kind of spatial information in the brain.

The participants in their study found themselves in the center of a virtual octagonal room, with a unique object in each corner. As the brain’s activity was monitored by means of functional magnetic resonance imaging, the participants stood in front of one corner and looked at its object. Now they were instructed to determine the position of a second randomly chosen object within the room relative to their current perspective (for example, the object behind them). After a few trials the participant turned around so that the next object was brought into the field of view and the task was set up again. The whole procedure was repeated until every object had been looked at once.

The scientists discovered that patterns of activity in the parietal cortex code the participant’s egocentric position, that is, the relative position to his or her surroundings. The spatial information discovered there proved to be independent of the particular object, its absolute position in the room or that of the observer – i.e. it encoded egocentric spatial information of the three-dimensional surroundings. This result turns out to be particularly interesting because damage to the brain in the parietal cortex can lead to serious disruption of egocentric spatial awareness. Hence it is difficult for patients suffering from optical ataxia to carry out coordinated grasping movements. Lesions in the parietal cortex can also lead to a symptom called spatial neglect where patients have difficulties in perceiving their surroundings on the side opposite to the lesion. The brain areas identified in the present study coincided precisely with the areas of brain damage in such patients and provide for the first time insights regarding their function in the healthy brain.

Filed under brain brain activity visual cortex spatial awareness parietal cortex neuroscience science

55 notes

Re-tuning responses in the visual cortex
New research led by Shigeru Tanaka of the University of Electro-Communications and visiting scientist at the RIKEN Brain Science Institute has shown that the responses of cells in the visual cortex can be ‘re-tuned’ by experience.
Experiments on kittens in the 1960s showed that the primary visual cortex contains neurons that fire selectively to straight lines of specific orientations. These cells are organized into alternating columns that receive inputs from the left or right eye. The kitten experiments also showed that proper brain development is highly dependent on sensory information. Closing one eye altered the organization of the columns, so that those that should have received inputs from the closed eye were reduced in width, whereas those that received inputs from the open eye were much wider than normal.
The normal columnar organization can be restored if the closed eye is re-opened within a critical period of brain development. The effect of sensory experience on the orientation selectivity of neurons in the primary visual cortex is, however, unknown.
To investigate, Tanaka and his colleagues reared mice and fitted them with specially designed goggles through which they can only perceive vertically oriented visual stimuli, for a one-week period, between 3 and 15 weeks of age. Immediately after removing the goggles, they created a ‘window’ in the skull bone lying over the visual cortex to examine the cell response under the microscope.
Rearing the mice in this way had a significant effect on the properties of neurons in the primary visual cortex. The researchers found that the number of cells responding to vertical orientation increased, while the number responding to other orientation decreased. They also found that the extent of these changes depended on the age at which they fitted the animals with the goggles. Mice fitted with the goggles between 4 and 7 weeks of age had more cells that were sensitive to the experienced (vertical) orientation than those fitted later.
These findings show that there is a critical period of plasticity between 4 and 7 weeks, during which cells in the primary visual cortex are particularly sensitive to sensory experience and that plasticity persists in older animals, albeit to a lesser extent. They also suggest that plasticity in younger and older animals involves different mechanisms.
“When we put similar goggles on kittens, the age at which we started goggle rearing determined the reversibility of orientation selectivity,” says Tanaka. “We would now like to clarify the differences and commonalities of the mechanisms in cats and mice.”

Re-tuning responses in the visual cortex

New research led by Shigeru Tanaka of the University of Electro-Communications and visiting scientist at the RIKEN Brain Science Institute has shown that the responses of cells in the visual cortex can be ‘re-tuned’ by experience.

Experiments on kittens in the 1960s showed that the primary visual cortex contains neurons that fire selectively to straight lines of specific orientations. These cells are organized into alternating columns that receive inputs from the left or right eye. The kitten experiments also showed that proper brain development is highly dependent on sensory information. Closing one eye altered the organization of the columns, so that those that should have received inputs from the closed eye were reduced in width, whereas those that received inputs from the open eye were much wider than normal.

The normal columnar organization can be restored if the closed eye is re-opened within a critical period of brain development. The effect of sensory experience on the orientation selectivity of neurons in the primary visual cortex is, however, unknown.

To investigate, Tanaka and his colleagues reared mice and fitted them with specially designed goggles through which they can only perceive vertically oriented visual stimuli, for a one-week period, between 3 and 15 weeks of age. Immediately after removing the goggles, they created a ‘window’ in the skull bone lying over the visual cortex to examine the cell response under the microscope.

Rearing the mice in this way had a significant effect on the properties of neurons in the primary visual cortex. The researchers found that the number of cells responding to vertical orientation increased, while the number responding to other orientation decreased. They also found that the extent of these changes depended on the age at which they fitted the animals with the goggles. Mice fitted with the goggles between 4 and 7 weeks of age had more cells that were sensitive to the experienced (vertical) orientation than those fitted later.

These findings show that there is a critical period of plasticity between 4 and 7 weeks, during which cells in the primary visual cortex are particularly sensitive to sensory experience and that plasticity persists in older animals, albeit to a lesser extent. They also suggest that plasticity in younger and older animals involves different mechanisms.

“When we put similar goggles on kittens, the age at which we started goggle rearing determined the reversibility of orientation selectivity,” says Tanaka. “We would now like to clarify the differences and commonalities of the mechanisms in cats and mice.”

Filed under visual cortex brain brain development cell response neuroscience science

741 notes













The Meaning of Pupil Dilation
For more than a century, scientists have known that our pupils respond to more than changes in light. They also betray mental and emotional commotion within. In fact, pupil dilation correlates with arousal so consistently that researchers use pupil size, or pupillometry, to investigate a wide range of psychological phenomena. And they do this without knowing exactly why our eyes behave this way. “Nobody really knows for sure what these changes do,” said Stuart Steinhauer, who directs the Biometrics Research Lab at the University of Pittsburgh School of Medicine.
While the visual cortex in the back of the brain assembles the images we see, a different, older part of our nervous system manages the continuous tuning of our pupil size, alongside other functions—like heart rate and perspiration—that operate mostly outside our conscious control. This autonomic nervous system dictates the movement of the iris, like the lens of a camera, to regulate the amount of light that enters the pupil.
The iris is made of two types of muscle: in a brightly lit environment, a ring of sphincter muscles that encircle and constrict the pupil down to as little as a couple of millimeters across; in the dark, a set of dilator muscles laid out like bicycle spokes, which can expand the pupil up to 8 millimeters—approximately the diameter of a chickpea.
Cognitive and emotional events can also dictate pupil constriction and expansion, though such events occur on a smaller scale than the light reflex, causing changes generally less than half a millimeter. But that’s enough. By recording subjects’ eyes with infrared cameras and controlling for other factors that might affect pupil size, like brightness, color, and distance, scientists can use pupil movements as a proxy for other processes, like mental strain.


















Read more
(Image: Wikimedia Commons)

The Meaning of Pupil Dilation

For more than a century, scientists have known that our pupils respond to more than changes in light. They also betray mental and emotional commotion within. In fact, pupil dilation correlates with arousal so consistently that researchers use pupil size, or pupillometry, to investigate a wide range of psychological phenomena. And they do this without knowing exactly why our eyes behave this way. “Nobody really knows for sure what these changes do,” said Stuart Steinhauer, who directs the Biometrics Research Lab at the University of Pittsburgh School of Medicine.

While the visual cortex in the back of the brain assembles the images we see, a different, older part of our nervous system manages the continuous tuning of our pupil size, alongside other functions—like heart rate and perspiration—that operate mostly outside our conscious control. This autonomic nervous system dictates the movement of the iris, like the lens of a camera, to regulate the amount of light that enters the pupil.

The iris is made of two types of muscle: in a brightly lit environment, a ring of sphincter muscles that encircle and constrict the pupil down to as little as a couple of millimeters across; in the dark, a set of dilator muscles laid out like bicycle spokes, which can expand the pupil up to 8 millimeters—approximately the diameter of a chickpea.

Cognitive and emotional events can also dictate pupil constriction and expansion, though such events occur on a smaller scale than the light reflex, causing changes generally less than half a millimeter. But that’s enough. By recording subjects’ eyes with infrared cameras and controlling for other factors that might affect pupil size, like brightness, color, and distance, scientists can use pupil movements as a proxy for other processes, like mental strain.

Filed under pupil dilation visual cortex pupillometry emotions cognition psychology neuroscience science

231 notes



Learning to control brain activity improves visual sensitivity
Researchers at the Wellcome Trust Centre for Neuroimaging at UCL used non-invasive, real-time brain imaging that enabled participants to watch their own brain activity on a screen, a technique known as neurofeedback. During the training phase, they were asked to try to increase activity in the area of the brain that processes visual information, the visual cortex, by imagining images and observing how their brains responded.
After the training phase, the participants’ visual perception was tested using a new task that required them to detect very subtle changes in the contrast of an image. When they were asked to repeat this task while clamping brain activity in the visual cortex at high levels, those who had successfully learned to control their brain activity could improve their ability to detect even very small changes in contrast.
This improved performance was only observed when participants were exercising control over their brain activity.
Lead author Dr Frank Scharnowski, who is now based at the University of Geneva, explains: “We’ve shown that we can train people to manipulate their own brain activity and improve their visual sensitivity, without surgery and without drugs.”
In the past, researchers have used recordings of electrical activity in the brain to train people on various tasks, including cutting their reaction times, altering their emotional responses and even improving their musical performance. In this study, the researchers used functional magnetic resonance imaging (fMRI) to provide the volunteers with real-time feedback on brain activity. The advantage of this technique is that you can see exactly where in the brain the training is having an effect, so you can target the training to particular brain areas that are responsible for specific tasks.
"The next step is to test this approach in the clinic to see whether we can offer any benefit to patients, for example to stroke patients who may have problems with perception, even though there is no damage to their vision," adds Dr Scharnowski.

Learning to control brain activity improves visual sensitivity

Researchers at the Wellcome Trust Centre for Neuroimaging at UCL used non-invasive, real-time brain imaging that enabled participants to watch their own brain activity on a screen, a technique known as neurofeedback. During the training phase, they were asked to try to increase activity in the area of the brain that processes visual information, the visual cortex, by imagining images and observing how their brains responded.

After the training phase, the participants’ visual perception was tested using a new task that required them to detect very subtle changes in the contrast of an image. When they were asked to repeat this task while clamping brain activity in the visual cortex at high levels, those who had successfully learned to control their brain activity could improve their ability to detect even very small changes in contrast.

This improved performance was only observed when participants were exercising control over their brain activity.

Lead author Dr Frank Scharnowski, who is now based at the University of Geneva, explains: “We’ve shown that we can train people to manipulate their own brain activity and improve their visual sensitivity, without surgery and without drugs.”

In the past, researchers have used recordings of electrical activity in the brain to train people on various tasks, including cutting their reaction times, altering their emotional responses and even improving their musical performance. In this study, the researchers used functional magnetic resonance imaging (fMRI) to provide the volunteers with real-time feedback on brain activity. The advantage of this technique is that you can see exactly where in the brain the training is having an effect, so you can target the training to particular brain areas that are responsible for specific tasks.

"The next step is to test this approach in the clinic to see whether we can offer any benefit to patients, for example to stroke patients who may have problems with perception, even though there is no damage to their vision," adds Dr Scharnowski.

Filed under brain brain activity neurofeedback visual perception visual cortex neuroscience psychology science

193 notes


Two heads are better than one
Dramatic expansion of the human cerebral cortex, over the course of evolution, accommodated new areas for specialized cognitive function, including language. Understanding the genetic mechanisms underlying these changes, however, remains a challenge to neuroscientists.
A team of researchers in Japan, led by Hideyuki Okano of Keio University School of Medicine and Tomomi Shimogori of the RIKEN Brain Science Institute, has now elucidated the mechanisms of cortical evolution. They used molecular techniques to compare the gene expression patterns in mouse and monkey brains. 
Using the technique called in situ hybridization to visualize the distribution of mRNA transcripts, Okano, Shimogori and their colleagues examined the expression patterns of genes that are known to regulate development of the mouse brain. They compared these patterns to those of the same genes in the brain of the common marmoset. They found that most of the genes had similar expression patterns in mice and marmosets, but that some had strikingly different patterns between the two species. Notably, some areas of the visual and prefrontal cortices showed expression patterns that were unique to marmosets. 
The researchers also found differences in gene expression within regions that connect the prefrontal cortex and hippocampus, a structure that is critical for learning and memory.

Two heads are better than one

Dramatic expansion of the human cerebral cortex, over the course of evolution, accommodated new areas for specialized cognitive function, including language. Understanding the genetic mechanisms underlying these changes, however, remains a challenge to neuroscientists.

A team of researchers in Japan, led by Hideyuki Okano of Keio University School of Medicine and Tomomi Shimogori of the RIKEN Brain Science Institute, has now elucidated the mechanisms of cortical evolution. They used molecular techniques to compare the gene expression patterns in mouse and monkey brains. 

Using the technique called in situ hybridization to visualize the distribution of mRNA transcripts, Okano, Shimogori and their colleagues examined the expression patterns of genes that are known to regulate development of the mouse brain. They compared these patterns to those of the same genes in the brain of the common marmoset. They found that most of the genes had similar expression patterns in mice and marmosets, but that some had strikingly different patterns between the two species. Notably, some areas of the visual and prefrontal cortices showed expression patterns that were unique to marmosets. 

The researchers also found differences in gene expression within regions that connect the prefrontal cortex and hippocampus, a structure that is critical for learning and memory.

Filed under visual cortex cognitive functioning brain structure neuron genes gene expression neuroscience science

free counters