Neuroscience

Articles and news from the latest research reports.

Posts tagged visual cortex

282 notes

Running, Combined with Visual Experience, Restores Brain Function
In a new study by UC San Francisco scientists, running, when accompanied by visual stimuli, restored brain function to normal levels in mice that had been deprived of visual experience in early life.
In addition to suggesting a novel therapeutic strategy for humans with blindness in one eye caused by a congenital cataract, droopy eyelid, or misaligned eye, the new research—the latest in a series of UCSF studies exploring effects of locomotion on brain function—suggests that the adult brain may be far more capable of rewiring and repairing itself than previously thought.
In 2010, Michael P. Stryker, PhD, the W.F. Ganong Professor of Physiology, and postdoctoral fellow Cris Niell, PhD, now at the University of Oregon, made the surprising discovery that neurons in the visual area of the mouse brain fired much more robustly whenever the mice walked or ran.
Earlier this year, postdoctoral fellow Yu Fu, PhD, Stryker and a number of colleagues built on these findings, identifying and describing the neural circuit responsible for this locomotion-induced “high-gain state” in the visual cortex of the mouse brain.
Neither of these studies made clear, however, whether this circuit might have broader functional or clinical significance.
It has been known since the 1960s that visual areas of the brain do not develop normally if deprived of visual input during a “critical period” of brain development early in life. For example, in humans, if amblyopia (“lazy eye”) or other major eye problems are not surgically corrected in infancy, vision will never be normal in the affected eye—if such individuals lose sight in their “good” eye in later life, they are blind.
In the new research, published June 26, 2014 in the online journal eLife, Stryker and UCSF postdoctoral fellow Megumi Kaneko, MD, PhD, closed one eyelid of mouse pups at about 20 days after birth, and that eye was kept closed until the mice reached about five months of age.
As expected, the mice in which one eye had been closed during the critical developmental period showed sharply reduced neural activity in the part of the brain responsible for vision in that eye.
As in the previous UCSF experiments in this area, some mice were allowed to run freely on Styrofoam balls suspended on a cushion of air while recordings were made from their brains.
Little improvement was seen in the mice that had been deprived of visual input either when they were simply allowed to run or when they received visual training with the deprived eye not accompanied by walking or running.
But when the mice were exposed to the visual stimuli while they were running or walking, the results were dramatic: within a week the brain responses to those stimuli from the deprived eye were nearly identical to those from the normal eye, indicating that the circuits in the visual area of the brain representing the deprived eye had undergone a rapid reorganization, known in neuroscience as “plasticity.”
Interestingly, this recovery was stimulus-specific: if the brain activity of the mice was tested using a stimulus other than that they had seen while running, little or no recovery of function was apparent.
“We have no idea yet whether running puts the human cortex into a high-gain state that enhances plasticity, as it does the visual cortex of the mouse,” Stryker said, “but we are designing experiments to find out.”

Running, Combined with Visual Experience, Restores Brain Function

In a new study by UC San Francisco scientists, running, when accompanied by visual stimuli, restored brain function to normal levels in mice that had been deprived of visual experience in early life.

In addition to suggesting a novel therapeutic strategy for humans with blindness in one eye caused by a congenital cataract, droopy eyelid, or misaligned eye, the new research—the latest in a series of UCSF studies exploring effects of locomotion on brain function—suggests that the adult brain may be far more capable of rewiring and repairing itself than previously thought.

In 2010, Michael P. Stryker, PhD, the W.F. Ganong Professor of Physiology, and postdoctoral fellow Cris Niell, PhD, now at the University of Oregon, made the surprising discovery that neurons in the visual area of the mouse brain fired much more robustly whenever the mice walked or ran.

Earlier this year, postdoctoral fellow Yu Fu, PhD, Stryker and a number of colleagues built on these findings, identifying and describing the neural circuit responsible for this locomotion-induced “high-gain state” in the visual cortex of the mouse brain.

Neither of these studies made clear, however, whether this circuit might have broader functional or clinical significance.

It has been known since the 1960s that visual areas of the brain do not develop normally if deprived of visual input during a “critical period” of brain development early in life. For example, in humans, if amblyopia (“lazy eye”) or other major eye problems are not surgically corrected in infancy, vision will never be normal in the affected eye—if such individuals lose sight in their “good” eye in later life, they are blind.

In the new research, published June 26, 2014 in the online journal eLife, Stryker and UCSF postdoctoral fellow Megumi Kaneko, MD, PhD, closed one eyelid of mouse pups at about 20 days after birth, and that eye was kept closed until the mice reached about five months of age.

As expected, the mice in which one eye had been closed during the critical developmental period showed sharply reduced neural activity in the part of the brain responsible for vision in that eye.

As in the previous UCSF experiments in this area, some mice were allowed to run freely on Styrofoam balls suspended on a cushion of air while recordings were made from their brains.

Little improvement was seen in the mice that had been deprived of visual input either when they were simply allowed to run or when they received visual training with the deprived eye not accompanied by walking or running.

But when the mice were exposed to the visual stimuli while they were running or walking, the results were dramatic: within a week the brain responses to those stimuli from the deprived eye were nearly identical to those from the normal eye, indicating that the circuits in the visual area of the brain representing the deprived eye had undergone a rapid reorganization, known in neuroscience as “plasticity.”

Interestingly, this recovery was stimulus-specific: if the brain activity of the mice was tested using a stimulus other than that they had seen while running, little or no recovery of function was apparent.

“We have no idea yet whether running puts the human cortex into a high-gain state that enhances plasticity, as it does the visual cortex of the mouse,” Stryker said, “but we are designing experiments to find out.”

Filed under visual cortex brain function brain activity amblyopia plasticity locomotion neuroscience science

175 notes

Scientists show how bigger brains could help us see better

It has become increasingly common to hear reports that big brains are not necessary, or even an evolutionary fluke. However, the new article found that increases in the size of brain areas, such as the visual cortex, are an essential element of evolution.

image

As part of the study, the researchers found that an increase in the size of the visual part of the brain in different primate species, including humans, apes, and monkeys, is associated with enhanced visual processing.

It is controversial whether overall brain size can predict intelligence. However the size of specialised areas within the brain is associated with specific changes in behaviour such as reducing the susceptibility to visual illusions and increasing the visual acuity or fine details that can be seen.

First author, Dr Alexandra de Sousa explained: “Primates with a bigger visual cortex have better visual resolution, the precision of vision, and reduced visual illusion strength. In essence, the bigger the brain area, the better the visual processing ability.

“The size of brain areas predicts not only the number of neurons (brain cells) in that area, but also the likelihood of connections between neurons. These connections allow for increasingly complex computations to be made that allow for more accurate, and more difficult, visual perception.”

Co-author, Dr Michael Proulx, Senior Lecturer (Associate Professor) in Psychology, added: “This paper is a novel attempt to bring together the micro and macro anatomy of the brain with behaviour. We link visual abilities, the size of brain areas, and the number of neurons that make up those brain areas to provide a framework that ties brain structure and function together.

“The theory of brain size that we discuss can be tested in the future with more behavioural tests of other species, gathering more comparative neuroanatomical data, and by testing other senses and multi-sensory perception, too. We might be able to even predict how well extinct species could sense the world based on fossil data.”

For the study, Dr Alexandra de Sousa, an expert in brain evolution, provided brain size measurements from her and other’s neuroanatomical research. Dr Michael Proulx, an expert in perception, found psychological studies of visual illusions and visual acuity in the same species or general of animals.

The paper ‘What can volumes reveal about human brain evolution? A framework for bridging behavioral, histometric and volumetric perspectives’ is published today in Frontiers in Neuroanatomy – an online, open access journal.

(Source: bath.ac.uk)

Filed under visual cortex vision brain size evolution brain cells neuroscience science

99 notes

(Image caption: Astrocyte activity is shown in green in this slice of tissue from the brain region that controls movement in mice. Internal, structural elements of the astrocytes are shown in magenta; cell bodies are in red. Credit: Amit Agarwal and Dwight Bergles, courtesy of Cell Press)
Fight-Or-Flight Chemical Prepares Cells to Shift the Brain From Subdued to Alert State
A new study from The Johns Hopkins University shows that the brain cells surrounding a mouse’s neurons do much more than fill space. According to the researchers, the cells, called astrocytes because of their star-shaped appearance, can monitor and respond to nearby neural activity, but only after being activated by the fight-or-flight chemical norepinephrine. Because astrocytes can alter the activity of neurons, the findings suggest that astrocytes may help control the brain’s ability to focus.
The study involved observing the cells in the brains of living, active mice over long periods of time. A combination of genetically engineered mice and advanced microscopy allowed the researchers to visualize the activity of astrocyte networks in different regions of the brain to learn how these abundant supporting cells are controlled.
The scientists monitored astrocytes in the area of the brain responsible for controlling movement and saw that the cells often increased their activity as the mice walked on treadmills — but not always, and sometimes astrocytes became active when the animals were not moving. This lack of consistency suggested to the researchers that the astrocytes were not responding to nearby neurons, as had been thought.
Similarly, astrocytes in the vision processing area of the brain did not necessarily become active when the mice were stimulated with light, but they were sometimes active, even in the dark. The team solved both mysteries when they tested the idea that the astrocytes needed a signal to “wake them up” before they could respond to nearby neurons. That is how they found that norepinephrine, the brain’s broadly distributed fight-or-flight signal, primes the astrocytes in both locations to “listen in” on nearby neuronal activity.
“Astrocytes are among the most abundant cells in the brain, but we know very little about how they are controlled and how they contribute to brain function,” says Dwight Bergles, Ph.D., professor of neuroscience, who led the study. “Since memory formation and other important functions of the brain require a state of attention, we’re interested in learning more about how astrocytes help create that state.”
For example, Bergles says, “We know that astrocytes can regulate local blood flow, provide energy to neurons and release signaling molecules that alter neuronal activity. They could be doing any or all of those things in response to being activated. It is also possible that they act as a sort of megaphone to broadcast local norepinephrine signals to every neuron in the brain.” Whatever the case may be, researchers now know that astrocytes are not idle loiterers. This ability to study astrocyte network activity in animals as they do different things will help to reveal how these cells contribute to brain function.
This research was published in the journal Neuron on June 18.

(Image caption: Astrocyte activity is shown in green in this slice of tissue from the brain region that controls movement in mice. Internal, structural elements of the astrocytes are shown in magenta; cell bodies are in red. Credit: Amit Agarwal and Dwight Bergles, courtesy of Cell Press)

Fight-Or-Flight Chemical Prepares Cells to Shift the Brain From Subdued to Alert State

A new study from The Johns Hopkins University shows that the brain cells surrounding a mouse’s neurons do much more than fill space. According to the researchers, the cells, called astrocytes because of their star-shaped appearance, can monitor and respond to nearby neural activity, but only after being activated by the fight-or-flight chemical norepinephrine. Because astrocytes can alter the activity of neurons, the findings suggest that astrocytes may help control the brain’s ability to focus.

The study involved observing the cells in the brains of living, active mice over long periods of time. A combination of genetically engineered mice and advanced microscopy allowed the researchers to visualize the activity of astrocyte networks in different regions of the brain to learn how these abundant supporting cells are controlled.

The scientists monitored astrocytes in the area of the brain responsible for controlling movement and saw that the cells often increased their activity as the mice walked on treadmills — but not always, and sometimes astrocytes became active when the animals were not moving. This lack of consistency suggested to the researchers that the astrocytes were not responding to nearby neurons, as had been thought.

Similarly, astrocytes in the vision processing area of the brain did not necessarily become active when the mice were stimulated with light, but they were sometimes active, even in the dark. The team solved both mysteries when they tested the idea that the astrocytes needed a signal to “wake them up” before they could respond to nearby neurons. That is how they found that norepinephrine, the brain’s broadly distributed fight-or-flight signal, primes the astrocytes in both locations to “listen in” on nearby neuronal activity.

“Astrocytes are among the most abundant cells in the brain, but we know very little about how they are controlled and how they contribute to brain function,” says Dwight Bergles, Ph.D., professor of neuroscience, who led the study. “Since memory formation and other important functions of the brain require a state of attention, we’re interested in learning more about how astrocytes help create that state.”

For example, Bergles says, “We know that astrocytes can regulate local blood flow, provide energy to neurons and release signaling molecules that alter neuronal activity. They could be doing any or all of those things in response to being activated. It is also possible that they act as a sort of megaphone to broadcast local norepinephrine signals to every neuron in the brain.” Whatever the case may be, researchers now know that astrocytes are not idle loiterers. This ability to study astrocyte network activity in animals as they do different things will help to reveal how these cells contribute to brain function.

This research was published in the journal Neuron on June 18.

Filed under astrocytes neural activity norepinephrine visual cortex neuroscience science

101 notes

Neural Transplant Reduces Absence Epilepsy Seizures in Mice
New research from North Carolina State University pinpoints the areas of the cerebral cortex that are affected in mice with absence epilepsy and shows that transplanting embryonic neural cells into these areas can alleviate symptoms of the disease by reducing seizure activity. The work may help identify the areas of the human brain affected in absence epilepsy and lead to new therapies for sufferers.

Absence epilepsy primarily affects children. These seizures differ from “clonic-tonic” seizures in that they don’t cause muscle spasms; rather, patients “zone out” or stare into space for a period of time, with no memory of the episode afterward. Around one-third of patients with absence epilepsy fail to respond to medication, demonstrating the complexity of the disease.

NC State neurobiology professor Troy Ghashghaei and colleagues looked at a genetic mouse model for absence epilepsy to determine what was happening in their brains during these seizures. They found that the seizures were accompanied by hyperactivity in the areas of the brain associated with vision and touch – areas referred to as primary visual and primary somatosensory cortices in the occipital and parietal lobes, respectively.

“There are neurons that excite brain activity, and neurons that inhibit activity,” Ghashghaei says. “The inhibitory neurons work by secreting an inhibitory neurotransmitter called gamma-aminobutyric acid, or GABA. The ‘GABAergic’ interneurons were recently shown by others to be defective in the mice with absence seizures, and we surmised that these malfunctioning neurons might be part of the problem, especially in the visual and somatosensory cortical areas.”

Ghashghaei’s team took embryonic neural stem cells from a part of the developing brain that generates GABAergic interneurons for the cerebral cortex. They harvested these cells from normal mouse embryos and transplanted them into the occipital cortex of the genetic mice with absence seizures. Absence seizure activity in treated animals decreased dramatically, and the mice gained more weight and survived longer than untreated mice.

“This is a profound and remarkably effective first result, and adds to the recent body of evidence that these transplantation treatments can work in mouse models of epilepsy. But we still don’t understand the mechanisms behind what the normal inhibitory cells are doing in areas of the visual cortex of absence epileptic mice,” Ghashghaei says. “We know that you can get positive results even when a small number of transplanted neurons actually integrate into the cortex of affected mice, which is very interesting.  But we don’t know how the transplanted cells are connecting with other cells in the cortex and how they alleviate the absence seizures in the mouse model we employed.

“Our next steps will be to explore these questions. In addition, we are very interested in methods being devised by multiple labs around the world to ‘reprogram’ cells from transplantation patients to generate normal GABAergic and other types of neurons. Once established, this would eliminate the need for embryonic stem cells for this type of treatment. The ultimate goal is to develop new therapies for humans suffering from various forms of epilepsies, especially those for whom drugs do not work.”

Neural Transplant Reduces Absence Epilepsy Seizures in Mice

New research from North Carolina State University pinpoints the areas of the cerebral cortex that are affected in mice with absence epilepsy and shows that transplanting embryonic neural cells into these areas can alleviate symptoms of the disease by reducing seizure activity. The work may help identify the areas of the human brain affected in absence epilepsy and lead to new therapies for sufferers.

Absence epilepsy primarily affects children. These seizures differ from “clonic-tonic” seizures in that they don’t cause muscle spasms; rather, patients “zone out” or stare into space for a period of time, with no memory of the episode afterward. Around one-third of patients with absence epilepsy fail to respond to medication, demonstrating the complexity of the disease.

NC State neurobiology professor Troy Ghashghaei and colleagues looked at a genetic mouse model for absence epilepsy to determine what was happening in their brains during these seizures. They found that the seizures were accompanied by hyperactivity in the areas of the brain associated with vision and touch – areas referred to as primary visual and primary somatosensory cortices in the occipital and parietal lobes, respectively.

“There are neurons that excite brain activity, and neurons that inhibit activity,” Ghashghaei says. “The inhibitory neurons work by secreting an inhibitory neurotransmitter called gamma-aminobutyric acid, or GABA. The ‘GABAergic’ interneurons were recently shown by others to be defective in the mice with absence seizures, and we surmised that these malfunctioning neurons might be part of the problem, especially in the visual and somatosensory cortical areas.”

Ghashghaei’s team took embryonic neural stem cells from a part of the developing brain that generates GABAergic interneurons for the cerebral cortex. They harvested these cells from normal mouse embryos and transplanted them into the occipital cortex of the genetic mice with absence seizures. Absence seizure activity in treated animals decreased dramatically, and the mice gained more weight and survived longer than untreated mice.

“This is a profound and remarkably effective first result, and adds to the recent body of evidence that these transplantation treatments can work in mouse models of epilepsy. But we still don’t understand the mechanisms behind what the normal inhibitory cells are doing in areas of the visual cortex of absence epileptic mice,” Ghashghaei says. “We know that you can get positive results even when a small number of transplanted neurons actually integrate into the cortex of affected mice, which is very interesting.  But we don’t know how the transplanted cells are connecting with other cells in the cortex and how they alleviate the absence seizures in the mouse model we employed.

“Our next steps will be to explore these questions. In addition, we are very interested in methods being devised by multiple labs around the world to ‘reprogram’ cells from transplantation patients to generate normal GABAergic and other types of neurons. Once established, this would eliminate the need for embryonic stem cells for this type of treatment. The ultimate goal is to develop new therapies for humans suffering from various forms of epilepsies, especially those for whom drugs do not work.”

Filed under epilepsy cerebral cortex visual cortex interneurons epileptic seizures somatosensory cortex neuroscience science

156 notes

Sound and vision: visual cortex processes auditory information too

‘Seeing is believing’, so the idiom goes, but new research suggests vision also involves a bit of hearing.

image

Scientists studying brain process involved in sight have found the visual cortex also uses information gleaned from the ears as well as the eyes when viewing the world.

They suggest this auditory input enables the visual system to predict incoming information and could confer a survival advantage.

Professor Lars Muckli, of the Institute of Neuroscience and Psychology at the University of Glasgow, who led the research, said: “Sounds create visual imagery, mental images, and automatic projections.

“So, for example, if you are in a street and you hear the sound of an approaching motorbike, you expect to see a motorbike coming around the corner. If it turned out to be a horse, you’d be very surprised.”

The study, published in the journal Current Biology, involved conducting five different experiments using functional Magnetic Resonance Imaging (fMRI) to examine the activity in the early visual cortex in 10 volunteer subjects.

In one experiment they asked the blindfolded volunteers to listen to three different sounds – birdsong, traffic noise and a talking crowd.

Using a special algorithm that can identify unique patterns in brain activity, the researchers were able to discriminate between the different sounds being processed in early visual cortex activity.

A second experiment revealed even imagined images, in the absence of both sight and sound, evoked activity in the early visual cortex.

Lars Muckli said: “This research enhances our basic understanding of how interconnected different regions of the brain are. The early visual cortex hasn’t previously been known to process auditory information, and while there is some anatomical evidence of interconnectedness in monkeys, our study is the first to clearly show a relationship in humans.

“In future we will test how this auditory information supports visual processing, but the assumption is it provides predictions to help the visual system to focus on surprising events which would confer a survival advantage.

“This might provide insights into mental health conditions such as schizophrenia or autism and help us understand how sensory perceptions differ in these individuals.”

(Source: gla.ac.uk)

Filed under visual cortex hearing vision auditory perception visual processing neuroscience science

81 notes

Controlling Brain Waves to Improve Vision
Have you ever accidently missed a red light or a stop sign? Or have you  heard someone mention a visible event that you passed by but totally missed seeing?
“When we have different things competing for our attention, we can only be aware of so much of what we see,” said Kyle Mathewson, Beckman Institute Postdoctoral Fellow. “For example, when you’re driving, you might really be concentrating on obeying traffic signals.”
But say there’s an unexpected event: an emergency vehicle, a pedestrian, or an animal running into the road—will you actually see the unexpected, or will you be so focused on your initial task that you don’t notice?
“In the car, we may see something so brief or so faint, while we’re paying attention to something else, that the event won’t come into our awareness,” says Mathewson. “If you present this scenario hundreds of times to someone, sometimes they will see the unexpected event, and sometimes they won’t because their brain is in a different preparation state.”
By using a novel technique to test brain waves, Mathewson and colleagues are discovering how the brain processes external stimuli that do and don’t reach our awareness. A paper about their results, “Dynamics of Alpha Control: Preparatory Suppression of Posterior Alpha Oscillations by Frontal Modulators Revealed with Combined EEG and Event-related Optical Signal,” published this month in the Journal of Cognitive Neuroscience, reveals how alpha waves, typically thought of as your brain’s electrical activity while it’s at rest, can actually influence what we see or don’t see.
The researchers used both electroencephalography (EEG) and the event-related optical signal (EROS), developed in the Cognitive Neuroimaging Laboratory of Gabriele Gratton and Monica Fabiani, professors of psychology and members of the Beckman Institute’s Cognitive Neuroscience Group, and authors of the study.
While EEG records the electrical activity along the scalp, EROS uses infrared light passed through optical fibers to measure changes in optical properties in the active areas of the cerebral cortex. Because of the hard skull between the EEG sensors and the brain, it can be difficult to find exactly WHERE signals are produced. EROS, which examines how light is scattered, can noninvasively pinpoint activity within the brain.
“EROS is based on near-infrared light,” explained Fabiani and Gratton via email. “It exploits the fact that when neurons are active, they swell a little, becoming slightly more transparent to light: this allows us to determine when a particular part of the cortex is processing information, as well as where the activity occurs.”
This allowed the researchers to not only measure activity in the brain, but also allowed them to map where the alpha oscillations were originating. Their discovery: the alpha waves are produced in the cuneus, located in the part of the brain that processes visual information.
The alpha can inhibit what is processed visually, making it hard for you to see something unexpected.
By focusing your attention and concentrating more fully on what you are experiencing, however, the executive function of the brain can come into play and provide “top-down” control—putting a brake on the alpha waves, thus allowing you to see things that you might have missed in a more relaxed state.
“We found that the same brain regions known to control our attention are involved in suppressing the alpha waves and improving our ability to detect hard-to-see targets,” said Diane Beck, a member of the Beckman’s Cognitive Neuroscience Group, and one of the study’s authors.
“Knowing where the waves originate means we can target that area specifically with electrical stimulation” said Mathewson. “Or we can also give people moment-to-moment feedback, which could be used to alert drivers that they are not paying attention and should increase their focus on the road ahead, or in other situations alert students in a classroom that they need to focus more, or athletes, or pilots and equipment operators.”
The study examined 16 subjects and mapped the electrical and optical data onto individual MRI brain images.

Controlling Brain Waves to Improve Vision

Have you ever accidently missed a red light or a stop sign? Or have you  heard someone mention a visible event that you passed by but totally missed seeing?

“When we have different things competing for our attention, we can only be aware of so much of what we see,” said Kyle Mathewson, Beckman Institute Postdoctoral Fellow. “For example, when you’re driving, you might really be concentrating on obeying traffic signals.”

But say there’s an unexpected event: an emergency vehicle, a pedestrian, or an animal running into the road—will you actually see the unexpected, or will you be so focused on your initial task that you don’t notice?

“In the car, we may see something so brief or so faint, while we’re paying attention to something else, that the event won’t come into our awareness,” says Mathewson. “If you present this scenario hundreds of times to someone, sometimes they will see the unexpected event, and sometimes they won’t because their brain is in a different preparation state.”

By using a novel technique to test brain waves, Mathewson and colleagues are discovering how the brain processes external stimuli that do and don’t reach our awareness. A paper about their results, “Dynamics of Alpha Control: Preparatory Suppression of Posterior Alpha Oscillations by Frontal Modulators Revealed with Combined EEG and Event-related Optical Signal,” published this month in the Journal of Cognitive Neuroscience, reveals how alpha waves, typically thought of as your brain’s electrical activity while it’s at rest, can actually influence what we see or don’t see.

The researchers used both electroencephalography (EEG) and the event-related optical signal (EROS), developed in the Cognitive Neuroimaging Laboratory of Gabriele Gratton and Monica Fabiani, professors of psychology and members of the Beckman Institute’s Cognitive Neuroscience Group, and authors of the study.

While EEG records the electrical activity along the scalp, EROS uses infrared light passed through optical fibers to measure changes in optical properties in the active areas of the cerebral cortex. Because of the hard skull between the EEG sensors and the brain, it can be difficult to find exactly WHERE signals are produced. EROS, which examines how light is scattered, can noninvasively pinpoint activity within the brain.

“EROS is based on near-infrared light,” explained Fabiani and Gratton via email. “It exploits the fact that when neurons are active, they swell a little, becoming slightly more transparent to light: this allows us to determine when a particular part of the cortex is processing information, as well as where the activity occurs.”

This allowed the researchers to not only measure activity in the brain, but also allowed them to map where the alpha oscillations were originating. Their discovery: the alpha waves are produced in the cuneus, located in the part of the brain that processes visual information.

The alpha can inhibit what is processed visually, making it hard for you to see something unexpected.

By focusing your attention and concentrating more fully on what you are experiencing, however, the executive function of the brain can come into play and provide “top-down” control—putting a brake on the alpha waves, thus allowing you to see things that you might have missed in a more relaxed state.

“We found that the same brain regions known to control our attention are involved in suppressing the alpha waves and improving our ability to detect hard-to-see targets,” said Diane Beck, a member of the Beckman’s Cognitive Neuroscience Group, and one of the study’s authors.

“Knowing where the waves originate means we can target that area specifically with electrical stimulation” said Mathewson. “Or we can also give people moment-to-moment feedback, which could be used to alert drivers that they are not paying attention and should increase their focus on the road ahead, or in other situations alert students in a classroom that they need to focus more, or athletes, or pilots and equipment operators.”

The study examined 16 subjects and mapped the electrical and optical data onto individual MRI brain images.

Filed under brain activity brainwaves neural activity EROS EEG visual cortex alpha oscillations neuroscience science

111 notes

(Fig. 1: Humans have the ability to accurately estimate the speed of moving objects under good light conditions, such as a bird on a clear day (left). On a cloudy day (right), however, the sensory information may be more ambiguous and invokes a specific cognitive mechanism—perceptual bias—that is hardwired into the visual cortex. Image credit: Justin Gardner, RIKEN Brain Science Institute)
An early link to motion perception
When viewing a scene with low contrast, such as in cloudy or low-light situations, humans tend to perceive objects to be moving slower or flickering faster than in reality. This less-than-faithful interpretation of the sensory environment is known as perceptual bias and is thought to be a mechanism that can help humans interpret vague motion information. Brett Vintch and Justin Gardner from the Laboratory for Human Systems Neuroscience at the RIKEN Brain Science Institute have now shown that perceptual bias is encoded within the visual cortex—the region of the brain where visual stimuli first arrive and begin to be processed.
Although humans have the ability to estimate the speed of easily visible, high-contrast stimuli quite accurately, the speed of less-visible, low-contrast stimuli is harder to judge and is invariably underestimated. Speed perception is thought to be closely associated with the middle temporal zone of the visual cortex, but measurements have so far been unable to confirm this link.
Vintch and Gardner set out to resolve the link between cortical response and perception by conducting functional magnetic resonance imaging experiments on test subjects exposed to a series of low- and high-contrast images either moving across the screen at different speeds or flickering at different rates.
The researchers found that different speeds of motion in visual stimulus evoked different patterns of activity in the visual cortex. So systematic was the observed pattern of activity that Vintch and Gardner were able to predict the motion speed or flicker frequency of what the observer was viewing simply by examining the measured brain responses. Using these predictions, they found that when the test subjects viewed scenes with low contrast, the patterns of activity shifted to match what the observer was perceiving rather than what was actually physically present. 
The findings indicate that human perceptual bias about the movement of low-contrast stimuli originates from a shift in the response of neuronal populations in the parts of the brain that first start to process images. This early visual processing, which is hardwired into the visual cortex, may help humans make sense of ambiguous or vague visual information, such as moving or flickering scenes under low-contrast conditions (Fig. 1).
“Multiple aspects of human thought, such as sensory inference, language, cognition and reasoning, involve cognitive guesswork. We hope that our study of this very simple form of guessing by the nervous system will have implications for other high-level processes in the human brain,” explains Gardner.

(Fig. 1: Humans have the ability to accurately estimate the speed of moving objects under good light conditions, such as a bird on a clear day (left). On a cloudy day (right), however, the sensory information may be more ambiguous and invokes a specific cognitive mechanism—perceptual bias—that is hardwired into the visual cortex. Image credit: Justin Gardner, RIKEN Brain Science Institute)

An early link to motion perception

When viewing a scene with low contrast, such as in cloudy or low-light situations, humans tend to perceive objects to be moving slower or flickering faster than in reality. This less-than-faithful interpretation of the sensory environment is known as perceptual bias and is thought to be a mechanism that can help humans interpret vague motion information. Brett Vintch and Justin Gardner from the Laboratory for Human Systems Neuroscience at the RIKEN Brain Science Institute have now shown that perceptual bias is encoded within the visual cortex—the region of the brain where visual stimuli first arrive and begin to be processed.

Although humans have the ability to estimate the speed of easily visible, high-contrast stimuli quite accurately, the speed of less-visible, low-contrast stimuli is harder to judge and is invariably underestimated. Speed perception is thought to be closely associated with the middle temporal zone of the visual cortex, but measurements have so far been unable to confirm this link.

Vintch and Gardner set out to resolve the link between cortical response and perception by conducting functional magnetic resonance imaging experiments on test subjects exposed to a series of low- and high-contrast images either moving across the screen at different speeds or flickering at different rates.

The researchers found that different speeds of motion in visual stimulus evoked different patterns of activity in the visual cortex. So systematic was the observed pattern of activity that Vintch and Gardner were able to predict the motion speed or flicker frequency of what the observer was viewing simply by examining the measured brain responses. Using these predictions, they found that when the test subjects viewed scenes with low contrast, the patterns of activity shifted to match what the observer was perceiving rather than what was actually physically present. 

The findings indicate that human perceptual bias about the movement of low-contrast stimuli originates from a shift in the response of neuronal populations in the parts of the brain that first start to process images. This early visual processing, which is hardwired into the visual cortex, may help humans make sense of ambiguous or vague visual information, such as moving or flickering scenes under low-contrast conditions (Fig. 1).

“Multiple aspects of human thought, such as sensory inference, language, cognition and reasoning, involve cognitive guesswork. We hope that our study of this very simple form of guessing by the nervous system will have implications for other high-level processes in the human brain,” explains Gardner.

Filed under perceptual bias visual cortex vision motion perception neuroscience science

173 notes

Movies synchronize brains
When we watch a movie, our brains react to it immediately in a way similar to other people’s brains.
Researchers at Aalto University in Finland have succeeded in developing a method fast enough to observe immediate changes in the function of the brain even when watching a movie. By employing movies it was possible to investigate the function of the human brain in experimental conditions that are close to natural. Traditionally, in neuroscience research, simple stimuli, such as checkerboard patterns or single images, have been used.
Viewing a movie creates multilevel changes in the brain function. Despite the complexity of the stimulus, the elicited brain activity patterns show remarkable similarities across different people – even at the time scale of fractions of seconds.
The analysis revealed important similarities between brain signals of different people during movie viewing. These similar kinds or synchronized signals were found in brain areas that are connected with the early-stage processing of visual stimuli, detection of movement and persons, motor coordination and cognitive functions. The results imply that the contents of the movie affected certain brain functions of the subjects in a similar manner, explains Kaisu Lankinen the findings of her doctoral research.
So far, studies in this field have mainly been based on functional magnetic resonance imaging (fMRI). However, given the superior temporal resolution, within milliseconds, magnetoencephalography (MEG) is able to provide more complete picture of the fast brain processes. With the help of MEG and new analysis methods, investigation of significantly faster brain processes is possible and it enables detection of brain activity in frequencies higher than before.
In the novel analysis, brain imaging was combined with machine-learning methodology, with which signals of a similar form were mined from the brain data.
The research result was recently published in the NeuroImage journal.

Movies synchronize brains

When we watch a movie, our brains react to it immediately in a way similar to other people’s brains.

Researchers at Aalto University in Finland have succeeded in developing a method fast enough to observe immediate changes in the function of the brain even when watching a movie. By employing movies it was possible to investigate the function of the human brain in experimental conditions that are close to natural. Traditionally, in neuroscience research, simple stimuli, such as checkerboard patterns or single images, have been used.

Viewing a movie creates multilevel changes in the brain function. Despite the complexity of the stimulus, the elicited brain activity patterns show remarkable similarities across different people – even at the time scale of fractions of seconds.

The analysis revealed important similarities between brain signals of different people during movie viewing. These similar kinds or synchronized signals were found in brain areas that are connected with the early-stage processing of visual stimuli, detection of movement and persons, motor coordination and cognitive functions. The results imply that the contents of the movie affected certain brain functions of the subjects in a similar manner, explains Kaisu Lankinen the findings of her doctoral research.

So far, studies in this field have mainly been based on functional magnetic resonance imaging (fMRI). However, given the superior temporal resolution, within milliseconds, magnetoencephalography (MEG) is able to provide more complete picture of the fast brain processes. With the help of MEG and new analysis methods, investigation of significantly faster brain processes is possible and it enables detection of brain activity in frequencies higher than before.

In the novel analysis, brain imaging was combined with machine-learning methodology, with which signals of a similar form were mined from the brain data.

The research result was recently published in the NeuroImage journal.

Filed under spatial filtering neuroimaging brain activity visual cortex neuroscience science

158 notes

Motion-Sensing Cells in the Eye Let the Brain ‘Know’ About Directional Changes
How do we “know” from the movements of speeding car in our field of view if it’s coming straight toward us or more likely to move to the right or left?
Scientists have long known that our perceptions of the outside world are processed in our cortex, the six-layered structure in the outer part of our brains. But how much of that processing actually happens in cortex? Do the eyes tell the brain a lot or a little about the content of the outside world and the objects moving within it?
In a detailed study of the neurons linking the eyes and brains of mice, biologists at UC San Diego discovered that the ability of our brains and those of other mammals to figure out and process in our brains directional movements is a result of the activation in the cortex of signals that originate from the direction-sensing cells in the retina of our eyes.
“Even though direction-sensing cells in the retina have been known about for half a century, what they actually do has been a mystery- mostly because no one knew how to follow their connections deep into the brain,” said Andrew Huberman, an assistant professor of neurobiology, neurosciences and ophthalmology at UC San Diego, who headed the research team, which also involved biologists at the Salk Institute for Biological Sciences. “Our study provides the first direct link between direction-sensing cells in the retina and the cortex and thereby raises the new idea that we ‘know’ which direction things are moving specifically because of the activation of these direction-selective retinal neurons.” The study, recently published online, will appear in the March 20 print issue of Nature.
The discovery of the link between direction-sensing cells in the retina and the cortex has a number of practical implications for neuroscientists who treat disabilities in motion processing, such as dysgraphia, a condition sometimes associated with dyslexia that affects direction-oriented skills.
“Understanding the cells and neural circuits involved in sensing directional motion may someday help us understand defects in motion processing, such as those involved dyslexia, and it may inform strategies to treat or even re-wire these circuits in response to injury or common neurodegenerative diseases, such as glaucoma or Alzheimer’s,” said Huberman.
He and his team discovered the link in mice by using new types of modified rabies viruses that were pioneered by Ed Callaway, a professor at the Salk Institute, and by imaging the activity of neurons deep in the brain during visual experience.

Motion-Sensing Cells in the Eye Let the Brain ‘Know’ About Directional Changes

How do we “know” from the movements of speeding car in our field of view if it’s coming straight toward us or more likely to move to the right or left?

Scientists have long known that our perceptions of the outside world are processed in our cortex, the six-layered structure in the outer part of our brains. But how much of that processing actually happens in cortex? Do the eyes tell the brain a lot or a little about the content of the outside world and the objects moving within it?

In a detailed study of the neurons linking the eyes and brains of mice, biologists at UC San Diego discovered that the ability of our brains and those of other mammals to figure out and process in our brains directional movements is a result of the activation in the cortex of signals that originate from the direction-sensing cells in the retina of our eyes.

“Even though direction-sensing cells in the retina have been known about for half a century, what they actually do has been a mystery- mostly because no one knew how to follow their connections deep into the brain,” said Andrew Huberman, an assistant professor of neurobiology, neurosciences and ophthalmology at UC San Diego, who headed the research team, which also involved biologists at the Salk Institute for Biological Sciences. “Our study provides the first direct link between direction-sensing cells in the retina and the cortex and thereby raises the new idea that we ‘know’ which direction things are moving specifically because of the activation of these direction-selective retinal neurons.” The study, recently published online, will appear in the March 20 print issue of Nature.

The discovery of the link between direction-sensing cells in the retina and the cortex has a number of practical implications for neuroscientists who treat disabilities in motion processing, such as dysgraphia, a condition sometimes associated with dyslexia that affects direction-oriented skills.

“Understanding the cells and neural circuits involved in sensing directional motion may someday help us understand defects in motion processing, such as those involved dyslexia, and it may inform strategies to treat or even re-wire these circuits in response to injury or common neurodegenerative diseases, such as glaucoma or Alzheimer’s,” said Huberman.

He and his team discovered the link in mice by using new types of modified rabies viruses that were pioneered by Ed Callaway, a professor at the Salk Institute, and by imaging the activity of neurons deep in the brain during visual experience.

Filed under vision visual cortex retina retinal ganglion cells lateral geniculate nucleus neuroscience science

101 notes

Learning to see better in life and baseball
With a little practice on a computer or iPad—25 minutes a day, 4 days a week, for 2 months—our brains can learn to see better, according to a study of University of California, Riverside baseball players reported in the Cell Press journal Current Biology on February 17. The new evidence also shows that a visual training program can sometimes make the difference between winning and losing.
The study is the first, as far as the researchers know, to show that perceptual learning can produce improvements in vision in normally seeing individuals.
"The demonstration that seven players reached 20/7.5 acuity—the ability to read text at three times the distance of a normal observer—is dramatic and required players to stand forty feet back from the eye chart in order to get a measurement of their vision," says Aaron Seitz of the University of California, Riverside. For reference, 20/20 is considered normal visual acuity.
In the training game, the players’ task was to find and select visual patterns modeled after stimuli to which neurons in the early visual cortex of the brain respond best, Seitz explains. As game play commenced, those patterns were made dimmer and dimmer, exercising the players’ vision as they searched.
"The goal of the program is to train the brain to better respond to the inputs that it gets from the eye," Seitz says. "As with most other aspects of our function, our potential is greater than our normative level of performance. When we go to the gym and exercise, we are able to increase our physical fitness; it’s the same thing with the brain. By exercising our mental processes we can promote our mental fitness."
After the 2 month training period, players reported “seeing the ball much better,” “greater peripheral vision,” “easy to see further,” “able to distinguish lower-contrasting things,” “eyes feel stronger, they don’t get tired as much,” and so on.
The players also showed greater-than-expected improvements in their game. They were less likely to strike out and got more runs. The researchers estimate that those gains in batting statistics may have given the team an additional four or five wins in the 2013 season.
The researchers are now extending their work to include different groups, including members of the Los Angeles and Riverside Police Departments and people with low vision due to cataracts, macular degeneration, or amblyopia. They will also apply the same principles to other aspects of cognition, including memory and attention.
It all comes down to one thing: “Understanding the rules of brain plasticity unlocks great potential for improvement of health and wellbeing,” Seitz says.

Learning to see better in life and baseball

With a little practice on a computer or iPad—25 minutes a day, 4 days a week, for 2 months—our brains can learn to see better, according to a study of University of California, Riverside baseball players reported in the Cell Press journal Current Biology on February 17. The new evidence also shows that a visual training program can sometimes make the difference between winning and losing.

The study is the first, as far as the researchers know, to show that perceptual learning can produce improvements in vision in normally seeing individuals.

"The demonstration that seven players reached 20/7.5 acuity—the ability to read text at three times the distance of a normal observer—is dramatic and required players to stand forty feet back from the eye chart in order to get a measurement of their vision," says Aaron Seitz of the University of California, Riverside. For reference, 20/20 is considered normal visual acuity.

In the training game, the players’ task was to find and select visual patterns modeled after stimuli to which neurons in the early visual cortex of the brain respond best, Seitz explains. As game play commenced, those patterns were made dimmer and dimmer, exercising the players’ vision as they searched.

"The goal of the program is to train the brain to better respond to the inputs that it gets from the eye," Seitz says. "As with most other aspects of our function, our potential is greater than our normative level of performance. When we go to the gym and exercise, we are able to increase our physical fitness; it’s the same thing with the brain. By exercising our mental processes we can promote our mental fitness."

After the 2 month training period, players reported “seeing the ball much better,” “greater peripheral vision,” “easy to see further,” “able to distinguish lower-contrasting things,” “eyes feel stronger, they don’t get tired as much,” and so on.

The players also showed greater-than-expected improvements in their game. They were less likely to strike out and got more runs. The researchers estimate that those gains in batting statistics may have given the team an additional four or five wins in the 2013 season.

The researchers are now extending their work to include different groups, including members of the Los Angeles and Riverside Police Departments and people with low vision due to cataracts, macular degeneration, or amblyopia. They will also apply the same principles to other aspects of cognition, including memory and attention.

It all comes down to one thing: “Understanding the rules of brain plasticity unlocks great potential for improvement of health and wellbeing,” Seitz says.

Filed under visual acuity vision visual cortex brain training perceptual learning neuroscience science

free counters