Neuroscience

Articles and news from the latest research reports.

Posts tagged EEG

291 notes

Brain imaging study examines second-language learning skills

With enough practice, some learners of a second language can process their new language as well as native speakers, research at the University of Kansas shows.

image

(Credit: bigstockphoto)

Using brain imaging, a trio of KU researchers was able to examine to the millisecond how the brain processes a second language. They then compared their findings with their previous results for native speakers and saw both followed similar patterns.

The research by Robert Fiorentino and Alison Gabriele, both associate professors in the linguistics department, and José Alemán Bañón, a former KU graduate student who is now a postdoctoral researcher at the University of Reading in the United Kingdom, was published this month in the journal Second Language Research.

For years, linguists have debated whether second-language learners would ever resemble native speakers in their ability to process language properties that differ between the first and second language, such as gender agreement, which is a property of Spanish but not English. In Spanish, all nouns are categorized as masculine or feminine, and various elements in the sentence, such as adjectives, need to carry the gender feature of the noun as well.

Some researchers argued that even those who spoke a second language with a high level of accuracy were using a qualitatively different mechanism than native speakers.

“We realized that these different theories proposing that either second-language learners use the same mechanism, or a different mechanism could actually be teased apart by using brain-imaging techniques,” Gabriele said.

The team studied 26 high-level Spanish speakers who hadn’t learned to speak Spanish until after age 11 and grew up with English as the majority language. The speakers used Spanish on a daily basis and had spent an average of a year and a half in a Spanish-speaking country.

They were compared with 24 native speakers, who were raised in Spanish-speaking countries and stayed in their home country until age 17.

To measure language processing as it happens, the team used a method known as electroencephalography (EEG), which uses an array of electrodes placed on the scalp to detect patterns of brain activity with high accuracy in timing.

Once hooked up to the EEG, the test subjects were asked to read sentences, some of which had grammatical errors in either number agreement or gender agreement.

The researchers then compared the results of the second-language learners to native speakers. They found that the highly proficient second-language speakers showed the same patterns of brain activity as native speakers when processing grammatical violations in sentences.

“We show that the learners’ brain activity looks qualitatively similar to that of native speakers, suggesting that they are using the same mechanisms,” Fiorentino said.

The study highlights the brain’s plasticity and its ability to acquire a new complex system even in adulthood.

“A lot of researchers have argued that there is some sort of language learning mechanism that might atrophy over the life span, particularly before puberty. And, we certainly have a lot of evidence that it is difficult to process your second language at nativelike levels and you have to go through quite a bit of effort to find people who can,” Gabriele said. “But I think what this paper shows is that it is possible.”

Gabriele and Fiorentino are working on a second phase of the research, studying how the brain processes a second language at the initial stages of exposure. Their preliminary results suggest that properties that are shared between the first and second language show patterns of brain activity that are very similar in learners and native speakers. This suggests that learners build on the representation for language that is already in place when learning a second language.

(Source: news.ku.edu)

Filed under language language acquisition brain imaging EEG brain activity neuroscience science

555 notes

Neuroscientists study our love for deep bass sounds
Have you ever wondered why bass-range instruments tend to lay down musical rhythms, while instruments with a higher pitch often handle the melody?
According to new research from Laurel Trainor and colleagues at the McMaster Institute for Music and The Mind, this is no accident, but rather a result of the physiology of hearing.
In other words, when the bass is loud and rock solid, we have an easier time following along to the rhythm of a song.
Read more

Neuroscientists study our love for deep bass sounds

Have you ever wondered why bass-range instruments tend to lay down musical rhythms, while instruments with a higher pitch often handle the melody?

According to new research from Laurel Trainor and colleagues at the McMaster Institute for Music and The Mind, this is no accident, but rather a result of the physiology of hearing.

In other words, when the bass is loud and rock solid, we have an easier time following along to the rhythm of a song.

Read more

Filed under auditory cortex pitch melody temporal perception EEG neuroscience science

180 notes

Does the moon affect our sleep?
Popular beliefs about the influence of the moon on humans widely exist. Many people report sleeplessness around the time of full moon. In contrast to earlier studies, scientists from the Max Planck Institute of Psychiatry in Munich did not observe any correlation between human sleep and the lunar phases. The researchers analyzed preexisting data of a large cohort of volunteers and their sleep nights. Further identification of mostly unpublished null findings suggests that the conflicting results of previous studies might be due to a publication bias.
For centuries, people have believed that the moon cycle influences human health, behavior and physiology. Folklore mainly links the full moon with sleeplessness. But what about the scientific background?
Several studies searched in re-analyses of pre-existing datasets on human sleep for a lunar effect, although the results were quite varying and the effects on sleep have rarely been assessed with objective measures, such as a sleep EEG. In some studies women appeared more affected by the moon phase, in others men. Two analyses of datasets from 2013 and 2014, each including between 30 and 50 volunteers, agreed on shorter total sleep duration in the nights around full moon. However, both studies came to conflicting results in other variables. For example, in one analysis the beginning of the REM-sleep phase in which we mainly dream was delayed around new moon, whereas the other study observed the longest delay around full moon.
To overcome the problem of possible chance findings in small study samples, scientists now analyzed the sleep data of overall 1,265 volunteers during 2,097 nights. “Investigating this large cohort of test persons and sleep nights, we were unable to replicate previous findings,” states Martin Dresler, neuroscientist at the Max Planck Institute of Psychiatry in Munich, Germany, and the Donders Institute for Brain, Cognition and Behaviour in Nijmegen, Netherlands. “We could not observe a statistical relevant correlation between human sleep and the lunar phases.” Further, his team identified several unpublished null findings including cumulative analyses of more than 20,000 sleep nights, which suggest that the conflicting results might be an example of a publication bias (i.e. the file drawer problem).
The file drawer problem describes the phenomenon, that many studies may be conducted but never reported – they remain in the file drawer. One much-discussed publication bias in science, medicine and pharmacy is the tendency to report experimental results that are positive or show a significant finding and to omit results that are negative or inconclusive.
Up to now, the influence of the lunar cycle on human sleep was investigated in re-analyses of earlier studies which originally followed different purposes. “To overcome the obvious limitations of retrospective data analysis, carefully controlled studies specifically designed for the test of lunar cycle effects on sleep in large samples are required for a definite answer,” comments Dresler.

Does the moon affect our sleep?

Popular beliefs about the influence of the moon on humans widely exist. Many people report sleeplessness around the time of full moon. In contrast to earlier studies, scientists from the Max Planck Institute of Psychiatry in Munich did not observe any correlation between human sleep and the lunar phases. The researchers analyzed preexisting data of a large cohort of volunteers and their sleep nights. Further identification of mostly unpublished null findings suggests that the conflicting results of previous studies might be due to a publication bias.

For centuries, people have believed that the moon cycle influences human health, behavior and physiology. Folklore mainly links the full moon with sleeplessness. But what about the scientific background?

Several studies searched in re-analyses of pre-existing datasets on human sleep for a lunar effect, although the results were quite varying and the effects on sleep have rarely been assessed with objective measures, such as a sleep EEG. In some studies women appeared more affected by the moon phase, in others men. Two analyses of datasets from 2013 and 2014, each including between 30 and 50 volunteers, agreed on shorter total sleep duration in the nights around full moon. However, both studies came to conflicting results in other variables. For example, in one analysis the beginning of the REM-sleep phase in which we mainly dream was delayed around new moon, whereas the other study observed the longest delay around full moon.

To overcome the problem of possible chance findings in small study samples, scientists now analyzed the sleep data of overall 1,265 volunteers during 2,097 nights. “Investigating this large cohort of test persons and sleep nights, we were unable to replicate previous findings,” states Martin Dresler, neuroscientist at the Max Planck Institute of Psychiatry in Munich, Germany, and the Donders Institute for Brain, Cognition and Behaviour in Nijmegen, Netherlands. “We could not observe a statistical relevant correlation between human sleep and the lunar phases.” Further, his team identified several unpublished null findings including cumulative analyses of more than 20,000 sleep nights, which suggest that the conflicting results might be an example of a publication bias (i.e. the file drawer problem).

The file drawer problem describes the phenomenon, that many studies may be conducted but never reported – they remain in the file drawer. One much-discussed publication bias in science, medicine and pharmacy is the tendency to report experimental results that are positive or show a significant finding and to omit results that are negative or inconclusive.

Up to now, the influence of the lunar cycle on human sleep was investigated in re-analyses of earlier studies which originally followed different purposes. “To overcome the obvious limitations of retrospective data analysis, carefully controlled studies specifically designed for the test of lunar cycle effects on sleep in large samples are required for a definite answer,” comments Dresler.

Filed under sleep lunar phases EEG moon cycle psychology neuroscience science

326 notes

Does ‘free will’ stem from brain noise?
Our ability to make choices — and sometimes mistakes — might arise from random fluctuations in the brain’s background electrical noise, according to a recent study from the Center for Mind and Brain at the University of California, Davis.
"How do we behave independently of cause and effect?" said Jesse Bengson, a postdoctoral researcher at the center and first author on the paper. "This shows how arbitrary states in the brain can influence apparently voluntary decisions."
The brain has a normal level of “background noise,” Bengson said, as electrical activity patterns fluctuate across the brain. In the new study, decisions could be predicted based on the pattern of brain activity immediately before a decision was made.
Bengson sat volunteers in front of a screen and told them to fix their attention on the center, while using electroencephalography, or EEG, to record their brains’ electrical activity. The volunteers were instructed to make a decision to look either to the left or to the right when a cue symbol appeared on screen, and then to report their decision.
The cue to look left or right appeared at random intervals, so the volunteers could not consciously or unconsciously prepare for it.
The brain has a normal level of “background noise,” Bengson said, as electrical activity patterns fluctuate across the brain. The researchers found that the pattern of activity in the second or so before the cue symbol appeared — before the volunteers could know they were going to make a decision — could predict the likely outcome of the decision.
"The state of the brain right before presentation of the cue determines whether you will attend to the left or to the right," Bengson said.
The experiment builds on a famous 1970s experiment by Benjamin Libet, a psychologist at UCSF who was later affiliated with the UC Davis Center for Neuroscience.
Libet also measured brain electrical activity immediately before a volunteer made a decision to press a switch in response to a visual signal. He found brain activity immediately before the volunteer reported deciding to press the switch.
The new results build on Libet’s finding, because they provide a model for how brain activity could precede decision, Bengson said. Additionally, Libet had to rely on when volunteers said they made their decision. In the new experiment, the random timing means that “we know people aren’t making the decision in advance,” Bengson said.
Libet’s experiment raised questions of free will — if our brain is preparing to act before we know we are going to act, how do we make a conscious decision to act? The new work, though, shows how “brain noise” might actually create the opening for free will, Bengson said.
"It inserts a random effect that allows us to be freed from simple cause and effect," he said.
The work, which was funded by the National Institutes of Health, was published online in the Journal of Cognitive Neuroscience.

Does ‘free will’ stem from brain noise?

Our ability to make choices — and sometimes mistakes — might arise from random fluctuations in the brain’s background electrical noise, according to a recent study from the Center for Mind and Brain at the University of California, Davis.

"How do we behave independently of cause and effect?" said Jesse Bengson, a postdoctoral researcher at the center and first author on the paper. "This shows how arbitrary states in the brain can influence apparently voluntary decisions."

The brain has a normal level of “background noise,” Bengson said, as electrical activity patterns fluctuate across the brain. In the new study, decisions could be predicted based on the pattern of brain activity immediately before a decision was made.

Bengson sat volunteers in front of a screen and told them to fix their attention on the center, while using electroencephalography, or EEG, to record their brains’ electrical activity. The volunteers were instructed to make a decision to look either to the left or to the right when a cue symbol appeared on screen, and then to report their decision.

The cue to look left or right appeared at random intervals, so the volunteers could not consciously or unconsciously prepare for it.

The brain has a normal level of “background noise,” Bengson said, as electrical activity patterns fluctuate across the brain. The researchers found that the pattern of activity in the second or so before the cue symbol appeared — before the volunteers could know they were going to make a decision — could predict the likely outcome of the decision.

"The state of the brain right before presentation of the cue determines whether you will attend to the left or to the right," Bengson said.

The experiment builds on a famous 1970s experiment by Benjamin Libet, a psychologist at UCSF who was later affiliated with the UC Davis Center for Neuroscience.

Libet also measured brain electrical activity immediately before a volunteer made a decision to press a switch in response to a visual signal. He found brain activity immediately before the volunteer reported deciding to press the switch.

The new results build on Libet’s finding, because they provide a model for how brain activity could precede decision, Bengson said. Additionally, Libet had to rely on when volunteers said they made their decision. In the new experiment, the random timing means that “we know people aren’t making the decision in advance,” Bengson said.

Libet’s experiment raised questions of free will — if our brain is preparing to act before we know we are going to act, how do we make a conscious decision to act? The new work, though, shows how “brain noise” might actually create the opening for free will, Bengson said.

"It inserts a random effect that allows us to be freed from simple cause and effect," he said.

The work, which was funded by the National Institutes of Health, was published online in the Journal of Cognitive Neuroscience.

Filed under decision making brain activity EEG attention psychology neuroscience science

81 notes

Research lays foundations for brain damage study

Researchers at The University of Queensland have made a key step that could eventually offer hope for stroke survivors and other people with brain damage.

image

The international study, led by researchers at UQ, could help explain a debilitating neurological condition known as unilateral spatial neglect, which commonly occurs after a stroke causing damage to the right side of the brain.

People with this condition become unaware of the left side of their sensory world, making everyday tasks such as eating and dressing almost impossible to perform.

ARC Discovery Early Career Research Fellow Dr Marta Garrido from UQ’s Queensland Brain Institute (QBI) said this lack of awareness on the left side, might be caused by an uneven brain network that involves interactions between different brain regions.

“Patients with spatial neglect are impaired in attending to sensory information on the left or the right side of space, but this inability is a lot stronger for objects coming from the left,” she said.

“This research has enabled us to establish what happens in a healthy brain, so that we can then further understand exactly what goes on in the brain of someone who is experiencing spatial neglect.”

QBI co-investigator and ARC Australian Laureate Fellow Professor Jason Mattingley said the human brain performed many functions in an uneven way.

“We already know that in a healthy brain even basic perception can be lopsided. For example, when we look at others’ faces we tend to focus more on the left than the right side,” he said.

“Research like this helps us take a key step in understanding some of the puzzling symptoms observed in people following brain damage.”

The researchers at QBI collaborated with UQ’s School of Psychology, and colleagues from Aarhus University in Denmark, and University College London in the UK.

The study involved recording electrical activity in the brains of healthy adult volunteers using electroencephalography (EEG) while listening to sequences of sounds from the left, right or centre.

The next step for the researchers will be to study how people with brain damage use the left and right sides of the brain when perceiving visual objects and sounds. 

Findings of the study were published in The Journal of Neuroscience.

(Source: uq.edu.au)

Filed under unilateral spatial neglect hemispatial neglect brain damage EEG audiospatial perception neuroscience science

599 notes

A Mexican Scientist Just Invented a ‘Telekinesis’ Helmet
A researcher just made a remarkable breakthrough in the area of brain-computer interfaces—creating a rig that allows a user to operate machines with thought alone, almost literally granting a form of ‘telekinesis’ over attached devices.
Brain-computer interfaces are a rapidly expanding area of research and industry. Though the technology to read brainwaves from the head’s surface has been around for decades, scientists and engineers have only recently created numerous systems to read signals directly from the brain and translate them into commands that control computers.
In the future, these technologies could allow people with physical disabilities to control their environment through thought alone—the brain-computer interface effectively grants users a form of telekinesis. With an increasingly digital world, brain-computer interfaces (BCIs) could allow future generations to interact with technology telepathically. Many of the early BCI studies were promising, but the technology was difficult to use and mentally exhausting.
Read more

A Mexican Scientist Just Invented a ‘Telekinesis’ Helmet

A researcher just made a remarkable breakthrough in the area of brain-computer interfaces—creating a rig that allows a user to operate machines with thought alone, almost literally granting a form of ‘telekinesis’ over attached devices.

Brain-computer interfaces are a rapidly expanding area of research and industry. Though the technology to read brainwaves from the head’s surface has been around for decades, scientists and engineers have only recently created numerous systems to read signals directly from the brain and translate them into commands that control computers.

In the future, these technologies could allow people with physical disabilities to control their environment through thought alone—the brain-computer interface effectively grants users a form of telekinesis. With an increasingly digital world, brain-computer interfaces (BCIs) could allow future generations to interact with technology telepathically. Many of the early BCI studies were promising, but the technology was difficult to use and mentally exhausting.

Read more

Filed under BCI EEG brainwaves mind control neuroscience science

303 notes

A ‘hands-on’ approach could help babies develop spatial awareness
A study from the Department of Psychology published today found:
Changes in the way the brain processes touch in the first year of life
Babies start keeping track of their hands are when their arms move around from 8 months
Crossing the hands confuses the mind in young babies
The way we perceive touch in the outside world develops in the first year of life
The research, from Goldsmiths’ InfantLab, suggested that babies’ tactile experiences could be important for developing their sense of place in the world around them.
The InfantLab research team carried out their study on 66 babies aged from six to ten months old.
Babies felt harmless ‘buzzes’ on their arms
In the study, babies felt little tactile ‘buzzes’ on their hands first with their arms in an uncrossed position and then in a crossed position, while their brain activity was recorded through an EEG (electroencephalography) sensor net.
This is one of the first pieces of research to focus on the development of ‘touch perception’, which is crucial for investigating how babies learn to perceive how their own bodies fit into the world around them.
Dr Andy Bremner, InfantLab Director, explained: “We discovered that it takes time for babies to build up good mechanisms for perceiving how they fit into the outside world. Specifically, early on they do not appear to perceive the ways in which the body changes when their limbs, in this case their arms, move around.” 
Dr Silvia Rigato, researcher on the project, commented: “The vast majority of previous studies on infant perception has focussed on what babies perceive of a visual environment on a screen and out of reach, giving us a picture of what babies can do and understand when in couch potato mode.”
“Our research has taken this a step further. As adults we need good maps of where our bodies and limbs are in order to be able to act and move around competently. It seems these take time to develop in the first year, and we didn’t know that before.”
The full research paper ‘The neural basis of somatosensory remapping develops in human infancy’ was published in the journal Current Biology.

A ‘hands-on’ approach could help babies develop spatial awareness

A study from the Department of Psychology published today found:

  • Changes in the way the brain processes touch in the first year of life
  • Babies start keeping track of their hands are when their arms move around from 8 months
  • Crossing the hands confuses the mind in young babies
  • The way we perceive touch in the outside world develops in the first year of life

The research, from Goldsmiths’ InfantLab, suggested that babies’ tactile experiences could be important for developing their sense of place in the world around them.

The InfantLab research team carried out their study on 66 babies aged from six to ten months old.

Babies felt harmless ‘buzzes’ on their arms

In the study, babies felt little tactile ‘buzzes’ on their hands first with their arms in an uncrossed position and then in a crossed position, while their brain activity was recorded through an EEG (electroencephalography) sensor net.

This is one of the first pieces of research to focus on the development of ‘touch perception’, which is crucial for investigating how babies learn to perceive how their own bodies fit into the world around them.

Dr Andy Bremner, InfantLab Director, explained: “We discovered that it takes time for babies to build up good mechanisms for perceiving how they fit into the outside world. Specifically, early on they do not appear to perceive the ways in which the body changes when their limbs, in this case their arms, move around.” 

Dr Silvia Rigato, researcher on the project, commented: “The vast majority of previous studies on infant perception has focussed on what babies perceive of a visual environment on a screen and out of reach, giving us a picture of what babies can do and understand when in couch potato mode.”

“Our research has taken this a step further. As adults we need good maps of where our bodies and limbs are in order to be able to act and move around competently. It seems these take time to develop in the first year, and we didn’t know that before.”

The full research paper ‘The neural basis of somatosensory remapping develops in human infancy’ was published in the journal Current Biology.

Filed under brain activity EEG infants somatosensory remapping brain development psychology neuroscience science

221 notes

Electrical stimulation of brain alters dreams
Nighttime dreams in which you show up at work naked, encounter an ax-wielding psychopath or experience other tribulations may become a thing of the past thanks to a discovery reported on Sunday.
Applying electrical current to the brain, according to a study published online in Nature Neuroscience, induces “lucid dreaming,” in which the dreamer is aware that he is dreaming and can often gain control of the ongoing plot.
The findings are the first to show that inducing brain waves of a specific frequency produces lucid dreaming.
Read more

Electrical stimulation of brain alters dreams

Nighttime dreams in which you show up at work naked, encounter an ax-wielding psychopath or experience other tribulations may become a thing of the past thanks to a discovery reported on Sunday.

Applying electrical current to the brain, according to a study published online in Nature Neuroscience, induces “lucid dreaming,” in which the dreamer is aware that he is dreaming and can often gain control of the ongoing plot.

The findings are the first to show that inducing brain waves of a specific frequency produces lucid dreaming.

Read more

Filed under lucid dreaming dreams gamma waves EEG brainwaves self-awareness psychology neuroscience science

81 notes

Controlling Brain Waves to Improve Vision
Have you ever accidently missed a red light or a stop sign? Or have you  heard someone mention a visible event that you passed by but totally missed seeing?
“When we have different things competing for our attention, we can only be aware of so much of what we see,” said Kyle Mathewson, Beckman Institute Postdoctoral Fellow. “For example, when you’re driving, you might really be concentrating on obeying traffic signals.”
But say there’s an unexpected event: an emergency vehicle, a pedestrian, or an animal running into the road—will you actually see the unexpected, or will you be so focused on your initial task that you don’t notice?
“In the car, we may see something so brief or so faint, while we’re paying attention to something else, that the event won’t come into our awareness,” says Mathewson. “If you present this scenario hundreds of times to someone, sometimes they will see the unexpected event, and sometimes they won’t because their brain is in a different preparation state.”
By using a novel technique to test brain waves, Mathewson and colleagues are discovering how the brain processes external stimuli that do and don’t reach our awareness. A paper about their results, “Dynamics of Alpha Control: Preparatory Suppression of Posterior Alpha Oscillations by Frontal Modulators Revealed with Combined EEG and Event-related Optical Signal,” published this month in the Journal of Cognitive Neuroscience, reveals how alpha waves, typically thought of as your brain’s electrical activity while it’s at rest, can actually influence what we see or don’t see.
The researchers used both electroencephalography (EEG) and the event-related optical signal (EROS), developed in the Cognitive Neuroimaging Laboratory of Gabriele Gratton and Monica Fabiani, professors of psychology and members of the Beckman Institute’s Cognitive Neuroscience Group, and authors of the study.
While EEG records the electrical activity along the scalp, EROS uses infrared light passed through optical fibers to measure changes in optical properties in the active areas of the cerebral cortex. Because of the hard skull between the EEG sensors and the brain, it can be difficult to find exactly WHERE signals are produced. EROS, which examines how light is scattered, can noninvasively pinpoint activity within the brain.
“EROS is based on near-infrared light,” explained Fabiani and Gratton via email. “It exploits the fact that when neurons are active, they swell a little, becoming slightly more transparent to light: this allows us to determine when a particular part of the cortex is processing information, as well as where the activity occurs.”
This allowed the researchers to not only measure activity in the brain, but also allowed them to map where the alpha oscillations were originating. Their discovery: the alpha waves are produced in the cuneus, located in the part of the brain that processes visual information.
The alpha can inhibit what is processed visually, making it hard for you to see something unexpected.
By focusing your attention and concentrating more fully on what you are experiencing, however, the executive function of the brain can come into play and provide “top-down” control—putting a brake on the alpha waves, thus allowing you to see things that you might have missed in a more relaxed state.
“We found that the same brain regions known to control our attention are involved in suppressing the alpha waves and improving our ability to detect hard-to-see targets,” said Diane Beck, a member of the Beckman’s Cognitive Neuroscience Group, and one of the study’s authors.
“Knowing where the waves originate means we can target that area specifically with electrical stimulation” said Mathewson. “Or we can also give people moment-to-moment feedback, which could be used to alert drivers that they are not paying attention and should increase their focus on the road ahead, or in other situations alert students in a classroom that they need to focus more, or athletes, or pilots and equipment operators.”
The study examined 16 subjects and mapped the electrical and optical data onto individual MRI brain images.

Controlling Brain Waves to Improve Vision

Have you ever accidently missed a red light or a stop sign? Or have you  heard someone mention a visible event that you passed by but totally missed seeing?

“When we have different things competing for our attention, we can only be aware of so much of what we see,” said Kyle Mathewson, Beckman Institute Postdoctoral Fellow. “For example, when you’re driving, you might really be concentrating on obeying traffic signals.”

But say there’s an unexpected event: an emergency vehicle, a pedestrian, or an animal running into the road—will you actually see the unexpected, or will you be so focused on your initial task that you don’t notice?

“In the car, we may see something so brief or so faint, while we’re paying attention to something else, that the event won’t come into our awareness,” says Mathewson. “If you present this scenario hundreds of times to someone, sometimes they will see the unexpected event, and sometimes they won’t because their brain is in a different preparation state.”

By using a novel technique to test brain waves, Mathewson and colleagues are discovering how the brain processes external stimuli that do and don’t reach our awareness. A paper about their results, “Dynamics of Alpha Control: Preparatory Suppression of Posterior Alpha Oscillations by Frontal Modulators Revealed with Combined EEG and Event-related Optical Signal,” published this month in the Journal of Cognitive Neuroscience, reveals how alpha waves, typically thought of as your brain’s electrical activity while it’s at rest, can actually influence what we see or don’t see.

The researchers used both electroencephalography (EEG) and the event-related optical signal (EROS), developed in the Cognitive Neuroimaging Laboratory of Gabriele Gratton and Monica Fabiani, professors of psychology and members of the Beckman Institute’s Cognitive Neuroscience Group, and authors of the study.

While EEG records the electrical activity along the scalp, EROS uses infrared light passed through optical fibers to measure changes in optical properties in the active areas of the cerebral cortex. Because of the hard skull between the EEG sensors and the brain, it can be difficult to find exactly WHERE signals are produced. EROS, which examines how light is scattered, can noninvasively pinpoint activity within the brain.

“EROS is based on near-infrared light,” explained Fabiani and Gratton via email. “It exploits the fact that when neurons are active, they swell a little, becoming slightly more transparent to light: this allows us to determine when a particular part of the cortex is processing information, as well as where the activity occurs.”

This allowed the researchers to not only measure activity in the brain, but also allowed them to map where the alpha oscillations were originating. Their discovery: the alpha waves are produced in the cuneus, located in the part of the brain that processes visual information.

The alpha can inhibit what is processed visually, making it hard for you to see something unexpected.

By focusing your attention and concentrating more fully on what you are experiencing, however, the executive function of the brain can come into play and provide “top-down” control—putting a brake on the alpha waves, thus allowing you to see things that you might have missed in a more relaxed state.

“We found that the same brain regions known to control our attention are involved in suppressing the alpha waves and improving our ability to detect hard-to-see targets,” said Diane Beck, a member of the Beckman’s Cognitive Neuroscience Group, and one of the study’s authors.

“Knowing where the waves originate means we can target that area specifically with electrical stimulation” said Mathewson. “Or we can also give people moment-to-moment feedback, which could be used to alert drivers that they are not paying attention and should increase their focus on the road ahead, or in other situations alert students in a classroom that they need to focus more, or athletes, or pilots and equipment operators.”

The study examined 16 subjects and mapped the electrical and optical data onto individual MRI brain images.

Filed under brain activity brainwaves neural activity EROS EEG visual cortex alpha oscillations neuroscience science

510 notes

Scientists discover brain’s anti-distraction system
Two Simon Fraser University psychologists have made a brain-related discovery that could revolutionize doctors’ perception and treatment of attention-deficit disorders.
This discovery opens up the possibility that environmental and/or genetic factors may hinder or suppress a specific brain activity that the researchers have identified as helping us prevent distraction.
The Journal of Neuroscience has just published a paper about the discovery by John McDonald, an associate professor of psychology and his doctoral student John Gaspar, who made the discovery during his master’s thesis research.
This is the first study to reveal our brains rely on an active suppression mechanism to avoid being distracted by salient irrelevant information when we want to focus on a particular item or task.
McDonald, a Canada Research Chair in Cognitive Neuroscience, and other scientists first discovered the existence of the specific neural index of suppression in his lab in 2009. But, until now, little was known about how it helps us ignore visual distractions.
“This is an important discovery for neuroscientists and psychologists because most contemporary ideas of attention highlight brain processes that are involved in picking out relevant objects from the visual field. It’s like finding Waldo in a Where’s Waldo illustration,” says Gaspar, the study’s lead author.
“Our results show clearly that this is only one part of the equation and that active suppression of the irrelevant objects is another important part.”
Given the proliferation of distracting consumer devices in our technology-driven, fast-paced society, the psychologists say their discovery could help scientists and health care professionals better treat individuals with distraction-related attentional deficits.
“Distraction is a leading cause of injury and death in driving and other high-stakes environments,” notes McDonald, the study’s senior author. “There are individual differences in the ability to deal with distraction. New electronic products are designed to grab attention. Suppressing such signals takes effort, and sometimes people can’t seem to do it.
“Moreover, disorders associated with attention deficits, such as ADHD and schizophrenia, may turn out to be due to difficulties in suppressing irrelevant objects rather than difficulty selecting relevant ones.”
The researchers are now turning their attention to understanding how we deal with distraction. They’re looking at when and why we can’t suppress potentially distracting objects, whether some of us are better at doing so and why that is the case.
“There’s evidence that attentional abilities decline with age and that women are better than men at certain visual attentional tasks,” says Gaspar, the study’s first author.
The study was based on three experiments in which 47 students performed an attention-demanding visual search task. Their mean age was 21. The researchers studied their neural processes related to attention, distraction and suppression by recording electrical brain signals from sensors embedded in a cap they wore.

Scientists discover brain’s anti-distraction system

Two Simon Fraser University psychologists have made a brain-related discovery that could revolutionize doctors’ perception and treatment of attention-deficit disorders.

This discovery opens up the possibility that environmental and/or genetic factors may hinder or suppress a specific brain activity that the researchers have identified as helping us prevent distraction.

The Journal of Neuroscience has just published a paper about the discovery by John McDonald, an associate professor of psychology and his doctoral student John Gaspar, who made the discovery during his master’s thesis research.

This is the first study to reveal our brains rely on an active suppression mechanism to avoid being distracted by salient irrelevant information when we want to focus on a particular item or task.

McDonald, a Canada Research Chair in Cognitive Neuroscience, and other scientists first discovered the existence of the specific neural index of suppression in his lab in 2009. But, until now, little was known about how it helps us ignore visual distractions.

“This is an important discovery for neuroscientists and psychologists because most contemporary ideas of attention highlight brain processes that are involved in picking out relevant objects from the visual field. It’s like finding Waldo in a Where’s Waldo illustration,” says Gaspar, the study’s lead author.

“Our results show clearly that this is only one part of the equation and that active suppression of the irrelevant objects is another important part.”

Given the proliferation of distracting consumer devices in our technology-driven, fast-paced society, the psychologists say their discovery could help scientists and health care professionals better treat individuals with distraction-related attentional deficits.

“Distraction is a leading cause of injury and death in driving and other high-stakes environments,” notes McDonald, the study’s senior author. “There are individual differences in the ability to deal with distraction. New electronic products are designed to grab attention. Suppressing such signals takes effort, and sometimes people can’t seem to do it.

“Moreover, disorders associated with attention deficits, such as ADHD and schizophrenia, may turn out to be due to difficulties in suppressing irrelevant objects rather than difficulty selecting relevant ones.”

The researchers are now turning their attention to understanding how we deal with distraction. They’re looking at when and why we can’t suppress potentially distracting objects, whether some of us are better at doing so and why that is the case.

“There’s evidence that attentional abilities decline with age and that women are better than men at certain visual attentional tasks,” says Gaspar, the study’s first author.

The study was based on three experiments in which 47 students performed an attention-demanding visual search task. Their mean age was 21. The researchers studied their neural processes related to attention, distraction and suppression by recording electrical brain signals from sensors embedded in a cap they wore.

Filed under attention disorders attention distraction EEG psychology neuroscience science

free counters