Posts tagged brainwaves

Posts tagged brainwaves
Judgment and decision-making: brain activity indicates there is more than meets the eye
Published today in PLOS ONE, the study is the first in the world to show that it is possible to predict abstract judgments from brain waves, even though people were not conscious of making such judgments. The study also increases our understanding of impulsive behaviours and how to regulate it.
It found that researchers could predict from participants’ brain activity how exciting they found a particular image to be, and whether a particular image made them think more about the future or the present. This is true even though the brain activity was recorded before participants knew they were going to be asked to make these judgments.
Lead authors Dr Stefan Bode from the Melbourne School of Psychological Sciences and Dr Carsten Murawski from the University of Melbourne Department of Finance said these findings illustrated there was more information encoded in brain activity than previously assumed.
“We have found that brain activity when looking at images can encode judgments such as time reference, even when the viewer is not aware of making such judgments. Moreover, our results suggest that certain images can prompt a person to think about the present or the future,” they said.
The authors said the results contributed to our understanding of impulsive behaviours, especially where those behaviours were caused by ‘prompts’ in the world around us.
“For instance, consider someone trying to quit gambling who sees a gambling advertisement on TV. Our results suggest that even if this person is trying to ignore the ad, their brain may be unconsciously processing it and making it more likely that they will relapse,” he said.
The researchers used electroencephalography technology (EEG) to measure the electrical activity of people’s brains while they looked at different pictures. The pictures displayed images of food, social scenes or status symbols like cars and money.
After the EEG, researchers showed participants the same pictures again and asked questions about each image, such as how exciting they thought the image was or how strongly the image made them think of either the present or the future.
A statistical ‘decoding’ technique was then used to predict the judgments participants made about each of the pictures from the EEG brain activity that was recorded.
Co-author Daniel Bennett said just as certain prompts might cause impulsive behaviour, images could be used to prompt people to be more patient by regulating impulse control.
“Our results suggest that prompting people with images related to the future might cause processing outside awareness that could make it easier to think about the future. In theory, this could make people less impulsive and more likely to make healthy long-term decisions. These are hypotheses we will try to test in the future,” he said. The research was done in collaboration with the University of Cologne, Germany.
Brainwave Test Could Improve Autism Diagnosis and Classification
A new study by researchers at Albert Einstein College of Medicine of Yeshiva University suggests that measuring how fast the brain responds to sights and sounds could help in objectively classifying people on the autism spectrum and may help diagnose the condition earlier. The paper was published today in the online edition of the Journal of Autism and Developmental Disabilities.
The U.S. Centers for Disease Control and Prevention estimates that 1 in 68 children has been identified with an autism spectrum disorder (ASD). The signs and symptoms of ASD vary significantly from person to person, ranging from mild social and communication difficulties to profound cognitive impairments.
“One of the challenges in autism is that we don’t know how to classify patients into subgroups or even what those subgroups might be,” said study leader Sophie Molholm, Ph.D., associate professor in the Dominick P. Purpura Department of Neuroscience and the Muriel and Harold Block Faculty Scholar in Mental Illness in the department of pediatrics at Einstein. “This has greatly limited our understanding of the disorder and how to treat it.”
Autism is diagnosed based on a patient’s behavioral characteristics and symptoms. “These assessments can be highly subjective and require a tremendous amount of clinical expertise,” said Dr. Molholm. “We clearly need a more objective way to diagnose and classify this disorder.”
An earlier study by Dr. Molholm and colleagues suggested that brainwave electroencephalogram (EEG) recordings could potentially reveal how severely ASD individuals are affected. That study found that children with ASD process sensory information—such as sound, touch and vision—less rapidly than typically developing children do.
The current study was intended to see whether sensory processing varies along the autism spectrum. Forty-three ASD children aged 6 to 17 were presented with either a simple auditory tone, a visual image (red circle), or a tone combined with an image, and instructed to press a button as soon as possible after hearing the tone, seeing the image or seeing and hearing the two stimuli together. Continuous EEG recordings were made via 70 scalp electrodes to determine how fast the children’s brains were processing the stimuli.
The speed with which the subjects processed auditory signals strongly correlated with the severity of their symptoms: the more time required for an ASD individual to process the auditory signals, the more severe that person’s autistic symptoms. “This finding is in line with studies showing that, in people with ASD, the microarchitecture in the brain’s auditory center differs from that of typically developing children,” Dr. Molholm said.
The study also found a significant though weaker correlation between the speed of processing combined audio-visual signals and ASD severity. No link was observed between visual processing and ASD severity.
“This is a first step toward developing a biomarker of autism severity—an objective way to assess someone’s place on the ASD spectrum,” said Dr. Molholm. “Using EEG recordings in this way might also prove useful for objectively evaluating the effectiveness of ASD therapies.”
In addition, EEG recordings might help diagnose ASD earlier. “Early diagnosis allows for earlier treatment—which we know increases the likelihood of a better outcome,” said Dr. Molholm. “But currently, fewer than 15 percent of children with ASD are diagnosed before age 4. We might be able to adapt this technology to allow for early ASD detection and therapy for a much larger percentage of children.”
EEG Study Findings Reveal How Fear is Processed in the Brain
An estimated 8% of Americans will suffer from post traumatic stress disorder (PTSD) at some point during their lifetime. Brought on by an overwhelming or stressful event or events, PTSD is the result of altered chemistry and physiology of the brain. Understanding how threat is processed in a normal brain versus one altered by PTSD is essential to developing effective interventions.
New research from the Center for BrainHealth at The University of Texas at Dallas published online today in Brain and Cognition illustrates how fear arises in the brain when individuals are exposed to threatening images. This novel study is the first to separate emotion from threat by controlling for the dimension of arousal, the emotional reaction provoked, whether positive or negative, in response to stimuli. Building on previous animal and human research, the study identifies an electrophysiological marker for threat in the brain.
“We are trying to find where thought exists in the mind,” explained John Hart, Jr., M.D., Medical Science Director at the Center for BrainHealth. “We know that groups of neurons firing on and off create a frequency and pattern that tell other areas of the brain what to do. By identifying these rhythms, we can correlate them with a cognitive unit such as fear.”
Utilizing electroencephalography (EEG), Dr. Hart’s research team identified theta and beta wave activity that signifies the brain’s reaction to visually threatening images.
“We have known for a long time that the brain prioritizes threatening information over other cognitive processes,” explained Bambi DeLaRosa, study lead author. “These findings show us how this happens. Theta wave activity starts in the back of the brain, in it’s fear center – the amygdala – and then interacts with brain’s memory center - the hippocampus – before traveling to the frontal lobe where thought processing areas are engaged. At the same time, beta wave activity indicates that the motor cortex is revving up in case the feet need to move to avoid the perceived threat.”
For the study, 26 adults (19 female, 7 male), ages 19-30 were shown 224 randomized images that were either unidentifiably scrambled or real pictures. Real pictures were separated into two categories: threatening (weapons, combat, nature or animals) and non-threatening (pleasant situations, food, nature or animals).
While wearing an EEG cap, participants were asked to push a button with their right index finger for real items and another button with their right middle finger for nonreal/scrambled items. Shorter response times were recorded for scrambled images than the real images. There was no difference in reaction time for threatening versus non-threatening images.
EEG results revealed that threatening images evoked an early increase in theta activity in the occipital lobe (the area in the brain where visual information is processed), followed by a later increase in theta power in the frontal lobe (where higher mental functions such as thinking, decision-making, and planning occur). A left lateralized desynchronization of the beta band, the wave pattern associated with motor behavior (like the impulse to run), also consistently appeared in the threatening condition.
This study will serve as a foundation for future work that will explore normal versus abnormal fear associated with an object in other atypical populations including individuals with PTSD.
Control your environment through brain commands
Many patients with amyotrophic lateral sclerosis (ALS, or Lou Gehrig’s Disease) and other neurodegenerative conditions live every day with a frustrating inability to do small, everyday tasks, such as turning on the lights, changing the volume on the TV, or even communicating with their friends and loved ones.
Today, a first-ever proof of concept demonstrates how wearable technology and consumer products can be brought together with digital innovations to let a person with no mobility control their environment using brain commands, via a custom-built tablet application and wearable display interface.
This proof of concept demonstrates the potential to improve the quality of life for ALS patients – or any person with limited muscle and speech function – by giving them the ability to interact, communicate and issue commands without moving their body or using their voice.

Brain waves show learning to read does not end in 4th grade, contrary to popular theory
Teachers-in-training have long been taught that fourth grade is when students stop learning to read and start reading to learn. But a new Dartmouth study in the journal Developmental Science tested the theory by analyzing brain waves and found that fourth-graders do not experience a change in automatic word processing, a crucial component of the reading shift theory. Instead, some types of word processing become automatic before fourth grade, while others don’t switch until after fifth.
The findings mean that teachers at all levels of elementary school must think of themselves as reading instructors, said the study’s author, Associate Professor of Education Donna Coch.
"Until now, we lacked neurological evidence about the supposed fourth-grade shift," said Coch, also principal investigator for Dartmouth’s Reading Brains Lab. "The theory developed from behavioral evidence, and as a result of it, some teachers in fifth and sixth grade have not thought of themselves as reading instructors. Now we can see from brain waves that students in those grades are still learning to process words automatically; their neurological reading system is not yet adult-like."
Automatic word processing is the brain’s ability to determine whether a group of symbols constitutes a word within milliseconds, without the brain’s owner realizing the process is taking place.
To test how automatic word processing develops, Coch placed electrode caps on the heads of third-, fourth-, and fifth-graders, as well as college students. She had her test subjects view a screen that displayed a mix of real English words (such as “bed”), pseudo-words (such as “bem”), strings of letters (such as “mbe”), and strings of meaningless symbols one at a time. The setup allowed her to see how the subjects’ brains reacted to each kind of stimulus within milliseconds. In other words, she could watch their automatic word processing.
Next, Coch gave the participants a written test, in which they were asked to circle the real words in a list that also contained pseudo-words, strings of letters, and strings of meaningless symbols. This task was designed to test the participants’ conscious word processing, a much slower procedure.
Interestingly, most of the 96 participants got a nearly perfect score on the written test, showing that their conscious brains knew the difference between words and non-words.
However, the electrode cap revealed that only the college students processed meaningless symbols differently than real words. The third-, fourth-, and fifth-graders’ brains reacted to the meaningless symbols the same way they reacted to common English words.
"This tells us that, at least through the fifth grade, even children who read well are letting stimuli into the neural word processing system that more mature readers do not," Coch said. "Their brains are processing strings of meaningless symbols as if they were words, perhaps in case they turn out to be real letters. In contrast, by college, students have learned not to process strings of meaningless symbols as words, saving their brains precious time and energy."
The phenomenon is evidence that young readers do not fully develop automatic word processing skills until after fifth grade, which contradicts the fourth-grade reading shift theory.
The brain waves also showed that the third-, fourth-, and fifth-graders processed real words, psuedowords, and letter strings similarly to college students, suggesting that some automatic word processing begins before the fourth grade, and even before the third grade, also contradicting the reading shift theory.
"There is value to the theory of the fourth grade shift in that it highlights how reading skills and abilities develop at different times," Coch said. "But the neural data suggest that teachers should not expect their fourth-graders, or even their fifth-graders, to be completely automatic, adult-like readers."
Synchronized brain waves enable rapid learning
The human mind can rapidly absorb and analyze new information as it flits from thought to thought. These quickly changing brain states may be encoded by synchronization of brain waves across different brain regions, according to a new study from MIT neuroscientists.
The researchers found that as monkeys learn to categorize different patterns of dots, two brain areas involved in learning — the prefrontal cortex and the striatum — synchronize their brain waves to form new communication circuits.
“We’re seeing direct evidence for the interactions between these two systems during learning, which hasn’t been seen before. Category-learning results in new functional circuits between these two areas, and these functional circuits are rhythm-based, which is key because that’s a relatively new concept in systems neuroscience,” says Earl Miller, the Picower Professor of Neuroscience at MIT and senior author of the study, which appears in the June 12 issue of Neuron.
There are millions of neurons in the brain, each producing its own electrical signals. These combined signals generate oscillations known as brain waves, which can be measured by electroencephalography (EEG). The research team focused on EEG patterns from the prefrontal cortex —the seat of the brain’s executive control system — and the striatum, which controls habit formation.
The phenomenon of brain-wave synchronization likely precedes the changes in synapses, or connections between neurons, believed to underlie learning and long-term memory formation, Miller says. That process, known as synaptic plasticity, is too time-consuming to account for the human mind’s flexibility, he believes.
“If you can change your thoughts from moment to moment, you can’t be doing it by constantly making new connections and breaking them apart in your brain. Plasticity doesn’t happen on that kind of time scale,” says Miller, who is a member of MIT’s Picower Institute for Learning and Memory. “There’s got to be some way of dynamically establishing circuits to correspond to the thoughts we’re having in this moment, and then if we change our minds a moment later, those circuits break apart somehow. We think synchronized brain waves may be the way the brain does it.”
The paper’s lead author is former Picower Institute postdoc Evan Antzoulatos, who is now at the University of California at Davis.
Humming together
Miller’s lab has previously shown that during category-learning, neurons in the striatum become active early, followed by slower activation of neurons in the prefrontal cortex. “The striatum learns very simple things really quickly, and then its output trains the prefrontal cortex to gradually pick up on the bigger picture,” Miller says. “The striatum learns the pieces of the puzzle, and then the prefrontal cortex puts the pieces of the puzzle together.”
In the new study, the researchers wanted to investigate whether this activity pattern actually reflects communication between the prefrontal cortex and striatum, or if each region is working independently. To do this, they measured EEG signals as monkeys learned to assign patterns of dots into one of two categories.
At first, the animals were shown just two different examples, or “exemplars,” from each category. After each round, the number of exemplars was doubled. In the early stages, the animals could simply memorize which exemplars belonged to each category. However, the number of exemplars eventually became too large for the animals to memorize all of them, and they began to learn the general traits that characterized each category.
By the end of the experiment, when the researchers were showing 256 novel exemplars, the monkeys were able to categorize all of them correctly.
As the monkeys shifted from rote memorization to learning the categories, the researchers saw a corresponding shift in EEG patterns. Brain waves known as “beta bands,” produced independently by the prefrontal cortex and the striatum, began to synchronize with each other. This suggests that a communication circuit is forming between the two regions, Miller says.
“There is some unknown mechanism that allows these resonance patterns to form, and these circuits start humming together,” he says. “That humming may then foster subsequent long-term plasticity changes in the brain, so real anatomical circuits can form. But the first thing that happens is they start humming together.”
A little later, as an animal nailed down the two categories, two separate circuits formed between the striatum and prefrontal cortex, each corresponding to one of the categories.
“This is the first paper that provides data suggesting that coupling in the beta-band between prefrontal cortex and striatum may play a key role in category-formation. In addition to revealing a novel mechanism involved in category-learning, the results also contribute to better understanding of the significance of coupled beta-band oscillations in the brain,” says Andreas Engel, a professor of physiology at the University Medical Center Hamburg-Eppendorf in Germany.
“Expanding your knowledge”
Previous studies have shown that during cognitively demanding tasks, there is increased synchrony between the frontal cortex and visual cortex, but Miller’s lab is the first to show specific patterns of synchrony linked to specific thoughts.
Miller and Antzoulatos also showed that once the prefrontal cortex learns the categories and sends them to the striatum, they undergo further modification as new information comes in, allowing more expansive learning to take place. This iteration can occur over and over.
“That’s how you get the open-ended nature of human thought. You keep expanding your knowledge,” Miller says. “The prefrontal cortex learning the categories isn’t the end of the game. The cortex is learning these new categories and then forming circuits that can send the categories down to the striatum as if it’s just brand-new material for the brain to elaborate on.”
In follow-up studies, the researchers are now looking at how the brain learns more abstract categories, and how activity in the striatum and prefrontal cortex might reflect that type of abstraction.
Deep sleep promotes our well-being, improves our memory and strengthens the body’s defences. Zurich and Fribourg researchers demonstrate how restorative SWS can also be increased without medication – using hypnosis.

Sleeping well is a crucial factor contributing to our physical and mental restoration. SWS in particular has a positive impact for instance on memory and the functioning of the immune system. During periods of SWS, growth hormones are secreted, cell repair is promoted and the defence system is stimulated. If you feel sick or have had a hard working day, you often simply want to get some good, deep sleep. A wish that you can’t influence through your own will – so the widely held preconception.
Sleep researchers from the Universities of Zurich and Fribourg now prove the opposite. In a study that has now been published in the scientific journal “Sleep”, they have demonstrated that hypnosis has a positive impact on the quality of sleep, to a surprising extent. “It opens up new, promising opportunities for improving the quality of sleep without drugs”, says biopsychologist Björn Rasch who heads the study at the Psychological Institute of the University of Zurich in conjunction with the “Sleep and Learning” project*.
Brain waves – an indicator of sleep quality
Hypnosis is a method that can influence processes which are very difficult to control voluntarily. Patients with sleep disturbances can indeed be successfully treated with hypnotherapy. However, up to now it hadn’t been proven that this can lead to an objectively measurable change in sleep. To objectively measure sleep, electrical brain activity is recorded using an electroencephalogram (EEG). The characteristic feature of slow-wave sleep, which is deemed to have high restorative capacity, is a very even and slow oscillation in electrical brain activity.
70 healthy young women took part in the UZH study. They came to the sleep laboratory for a 90-minute midday nap. Before falling asleep they listened to a special 13-minute slow-wave sleep hypnosis tape over loudspeakers, developed by hypnotherapist Professor Angelika Schlarb, a sleep specialist, or to a neutral spoken text. At the beginning of the experiment the subjects were divided into highly suggestible and low suggestible groups using a standard procedure (Harvard Group Scale of Hypnotic Susceptibility). Around half of the population is moderately suggestible. With this method women achieve on average higher values for hypnotic susceptibility than men. Nevertheless, the researchers expect the same positive effects on sleep for highly suggestible men.
Slow-wave sleep increased by 80 percent
In their study, sleep researchers Maren Cordi and Björn Rasch were able to prove that highly suggestible women experienced 80 percent more slow-wave sleep after listening to the hypnosis tape compared with sleep after listening to the neutral text. In parallel, time spent awake was reduced by around one-third. In contrast to highly suggestible women, low suggestible female participants did not benefit as much from hypnosis. With additional control experiments the psychologists confirmed that the beneficial impact of hypnosis on slow-wave sleep could be attributed to the hypnotic suggestion to “sleep deeper” and could not be reduced to mere expectancy effects.
According to psychologist Maren Cordi “the results may be of major importance for patients with sleep problems and for older adults. In contrast to many sleep-inducing drugs, hypnosis has no adverse side effects”. Basically, everyone who responds to hypnosis could benefit from improved sleep through hypnosis.
* The project “Sleep and Learning” is headed by Professor Björn Rasch from the University of Fribourg and conducted at the Universities of Zurich and Fribourg. The project is financed by the Swiss National Fund and the University of Zurich (main area of clinical research “Sleep and Health”). The goal of the project is to identify psychological and neurophysiological mechanisms underlying the positive role of sleep for our memory and mental health.
(Source: mediadesk.uzh.ch)
Using thoughts to control airplanes
Pilots of the future could be able to control their aircraft by merely thinking commands. Scientists of the Technische Universität München and the TU Berlin have now demonstrated the feasibility of flying via brain control – with astonishing accuracy.
The pilot is wearing a white cap with myriad attached cables. His gaze is concentrated on the runway ahead of him. All of a sudden the control stick starts to move, as if by magic. The airplane banks and then approaches straight on towards the runway. The position of the plane is corrected time and again until the landing gear gently touches down. During the entire maneuver the pilot touches neither pedals nor controls.
This is not a scene from a science fiction movie, but rather the rendition of a test at the Institute for Flight System Dynamics of the Technische Universität München (TUM). Scientists working for Professor Florian Holzapfel are researching ways in which brain controlled flight might work in the EU-funded project “Brainflight”.
"A long-term vision of the project is to make flying accessible to more people," explains aerospace engineer Tim Fricke, who heads the project at TUM. "With brain control, flying, in itself, could become easier. This would reduce the work load of pilots and thereby increase safety. In addition, pilots would have more freedom of movement to manage other manual tasks in the cockpit."
Surprising accuracy
The scientists have logged their first breakthrough: They succeeded in demonstrating that brain-controlled flight is indeed possible – with amazing precision. Seven subjects took part in the flight simulator tests. They had varying levels of flight experience, including one person without any practical cockpit experience whatsoever. The accuracy with which the test subjects stayed on course by merely thinking commands would have sufficed, in part, to fulfill the requirements of a flying license test. “One of the subjects was able to follow eight out of ten target headings with a deviation of only 10 degrees,” reports Fricke. Several of the subjects also managed the landing approach under poor visibility. One test pilot even landed within only few meters of the centerline.
The TU München scientists are now focusing in particular on the question of how the requirements for the control system and flight dynamics need to be altered to accommodate the new control method. Normally, pilots feel resistance in steering and must exert significant force when the loads induced on the aircraft become too large. This feedback is missing when using brain control. The researchers are thus looking for alternative methods of feedback to signal when the envelope is pushed too hard, for example.
Electrical potentials are converted into control commands
In order for humans and machines to communicate, brain waves of the pilots are measured using electroencephalography (EEG) electrodes connected to a cap. An algorithm developed by scientists from Team PhyPA (Physiological Parameters for Adaptation) of the Technische Universität Berlin allows the program to decipher electrical potentials and convert them into useful control commands.
Only the very clearly defined electrical brain impulses required for control are recognized by the brain-computer interface. “This is pure signal processing,” emphasizes Fricke. Mind reading is not possible.
A Mexican Scientist Just Invented a ‘Telekinesis’ Helmet
A researcher just made a remarkable breakthrough in the area of brain-computer interfaces—creating a rig that allows a user to operate machines with thought alone, almost literally granting a form of ‘telekinesis’ over attached devices.
Brain-computer interfaces are a rapidly expanding area of research and industry. Though the technology to read brainwaves from the head’s surface has been around for decades, scientists and engineers have only recently created numerous systems to read signals directly from the brain and translate them into commands that control computers.
In the future, these technologies could allow people with physical disabilities to control their environment through thought alone—the brain-computer interface effectively grants users a form of telekinesis. With an increasingly digital world, brain-computer interfaces (BCIs) could allow future generations to interact with technology telepathically. Many of the early BCI studies were promising, but the technology was difficult to use and mentally exhausting.
In his search to understand the role and function of brain waves, neuroscientist Ole Jensen (Radboud University) postulates a new theory on how the alpha wave controls attention to visual signals. His theory is published in Trends in Neurosciences on May 20. Alpha waves appear to be even more active and important than Jensen already thought.

Our brain cells ‘spark’ all the time. From this electronic activity brain waves emerge: oscillations at different band widths. And like a radio station uses different frequencies to carry specific information far away from the emitting source, so does the brain. And just like radio listeners with a certain musical preference tune in to the frequency that carries the music they prefer, brain area’s tune into the wave length relevant for their functioning.
Alpha waves aren’t boring
Ole Jensen, professor of Neuronal Oscillations at Radboud University’s Donders Institute for Brain, Cognition and Behaviour, tries to figure out how this network of sending and receiving information through oscillations works in detail. Earlier he discovered a novel role of the alpha wave that was long thought to be a boring wave, emerging when the brain runs idle and a person is dozing off. Jensen shifted this interpretation by showing the importance of the alpha frequency: it helps to shut down irrelevant brain area’s for a certain task. It helps us concentrate on what is really important at that moment.
To do list
In the Trends in Neurosciences paper that appeared today, Jensen postulates a new theory for how this actually works given a visual task. ‘We think that different phases of the alpha wave encode for different parts of a visual scene. It helps breaking down the visual information into small jobs and then perform those tasks in a specific order. A to do list for your visual attention system: focus on the face, focus on the hand, focus on the glass, look around. And then all over again.’
Jensen is now planning to test this new interpretation of the alpha wave in both animals and humans.
(Source: ru.nl)