Posts tagged psychology

Posts tagged psychology
ScienceDaily (Apr. 26, 2012) — Scientists at the Gladstone Institutes have unraveled a process by which depletion of a specific protein in the brain contributes to the memory problems associated with Alzheimer’s disease. These findings provide insights into the disease’s development and may lead to new therapies that could benefit the millions of people worldwide suffering from Alzheimer’s and other devastating neurological disorders.
The study, led by Gladstone Investigator Jorge J. Palop, PhD, revealed that low levels of a protein, called Nav1.1, disrupt the electrical activity between brain cells. Such activity is crucial for healthy brain function and memory. Indeed, the researchers found that restoring Nav1.1 levels in mice that were genetically modified to mimic key aspects of Alzheimer’s disease (AD-mice) improved learning and memory functions and increased their lifespan. Their findings are featured on the cover of the April 27 issue of Cell, available online April 26.
"It is estimated that more than 30 million people worldwide suffer from Alzheimer’s disease and that number is expected to rise dramatically in the near future," said Lennart Mucke, MD, who directs neurological research at Gladstone, an independent and nonprofit biomedical-research organization. "This research improves our understanding of the biological processes that underlie cognitive dysfunction in this disease and could open the door for new therapeutic interventions."
The researchers’ findings suggest that Nav1.1 levels in special regulatory nerve cells called parvalbumin cells, or PV cells, are essential to generate healthy brain-wave activity — and that problems in this process contribute to cognitive decline in AD-mice and possibly in patients with Alzheimer’s.
In the brain, neurons form highly interconnected networks, using chemical and electrical signals to communicate with each other. The researchers investigated whether this communication between neurons is disrupted in AD-mice, and if so, how this may affect the symptoms of Alzheimer’s disease.
To study this, they performed electroencephalogram (EEG) recordings — a technique that detects abnormalities in the brain’s electrical waves such as those found in patients with epilepsy. They found that similar abnormalities emerged during periods of reduced gamma-wave oscillations — a type of brain wave that is crucial to regulating learning and memory.
"Like a conductor in an orchestra, PV cells regulate brain rhythms by precisely controlling excitatory brain activity," said Laure Verret, PhD, postdoctoral fellow and lead author. "We found that PV cells in patients with Alzheimer’s and in AD-mice have low levels of the protein Nav1.1 — likely contributing to PV cell dysfunction. As a consequence, AD-mice had abnormal brain rhythms. By restoring Nav1.1 levels, we were able to re-establish normal brain function."
Indeed, the scientists found that increasing Nav1.1 levels in PV cells improves brain wave activity, learning, memory and survival rates in AD-mice.
"Enhancing Nav1.1 activity, and consequently improving PV cell function, may help in the treatment of Alzheimer’s disease and other neurological disorders associated with gamma-wave alterations and cognitive impairments such as epilepsy, autism and schizophrenia," said Dr. Palop, who is also an assistant professor of neurology at the University of California, San Francisco, with which Gladstone is affiliated. "These findings may allow us to develop therapies to help patients with these devastating diseases."
Source: Science Daily
ScienceDaily (Apr. 26, 2012) — The ability to navigate using spatial cues was impaired in mice whose brains were minus a channel that delivers potassium — a finding that may have implications for humans with damage to the hippocampus, a brain structure critical to memory and learning, according to a Baylor University researcher.
Mice missing the channel also showed diminished learning ability in an experiment dealing with fear conditioning, said Joaquin Lugo, Ph.D., the lead author in the study and an assistant professor of psychology and neuroscience in Baylor’s College of Arts & Sciences. “By targeting chemical pathways that alter those potassium channels, we may eventually be able to apply the findings to humans and reverse some of the cognitive deficits in people with epilepsy and other neurological disorders,” Lugo said.
The research was done in Baylor College of Medicine Intellectual and Developmental Disabilities Research Center Mouse Neurobehavior Core in Houston during Lugo’s time as a researcher there.
The findings are published online in the journal Learning & Memory.
The channel, called Kv4.2, delivers potassium, which aids neuron function in the brain’s hippocampus. The hippocampus forms memory for long-term storage in the brain. Potassium also helps to regulate excitability.
Individuals who have epilepsy sometimes exhibit altered or missing Kv.4.2 channels or similar types of channels.
In the experiment investigating navigation, “knockout” mice — those without the channel — were tested in a water maze four feet in diameter and 12 inches deep, with eight trials daily — each lasting about a minute — over four days, he said. Their performance was compared with that of normal mice.
Both groups responded to visual cues — colored symbols — in learning their way around the maze, but the knockout mice did not respond as well as the normal mice in terms of spatial cues — hidden platforms in the water.
"When the mice don’t have this channel, it hurts their ability to learn," Lugo said. In a separate experiment examining fear conditioning, both knockout mice and normal mice were placed in a cage, and researchers sounded a tone before giving the mice a mild electric shock. In repeated trials, both groups began to freeze upon hearing the tone as they anticipated a shock. But the normal mice also reacted to the context — being placed in the cage — while the mice who did not have the Kv4.2 channel reacted only to the tone. The research was funded by the Epilepsy Foundation and the National Institutes of Health.
Source: Science Daily
ScienceDaily (Apr. 26, 2012) — They say you can’t teach an old dog new tricks. Fortunately, this is not always true. Researchers at the Netherlands Institute for Neuroscience (NIN-KNAW) have now discovered how the adult brain can adapt to new situations. The Dutch researchers’ findings are published on April 25 in the journal Neuron. Their study may be significant in developing treatments of neurodevelopmental disorders.

Two inhibitory synapses (yellow) disappear from the process of a nerve-cell (red) during learning. (Credit: Image courtesy of Netherlands Institute for Neuroscience)
Ability to learn
Our brain processes information in complex networks of nerve cells. The cells communicate and excite one another through special connections, called synapses. Young brains are capable of forming many new synapses, and they are consequently better at learning new things. That is why we acquire vital skills — walking, talking, hearing and seeing — early on in life. The adult brain stabilises the synapses so that we can use what we have learned in childhood for the rest of our lives.
Disappearing inhibitors
Earlier research found that approximately one fifth of the synapses in the brain inhibit rather than excite other nerve-cell activity. Neuroscientists have now shown that many of these inhibitory synapses disappear if the adult brain is forced to learn new skills. They reached this conclusion by labelling inhibitory synapses in mouse brains with fluorescent proteins and then tracking them for several weeks using a specialised microscope. They then closed one of the mice’s eyes temporarily to accustom them to seeing through just one eye. After a few days, the area of the brain that processes information from both eyes began to respond more actively to the open eye. At the same time, many of the inhibitory synapses disappeared and were later replaced by new synapses.
Regulating the information network
Inhibitory synapses are vital for the way networks function in the brain. “Think of the excitatory synapses as a road network, with traffic being guided from A to B, and the inhibitory synapses as the matrix signs that regulate the traffic,” explains research leader Christiaan Levelt. “The inhibitory synapses ensure an efficient flow of traffic in the brain. If they don’t, the system becomes overloaded, for example as in epilepsy; if they constantly indicate a speed of 20 kilometres an hour, then everything will grind to a halt, for example when an anaesthetic is administered. If you can move the signs to different locations, you can bring about major changes in traffic flows without having to entirely reroute the road network.”
Hope
Inhibitory synapses play a hugely influential role on learning in the young brain. People who have neurodevelopmental disorders — for example epilepsy, but also autism and schizophrenia — may have trouble forming inhibitory synapses. The discovery that the adult brain is still capable of pruning or forming these synapses offers hope that pharmacological or genetic intervention can be used to enhance or manage this process. This could lead to important guideposts for treating the above-mentioned neurological disorders, but also repairing damaged brain tissue.
Source: Science Daily
April 26, 2012
They say you can’t teach an old dog new tricks. Fortunately, this is not always true. Researchers at the Netherlands Institute for Neuroscience have now discovered how the adult brain can adapt to new situations. The Dutch researchers’ findings are published on Wednesday in the prestigious journal Neuron. Their study may be significant in the treatment of neurodevelopmental disorders such as epilepsy, autism and schizophrenia.
Our brain processes information in complex networks of nerve cells. The cells communicate and excite one another through special connections, called synapses. Young brains are capable of forming many new synapses, and they are consequently better at learning new things. That is why we acquire vital skills – walking, talking, hearing and seeing – early on in life. The adult brain stabilises the synapses so that we can use what we have learned in childhood for the rest of our lives.
Earlier research found that approximately one fifth of the synapses in the brain inhibit rather than excite other nerve-cell activity. Neuroscientists have now shown that many of these inhibitory synapses disappear if the adult brain is forced to learn new skills. They reached this conclusion by labelling inhibitory synapses in mouse brains with fluorescent proteins and then tracking them for several weeks using a specialised microscope. They then closed one of the mice’s eyes temporarily to accustom them to seeing through just one eye. After a few days, the area of the brain that processes information from both eyes began to respond more actively to the open eye. At the same time, many of the inhibitory synapses disappeared and were later replaced by new synapses.
Inhibitory synapses are vital for the way networks function in the brain. “Think of the excitatory synapses as a road network, with traffic being guided from A to B, and the inhibitory synapses as the matrix signs that regulate the traffic,” explains research leader Christiaan Levelt. “The inhibitory synapses ensure an efficient flow of traffic in the brain. If they don’t, the system becomes overloaded, for example as in epilepsy; if they constantly indicate a speed of 20 kilometres an hour, then everything will grind to a halt, for example when an anaesthetic is administered. If you can move the signs to different locations, you can bring about major changes in traffic flows without having to entirely reroute the road network.”
Inhibitory synapses play a hugely influential role on learning in the young brain. People who have neurodevelopmental disorders – for example epilepsy, but also autism and schizophrenia – may have trouble forming inhibitory synapses. The discovery that the adult brain is still capable of pruning or forming these synapses offers hope that pharmacological or genetic intervention can be used to enhance or manage this process. This could lead to important guideposts for treating the above-mentioned neurological disorders, but also repairing damaged brain tissue.
Provided by Royal Netherlands Academy of Arts and Sciences
Source: medicalxpress.com
April 26, 2012
What happens at the level of individual neurons while we learn? This question intrigued the neuroscientist Daniel Huber, who recently arrived at the Department of Basic Neuroscience at the University of Geneva. During his stay in the United States, he and his team tried to unravel the network mechanisms underlying learning and memory at the level of the cerebral cortex.
What’s the role of individual neurons in behavior? Do they always participate in the same functions? How do their responses evolve during learning?” asks the professor. One way to address these questions is to follow the activity of a large set of neurons while the subject learns a novel task. The goal is to link the behavioral changes with the changes in neuronal representations.
It’s currently impossible to follow the activity of a large number of individual neurons in humans, but the team of researchers quickly realized that mice are excellent subjects for such studies. “We were surprised by capacities of these small rodents. They learn novel associations quickly and are able to focus for hours on complex behavioral tasks. However, it is important to keep them motivated by rewarding them accordingly. They are very similar to us in that way.”
The behavioral task of the mice consisted in sampling the area in front of their snout with their whiskers to search for a small object. The object was presented either within reach and out of reach of their whiskers. Each time the object was detected with the whiskers, the mouse had to respond by licking to a reward spout. The correct choices were rewarded with a drop of liquid. “In this task different sensory and motor circuits have to interact in order to establish a novel association, leading to better and better performance”.
Remained the problem of how to follow the activity of the large number of neurons across many days of learning. The researchers replaced a small part of the bone overlying motor cortex with a tiny glass window. The neurons underneath the window were genetically modified to express a fluorescent marker which changes its intensity according to the activity of the neurons. This window into the brain allowed the researches around Daniel Huber to use two-photon microscopy to record the activity of the same set of 500 neurons during days of learning.
"We then correlated the activity of the individual neurons with the different actions of the mouse, such as moving the whiskers, touching the object or licking at the right moment. It’s like synchronizing the soundtrack with the images in a movie" adds the neuroscientist. The researchers analyzed this data using a series of computational approaches to establish a link between the neuronal activity and the different sensory and motor features of the task. This allowed them to build algorithmic models that can predict different motor outputs by solely monitoring the neuronal activity. Decoding the neuronal activity allowed the researchers then to construct functional maps of the recorded neurons and quantify each neuron’s link with the different aspects of the behavior.
These functional maps revealed several fundamental findings: “Although the movements of the whiskers became more and more precise and targeted to search for the object during the learning, their relative neuronal representation remained relatively stable. In contrast, the representation of licking to respond and collect the rewards became more and more pronounced”. Taken together, only selected aspects of the learned behavior induced changes it the neuronal representation in the cortex. The scientists also found that different sensory and motor representations are spatially intermingled in the rodent brain.
Other analysis revealed that individual neurons remain stably linked to a given behavioral function, but they have a flexibility to remain silent on a given day. This functional stability despite a flexibility to join (or not) a given representation was actually suggested by different theoretical work on learning.
"If these characteristics are limited to the motor cortex or if these are more general rules that are apply across the cerebral cortex remains open" says Daniel Huber. That in fact this is one of the questions we are currently investigating in my lab in Geneva".
Provided by University of Geneva
Source: medicalxpress.com
April 26, 2012 by Stuart Mason Dambrot
(Medical Xpress) — Remarkably, cortical maps show that neurons in the primary visual cortex have specific preferences for the location and orientation of a given visual field stimulus – but how these maps develop and what function they play in visual processing remains a mystery. Evidence suggests that the retinotopic map is established by molecular gradients, but little is known about how orientation maps are wired. One hypothesis: at their inception, these orientation maps are seeded by the spatial interference of ON- and OFF-center retinal receptive field mosaics. Recently, scientists in the Departments of Neurobiology and Psychology at the University of California, Los Angeles have shown that this proposed mechanism predicts a link between the layout of orientation preferences around singularities of different signs and the cardinal axes of the retinotopic map, and have confirmed this prediction in the tree shrew primary visual cortex. The researchers say their findings support the idea that spatially structured retinal input may provide a blueprint of sorts for the early development of cortical maps and receptive fields – and that the same may hold true for other senses as well.

Moiré interference of retinal mosaics predicts a link between retinotopic and orientation maps. (A) (Upper) Two hexagonal lattices representing ON- (red) and OFF-center (blue) ganglion cell receptive fields/ (Lower) A cortical cell with input dominated by a dipole has a receptive field with side-by-side subregions of opposite sign and can be tuned for orientation. (B) (Upper) The orientation of dipoles in the interference pattern, indicated by the orientation of short line segments, changes over space, generating a blueprint for an orientation map. (Lower) The organization of orientation preferences around negative (Left) and positive (Right) singularities. Image Courtesy PNAS, doi: 10.1073/pnas.1118926109
Professor of Neurobiology and Psychology Dario L. Ringach articulates the primary elements of showing that the hypothesis that orientation maps are initially seeded by the spatial interference of ON- and OFF-center retinal receptive field mosaics corresponds to a mechanism that predicts a link between the layout of orientation preferences around singularities of different signs and the cardinal axes of the retinotopic map. “The cerebral cortex of higher mammals contains diverse maps,” he tells Medical Xpress, “where information about sensory input or motor planning is laid out systematically across the surface of a given cortical area. Some scientists have postulated that these computational maps are key to cortical function. However, we still do not know exactly what role cortical maps play in normal sensory and motor processes.” Their importance, he stresses, is that understanding how cortical maps are wired during development, and what types of pathology may arise from their faulty wiring, are fundamental questions of brain function.
Ringach also notes that neurons in primary visual cortex are selective to the orientation of a stimulus in visual space, and their preference changes systematically across the cortical surface in a periodic fashion. “We know these maps are present at the earliest stages of life,” he continues, “and do not require normal sensory experience to develop – but how do they wire themselves? We’ve postulated that the initial structure of these maps is biased by the spatial organization of the periphery.” In the visual system this is represented by the signals the retina within the eye conveys to the brain.
The researchers’ model postulates that at each location in the visual field, the input from the retina constraints the range of orientation preferences that the cortex can implement at that location. “We show that, given what is known about the organization of retinal signals, that such constraints would be quasi-periodic, thereby potentially providing the blueprint for an orientation map in the cortex.” One prediction of this theory is that groups of neurons preferring the same orientation should be arranged on an approximate hexagonal lattice on the cortical surface – a prediction the researchers confirmed in a previous study1.
April 26, 2012 by Larry Hardesty
In this week’s issue of the journal Neurology, researchers at MIT and two Boston hospitals provide early evidence that a simple, unobtrusive wrist sensor could gauge the severity of epileptic seizures as accurately as electroencephalograms (EEGs) do — but without the ungainly scalp electrodes and electrical leads. The device could make it possible to collect clinically useful data from epilepsy patients as they go about their daily lives, rather than requiring them to come to the hospital for observation. And if early results are borne out, it could even alert patients when their seizures are severe enough that they need to seek immediate medical attention.
Rosalind Picard, a professor of media arts and sciences at MIT, and her group originally designed the sensors to gauge the emotional states of children with autism, whose outward behavior can be at odds with what they’re feeling. The sensor measures the electrical conductance of the skin, an indicator of the state of the sympathetic nervous system, which controls the human fight-or-flight response.
In a study conducted at Children’s Hospital Boston, the research team — Picard, her student Ming-Zher Poh, neurologist Tobias Loddenkemper and four colleagues from MIT, Children’s Hospital and Brigham and Women’s Hospital — discovered that the higher a patient’s skin conductance during a seizure, the longer it took for the patient’s brain to resume the neural oscillations known as brain waves, which EEG measures.
At least one clinical study has shown a correlation between the duration of brain-wave suppression after seizures and the incidence of sudden unexplained death in epilepsy (SUDEP), a condition that claims thousands of lives each year in the United States alone. With SUDEP, death can occur hours after a seizure.
Currently, patients might use a range of criteria to determine whether a seizure is severe enough to warrant immediate medical attention. One of them is duration. But during the study at Children’s Hospital, Picard says, “what we found was that this severity measure had nothing to do with the length of the seizure.” Ultimately, data from wrist sensors could provide crucial information to patients deciding whether to roll over and go back to sleep or get to the emergency room.
Surprising signals
The realization that the wrist sensors might be of use in treating epilepsy was something of a fluke. “We’d been working with kids on the autism spectrum, and I didn’t realize, but a lot of them have seizures,” Picard says. In reviewing data from their autism studies, Picard and her group found that seizures were sometimes preceded by huge spikes in skin conductance. It seemed that their sensors might actually be able to predict the onset of seizures.
At the time, several MIT students were working in Picard’s lab through MIT’s Undergraduate Research Opportunities Program (UROP); one of them happened to be the daughter of Joseph Madsen, director of the Epilepsy Surgery Program at Children’s Hospital. “I decided it was time to meet my UROP’s dad,” Picard says.
In a project that would serve as the basis of Poh’s doctoral dissertation, Madsen agreed to let the MIT researchers test the sensors on patients with severe epilepsy, who were in the hospital for as much as a week of constant EEG monitoring. Poh and Picard considered several off-the-shelf sensors for the project, but “at the time, there was nothing we could buy that did what we needed,” Picard says. “Finally, we just built our own.”
"It’s a big challenge to make a device robust enough to withstand long hours of recording," Poh says. "We were recording days or weeks in a row." In early versions of the sensors, some fairly common gestures could produce false signals. Eliminating the sensors’ susceptibility to such sources of noise was largely a process of trial and error, Picard says.
Blending in
Additionally, Poh says, “I put a lot of thought into how to make it really comfortable and as nonintrusive as possible. So I packaged it all into typical sweatbands.” Since the patients in the study were children, “I allowed them to choose their favorite character on their wristband — for example, Superman, or Dora the Explorer, whatever they like,” Poh says. “To them, they were wearing a wristband. But there was a lot of complicated sensing going on inside the wristband.” Indeed, Picard says, the researchers actually lost five of their homemade sensors because hospital cleaning staff saw what they thought were ratty sweatbands lying around recently vacated rooms and simply threw them out.
Picard is continuing to investigate the possibility that initially intrigued her — that the devices could predict seizures. In the meantime, however, her collaborators at Children’s Hospital are conducting a study that will follow up on the one reported in Neurology, and a similar study is beginning at Brigham and Women’s Hospital. Rather than sweatbands with TV and comic-book characters, however, the new studies will use sensors produced by Affectiva, a company that Picard started in order to commercialize her lab’s work.
Provided by Massachusetts Institute of Technology
Source: medicalxpress.com
April 26, 2012
(Medical Xpress) — The progression of the debilitating disease Multiple Sclerosis (MS) could be slowed or even halted by blocking a protein that contributes to nerve damage, according to a new study.

Professor Claude Bernard and Dr Steven Petratos
In research published today in the journal Brain, scientists from the Monash Immunology and Stem Cell Laboratories (MISCL), the University of Toronto, Yale and the University of Western Australia, have demonstrated the key role played by the collapsin response mediator protein 2 (CRMP-2) in the development of MS.
Led by MISCL’s Dr Steven Petratos, also of RMIT University, and Professor Claude Bernard, the research team found that a modified version of CRMP-2 is present in active MS lesions, which indicate damage to the nervous system, in a laboratory model of MS.
The modified CRMP-2 interacts with another protein to cause nerve fibre damage that can result in numbness, blindness, difficulties with speech and motor skills, and cognitive impairments in sufferers.
When either the modified CRMP-2 or the interaction between the two proteins was blocked, using a method already approved in both the US and Australia, the progression of the disease was halted.
Director of MISCL, Professor Richard Boyd said the discovery could lead to new treatments for MS.
“Blocking the same protein in people with MS could provide a ‘handbrake’ to the progression of the disease,” Professor Boyd said.
Dr Petratos said the method used to block the protein was approved for the treatment of other disease conditions by both the US Food and Drug Administration and Australia’s Therapeutic Goods Administration.
“This should mean that clinical trials – once they start – will be fast tracked as the form of administration has already been approved,” Dr Petratos said.
MS Australia estimates that the disease affects more than 20,000 people in Australia, and up to 2.5 million worldwide. The disease tends to strike early in adulthood, with women three times more likely than men to be diagnosed. The total cost to the Australian community of the disease is estimated at $1 billion annually.
The research received major funding from the National Multiple Sclerosis Society of the United States of America and partial funding from MS Research Australia.
Provided by Monash University
Source: medicalxpress.com
April 26, 2012
Huntington disease (HD) is an inherited neurodegenerative disorder caused by a defect on chromosome four where, within the Huntingtin gene, a CAG repeat occurs too many times. Most individuals begin experiencing symptoms in their 40s or 50s, but studies have shown that significant brain atrophy occurs several years prior to an official HD diagnosis. As a result, the field has sought a preventive treatment that could be administered prior to the development of actual symptoms that might delay the onset of illness.
Using data from the ongoing PREDICT-HD study and led by Dr. Elizabeth Aylward, author and Associate Director at the Center for Integrative Brain Research, Seattle Children’s Research Institute, researchers examined whether neuroimaging measures can improve the accuracy of prediction of disease onset.
The PREDICT- HD study is an international, multi-site, long-term study of individuals who carry the gene mutation for Huntington disease but entered the study prior to onset of diagnosable motor impairment. Participants underwent structural magnetic resonance imaging (MRI) scans, which allowed for the comparison of individuals who developed HD during the course of the study and those who had not yet been diagnosed with HD.
They found that striatum and white matter volumes in the brain were significantly smaller in individuals diagnosed 1 to 4 years following the initial scan, suggesting that these volumetric measures can assist in determining which individuals are closest to disease onset.
"We believe that the results of this study will be important in designing future clinical trials for individuals who have the Huntington disease gene mutation, but who are not yet showing symptoms. We also believe this group of individuals is well suited for drug intervention studies, as their brain involvement is not as severe as those who have already been diagnosed," said Dr. Aylward.
"Huntington disease can be considered a model neuropsychiatric disorder, since it is caused by a single gene and has such predictable and well-characterized brain changes. It may guide thinking about other disorders with genetic contribution, such as schizophrenia," commented Dr. Christopher A. Ross, co-author and Professor of Psychiatry, Neurology and Neuroscience, Johns Hopkins University. "If we could better understand the natural history of brain changes in schizophrenia, for instance, we may be able to identify genetically vulnerable individuals, and intervene therapeutically, not just to treat symptoms, but to alter the biology and course of the disease."
Dr. John Krystal, Editor of Biological Psychiatry, agreed, noting that “biomarkers of illness progress are critical for all neuropsychiatric disorders.”
For now, these results may enhance the formulas used to calculate age of onset and help aid in the planning of future clinical trials aimed at delaying disease onset.
"Identifying individuals who are close to onset of diagnosable symptoms will allow feasible studies that use onset of symptoms as the primary outcome measure to determine if a drug intervention is effective," Dr. Aylward added. "Although it would be unreasonable to suggest that all potential clinical trial participants receive MRI scans one to four years prior to taking part in a trial, there are many individuals who have participated in pre-HD observational studies who already have such data available."
Perhaps more importantly, Dr. Krystal concluded that “the development of good disease staging using MRI in Huntington disease could assist investigators studying novel treatments and affected individuals and family members anxious to learn about disease progress.”
Provided by Elsevier
Source: medicalxpress.com
ScienceDaily (Apr. 26, 2012) — A team led by psychology professor Ian Spence at the University of Toronto reveals that playing an action videogame, even for a relatively short time, causes differences in brain activity and improvements in visual attention.

Playing an action videogame, even for a relatively short time, causes differences in brain activity and improvements in visual attention. (Credit: © j0yce / Fotolia)
Previous studies have found differences in brain activity between action videogame players and non-players, but these could have been attributed to pre-existing differences in the brains of those predisposed to playing videogames and those who avoid them. This is the first time research has attributed these differences directly to playing video games.
Twenty-five subjects — who had not previously played videogames — played a game for a total of 10 hours in one to two hour sessions. Sixteen of the subjects played a first-person shooter game and, as a control, nine subjects played a three-dimensional puzzle game.
Before and after playing the games, the subjects’ brain waves were recorded while they tried to detect a target object among other distractions over a wide visual field. Subjects who played the shooter videogame and also showed the greatest improvement on the visual attention task showed significant changes in their brain waves. The remaining subjects — including those who had played the puzzle game — did not.
"After playing the shooter game, the changes in electrical activity were consistent with brain processes that enhance visual attention and suppress distracting information," said Sijing Wu, a PhD student in Spence’s lab in U of T’s Department of Psychology and lead author of the study.
"Studies in different labs, including here at the University of Toronto, have shown that action videogames can improve selective visual attention, such as the ability to quickly detect and identify a target in a cluttered background," said Spence. "But nobody has previously demonstrated that there are differences in brain activity which are a direct result of playing the videogame."
"Superior visual attention is crucial in many important everyday activities," added Spence. "It’s necessary for things such as driving a car, monitoring changes on a computer display, or even avoiding tripping while walking through a room with children’s toys scattered on the floor."
Source: Science Daily