Neuroscience

Articles and news from the latest research reports.

140 notes

New ‘Flight Simulator’ Technology Gives Neurosurgeons A Peek Inside Brain Before Surgery

NYU Langone Medical Center is now using a novel technology that serves as a “flight simulator” for neurosurgeons, allowing them to rehearse complicated brain surgeries before making an actual incision on a patient.

image

The new simulator, called the Surgical Rehearsal Platform (SRP), creates an individualized walkthrough for neurosurgeons based on 3D imaging taken from the patient’s CT and MRI scans. Surgeons then plan and rehearse the surgeries using the unique software, which combines life-like tissue reaction with accurate modeling of surgical tools and clamps, to enable them to navigate multiple-angled models of a patient’s brain and vasculature.

The SRP was developed by Surgical Theater of Cleveland, Ohio. This augmented reality technology may help improve safety and efficiency during surgeries for conditions including pituitary tumors, skull base tumors, intrinsic brain tumors, aneurysms, and arteriovenous malformations (AVMs), and could potentially allow surgeons from around the world to simultaneously collaborate on a patient’s case in real-time.

 ”We are excited to partner with Surgical Theater to bring their Surgery Rehearsal Platform to our institution,” said John G. Golfinos, MD, chair of the Department of Neurosurgery at NYU Langone Medical Center and associate professor of neurosurgery at NYU School of Medicine. “The reaction of tissue in these 3D images is incredibly life-like and modeling of surgical tools is equally impressive. The SRP also will enhance the training of medical students, residents and fellows and help them hone their skills in new and more meaningful ways.”

When using the SRP, surgeons can rehearse a specific patient’s case on computer monitors connected to controllers that simulate surgical tools. For example, when rehearsing a surgery for an aneurysm, the SRP reacts realistically when the surgeon virtually applies a clip to the blood vessel. The surgeon then can assess the tissue’s mechanical properties and view realistic microscopic characteristics including shadowing and texture to plan approaches, so that when the real surgery is being performed, doctors have rehearsed and already have a mental picture of what is being seen in the OR.

The SRP obtained clearance from the U.S. Food and Drug Administration (FDA) in February 2013 as a pre-operative software for simulating and evaluating surgical treatment options.

In addition, a newer-generation of this technology from Surgical Theater, the Surgical Navigation Advanced Platform (SNAP), has an application pending with the FDA to allow the tool to be taken into the operating room, so surgeons can see behind arteries and other critical structures in real-time.

(Source: communications.med.nyu.edu)

Filed under surgical rehearsal platform 3d imaging augmented reality technology medicine science

158 notes

To Advance Care for Patients with Brain Metastases: Reject Five Myths
A blue-ribbon team of national experts on brain cancer says that professional pessimism and out-of-date “myths,” rather than current science, are guiding — and compromising — the care of patients with cancers that spread to the brain.
In a special article published in the July issue of Neurosurgery, the team, led by an NYU Langone Medical Center neurosurgeon, argues that many past, key clinical trials were designed with out-of-date assumptions and the tendency of some physicians to “lump together” brain metastases of diverse kinds of cancer, often results in less than optimal care for individual patients. Furthermore, payers question the best care when it deviates from these misconceptions, the authors conclude.
“It’s time to abandon this unjustifiable nihilism and think carefully about more individualized care,” says lead author of the article, Douglas S. Kondziolka, M.D., MSc, FRCSC, Vice Chair of Clinical Research and Director of the Gamma Knife Program in the Department of Neurosurgery at NYU Langone.
The authors — who also say medical insurers help perpetuate the myths by denying coverage that deviates from them — identify five leading misconceptions that often lead to poorer care:
All tumor cell types act the same way once they spread to the brain. This oversimplification means that doctors assume that histologically diverse cancers respond the same way to chemotherapy and are equally sensitive (or insensitive) to radiation. It also means that patients are all assumed to be at the same risk of subsequent brain cancer relapses, and development of additional metastatic lesions; and that survival rates are similar as well. The authors point out that this type of thinking overlooks important biological differences in brain metastases resulting from different types of cancer, such as those originating in the lung, breast or skin.
The number of brain metastases is the best indicator for guiding management of the disease. Such strict adherence to quantity, the authors say, can wrongly limit treatment options. Physicians should look at total tumor burden, including the size and scope of metastases, rather than just how many metastases occur.
All cancers detectable in the brain already reflect the presence of micrometastases, or  smaller, newly formed tumors too miniscule to detect. Evidence, the authors say, suggests otherwise, and aggressively monitoring for, and treating, individual brain metastases can, in fact, improve tumor control and patient survival.
Whole brain radiation (WBR) is generally unjustified because it will cause disabling cognitive dysfunction if a patient lives long enough. Dr. Kondziolka and his co-authors say the risks and benefits of WBR should be evaluated for each patient, and that new studies examining the cognitive impact of WBR on thinking and learning are underway.
Most brain metastases cause obvious symptoms, making regular screening for them unnecessary, and unlikely to affect survival. The authors counter that advances in screening allow metastases to be detected earlier, and treated sooner, before symptoms occur.
“We are in an era of personalized medicine,” Dr. Kondziolka says, “and we need to begin thinking that way.” The authors further write: “It is time for fresh thinking and new critical analyses,” urging consideration of updated clinical trial designs that include comparison of matched cohorts and cost effectiveness factors. In addition to research that pays more attention to specific cell types and overall tumor burden, investigators should focus on tools available from advances in molecular biology and genetic subtyping and on efforts to learn “why some patients with a given primary cancer develop brain tumors and others do not.”
Ultimately, the authors hope better stratifying patients will improve care for patients with diverse brain metastases.

To Advance Care for Patients with Brain Metastases: Reject Five Myths

A blue-ribbon team of national experts on brain cancer says that professional pessimism and out-of-date “myths,” rather than current science, are guiding — and compromising — the care of patients with cancers that spread to the brain.

In a special article published in the July issue of Neurosurgery, the team, led by an NYU Langone Medical Center neurosurgeon, argues that many past, key clinical trials were designed with out-of-date assumptions and the tendency of some physicians to “lump together” brain metastases of diverse kinds of cancer, often results in less than optimal care for individual patients. Furthermore, payers question the best care when it deviates from these misconceptions, the authors conclude.

“It’s time to abandon this unjustifiable nihilism and think carefully about more individualized care,” says lead author of the article, Douglas S. Kondziolka, M.D., MSc, FRCSC, Vice Chair of Clinical Research and Director of the Gamma Knife Program in the Department of Neurosurgery at NYU Langone.

The authors — who also say medical insurers help perpetuate the myths by denying coverage that deviates from them — identify five leading misconceptions that often lead to poorer care:

  1. All tumor cell types act the same way once they spread to the brain. This oversimplification means that doctors assume that histologically diverse cancers respond the same way to chemotherapy and are equally sensitive (or insensitive) to radiation. It also means that patients are all assumed to be at the same risk of subsequent brain cancer relapses, and development of additional metastatic lesions; and that survival rates are similar as well. The authors point out that this type of thinking overlooks important biological differences in brain metastases resulting from different types of cancer, such as those originating in the lung, breast or skin.
  2. The number of brain metastases is the best indicator for guiding management of the disease. Such strict adherence to quantity, the authors say, can wrongly limit treatment options. Physicians should look at total tumor burden, including the size and scope of metastases, rather than just how many metastases occur.
  3. All cancers detectable in the brain already reflect the presence of micrometastases, or  smaller, newly formed tumors too miniscule to detect. Evidence, the authors say, suggests otherwise, and aggressively monitoring for, and treating, individual brain metastases can, in fact, improve tumor control and patient survival.
  4. Whole brain radiation (WBR) is generally unjustified because it will cause disabling cognitive dysfunction if a patient lives long enough. Dr. Kondziolka and his co-authors say the risks and benefits of WBR should be evaluated for each patient, and that new studies examining the cognitive impact of WBR on thinking and learning are underway.
  5. Most brain metastases cause obvious symptoms, making regular screening for them unnecessary, and unlikely to affect survival. The authors counter that advances in screening allow metastases to be detected earlier, and treated sooner, before symptoms occur.

“We are in an era of personalized medicine,” Dr. Kondziolka says, “and we need to begin thinking that way.” The authors further write: “It is time for fresh thinking and new critical analyses,” urging consideration of updated clinical trial designs that include comparison of matched cohorts and cost effectiveness factors. In addition to research that pays more attention to specific cell types and overall tumor burden, investigators should focus on tools available from advances in molecular biology and genetic subtyping and on efforts to learn “why some patients with a given primary cancer develop brain tumors and others do not.”

Ultimately, the authors hope better stratifying patients will improve care for patients with diverse brain metastases.

Filed under brain cancer tumor cells medicine science

121 notes

Fatal cell malfunction ID’d in Huntington’s disease

Researchers believe they have learned how mutations in the gene that causes Huntington’s disease kill brain cells, a finding that could open new opportunities for treating the fatal disorder. Scientists first linked the gene to the inherited disease more than 20 years ago.

image

Huntington’s disease affects five to seven people out of every 100,000. Symptoms, which typically begin in middle age, include involuntary jerking movements, disrupted coordination and cognitive problems such as dementia. Drugs cannot slow or stop the progressive decline caused by the disorder, which leaves patients unable to walk, talk or eat.

Lead author Hiroko Yano, PhD, of Washington University School of Medicine in St. Louis, found in mice and in mouse brain cell cultures that the disease impairs the transfer of proteins to energy-making factories inside brain cells. The factories, known as mitochondria, need these proteins to maintain their function. When disruption of the supply line disables the mitochondria, brain cells die.

“We showed the problem could be fixed by making cells overproduce the proteins that make this transfer possible,” said Yano, assistant professor of neurological surgery, neurology and genetics. “We don’t know if this will work in humans, but it’s exciting to have a solid new lead on how this condition kills brain cells.”

The findings are available online in Nature Neuroscience.

Huntington’s disease is caused by a defect in the huntingtin gene, which makes the huntingtin protein. Life expectancy after initial onset is about 20 years.

Scientists have known for some time that the mutated form of the huntingtin protein impairs mitochondria and that this disruption kills brain cells. But they have had difficulty understanding specifically how the gene harms the mitochondria.

For the new study, Yano and collaborators at the University of Pittsburgh worked with mice that were genetically modified to simulate the early stages of the disorder.

Yano and her colleagues found that the mutated huntingtin protein binds to a group of proteins called TIM23. This protein complex normally helps transfer essential proteins and other supplies to the mitochondria. The researchers discovered that the mutated huntingtin protein impairs that process.

The problem seems to be specific to brain cells early in the disease. At the same point in the disease process, the scientists found no evidence of impairment in liver cells, which also produce the mutated huntingtin protein.

The researchers speculated that brain cells might be particularly reliant on their mitochondria to power the production and recycling of the chemical signals they use to transmit information. This reliance could make the cells vulnerable to disruption of the mitochondria.

Other neurodegenerative conditions, including Alzheimer’s disease and amyotrophic lateral sclerosis, also known as Lou Gehrig’s disease, have been linked to problems with mitochondria. Scientists may be able to build upon these new findings to better understand these disorders.

(Source: news.wustl.edu)

Filed under huntington’s disease huntingtin mitochondria brain cells gene mutation neuroscience science

112 notes

Cocoa Extract May Counter Specific Mechanisms of Alzheimer’s Disease

A specific preparation of cocoa-extract called Lavado may reduce damage to nerve pathways seen in Alzheimer’s disease patients’ brains long before they develop symptoms, according to a study conducted at the Icahn School of Medicine at Mount Sinai and published June 20 in the Journal of Alzheimer’s Disease (JAD).  

image

Specifically, the study results, using mice genetically engineered to mimic Alzheimer’s disease, suggest that Lavado cocoa extract prevents the protein β-amyloid- (Aβ) from gradually forming sticky clumps in the brain, which are known to damage nerve cells as Alzheimer’s disease progresses.

Lavado cocoa is primarily composed of polyphenols, antioxidants also found in fruits and vegetables, with past studies suggesting that they prevent degenerative diseases of the brain.  

The Mount Sinai study results revolve around synapses, the gaps between nerve cells. Within healthy nerve pathways, each nerve cell sends an electric pulse down itself until it reaches a synapse where it triggers the release of chemicals called neurotransmitters that float across the gap and cause the downstream nerve cell to “fire” and pass on the message.

The disease-causing formation of Aβ oligomers – groups of molecules loosely attracted to each other –build up around synapses. The theory is that these sticky clumps physically interfere with synaptic structures and disrupt mechanisms that maintain memory circuits’ fitness. In addition, Aβ triggers immune inflammatory responses, like an infection, bringing an on a rush of chemicals and cells meant to destroy invaders but that damage our own cells instead.

“Our data suggest that Lavado cocoa extract prevents the abnormal formation of Aβ into clumped oligomeric structures, to prevent synaptic insult and eventually cognitive decline,” says lead investigator Giulio Maria Pasinetti, MD, PhD, Saunders Family Chair and Professor of Neurology at the Icahn School of Medicine at Mount Sinai. “Given that cognitive decline in Alzheimer’s disease is thought to start decades before symptoms appear, we believe our results have broad implications for the prevention of Alzheimer’s disease and dementia.

Evidence in the current study is the first to suggest that adequate quantities of specific cocoa polyphenols in the diet over time may prevent the glomming together of Aβ into oligomers that damage the brain, as a means to prevent Alzheimer’s disease.  

The research team led by Dr. Pasinetti tested the effects of extracts from Dutched, Natural, and Lavado cocoa, which contain different levels of polyphenols. Each cocoa type was evaluated for its ability to reduce the formation of Aβ oligomers and to rescue synaptic function. Lavado extract, which has the highest polyphenol content and anti-inflammatory activity among the three, was also the most effective in both reducing formation of Aβ oligomers and reversing damage to synapses in the study mice.  

“There have been some inconsistencies in medical literature regarding the potential benefit of cocoa polyphenols on cognitive function,” says Dr. Pasinetti. “Our finding of protection against synaptic deficits by Lavado cocoa extract, but not Dutched cocoa extract, strongly suggests that polyphenols are the active component that rescue synaptic transmission, since much of the polyphenol content is lost by the high alkalinity in the Dutching process.”  

Because loss of synaptic function may have a greater role in memory loss than the loss of nerve cells, rescue of synaptic function may serve as a more reliable target for an effective Alzheimer’s disease drug, said Dr. Pasinetti.

The new study provides experimental evidence that Lavado cocoa extract may influence Alzheimer’s disease mechanisms by modifying the physical structure of Aβ oligomers. It also strongly supports further studies to identify the metabolites of Lavado cocoa extract that are active in the brain and identify potential drug targets.

In addition, turning cocoa-based Lavado into a dietary supplement may provide a safe, inexpensive and easily accessible means to prevent Alzheimer’s disease, even in its earliest, asymptomatic stages.

(Source: mountsinai.org)

Filed under alzheimer's disease beta amyloid cocoa extract synapses memory neuroscience science

209 notes

Study shows puzzle games can improve mental flexibility
A recent study by Nanyang Technological University (NTU) scientists showed that adults who played the physics-based puzzle video game Cut the Rope regularly, for as little as an hour a day, had improved executive functions.
The executive functions in your brain are important for making decisions in everyday life when you have to deal with sudden changes in your environment – better known as thinking on your feet. An example would be when the traffic light turns amber and a driver has to decide in an instant if he will be able to brake in time or if it is safer to travel across the junction/intersection.
The video game study by Assistant Professor Michael D. Patterson and his PhD student Mr Adam Oei, tested four different games for the mobile platform, as their previous research had shown that different games trained different skills.
The games varied in their genres, which included a first person shooter (Modern Combat); arcade (Fruit Ninja); real-time strategy (StarFront Collision); and a complex puzzle (Cut the Rope).
NTU undergraduates, who were non-gamers, were then selected to play an hour a day, 5 days a week on their iPhone or iPod Touch. This video game training lasted for 4 weeks, a total of 20 hours.
Prof Patterson said students who played Cut the Rope, showed significant improvement on executive function tasks while no significant improvements were observed in those playing the other three games.
“This finding is important because previously, no video games have demonstrated this type of broad improvement to executive functions, which are important for general intelligence, dealing with new situations and managing multitasking,” said Prof Patterson, an expert in the psychology of video games.
“This indicates that while some games may help to improve mental abilities, not all games give you the same effect. To improve the specific ability you are looking for, you need to play the right game,” added Mr Oei.
The abilities tested in this study included how fast the players can switch tasks (an indicator of mental flexibility); how fast can the players adapt to a new situation instead of relying on the same strategy (the ability to inhibit prepotent or predominant responses); and how well they can focus on information while blocking out distractors or inappropriate responses (also known as the Flanker task in cognitive psychology).
Prof Patterson said the reason Cut the Rope improved executive function in their players was probably due to the game’s unique puzzle design. Strategies which worked for earlier levels would not work in later levels, and regularly forced the players to think creatively and try alternate solutions. This is unlike most other video games which keep the same general mechanics and goals, and just speed up or increase the number of items to keep track of. 
After 20 hours of game play, players of Cut the Rope could switch between tasks 33 per cent faster, were 30 per cent faster in adapting to new situations, and 60 per cent better in blocking out distractions and focusing on the tasks at hand than before training.
All three tests were done one week after the 52 students had finished playing their assigned game, to ensure that these were not temporary gains due to motivation or arousal effects.
The study will be published in the academic journal, Computers in Human Behavior, this August, but is available currently online. This is the first study that showed broad transfer to several different executive functions, further providing evidence the video games can be effective in training human cognition.
“This result could have implications in many areas such as educational, occupational and rehabilitative settings,” Prof Patterson said.
“In future, with more studies, we will be able to know what type of games improves specific abilities, and prescribe games that will benefit people aside from just being entertainment.”
In their previous study published last year in PloS One, a top academic journal, Prof Patterson and Mr Oei studied the effects mobile gaming had on 75 NTU undergraduates.
The non-gamers were instructed to play one of the following games: “match three” game Bejeweled, virtual life simulation game The Sims, and action shooter Modern Combat.
The study findings showed that adults who play action games improved their ability to track multiple objects in a short span of time, useful when driving during a busy rush hour; while other games improved the participants’ ability for visual search tasks, useful when picking out an item from a large supermarket.
Moving forward, the Prof Patterson is keen to look at whether there is any improvement from playing such games in experienced adult gamers and how much improvement one can make through playing games.

Study shows puzzle games can improve mental flexibility

A recent study by Nanyang Technological University (NTU) scientists showed that adults who played the physics-based puzzle video game Cut the Rope regularly, for as little as an hour a day, had improved executive functions.

The executive functions in your brain are important for making decisions in everyday life when you have to deal with sudden changes in your environment – better known as thinking on your feet. An example would be when the traffic light turns amber and a driver has to decide in an instant if he will be able to brake in time or if it is safer to travel across the junction/intersection.

The video game study by Assistant Professor Michael D. Patterson and his PhD student Mr Adam Oei, tested four different games for the mobile platform, as their previous research had shown that different games trained different skills.

The games varied in their genres, which included a first person shooter (Modern Combat); arcade (Fruit Ninja); real-time strategy (StarFront Collision); and a complex puzzle (Cut the Rope).

NTU undergraduates, who were non-gamers, were then selected to play an hour a day, 5 days a week on their iPhone or iPod Touch. This video game training lasted for 4 weeks, a total of 20 hours.

Prof Patterson said students who played Cut the Rope, showed significant improvement on executive function tasks while no significant improvements were observed in those playing the other three games.

“This finding is important because previously, no video games have demonstrated this type of broad improvement to executive functions, which are important for general intelligence, dealing with new situations and managing multitasking,” said Prof Patterson, an expert in the psychology of video games.

“This indicates that while some games may help to improve mental abilities, not all games give you the same effect. To improve the specific ability you are looking for, you need to play the right game,” added Mr Oei.

The abilities tested in this study included how fast the players can switch tasks (an indicator of mental flexibility); how fast can the players adapt to a new situation instead of relying on the same strategy (the ability to inhibit prepotent or predominant responses); and how well they can focus on information while blocking out distractors or inappropriate responses (also known as the Flanker task in cognitive psychology).

Prof Patterson said the reason Cut the Rope improved executive function in their players was probably due to the game’s unique puzzle design. Strategies which worked for earlier levels would not work in later levels, and regularly forced the players to think creatively and try alternate solutions. This is unlike most other video games which keep the same general mechanics and goals, and just speed up or increase the number of items to keep track of. 

After 20 hours of game play, players of Cut the Rope could switch between tasks 33 per cent faster, were 30 per cent faster in adapting to new situations, and 60 per cent better in blocking out distractions and focusing on the tasks at hand than before training.

All three tests were done one week after the 52 students had finished playing their assigned game, to ensure that these were not temporary gains due to motivation or arousal effects.

The study will be published in the academic journal, Computers in Human Behavior, this August, but is available currently online. This is the first study that showed broad transfer to several different executive functions, further providing evidence the video games can be effective in training human cognition.

“This result could have implications in many areas such as educational, occupational and rehabilitative settings,” Prof Patterson said.

“In future, with more studies, we will be able to know what type of games improves specific abilities, and prescribe games that will benefit people aside from just being entertainment.”

In their previous study published last year in PloS One, a top academic journal, Prof Patterson and Mr Oei studied the effects mobile gaming had on 75 NTU undergraduates.

The non-gamers were instructed to play one of the following games: “match three” game Bejeweled, virtual life simulation game The Sims, and action shooter Modern Combat.

The study findings showed that adults who play action games improved their ability to track multiple objects in a short span of time, useful when driving during a busy rush hour; while other games improved the participants’ ability for visual search tasks, useful when picking out an item from a large supermarket.

Moving forward, the Prof Patterson is keen to look at whether there is any improvement from playing such games in experienced adult gamers and how much improvement one can make through playing games.

Filed under executive function video games cognition psychology neuroscience science

171 notes

How the brain processes visual information
MSU’s Behrad Noudoost was a co-author with Marc Zirnsak and other neuroscientists from the Tirin Moore Lab at Stanford University in publishing a recent paper on the research in Nature, an international weekly journal for natural sciences.
Noudoost and the team studied saccadic eye movements—those movements where the eye jumps from one point of focus to another—in an effort to determine exactly how this happens without us being overcome by our brains processing too much visual information.
To introduce the study, Noudoost first gets his audience to think about eye movements at the most basic level. “Look in the mirror and stare at one eye,” Noudoost said. “Then look at the other eye. We are essentially blind during eye movement as we cannot see our eyes move, even though we know they did.”
According to Noudoost, scientists have been trying to learn exactly how the brain processes these visual stimuli during saccadic eye movement and this research offers new evidence that the prefrontal cortex of the brain is responsible for visual stability.
"Visual stability is what keeps our vision stable in spite of changing input. It is similar to the stabilizer button on a video camera," Noudoost said.
"We wanted to know what causes the brain to filter out un-necessary information when we shift our vision from one focal target to another," Noudoost said. "Without that filter the visual information would overwhelm us."
According to the scientists, the study offers evidence neurons in the prefrontal cortex of the brain start processing information in anticipation of where we are going to look before we ever do it, suggesting that selective processing might be the mechanism for visual stability.
Noudoost said this new information can help scientists better understand the underlying causes of problems such as dyslexia and attention deficit disorders.
According to Frances Lefcort, the head of the Department of Cell Biology and Neuroscience, the team’s basic research may have implications for understanding a myriad of mental health issues.
"Schizophrenia and attention deficit disorders have been linked to visual stability, so the work Behrad is doing offers valuable knowledge to other scientists working in cognitive neuroscience," Lefcort said.
"Understanding how a healthy brain works is important in terms of knowing its impact on cognitive functions such as memory, learning and in this case attention," Noudoost said. "By exploring normal brain function, we can better understand what happens in someone with a mental illness."
According to Lefcort, Noudoost and neuroscience professor Charles Gray are strengthening MSU’s contribution to the field of cognitive neuroscience.
"Behrad is an exquisitely trained neuroscientist. He offers students a viewpoint as both scientist and a physician," Lefcort said. "We are thrilled to have him and he has already brought new energy and is bolstering our impact on the growing field of brain research."
Noudoost joined MSU’s Department of Cell Biology and Neuroscience last summer from Stanford University and has already been awarded a $225,000 Whitehall Foundation grant for neuroscience. Whitehall Foundation grants are awarded to established scientists working in neurobiology.
"I am colorblind and I wanted to see the world as others could see it," Noudoost said explaining why he was first drawn into this type of research. "Although I still don’t see the world in the same colors as everyone else, I am more amazed everyday by the brain."

How the brain processes visual information

MSU’s Behrad Noudoost was a co-author with Marc Zirnsak and other neuroscientists from the Tirin Moore Lab at Stanford University in publishing a recent paper on the research in Nature, an international weekly journal for natural sciences.

Noudoost and the team studied saccadic eye movements—those movements where the eye jumps from one point of focus to another—in an effort to determine exactly how this happens without us being overcome by our brains processing too much visual information.

To introduce the study, Noudoost first gets his audience to think about eye movements at the most basic level. “Look in the mirror and stare at one eye,” Noudoost said. “Then look at the other eye. We are essentially blind during eye movement as we cannot see our eyes move, even though we know they did.”

According to Noudoost, scientists have been trying to learn exactly how the brain processes these visual stimuli during saccadic eye movement and this research offers new evidence that the prefrontal cortex of the brain is responsible for visual stability.

"Visual stability is what keeps our vision stable in spite of changing input. It is similar to the stabilizer button on a video camera," Noudoost said.

"We wanted to know what causes the brain to filter out un-necessary information when we shift our vision from one focal target to another," Noudoost said. "Without that filter the visual information would overwhelm us."

According to the scientists, the study offers evidence neurons in the prefrontal cortex of the brain start processing information in anticipation of where we are going to look before we ever do it, suggesting that selective processing might be the mechanism for visual stability.

Noudoost said this new information can help scientists better understand the underlying causes of problems such as dyslexia and attention deficit disorders.

According to Frances Lefcort, the head of the Department of Cell Biology and Neuroscience, the team’s basic research may have implications for understanding a myriad of mental health issues.

"Schizophrenia and attention deficit disorders have been linked to visual stability, so the work Behrad is doing offers valuable knowledge to other scientists working in cognitive neuroscience," Lefcort said.

"Understanding how a healthy brain works is important in terms of knowing its impact on cognitive functions such as memory, learning and in this case attention," Noudoost said. "By exploring normal brain function, we can better understand what happens in someone with a mental illness."

According to Lefcort, Noudoost and neuroscience professor Charles Gray are strengthening MSU’s contribution to the field of cognitive neuroscience.

"Behrad is an exquisitely trained neuroscientist. He offers students a viewpoint as both scientist and a physician," Lefcort said. "We are thrilled to have him and he has already brought new energy and is bolstering our impact on the growing field of brain research."

Noudoost joined MSU’s Department of Cell Biology and Neuroscience last summer from Stanford University and has already been awarded a $225,000 Whitehall Foundation grant for neuroscience. Whitehall Foundation grants are awarded to established scientists working in neurobiology.

"I am colorblind and I wanted to see the world as others could see it," Noudoost said explaining why he was first drawn into this type of research. "Although I still don’t see the world in the same colors as everyone else, I am more amazed everyday by the brain."

Filed under eye movements prefrontal cortex visual processing visual system mental illness neuroscience science

175 notes

Scientists show how bigger brains could help us see better

It has become increasingly common to hear reports that big brains are not necessary, or even an evolutionary fluke. However, the new article found that increases in the size of brain areas, such as the visual cortex, are an essential element of evolution.

image

As part of the study, the researchers found that an increase in the size of the visual part of the brain in different primate species, including humans, apes, and monkeys, is associated with enhanced visual processing.

It is controversial whether overall brain size can predict intelligence. However the size of specialised areas within the brain is associated with specific changes in behaviour such as reducing the susceptibility to visual illusions and increasing the visual acuity or fine details that can be seen.

First author, Dr Alexandra de Sousa explained: “Primates with a bigger visual cortex have better visual resolution, the precision of vision, and reduced visual illusion strength. In essence, the bigger the brain area, the better the visual processing ability.

“The size of brain areas predicts not only the number of neurons (brain cells) in that area, but also the likelihood of connections between neurons. These connections allow for increasingly complex computations to be made that allow for more accurate, and more difficult, visual perception.”

Co-author, Dr Michael Proulx, Senior Lecturer (Associate Professor) in Psychology, added: “This paper is a novel attempt to bring together the micro and macro anatomy of the brain with behaviour. We link visual abilities, the size of brain areas, and the number of neurons that make up those brain areas to provide a framework that ties brain structure and function together.

“The theory of brain size that we discuss can be tested in the future with more behavioural tests of other species, gathering more comparative neuroanatomical data, and by testing other senses and multi-sensory perception, too. We might be able to even predict how well extinct species could sense the world based on fossil data.”

For the study, Dr Alexandra de Sousa, an expert in brain evolution, provided brain size measurements from her and other’s neuroanatomical research. Dr Michael Proulx, an expert in perception, found psychological studies of visual illusions and visual acuity in the same species or general of animals.

The paper ‘What can volumes reveal about human brain evolution? A framework for bridging behavioral, histometric and volumetric perspectives’ is published today in Frontiers in Neuroanatomy – an online, open access journal.

(Source: bath.ac.uk)

Filed under visual cortex vision brain size evolution brain cells neuroscience science

235 notes

People with tinnitus process emotions differently from their peers
Patients with persistent ringing in the ears – a condition known as tinnitus – process emotions differently in the brain from those with normal hearing, researchers report in the journal Brain Research.
Tinnitus afflicts 50 million people in the United States, according to the American Tinnitus Association, and causes those with the condition to hear noises that aren’t really there. These phantom sounds are not speech, but rather whooshing noises, train whistles, cricket noises or whines. Their severity often varies day to day.
University of Illinois speech and hearing science professor Fatima Husain, who led the study, said previous studies showed that tinnitus is associated with increased stress, anxiety, irritability and depression, all of which are affiliated with the brain’s emotional processing systems.
“Obviously, when you hear annoying noises constantly that you can’t control, it may affect your emotional processing systems,” Husain said. “But when I looked at experimental work done on tinnitus and emotional processing, especially brain imaging work, there hadn’t been much research published.”
She decided to use functional magnetic resonance imaging (fMRI) brain scans to better understand how tinnitus affects the brain’s ability to process emotions. These scans show the areas of the brain that are active in response to stimulation, based upon blood flow to those areas.
Three groups of participants were used in the study: people with mild-to-moderate hearing loss and mild tinnitus; people with mild-to-moderate hearing loss without tinnitus; and a control group of age-matched people without hearing loss or tinnitus. Each person was put in an fMRI machine and listened to a standardized set of 30 pleasant, 30 unpleasant and 30 emotionally neutral sounds (for example, a baby laughing, a woman screaming and a water bottle opening). The participants pressed a button to categorize each sound as pleasant, unpleasant or neutral.
The tinnitus and normal-hearing groups responded more quickly to emotion-inducing sounds than to neutral sounds, while patients with hearing loss had a similar response time to each category of sound. Over all, the tinnitus group’s reaction times were slower than the reaction times of those with normal hearing.
Activity in the amygdala, a brain region associated with emotional processing, was lower in the tinnitus and hearing-loss patients than in people with normal hearing. Tinnitus patients also showed more activity than normal-hearing people in two other brain regions associated with emotion, the parahippocampus and the insula. The findings surprised Husain.
“We thought that because people with tinnitus constantly hear a bothersome, unpleasant stimulus, they would have an even higher amount of activity in the amygdala when hearing these sounds, but it was lesser,” she said. “Because they’ve had to adjust to the sound, some plasticity in the brain has occurred. They have had to reduce this amygdala activity and reroute it to other parts of the brain because the amygdala cannot be active all the time due to this annoying sound.”
Because of the sheer number of people who suffer from tinnitus in the United States, a group that includes many combat veterans, Husain hopes her group’s future research will be able to increase tinnitus patients’ quality of life.
“It’s a communication issue and a quality-of-life issue,” she said. “We want to know how we can get better in the clinical realm. Audiologists and clinicians are aware that tinnitus affects emotional aspects, too, and we want to make them aware that these effects are occurring so they can better help their patients.”

People with tinnitus process emotions differently from their peers

Patients with persistent ringing in the ears – a condition known as tinnitus – process emotions differently in the brain from those with normal hearing, researchers report in the journal Brain Research.

Tinnitus afflicts 50 million people in the United States, according to the American Tinnitus Association, and causes those with the condition to hear noises that aren’t really there. These phantom sounds are not speech, but rather whooshing noises, train whistles, cricket noises or whines. Their severity often varies day to day.

University of Illinois speech and hearing science professor Fatima Husain, who led the study, said previous studies showed that tinnitus is associated with increased stress, anxiety, irritability and depression, all of which are affiliated with the brain’s emotional processing systems.

“Obviously, when you hear annoying noises constantly that you can’t control, it may affect your emotional processing systems,” Husain said. “But when I looked at experimental work done on tinnitus and emotional processing, especially brain imaging work, there hadn’t been much research published.”

She decided to use functional magnetic resonance imaging (fMRI) brain scans to better understand how tinnitus affects the brain’s ability to process emotions. These scans show the areas of the brain that are active in response to stimulation, based upon blood flow to those areas.

Three groups of participants were used in the study: people with mild-to-moderate hearing loss and mild tinnitus; people with mild-to-moderate hearing loss without tinnitus; and a control group of age-matched people without hearing loss or tinnitus. Each person was put in an fMRI machine and listened to a standardized set of 30 pleasant, 30 unpleasant and 30 emotionally neutral sounds (for example, a baby laughing, a woman screaming and a water bottle opening). The participants pressed a button to categorize each sound as pleasant, unpleasant or neutral.

The tinnitus and normal-hearing groups responded more quickly to emotion-inducing sounds than to neutral sounds, while patients with hearing loss had a similar response time to each category of sound. Over all, the tinnitus group’s reaction times were slower than the reaction times of those with normal hearing.

Activity in the amygdala, a brain region associated with emotional processing, was lower in the tinnitus and hearing-loss patients than in people with normal hearing. Tinnitus patients also showed more activity than normal-hearing people in two other brain regions associated with emotion, the parahippocampus and the insula. The findings surprised Husain.

“We thought that because people with tinnitus constantly hear a bothersome, unpleasant stimulus, they would have an even higher amount of activity in the amygdala when hearing these sounds, but it was lesser,” she said. “Because they’ve had to adjust to the sound, some plasticity in the brain has occurred. They have had to reduce this amygdala activity and reroute it to other parts of the brain because the amygdala cannot be active all the time due to this annoying sound.”

Because of the sheer number of people who suffer from tinnitus in the United States, a group that includes many combat veterans, Husain hopes her group’s future research will be able to increase tinnitus patients’ quality of life.

“It’s a communication issue and a quality-of-life issue,” she said. “We want to know how we can get better in the clinical realm. Audiologists and clinicians are aware that tinnitus affects emotional aspects, too, and we want to make them aware that these effects are occurring so they can better help their patients.”

Filed under tinnitus emotions amygdala neuroimaging hearing neuroscience science

99 notes

Researchers publish one of the longest longitudinal studies of cognition in MS
Researchers at Kessler Foundation and the Cleveland Clinic have published one of the longest longitudinal studies of cognition in multiple sclerosis (MS). The article, “Cognitive impairment in multiple sclerosis: An 18-year follow-up study,” was epublished by Multiple Sclerosis and Related Disorders on April 13, 2014. Results provide insight into the natural evolution of cognitive changes over time, an important consideration for researchers and clinicians. Authors are Lauren B. Strober, PhD, of Kessler Foundation and  Stephen M. Rao, PhD, Jar-Chi Lee, Elizabeth Fisher, PhD, and Richard Rudick, MD, of the Cleveland Clinic.
“While cognitive impairment is known to affect 40 to 65% of individuals with MS, few studies have followed the pattern of cognitive decline over time, which is important for understanding long-term care and outcomes associated with MS,” said Dr. Strober, senior research scientist at Kessler Foundation. “Our study was based on a unique sample of 22 patients who underwent neuropsychological testing at entry into the original phase 3 clinical trial of intramuscular interferon beta-1a, and again at 18-year followup.”
At baseline, 9 patients (41%) had cognitive impairment; at 18-year followup, 13 patients (59%), were found to be impaired. Significant declines over time were found in information processing speed, auditory attention, memory, episodic learning and visual construction. Decline was steeper in the unimpaired than in the impaired group, as indicated by the Symbol Digit Modalities Test (SDMT).
"These longitudinal data contribute substantially to our knowledge of the course of cognitive decline in MS,” noted John DeLuca, PhD, VP of Research & Training at Kessler Foundation. “In light of the young age at diagnosis, this perspective is fundamental to the development of rehabilitation strategies that meet the needs of people dealing with the cognitive effects of MS.”
The study was funded by Biogen Idec.

Researchers publish one of the longest longitudinal studies of cognition in MS

Researchers at Kessler Foundation and the Cleveland Clinic have published one of the longest longitudinal studies of cognition in multiple sclerosis (MS). The article, “Cognitive impairment in multiple sclerosis: An 18-year follow-up study,” was epublished by Multiple Sclerosis and Related Disorders on April 13, 2014. Results provide insight into the natural evolution of cognitive changes over time, an important consideration for researchers and clinicians. Authors are Lauren B. Strober, PhD, of Kessler Foundation and  Stephen M. Rao, PhD, Jar-Chi Lee, Elizabeth Fisher, PhD, and Richard Rudick, MD, of the Cleveland Clinic.

“While cognitive impairment is known to affect 40 to 65% of individuals with MS, few studies have followed the pattern of cognitive decline over time, which is important for understanding long-term care and outcomes associated with MS,” said Dr. Strober, senior research scientist at Kessler Foundation. “Our study was based on a unique sample of 22 patients who underwent neuropsychological testing at entry into the original phase 3 clinical trial of intramuscular interferon beta-1a, and again at 18-year followup.”

At baseline, 9 patients (41%) had cognitive impairment; at 18-year followup, 13 patients (59%), were found to be impaired. Significant declines over time were found in information processing speed, auditory attention, memory, episodic learning and visual construction. Decline was steeper in the unimpaired than in the impaired group, as indicated by the Symbol Digit Modalities Test (SDMT).

"These longitudinal data contribute substantially to our knowledge of the course of cognitive decline in MS,” noted John DeLuca, PhD, VP of Research & Training at Kessler Foundation. “In light of the young age at diagnosis, this perspective is fundamental to the development of rehabilitation strategies that meet the needs of people dealing with the cognitive effects of MS.”

The study was funded by Biogen Idec.

Filed under MS cognitive impairment cognition psychology neuroscience science

169 notes

Neural sweet talk: Taste metaphors emotionally engage the brain

So accustomed are we to metaphors related to taste that when we hear a kind smile described as “sweet,” or a resentful comment as “bitter,” we most likely don’t even think of those words as metaphors. But while it may seem to our ears that “sweet” by any other name means the same thing, new research shows that taste-related words actually engage the emotional centers of the brain more than literal words with the same meaning.

Researchers from Princeton University and the Free University of Berlin report in the Journal of Cognitive Neuroscience the first study to experimentally show that the brain processes these everyday metaphors differently than literal language. In the study, participants read 37 sentences that included common metaphors based on taste while the researchers recorded their brain activity. Each taste-related word was then swapped with a literal counterpart so that, for instance, “She looked at him sweetly” became “She looked at him kindly.”

The researchers found that the sentences containing words that invoked taste activated areas known to be associated with emotional processing, such as the amygdala, as well as the areas known as the gustatory cortices that allow for the physical act of tasting. Interestingly, the metaphorical and literal words only resulted in brain activity related to emotion when part of a sentence, but stimulated the gustatory cortices both in sentences and as stand-alone words.

Metaphorical sentences may spark increased brain activity in emotion-related regions because they allude to physical experiences, said co-author Adele Goldberg, a Princeton professor of linguistics in the Council of the Humanities. Human language frequently uses physical sensations or objects to refer to abstract domains such as time, understanding or emotion, Goldberg said. For instance, people liken love to a number of afflictions including being “sick” or shot through the heart with an arrow. Similarly, “sweet” has a much clearer physical component than “kind.” The new research suggests that these associations go beyond just being descriptive to engage our brains on an emotional level and potentially amplify the impact of the sentence, Goldberg said.

"You begin to realize when you look at metaphors how common they are in helping us understand abstract domains," Goldberg said. "It could be that we are more engaged with abstract concepts when we use metaphorical language that ties into physical experiences."

If metaphors in general elicit an emotional response from the brain that is similar to that caused by taste-related metaphors, then that could mean that figurative language presents a “rhetorical advantage” when communicating with others, explained co-author Francesca Citron, a postdoctoral researcher of psycholinguistics at the Free University’s Languages of Emotion research center.

"Figurative language may be more effective in communication and may facilitate processes such as affiliation, persuasion and support," Citron said. "Further, as a reader or listener, one should be wary of being overly influenced by metaphorical language."

Colloquially, metaphors seem to be employed precisely to evoke an emotional reaction, yet the actual emotional effect of figurative phrases on the person hearing them has not before been deeply explored, said Benjamin Bergen, an associate professor of cognitive science at the University of California-San Diego who studies language comprehension, and metaphorical language and thought.

"There’s a lot of research on the conceptual effects of metaphors, such as how they allow people to think about new or abstract concepts in terms of concrete things they’re familiar with. But there’s very little work on the emotional impact of metaphor," said Bergen, who had no role in the research but is familiar with it.

"Emotional impact seems to be one of the main reasons people use metaphors to begin with. For instance, a senator might describe a bill as ‘job-killing’ to evoke an emotional reaction," he said. "These results suggest that using certain metaphorical expressions induces more of an emotional reaction than saying the same thing literally. Those expressions that have this property are likely to have the effects on reasoning, inference, judgment and decision-making that emotion is known to have."

The brain areas that taste-related words did not stimulate are also an important outcome of the study, Citron said. Existing research on metaphors and neural processing has shown that figurative language generally requires more brainpower than literal language, Citron and Goldberg wrote. But these bursts of neural activity have been related to higher-order processing from thinking through an unfamiliar metaphor.

The brain activity Citron and Goldberg observed did not correlate with this process. In order to create the metaphorical- and literal-sentence stimuli, they had a group of people separate from the study participants rate sentences for familiarity, apparent arousal, imageability — which is how easily a phrase can be imagined in the reader’s mind — and how positive or negative each sentence was interpreted as being. The metaphorical and literal sentences were equal on all of these factors. In addition, each metaphorical phrase and its literal counterpart were rated as being highly similar in meaning.

These steps helped to ensure that the metaphorical and literal sentences were equally as easy to comprehend. Thus, the brain activity the researchers recorded was not likely to be in response to any additional difficulty study participants had in understanding the metaphors.

"It is important to rule out possible effects of familiarity, since less familiar items may require more processing resources to be understood and elicit enhanced brain responses in several brain regions," Citron said.

Citron and Goldberg plan to follow up on their results by examining if figurative language is remembered more accurately than literal language, if metaphors are more physically stimulating, and if metaphors related to other senses also provoke an emotional response from the brain.

Filed under brain activity taste metaphorical expressions amygdala emotions psychology neuroscience science

free counters