Neuroscience

Articles and news from the latest research reports.

167 notes

Researchers identify new vision of how we explore our world
Brain researchers at Barrow Neurological Institute have discovered that we explore the world with our eyes in a different way than previously thought. Their results advance our understanding of how healthy observers and neurological patients interact and glean critical information from the world around them.
The research team was led by Dr. Susana Martinez-Conde, Director of the Laboratory of Visual Neuroscience at Barrow, in collaboration with fellow Barrow Neurological Institute researchers Jorge Otero-Millan, Rachel Langston, and Dr. Stephen Macknik, Director of the Laboratory of Behavioral Neurophysiology. The study, titled “An oculomotor continuum from exploration to fixation”, was published in the Proceedings of the National Academy of Sciences.
Previously, scientists thought that we sample visual information from the world in two main different modes: exploration and fixation. “We used to think that we make large eye movements to search for objects of interest, and then fix our gaze to see them with high detail,” says Martinez-Conde. “But now we know that’s not quite right.”
The discovery shows that even during visual fixation, we are actually scanning visual details with small eye movements — just like we explore visual scenes with big eye movements, but on a smaller scale. This means that exploration and fixation are two ends of the same continuum of oculomotor scanning.
Subjects viewed natural images while the team measured their eye movements with high-speed eye tracking. The images could range in size from the massive, presented on a room-sized video monitor in the Barrow Neurological Institute’s Eller Telepresence Room, normally used for Barrow’s surgeons to collaborate in brain surgeries with colleagues around the world, to images that are just half the width of your thumb nail.
In all cases, the researchers found that subjects’ eyes scanned the scenes with the same general strategy, along a smooth continuum of dynamical changes. “There was no abrupt change in the characteristics of the eye movements, whether the visual scenes were huge or tiny, or even when the subjects were fixing their gaze. That means that the brain controls eye movements in the same way when we explore and when we fixate,” said Dr. Martinez-Conde.
Scientists have studied how the brain controls eye movements for over 100 years, and the idea —challenged here—that fixation and exploration are fundamentally different behaviors has been central to the field. This new perspective will affect future research and bring focus to the study of neurological diseases that impact oculomotor behavior.
(Image: Getty Images)

Researchers identify new vision of how we explore our world

Brain researchers at Barrow Neurological Institute have discovered that we explore the world with our eyes in a different way than previously thought. Their results advance our understanding of how healthy observers and neurological patients interact and glean critical information from the world around them.

The research team was led by Dr. Susana Martinez-Conde, Director of the Laboratory of Visual Neuroscience at Barrow, in collaboration with fellow Barrow Neurological Institute researchers Jorge Otero-Millan, Rachel Langston, and Dr. Stephen Macknik, Director of the Laboratory of Behavioral Neurophysiology. The study, titled “An oculomotor continuum from exploration to fixation”, was published in the Proceedings of the National Academy of Sciences.

Previously, scientists thought that we sample visual information from the world in two main different modes: exploration and fixation. “We used to think that we make large eye movements to search for objects of interest, and then fix our gaze to see them with high detail,” says Martinez-Conde. “But now we know that’s not quite right.”

The discovery shows that even during visual fixation, we are actually scanning visual details with small eye movements — just like we explore visual scenes with big eye movements, but on a smaller scale. This means that exploration and fixation are two ends of the same continuum of oculomotor scanning.

Subjects viewed natural images while the team measured their eye movements with high-speed eye tracking. The images could range in size from the massive, presented on a room-sized video monitor in the Barrow Neurological Institute’s Eller Telepresence Room, normally used for Barrow’s surgeons to collaborate in brain surgeries with colleagues around the world, to images that are just half the width of your thumb nail.

In all cases, the researchers found that subjects’ eyes scanned the scenes with the same general strategy, along a smooth continuum of dynamical changes. “There was no abrupt change in the characteristics of the eye movements, whether the visual scenes were huge or tiny, or even when the subjects were fixing their gaze. That means that the brain controls eye movements in the same way when we explore and when we fixate,” said Dr. Martinez-Conde.

Scientists have studied how the brain controls eye movements for over 100 years, and the idea —challenged here—that fixation and exploration are fundamentally different behaviors has been central to the field. This new perspective will affect future research and bring focus to the study of neurological diseases that impact oculomotor behavior.

(Image: Getty Images)

Filed under vision visual system visual fixation visual exploration eye movements neuroscience science

103 notes

Breakthrough in neuroscience could help re-wire appetite control
Researchers at the University of East Anglia (UEA) have made a discovery in neuroscience that could offer a long-lasting solution to eating disorders such as obesity.
It was previously thought that the nerve cells in the brain associated with appetite regulation were generated entirely during an embryo’s development in the womb and therefore their numbers were fixed for life.
But research published today in the Journal of Neuroscience has identified a population of stem cells capable of generating new appetite-regulating neurons in the brains of young and adult rodents.
Obesity has reached epidemic proportions globally. More than 1.4 billion adults worldwide are overweight and more than half a billion are obese. Associated health problems include type 2 diabetes, heart disease, arthritis and cancer. And at least 2.8 million people die each year as a result of being overweight or obese.
The economic burden on the NHS in the UK is estimated to be more than £5 billion annually. In the US, the healthcare cost tops $60 billion.
Scientists at UEA investigated the hypothalamus section of the brain – which regulates sleep and wake cycles, energy expenditure, appetite, thirst, hormone release and many other critical biological functions. The study looked specifically at the nerve cells that regulate appetite.
The researchers used ‘genetic fate mapping’ techniques to make their discovery – a method that tracks the development of stem cells and cells derived from them, at desired time points during the life of an animal.
They established that a population of brain cells called ‘tanycytes’ behave like stem cells and add new neurons to the appetite-regulating circuitry of the mouse brain after birth and into adulthood.
Lead researcher Dr Mohammad K. Hajihosseini, from UEA’s school of Biological Sciences, said: “Unlike dieting, translation of this discovery could eventually offer a permanent solution for tackling obesity.
“Loss or malfunctioning of neurons in the hypothalamus is the prime cause of eating disorders such as obesity.
“Until recently we thought that all of these nerve cells were generated during the embryonic period and so the circuitry that controls appetite was fixed.
“But this study has shown that the neural circuitry that controls appetite is not fixed in number and could possibly be manipulated numerically to tackle eating disorders.
“The next step is to define the group of genes and cellular processes that regulate the behaviour and activity of tanycytes. This information will further our understanding of brain stem cells and could be exploited to develop drugs that can modulate the number or functioning of appetite-regulating neurons.
“Our long-term goal of course is to translate this work to humans, which could take up to five or 10 years. It could lead to a permanent intervention in infancy for those predisposed to obesity, or later in life as the disease becomes apparent.”

Breakthrough in neuroscience could help re-wire appetite control

Researchers at the University of East Anglia (UEA) have made a discovery in neuroscience that could offer a long-lasting solution to eating disorders such as obesity.

It was previously thought that the nerve cells in the brain associated with appetite regulation were generated entirely during an embryo’s development in the womb and therefore their numbers were fixed for life.

But research published today in the Journal of Neuroscience has identified a population of stem cells capable of generating new appetite-regulating neurons in the brains of young and adult rodents.

Obesity has reached epidemic proportions globally. More than 1.4 billion adults worldwide are overweight and more than half a billion are obese. Associated health problems include type 2 diabetes, heart disease, arthritis and cancer. And at least 2.8 million people die each year as a result of being overweight or obese.

The economic burden on the NHS in the UK is estimated to be more than £5 billion annually. In the US, the healthcare cost tops $60 billion.

Scientists at UEA investigated the hypothalamus section of the brain – which regulates sleep and wake cycles, energy expenditure, appetite, thirst, hormone release and many other critical biological functions. The study looked specifically at the nerve cells that regulate appetite.

The researchers used ‘genetic fate mapping’ techniques to make their discovery – a method that tracks the development of stem cells and cells derived from them, at desired time points during the life of an animal.

They established that a population of brain cells called ‘tanycytes’ behave like stem cells and add new neurons to the appetite-regulating circuitry of the mouse brain after birth and into adulthood.

Lead researcher Dr Mohammad K. Hajihosseini, from UEA’s school of Biological Sciences, said: “Unlike dieting, translation of this discovery could eventually offer a permanent solution for tackling obesity.

“Loss or malfunctioning of neurons in the hypothalamus is the prime cause of eating disorders such as obesity.

“Until recently we thought that all of these nerve cells were generated during the embryonic period and so the circuitry that controls appetite was fixed.

“But this study has shown that the neural circuitry that controls appetite is not fixed in number and could possibly be manipulated numerically to tackle eating disorders.

“The next step is to define the group of genes and cellular processes that regulate the behaviour and activity of tanycytes. This information will further our understanding of brain stem cells and could be exploited to develop drugs that can modulate the number or functioning of appetite-regulating neurons.

“Our long-term goal of course is to translate this work to humans, which could take up to five or 10 years. It could lead to a permanent intervention in infancy for those predisposed to obesity, or later in life as the disease becomes apparent.”

Filed under obesity appetite regulation tanycytes neural circuitry hypothalamus brain cells neuroscience science

210 notes

A Sleep Aid Without the Side Effects
Insomniacs desperate for some zzzs may one day have a safer way to get them. Scientists have developed a new sleep medication that has induced sleep in rodents and monkeys without apparently impairing cognition, a potentially dangerous side effect of common sleep aids. The discovery, which originated in work explaining narcolepsy, could lead to a new class of drugs that help people who don’t respond to other treatments.
Between 10% and 15% of Americans chronically struggle with getting to or staying asleep. Many of them turn to sleeping pills for relief, and most are prescribed drugs, such as zolpidem (Ambien) and eszopiclone (Lunesta), that slow down the brain by binding to receptors for GABA, a neurotransmitter that’s involved in mood, cognition, and muscle tone. But because the drugs target GABA indiscriminately, they can also impair cognition, causing amnesia, confusion, and other problems with learning and memory, along with a number of strange sleepwalking behaviors, including wandering, eating, and driving while asleep. This has led many researchers to seek out alternative mechanisms for inducing sleep.
Neuroscientist Jason Uslaner of Merck Research Laboratories in West Point, Pennsylvania, and colleagues decided to tap into the brain’s orexin system. Orexin (also known as hypocretin) is a protein that controls wakefulness and is missing in people with narcolepsy. Past studies successfully induced sleep by inhibiting orexin, but had not looked into its effects on cognition. The researchers developed a new orexin-inhibiting compound called DORA-22 and confirmed that it could induce sleep in rats and rhesus monkeys as effectively as the GABA-modulating drugs.
Then the researchers went about testing the drugs’ effects on the animals’ cognition. They measured the rats’ cognition and memory by assessing the rodents’ ability to recognize objects. They presented the rats with a new object—say, a cone or a sphere—that the rats then sniffed and explored. Then they took the object away for an hour. After that hour, the rats were exposed to a new object and the one they’d already gotten to know; if the rats remembered, they spent less time checking out the familiar object. With the primates, Uslaner’s team tested their ability to match colors on a touchscreen and to pay attention to and identify the origin of a flashing light. In all the cases, the researchers found  the GABA-modulating sleeping pills caused both the rats and the primates to respond more slowly and less accurately. Monkeys taking the memory and attention tests, for example, were 20% less accurate on the highest dose of each of the GABA-modulating drugs. But DORA-22 had no such effect on cognition, the team reports today in Science Translational Medicine.
"We were very excited," Uslaner says. "Folks who take sleep medications need to be able to perform cognitive tasks when they awake, and this [compound] could help them do so without impairment."
Although DORA-22 has not yet been tested in humans, it holds tremendous promise for helping people suffering from sleep disorders, says Emmanuel Mignot, a sleep researcher with the Stanford University School of Medicine in Palo Alto, California. “This study is encouraging and exciting, because there’s good reason to believe it would work differently from what we’ve used in the past,” says Mignot, who helped discover the link between orexin (or its absence) and narcolepsy. “Not every drug works for everyone, so it’s really, really good news to have a potential new drug on the horizon.”

A Sleep Aid Without the Side Effects

Insomniacs desperate for some zzzs may one day have a safer way to get them. Scientists have developed a new sleep medication that has induced sleep in rodents and monkeys without apparently impairing cognition, a potentially dangerous side effect of common sleep aids. The discovery, which originated in work explaining narcolepsy, could lead to a new class of drugs that help people who don’t respond to other treatments.

Between 10% and 15% of Americans chronically struggle with getting to or staying asleep. Many of them turn to sleeping pills for relief, and most are prescribed drugs, such as zolpidem (Ambien) and eszopiclone (Lunesta), that slow down the brain by binding to receptors for GABA, a neurotransmitter that’s involved in mood, cognition, and muscle tone. But because the drugs target GABA indiscriminately, they can also impair cognition, causing amnesia, confusion, and other problems with learning and memory, along with a number of strange sleepwalking behaviors, including wandering, eating, and driving while asleep. This has led many researchers to seek out alternative mechanisms for inducing sleep.

Neuroscientist Jason Uslaner of Merck Research Laboratories in West Point, Pennsylvania, and colleagues decided to tap into the brain’s orexin system. Orexin (also known as hypocretin) is a protein that controls wakefulness and is missing in people with narcolepsy. Past studies successfully induced sleep by inhibiting orexin, but had not looked into its effects on cognition. The researchers developed a new orexin-inhibiting compound called DORA-22 and confirmed that it could induce sleep in rats and rhesus monkeys as effectively as the GABA-modulating drugs.

Then the researchers went about testing the drugs’ effects on the animals’ cognition. They measured the rats’ cognition and memory by assessing the rodents’ ability to recognize objects. They presented the rats with a new object—say, a cone or a sphere—that the rats then sniffed and explored. Then they took the object away for an hour. After that hour, the rats were exposed to a new object and the one they’d already gotten to know; if the rats remembered, they spent less time checking out the familiar object. With the primates, Uslaner’s team tested their ability to match colors on a touchscreen and to pay attention to and identify the origin of a flashing light. In all the cases, the researchers found the GABA-modulating sleeping pills caused both the rats and the primates to respond more slowly and less accurately. Monkeys taking the memory and attention tests, for example, were 20% less accurate on the highest dose of each of the GABA-modulating drugs. But DORA-22 had no such effect on cognition, the team reports today in Science Translational Medicine.

"We were very excited," Uslaner says. "Folks who take sleep medications need to be able to perform cognitive tasks when they awake, and this [compound] could help them do so without impairment."

Although DORA-22 has not yet been tested in humans, it holds tremendous promise for helping people suffering from sleep disorders, says Emmanuel Mignot, a sleep researcher with the Stanford University School of Medicine in Palo Alto, California. “This study is encouraging and exciting, because there’s good reason to believe it would work differently from what we’ve used in the past,” says Mignot, who helped discover the link between orexin (or its absence) and narcolepsy. “Not every drug works for everyone, so it’s really, really good news to have a potential new drug on the horizon.”

Filed under insomnia sleep sleep aid sleep medication cognition protein orexin GABA medicine neuroscience science

68 notes

Genetic markers ID second Alzheimer’s pathway

Researchers at Washington University School of Medicine in St. Louis have identified a new set of genetic markers for Alzheimer’s that point to a second pathway through which the disease develops.

image

Much of the genetic research on Alzheimer’s centers on amyloid-beta, a key component of brain plaques that build up in the brains of people with the disease.

In the new study, the scientists identified several genes linked to the tau protein, which is found in the tangles that develop in the brain as Alzheimer’s progresses and patients develop dementia. The findings may help provide targets for a different class of drugs that could be used for treatment.

The researchers report their findings online April 24 in the journal Neuron.

"We measured the tau protein in the cerebrospinal fluid and identified several genes that are related to high levels of tau and also affect risk for Alzheimer’s disease,” says senior investigator Alison M. Goate, DPhil, the Samuel and Mae S. Ludwig Professor of Genetics in Psychiatry. “As far as we’re aware, three of these genes have no effect on amyloid-beta, suggesting that they are operating through a completely different pathway.”

A fourth gene in the mix, APOE, had been identified long ago as a risk factor for Alzheimer’s. It has been linked to amyloid-beta, but in the new study, APOE appears to be connected to elevated levels of tau. Finding that APOE is influencing more than one pathway could help explain why the gene has such a big effect on Alzheimer’s disease risk, the researchers say.

“It appears APOE influences risk in more than one way,” says Goate, also a professor of genetics and co-director of the Hope Center for Neurological Disorders. “Some of the effects are mediated through amyloid-beta and others by tau. That suggests there are at least two ways in which the gene can influence our risk for Alzheimer’s disease.”

The new research by Goate and her colleagues is the largest genome-wide association study (GWAS) yet on tau in cerebrospinal fluid. The scientists analyzed points along the genomes of 1,269 individuals who had undergone spinal taps as part of ongoing Alzheimer’s research.

Whereas amyloid is known to collect in the brain and affect brain cells from the outside, the tau protein usually is stored inside cells. So tau usually moves into the spinal fluid when cells are damaged or die. Elevated tau has been linked to several forms of non-Alzheimer’s dementia, and first author Carlos Cruchaga, PhD, says that although amyloid plaques are a key feature of Alzheimer’s disease, it’s possible that excess tau has more to do with the dementia than plaques.

“We know there are some individuals with high levels of amyloid-beta who don’t develop Alzheimer’s disease,” says Cruchaga, an assistant professor of psychiatry. “We don’t know why that is, but perhaps it could be related to the fact that they don’t have elevated tau levels.”

In addition to APOE, the researchers found that a gene called GLIS3, and the genes TREM2 and TREML2 also affect both tau levels and Alzheimer’s risk.

Goate says she suspects changes in tau may be good predictors of advancing disease. As tau levels rise, she says people may be more likely to develop dementia. If drugs could be developed to target tau, they may prevent much of the neurodegeneration that characterizes Alzheimer’s disease and, in that way, help prevent or delay dementia.

The new research also suggests it may one day be possible to reduce Alzheimer’s risk by targeting both pathways.

“Since two mechanisms apparently exist, identifying potential drug targets along these pathways could be very useful,” she says. “If drugs that influence tau could be added to those that affect amyloid, we could potentially reduce risk through two different pathways.”

(Source: news.wustl.edu)

Filed under alzheimer's disease dementia tau protein genes APOE gene genomics genetics neuroscience science

84 notes

Shift of Language Function to Right Hemisphere Impedes Post-Stroke Aphasia Recovery
In a study designed to differentiate why some stroke patients recover from aphasia and others do not, investigators have found that a compensatory reorganization of language function to right hemispheric brain regions bodes poorly for language recovery. Patients who recovered from aphasia showed a return to normal left-hemispheric language activation patterns. These results, which may open up new rehabilitation strategies, are available in the current issue of Restorative Neurology and Neuroscience.
“Overall, approximately 30% of patients with stroke suffer from various types of aphasia, with this deficit most common in stroke with left middle cerebral artery territory damage. Some of the affected patients recover to a certain degree in the months and years following the stroke. The recovery process is modulated by several known factors, but the degree of the contribution of brain areas unaffected by stroke to the recovery process is less clear,” says lead investigator Jerzy P. Szaflarski, MD, PhD, of the Departments of Neurology at the University of Alabama and University of Cincinnati Academic Health Center.
For the study, 27 right-handed adults who suffered from a left middle cerebral artery infarction at least one year prior to study enrollment were recruited. After language testing, 9 subjects were considered to have normal language ability while 18 were considered aphasic. Patients underwent a battery of language tests as well as a semantic decision/tone decision cognitive task during functional MRI (fMRI) in order to map language function. MRI scans were used to determine stroke volume.
The authors found that linguistic performance was better in those who had stronger left-hemispheric fMRI signals while performance was worse in those who had stronger signal-shifts to the right hemisphere. As expected, they also found a negative association between the size of the stroke and performance on some linguistic tests. Right cerebellar activation was also linked to better post-stroke language ability.
The authors say that while a shift to the non-dominant right hemisphere can restore language function in children who have experienced left-hemispheric injury or stroke, for adults such a shift may impede recovery. For adults, it is the left hemisphere that is necessary for language function preservation and/or recovery.

Shift of Language Function to Right Hemisphere Impedes Post-Stroke Aphasia Recovery

In a study designed to differentiate why some stroke patients recover from aphasia and others do not, investigators have found that a compensatory reorganization of language function to right hemispheric brain regions bodes poorly for language recovery. Patients who recovered from aphasia showed a return to normal left-hemispheric language activation patterns. These results, which may open up new rehabilitation strategies, are available in the current issue of Restorative Neurology and Neuroscience.

“Overall, approximately 30% of patients with stroke suffer from various types of aphasia, with this deficit most common in stroke with left middle cerebral artery territory damage. Some of the affected patients recover to a certain degree in the months and years following the stroke. The recovery process is modulated by several known factors, but the degree of the contribution of brain areas unaffected by stroke to the recovery process is less clear,” says lead investigator Jerzy P. Szaflarski, MD, PhD, of the Departments of Neurology at the University of Alabama and University of Cincinnati Academic Health Center.

For the study, 27 right-handed adults who suffered from a left middle cerebral artery infarction at least one year prior to study enrollment were recruited. After language testing, 9 subjects were considered to have normal language ability while 18 were considered aphasic. Patients underwent a battery of language tests as well as a semantic decision/tone decision cognitive task during functional MRI (fMRI) in order to map language function. MRI scans were used to determine stroke volume.

The authors found that linguistic performance was better in those who had stronger left-hemispheric fMRI signals while performance was worse in those who had stronger signal-shifts to the right hemisphere. As expected, they also found a negative association between the size of the stroke and performance on some linguistic tests. Right cerebellar activation was also linked to better post-stroke language ability.

The authors say that while a shift to the non-dominant right hemisphere can restore language function in children who have experienced left-hemispheric injury or stroke, for adults such a shift may impede recovery. For adults, it is the left hemisphere that is necessary for language function preservation and/or recovery.

Filed under language language function aphasia stroke fMRI cerebral artery hemispheres brain neuroscience science

194 notes

Avoid impulsive acts by imagining future benefits
Why is it so hard for some people to resist the least little temptation, while others seem to possess incredible patience, passing up immediate gratification for a greater long-term good?
The answer, suggests a new brain imaging study from Washington University in St. Louis, lies in how effective people are at feeling good right now about all the future benefits that may come from passing up a smaller immediate reward. Researchers found that activity in two regions of the brain distinguished impulsive and patient people.
“Activity in one part of the brain, the anterior prefrontal cortex, seems to show whether you’re getting pleasure from thinking about the future reward you are about to receive,” explains study co-author Todd Braver, PhD, professor of psychology in Arts & Sciences. “People can relate to this idea that when you know something good is coming, just that waiting can feel pleasurable.”
The study, which was published in the first issue of the Journal of Neuroscience this year, was designed to examine what happens in the brain as people wait for a reward, especially whether people characterized as “impulsive” would show different brain responses than those considered “patient.”
The lead author of the study was Koji Jimura, then a postdoctoral researcher in Braver’s Cognitive Control and Psychopathology Laboratory, and now a research associate professor at the Tokyo Institute of Technology, in Japan.
Unlike previous research on delayed gratification that had people choose between hypothetical rewards of money over long delays (e.g, $500 now or $1,000 a year from now), this Washington University study presented their participants with real rewards of squirts of juice that they chose to receive either immediately or after a delay of up to a minute.
“It’s kind of funny because we treated the people in our study like researchers that work with animals do, and we actually squirted juice into their mouths,” Braver says.
Results show that a brain region called the ventral striatum (VS) ramped up its activity in impulsive people as they got closer and closer to receiving their delayed reward. The VS activity of patient people, on the other hand, stayed more constant.
The researchers interpreted these different brain responses to mean that impulsive people initially did not find the prospect of waiting for a reward very appealing. However, as they approached the time they’d receive that reward, they became more excited and their VS reflected that excitement.
“This gradual increase may reflect impatience or excessive anticipation of the upcoming reward in impulsive individuals,” says Jimura. This was unlike patient people, who were likely content with waiting for the reward from the start, as no changes in VS activity were observed for them.
The most novel finding of the study concerned the anterior prefrontal cortex (aPFC). This is the part of the brain that helps you think about the future. Here, we found that the patient people heightened activity in the aPFC when they first started waiting for they reward, which then decreased as the time to receive the reward approached. Impulsive people didn’t show this brain activity pattern.
“The aPFC appears to allow you to create a mental simulation of the future. It helps you consider what it’ll be like getting the future reward. In this way, you can get access to the utility and satisfaction in the present,” says Braver.
By thinking about the future reward, patient people were able to gain what economists call “anticipatory utility.” While their reward was far away in time, they were giddy with anticipation in the present. Conversely, impulsive people weren’t thinking beyond the present and so did not feel pleasure when they were told they had to wait. Their excitement built only as they got closer to receiving their reward.
Overall this study suggests that people may be impulsive because they do not or cannot imagine the future, so they prefer rewards right away. This research could be useful for assessing the effects of clinical treatments for impulsivity problems, which can lead to issues such as problem gambling and substance abuse disorders. A similar brain imaging approach as was used in the Washington University study could allow clinicians to track the effects of an intervention on changes not only in impulsive behavior but also changes in patients’ brain responses.
“One possible treatment approach could be to enhance mental functions in aPFC, a brain region well-known to be associated with cognitive control,” says Jimura. By increasing cognitive control, impulsive patients could learn to reject their immediate impulses.
Impulsivity occurs not only in a clinical setting but also every day in our own lives. Applying his research to his personal life, Braver says, “When I’m successful at achieving long-term goals it’s from explicitly trying to activate that goal and imagining each decision as helping me achieve it, to keep me on track.” Perhaps adopting this strategy of focusing on the long-term could help us move past present distractions and move toward our future goals.

Avoid impulsive acts by imagining future benefits

Why is it so hard for some people to resist the least little temptation, while others seem to possess incredible patience, passing up immediate gratification for a greater long-term good?

The answer, suggests a new brain imaging study from Washington University in St. Louis, lies in how effective people are at feeling good right now about all the future benefits that may come from passing up a smaller immediate reward. Researchers found that activity in two regions of the brain distinguished impulsive and patient people.

“Activity in one part of the brain, the anterior prefrontal cortex, seems to show whether you’re getting pleasure from thinking about the future reward you are about to receive,” explains study co-author Todd Braver, PhD, professor of psychology in Arts & Sciences. “People can relate to this idea that when you know something good is coming, just that waiting can feel pleasurable.”

The study, which was published in the first issue of the Journal of Neuroscience this year, was designed to examine what happens in the brain as people wait for a reward, especially whether people characterized as “impulsive” would show different brain responses than those considered “patient.”

The lead author of the study was Koji Jimura, then a postdoctoral researcher in Braver’s Cognitive Control and Psychopathology Laboratory, and now a research associate professor at the Tokyo Institute of Technology, in Japan.

Unlike previous research on delayed gratification that had people choose between hypothetical rewards of money over long delays (e.g, $500 now or $1,000 a year from now), this Washington University study presented their participants with real rewards of squirts of juice that they chose to receive either immediately or after a delay of up to a minute.

“It’s kind of funny because we treated the people in our study like researchers that work with animals do, and we actually squirted juice into their mouths,” Braver says.

Results show that a brain region called the ventral striatum (VS) ramped up its activity in impulsive people as they got closer and closer to receiving their delayed reward. The VS activity of patient people, on the other hand, stayed more constant.

The researchers interpreted these different brain responses to mean that impulsive people initially did not find the prospect of waiting for a reward very appealing. However, as they approached the time they’d receive that reward, they became more excited and their VS reflected that excitement.

“This gradual increase may reflect impatience or excessive anticipation of the upcoming reward in impulsive individuals,” says Jimura. This was unlike patient people, who were likely content with waiting for the reward from the start, as no changes in VS activity were observed for them.

The most novel finding of the study concerned the anterior prefrontal cortex (aPFC). This is the part of the brain that helps you think about the future. Here, we found that the patient people heightened activity in the aPFC when they first started waiting for they reward, which then decreased as the time to receive the reward approached. Impulsive people didn’t show this brain activity pattern.

“The aPFC appears to allow you to create a mental simulation of the future. It helps you consider what it’ll be like getting the future reward. In this way, you can get access to the utility and satisfaction in the present,” says Braver.

By thinking about the future reward, patient people were able to gain what economists call “anticipatory utility.” While their reward was far away in time, they were giddy with anticipation in the present. Conversely, impulsive people weren’t thinking beyond the present and so did not feel pleasure when they were told they had to wait. Their excitement built only as they got closer to receiving their reward.

Overall this study suggests that people may be impulsive because they do not or cannot imagine the future, so they prefer rewards right away. This research could be useful for assessing the effects of clinical treatments for impulsivity problems, which can lead to issues such as problem gambling and substance abuse disorders. A similar brain imaging approach as was used in the Washington University study could allow clinicians to track the effects of an intervention on changes not only in impulsive behavior but also changes in patients’ brain responses.

“One possible treatment approach could be to enhance mental functions in aPFC, a brain region well-known to be associated with cognitive control,” says Jimura. By increasing cognitive control, impulsive patients could learn to reject their immediate impulses.

Impulsivity occurs not only in a clinical setting but also every day in our own lives. Applying his research to his personal life, Braver says, “When I’m successful at achieving long-term goals it’s from explicitly trying to activate that goal and imagining each decision as helping me achieve it, to keep me on track.” Perhaps adopting this strategy of focusing on the long-term could help us move past present distractions and move toward our future goals.

Filed under brain brain activity prefrontal cortex impulsivity reward pleasure neuroscience science

191 notes

Either mad and bad or Jekyll and Hyde: media portrayals of schizophrenia
Stigma can take a heavy toll on people who suffer from mental illness. Being shunned, feared, devalued and discriminated against can impair recovery and deepen social isolation and distress. Many sufferers judge stigma to be more difficult to cope with than the symptoms of their illness.
Thankfully, there are grounds for hope. Australian researchers have shown that mental illness stigma, such as the unwillingness to interact with affected people, generally declined from 2003 to 2011. Some credit for this improvement must go to media campaigns by beyondblue and SANE, and to the willingness of many people to speak publicly about experiences that would once have been shamefully private.
The dark cloud inside this silver lining is schizophrenia, a serious condition that impairs thinking, emotion and motivation. While Australians’ attitudes towards depression have become more accepting, the stigma of schizophrenia has remained largely unchanged.
Misusing and misunderstanding
People with schizophrenia are still perceived as dangerous and unpredictable, and these perceptions have increased in recent years. Attitudes to people with schizophrenia have also worsened in the United States at the same time as attitudes to depressed people have improved.
Just as the media can take some credit for the declining stigma of other conditions, it must take some of the blame for the continuing stigma of schizophrenia. Media portrayals commonly associate it with violence and danger.
Schizophrenia is also often misused to refer to split personality or incoherence. This Jekyll-and-Hyde misconception persists despite countless corrections. One study of Italian newspapers, for instance, found that the term was employed in this way almost three times as often it was used correctly to refer to people with the diagnosis or their illness.
But just how negative are current media depictions of schizophrenia? My students and I recently examined this question in a study that we published in the academic journal Psychosis. We located every story published in major national, state and territory online and print news media outlets in the year ending August 2012 that cited schizophrenia or schizophrenic.
We then counted how many stories misused these terms and coded how often the condition was linked to violence or presented in a stigmatising way.
Our results were striking. Almost half (47%) of stories linked schizophrenia to some form of violence, and 28% of these associated it with attempted or completed homicide. The schizophrenic person was identified as a perpetrator of violence six times more frequently than as its victim.
Schizophrenia was misused as a split metaphor in 13% of stories. And fully 46% of stories were coded as stigmatising.
It’s hardly surprising that the public’s views of the condition continue to be laced with fear and loathing if they usually find schizophrenia presented in the context of violent aggression or as a metaphor for internal contradiction.
Better ways
What can be done about all of this? For one thing, journalists and the general public need to become aware that schizophrenia doesn’t mean split personality and it bears no resemblance to caricatures of craziness. This mistaken usage should be retired not because the police say it’s offensive, but because it perpetuates a misunderstanding that hurts real people.
Journalists and editors also need to think carefully before linking schizophrenia to violent behaviour. Often the proposed link is dubious and speculative, and adds nothing important to the story. Just as violence supposedly committed by people experiencing mental illness is over-reported – producing an exaggerated sense of their dangerousness – their victimisation is often under-reported.
An equally important corrective would be to publish more stories that feature people with schizophrenia living well, present their everyday struggles and adversities or showcase promising treatments and research findings.
Coverage can be improved. Our study found that stories from broadsheet newspapers were less stigmatising than tabloid stories, and longer, more developed stories were less stigmatising than briefer ones.
This is not a matter of white-washing the news. People with schizophrenia are indeed at a somewhat increased risk of committing violent offences (and of being their victims). They can behave in challenging ways. But the media landscape that our study surveyed is so tilted towards depicting schizophrenia as dangerous that it’s seriously unbalanced.
The news media can do better and, if they do, the stigma of schizophrenia may start to erode.

Either mad and bad or Jekyll and Hyde: media portrayals of schizophrenia

Stigma can take a heavy toll on people who suffer from mental illness. Being shunned, feared, devalued and discriminated against can impair recovery and deepen social isolation and distress. Many sufferers judge stigma to be more difficult to cope with than the symptoms of their illness.

Thankfully, there are grounds for hope. Australian researchers have shown that mental illness stigma, such as the unwillingness to interact with affected people, generally declined from 2003 to 2011. Some credit for this improvement must go to media campaigns by beyondblue and SANE, and to the willingness of many people to speak publicly about experiences that would once have been shamefully private.

The dark cloud inside this silver lining is schizophrenia, a serious condition that impairs thinking, emotion and motivation. While Australians’ attitudes towards depression have become more accepting, the stigma of schizophrenia has remained largely unchanged.

Misusing and misunderstanding

People with schizophrenia are still perceived as dangerous and unpredictable, and these perceptions have increased in recent years. Attitudes to people with schizophrenia have also worsened in the United States at the same time as attitudes to depressed people have improved.

Just as the media can take some credit for the declining stigma of other conditions, it must take some of the blame for the continuing stigma of schizophrenia. Media portrayals commonly associate it with violence and danger.

Schizophrenia is also often misused to refer to split personality or incoherence. This Jekyll-and-Hyde misconception persists despite countless corrections. One study of Italian newspapers, for instance, found that the term was employed in this way almost three times as often it was used correctly to refer to people with the diagnosis or their illness.

But just how negative are current media depictions of schizophrenia? My students and I recently examined this question in a study that we published in the academic journal Psychosis. We located every story published in major national, state and territory online and print news media outlets in the year ending August 2012 that cited schizophrenia or schizophrenic.

We then counted how many stories misused these terms and coded how often the condition was linked to violence or presented in a stigmatising way.

Our results were striking. Almost half (47%) of stories linked schizophrenia to some form of violence, and 28% of these associated it with attempted or completed homicide. The schizophrenic person was identified as a perpetrator of violence six times more frequently than as its victim.

Schizophrenia was misused as a split metaphor in 13% of stories. And fully 46% of stories were coded as stigmatising.

It’s hardly surprising that the public’s views of the condition continue to be laced with fear and loathing if they usually find schizophrenia presented in the context of violent aggression or as a metaphor for internal contradiction.

Better ways

What can be done about all of this? For one thing, journalists and the general public need to become aware that schizophrenia doesn’t mean split personality and it bears no resemblance to caricatures of craziness. This mistaken usage should be retired not because the police say it’s offensive, but because it perpetuates a misunderstanding that hurts real people.

Journalists and editors also need to think carefully before linking schizophrenia to violent behaviour. Often the proposed link is dubious and speculative, and adds nothing important to the story. Just as violence supposedly committed by people experiencing mental illness is over-reported – producing an exaggerated sense of their dangerousness – their victimisation is often under-reported.

An equally important corrective would be to publish more stories that feature people with schizophrenia living well, present their everyday struggles and adversities or showcase promising treatments and research findings.

Coverage can be improved. Our study found that stories from broadsheet newspapers were less stigmatising than tabloid stories, and longer, more developed stories were less stigmatising than briefer ones.

This is not a matter of white-washing the news. People with schizophrenia are indeed at a somewhat increased risk of committing violent offences (and of being their victims). They can behave in challenging ways. But the media landscape that our study surveyed is so tilted towards depicting schizophrenia as dangerous that it’s seriously unbalanced.

The news media can do better and, if they do, the stigma of schizophrenia may start to erode.

Filed under schizophrenia mental illness stigma society media psychology neuroscience

88 notes

Brain cell signal network genes linked to schizophrenia risk in families
New genetic factors that predispose to schizophrenia have been uncovered in five families with several affected relatives. The psychiatric disorder can disrupt thinking, feeling, and acting, and blur the border between reality and imagination.
Dr. Debby W. Tsuang, professor of psychiatry and behavioral sciences, and Dr. Marshall S. Horwitz, professor of pathology, both at the University of Washington in Seattle, led the multi-institutional study. Tsuang is also a staff physician at the Puget Sound Veterans Administration Health Care System.
The results are published in the April 3 online edition of the JAMA Psychiatry.
Loss of brain nerve cell integrity occurs in schizophrenia, but scientists have not worked out the details of when and how this happens. In all five families in the present study, the researchers found rare variants in genes tied to the networking of certain signal receptors on nerve cells distributed throughout the brain. These N-methyl-D-aspartate, or NMDA, receptors are widespread molecular control towers in the brain. They regulate the release of chemical messages that influence the strength of brain cell connections and the ongoing remodeling of the networks.
These receptors respond to glutamate, one of the most common nerve-signaling chemicals in the brain, and they are also found on brain circuits that manage dopamine release. Dopamine is a nerve signal associated with reward-seeking, movement and emotions. Deficits in glutamate and dopamine function have both been implicated in schizophrenia but most of the medications that have been developed to treat schizophrenia have targeted dopamine receptors.
Tsuang and her groups’ discovery of gene variations that disturb N-methyl-D-aspartate receptor networking functions supports the hypothesis that decreased NMDA receptor-mediated nerve-signal transmissions contributes to some cases of schizophrenia.
Tsuang pointed out that several hallucinogenic drugs, such as ketamine and phencyclidine (PCP, or angel dust), block N-methyl-D-aspartate receptors and can produce symptoms similar to schizophrenia. These are the strongest evidence implicating these receptors in schizophrenia. The drugs sometimes induce psychosis and terrifying sensory detachment. Reports of such effects in recreational drug users fingered faulty NMDA receptor networks as suspects in schizophrenia.
In all five of their study families, Tsuang’s team detected rare protein-altering variants in one of three genes involved with the N-methyl-D-aspartate receptor network. One of the genes, GRM5, is directly linked with glutamate signaling. In the other two genes, the links are indirect and connected through other proteins synthesized in brain cells. One of these proteins, PPEF2, appears to affect the levels of certain brain nerve-cell signaling mediators, and the other altered protein, LRP1B, may compete with a normal protein for a binding spot on a subunit of the NMDA receptor.
These discoveries provide additional clues to the molecular disarray that might occur in the brain nerve cells of some patients with schizophrenia, and suggest new targets for therapy for certain patients. In a disease occurring in about 1 percent of the population, the picture of how and why schizophrenia arises in all these people is far from complete.
“Disorders like schizophrenia are likely to have many underlying causes,” Tsuang noted. She added that it might eventually make sense to divide schizophrenia into categories based, for example, on which biochemical pathways in the brain are disrupted. Treatments might be developed to correct the exact malfunctioning mechanisms underlying various forms of the disease.
Tsuang gave an example: Agents that stimulate N-methyl-D-aspartate receptor-mediated nerve-signal transmissions include glycine-site blockers and glycine-transport inhibitors have shown some encouraging results in pre-clinical drug trials, but mostly in adjunctive treatment in addition to standard antipsychotic therapy.
“But perhaps the data we have generated will help pharmaceutical companies target specific subunits of the NMDA receptors and pathways,” Tsuang said. She added, however, that effective treatments may lag by many years after these kinds of discoveries. Someday it may make sense to initiate such treatments in people at high genetic risk when early symptoms, such as apathy and lack of motivation, appear, and before brain dysfunction is severe.
Also, possessing the newly discovered gene mutations does not always mean that a person will become schizophrenic. In the recent family study, three of the five families had relatives with the protein-altering variants who did not have schizophrenia.
“This isn’t surprising,” Tsuang observed, “Given that schizophrenia is such a complex disorder, we would expect that not everyone who carries the variants would develop the disease.” In the future, researchers will be seeking what triggers the gene variants into causing problems, other mutations within affected individuals’ genetic profile that might promote or protect against disease, as well as non-genetic factors in the onset of the illness in genetically susceptible people.
The researchers also utilized a strategy and selected more distant relatives of affected individuals for genetic sequencing. Distant kin share, a smaller proportion of genes compared to closely related family members. For example,siblings typically on the average share about 50 percent of their genes whereas cousins on the average share 12.5 percent of their genes. The researhers also hypothesized that the causative mutation within each family would be the same variant.
This strategy helped the researchers decrease the number of genetic variants that were detected by sequencing and thereby concentrate only on the remaining strongest candidates. The researchers also filtered their results against the many publicly available sequencing databases. This allowed them to pick out genetic variants not seen in individuals without psychiatric illness.
According to Tsuang, the research team was excited by recent advances in technology enabled them to uncover unknown, rare genetic variants not previously found in large populations without psychiatric condition. The ability to rapidly sequence only those portions of the genome that code for proteins made this experiment possible.
The next step for the researchers will be to screen for the newly discovered genetic variants in a large sample of unrelated cases of schizophrenia compared to controls. They want to determine if the variants are statistically associated with the disease.

Brain cell signal network genes linked to schizophrenia risk in families

New genetic factors that predispose to schizophrenia have been uncovered in five families with several affected relatives. The psychiatric disorder can disrupt thinking, feeling, and acting, and blur the border between reality and imagination.

Dr. Debby W. Tsuang, professor of psychiatry and behavioral sciences, and Dr. Marshall S. Horwitz, professor of pathology, both at the University of Washington in Seattle, led the multi-institutional study. Tsuang is also a staff physician at the Puget Sound Veterans Administration Health Care System.

The results are published in the April 3 online edition of the JAMA Psychiatry.

Loss of brain nerve cell integrity occurs in schizophrenia, but scientists have not worked out the details of when and how this happens. In all five families in the present study, the researchers found rare variants in genes tied to the networking of certain signal receptors on nerve cells distributed throughout the brain. These N-methyl-D-aspartate, or NMDA, receptors are widespread molecular control towers in the brain. They regulate the release of chemical messages that influence the strength of brain cell connections and the ongoing remodeling of the networks.

These receptors respond to glutamate, one of the most common nerve-signaling chemicals in the brain, and they are also found on brain circuits that manage dopamine release. Dopamine is a nerve signal associated with reward-seeking, movement and emotions. Deficits in glutamate and dopamine function have both been implicated in schizophrenia but most of the medications that have been developed to treat schizophrenia have targeted dopamine receptors.

Tsuang and her groups’ discovery of gene variations that disturb N-methyl-D-aspartate receptor networking functions supports the hypothesis that decreased NMDA receptor-mediated nerve-signal transmissions contributes to some cases of schizophrenia.

Tsuang pointed out that several hallucinogenic drugs, such as ketamine and phencyclidine (PCP, or angel dust), block N-methyl-D-aspartate receptors and can produce symptoms similar to schizophrenia. These are the strongest evidence implicating these receptors in schizophrenia. The drugs sometimes induce psychosis and terrifying sensory detachment. Reports of such effects in recreational drug users fingered faulty NMDA receptor networks as suspects in schizophrenia.

In all five of their study families, Tsuang’s team detected rare protein-altering variants in one of three genes involved with the N-methyl-D-aspartate receptor network. One of the genes, GRM5, is directly linked with glutamate signaling. In the other two genes, the links are indirect and connected through other proteins synthesized in brain cells. One of these proteins, PPEF2, appears to affect the levels of certain brain nerve-cell signaling mediators, and the other altered protein, LRP1B, may compete with a normal protein for a binding spot on a subunit of the NMDA receptor.

These discoveries provide additional clues to the molecular disarray that might occur in the brain nerve cells of some patients with schizophrenia, and suggest new targets for therapy for certain patients. In a disease occurring in about 1 percent of the population, the picture of how and why schizophrenia arises in all these people is far from complete.

“Disorders like schizophrenia are likely to have many underlying causes,” Tsuang noted. She added that it might eventually make sense to divide schizophrenia into categories based, for example, on which biochemical pathways in the brain are disrupted. Treatments might be developed to correct the exact malfunctioning mechanisms underlying various forms of the disease.

Tsuang gave an example: Agents that stimulate N-methyl-D-aspartate receptor-mediated nerve-signal transmissions include glycine-site blockers and glycine-transport inhibitors have shown some encouraging results in pre-clinical drug trials, but mostly in adjunctive treatment in addition to standard antipsychotic therapy.

“But perhaps the data we have generated will help pharmaceutical companies target specific subunits of the NMDA receptors and pathways,” Tsuang said. She added, however, that effective treatments may lag by many years after these kinds of discoveries. Someday it may make sense to initiate such treatments in people at high genetic risk when early symptoms, such as apathy and lack of motivation, appear, and before brain dysfunction is severe.

Also, possessing the newly discovered gene mutations does not always mean that a person will become schizophrenic. In the recent family study, three of the five families had relatives with the protein-altering variants who did not have schizophrenia.

“This isn’t surprising,” Tsuang observed, “Given that schizophrenia is such a complex disorder, we would expect that not everyone who carries the variants would develop the disease.” In the future, researchers will be seeking what triggers the gene variants into causing problems, other mutations within affected individuals’ genetic profile that might promote or protect against disease, as well as non-genetic factors in the onset of the illness in genetically susceptible people.

The researchers also utilized a strategy and selected more distant relatives of affected individuals for genetic sequencing. Distant kin share, a smaller proportion of genes compared to closely related family members. For example,siblings typically on the average share about 50 percent of their genes whereas cousins on the average share 12.5 percent of their genes. The researhers also hypothesized that the causative mutation within each family would be the same variant.

This strategy helped the researchers decrease the number of genetic variants that were detected by sequencing and thereby concentrate only on the remaining strongest candidates. The researchers also filtered their results against the many publicly available sequencing databases. This allowed them to pick out genetic variants not seen in individuals without psychiatric illness.

According to Tsuang, the research team was excited by recent advances in technology enabled them to uncover unknown, rare genetic variants not previously found in large populations without psychiatric condition. The ability to rapidly sequence only those portions of the genome that code for proteins made this experiment possible.

The next step for the researchers will be to screen for the newly discovered genetic variants in a large sample of unrelated cases of schizophrenia compared to controls. They want to determine if the variants are statistically associated with the disease.

Filed under schizophrenia nerve cells signal receptors NMDA receptors glutamate dopamine genetics neuroscience science

277 notes

FYI: Do Lobotomies Work?
Surprisingly, yes.
The modern lobotomy originated in the 1930s, when doctors realized that by severing fiber tracts connected to the frontal lobe, they could help patients overcome certain psychiatric problems, such as intractable depression and anxiety. Over the next two decades, the procedure would become simple and popular, completed by poking a sharpened tool above the eyeball. According to one study, about two thirds of patients showed improvement after surgery.
Unfortunately, not all lobotomy practition-ers were responsible, and the technique left some patients with severe side effects, including seizures, lethargy, changes in personality, and incontinence. In response, doctors refined their techniques. They replaced the lobotomy with more specialized approaches: the cingulotomy, the anterior capsulotomy, and the subcaudate tractotomy. Studies of these procedures found evidence of benefit for at least one fourth of patients suffering from problems such as OCD and depression.
Even with the risk of side effects, those in the field still say the procedures were by and large successful. “I feel that the principle behind ablative surgery was somewhat exonerated by the research findings, which showed that it worked for very specific indications,” says Konstantin Slavin, president of the American Society for Stereotactic and Functional Neurosurgery, and professor at the Uni­versity of Illinois at Chicago.
By the 1980s, lobotomies had fallen out of fashion. “In general, the entire functional neurosurgery field moved away from destruction—from ablative surgery,” Slavin says. A then-new technique called deep-brain stimulation made ablative surgery obsolete. In the procedure, a surgeon drills holes in the head and inserts electrodes into the neural tissue. When current passes through the leads, they activate or inactivate patches of the brain. “The attractive part is that we don’t destroy the tissue,” Slavin says. Doctors can also adjust treatment if a patient suffers side effects. They can turn the current down or suspend it altogether—so as to “give the brain a holiday,” as Slavin calls it.
Most deep-brain stimulation is now used to treat movement disorders such as Parkinson’s Disease. The surgical treatment of patients with OCD is FDA-approved but reserved only for extreme cases. Slavin and his colleagues have been examining broader uses in an ongoing study. “Within the next five years, we hope we’ll have a definitive answer of whether or not it works.”

FYI: Do Lobotomies Work?

Surprisingly, yes.

The modern lobotomy originated in the 1930s, when doctors realized that by severing fiber tracts connected to the frontal lobe, they could help patients overcome certain psychiatric problems, such as intractable depression and anxiety. Over the next two decades, the procedure would become simple and popular, completed by poking a sharpened tool above the eyeball. According to one study, about two thirds of patients showed improvement after surgery.

Unfortunately, not all lobotomy practition-ers were responsible, and the technique left some patients with severe side effects, including seizures, lethargy, changes in personality, and incontinence. In response, doctors refined their techniques. They replaced the lobotomy with more specialized approaches: the cingulotomy, the anterior capsulotomy, and the subcaudate tractotomy. Studies of these procedures found evidence of benefit for at least one fourth of patients suffering from problems such as OCD and depression.

Even with the risk of side effects, those in the field still say the procedures were by and large successful. “I feel that the principle behind ablative surgery was somewhat exonerated by the research findings, which showed that it worked for very specific indications,” says Konstantin Slavin, president of the American Society for Stereotactic and Functional Neurosurgery, and professor at the Uni­versity of Illinois at Chicago.

By the 1980s, lobotomies had fallen out of fashion. “In general, the entire functional neurosurgery field moved away from destruction—from ablative surgery,” Slavin says. A then-new technique called deep-brain stimulation made ablative surgery obsolete. In the procedure, a surgeon drills holes in the head and inserts electrodes into the neural tissue. When current passes through the leads, they activate or inactivate patches of the brain. “The attractive part is that we don’t destroy the tissue,” Slavin says. Doctors can also adjust treatment if a patient suffers side effects. They can turn the current down or suspend it altogether—so as to “give the brain a holiday,” as Slavin calls it.

Most deep-brain stimulation is now used to treat movement disorders such as Parkinson’s Disease. The surgical treatment of patients with OCD is FDA-approved but reserved only for extreme cases. Slavin and his colleagues have been examining broader uses in an ongoing study. “Within the next five years, we hope we’ll have a definitive answer of whether or not it works.”

Filed under brain mental illness psychiatric disorders lobotomy deep brain stimulation neurology neuroscience

184 notes

Hallucinations of musical notation: new paper for neurology journal Brain by Oliver Sacks
Professor of neurology, physician, and author Oliver Sacks M.D. has outlined case studies of hallucinations of musical notation, and commented on the neural basis of such hallucinations, in a new paper for the neurology journal Brain.
In this paper, Dr Sacks is building on work done by Dominic ffytche et al in 2000, which delineates more than a dozen types of hallucinations, particularly in relation to people with Charles Bonnet syndrome (a condition that causes patients with visual loss to have complex visual hallucinations). While ffytche believes that hallucinations of musical notation are rarer than some other types of visual hallucination, Sacks says that his own experience is different.
“Perhaps because I have investigated various musical syndromes,” writes Dr Sacks, “and people often write to me about these… I have seen or corresponded with a dozen or more people whose hallucinations include – and sometimes consist exclusively of – musical notation.”
Sacks goes on to detail eight fascinating case studies of people who have reported experiencing hallucinations of musical notation, including:
A 77 year old woman with glaucoma who wrote of her “musical eyes”. She saw “music, lines, spaces, notes, clefs – in fact written music on everything [she] looked at.”
A surgeon and pianist suffering from macular degeneration, who saw unreadable and unplayable music on a white background.
A Sanskrit scholar who developed Parkinson’s disease in his 60s and later reported hallucinating ornately-written music, occurring with a Sanskrit script. “Despite the exotic nature of the script the result is still western music,” he said.
A woman who reported seeing musical notation on her ceiling upon waking in the morning.
A woman who said she wasn’t a musician, but would hallucinate when she had high fevers as a child. She said that the notes were “angry, and [she] felt unease. The lines and notes were out of control and at times in a ball.”
It is striking that, of Dr Sacks’ eight case studies, seven were gifted musicians. Sacks comments, “This is perhaps a coincidence, but it makes one wonder whether there is something about musical scores that is radically different from verbal texts.” Musical scores are far more visually complex than standard (English) text, with not just a variety of notes, but also many symbols that indicate how the notes should be played.
Dr Sacks also says that he has a mild form of Charles Bonnet syndrome himself, in which he sees a variety of simple forms whenever he gazes at a blank surface. “When I recently returned to playing the piano and to studying scores minutely, I began to ‘see’ showers of flat signs along with the letters and runes on blank surfaces.”
Another striking feature of these hallucinations is that – like text hallucinations – they are generally unreadable. They can seem playable at first, but on closer inspection it transpires that the music is often nonsensical or impossible to play, such as an example reported in one of the case studies: a melody line three or more octaves above middle C, and so may have half a dozen or more ledger lines above the treble staff.
Usually, the early visual system analyses forms and sends the information it has extracted to higher areas, where it gains coherence and meaning. Normally, in the act of perception, the entire visual system is engaged. Paradoxically, according to Sacks, “one may have to study disorders of the visual system to see how complex perceptual and cognitive processes are analysed and delegated to different levels… and hallucinations of musical notation can provide a very rich field of study here.”

Hallucinations of musical notation: new paper for neurology journal Brain by Oliver Sacks

Professor of neurology, physician, and author Oliver Sacks M.D. has outlined case studies of hallucinations of musical notation, and commented on the neural basis of such hallucinations, in a new paper for the neurology journal Brain.

In this paper, Dr Sacks is building on work done by Dominic ffytche et al in 2000, which delineates more than a dozen types of hallucinations, particularly in relation to people with Charles Bonnet syndrome (a condition that causes patients with visual loss to have complex visual hallucinations). While ffytche believes that hallucinations of musical notation are rarer than some other types of visual hallucination, Sacks says that his own experience is different.

“Perhaps because I have investigated various musical syndromes,” writes Dr Sacks, “and people often write to me about these… I have seen or corresponded with a dozen or more people whose hallucinations include – and sometimes consist exclusively of – musical notation.”

Sacks goes on to detail eight fascinating case studies of people who have reported experiencing hallucinations of musical notation, including:

  • A 77 year old woman with glaucoma who wrote of her “musical eyes”. She saw “music, lines, spaces, notes, clefs – in fact written music on everything [she] looked at.”
  • A surgeon and pianist suffering from macular degeneration, who saw unreadable and unplayable music on a white background.
  • A Sanskrit scholar who developed Parkinson’s disease in his 60s and later reported hallucinating ornately-written music, occurring with a Sanskrit script. “Despite the exotic nature of the script the result is still western music,” he said.
  • A woman who reported seeing musical notation on her ceiling upon waking in the morning.
  • A woman who said she wasn’t a musician, but would hallucinate when she had high fevers as a child. She said that the notes were “angry, and [she] felt unease. The lines and notes were out of control and at times in a ball.”

It is striking that, of Dr Sacks’ eight case studies, seven were gifted musicians. Sacks comments, “This is perhaps a coincidence, but it makes one wonder whether there is something about musical scores that is radically different from verbal texts.” Musical scores are far more visually complex than standard (English) text, with not just a variety of notes, but also many symbols that indicate how the notes should be played.

Dr Sacks also says that he has a mild form of Charles Bonnet syndrome himself, in which he sees a variety of simple forms whenever he gazes at a blank surface. “When I recently returned to playing the piano and to studying scores minutely, I began to ‘see’ showers of flat signs along with the letters and runes on blank surfaces.”

Another striking feature of these hallucinations is that – like text hallucinations – they are generally unreadable. They can seem playable at first, but on closer inspection it transpires that the music is often nonsensical or impossible to play, such as an example reported in one of the case studies: a melody line three or more octaves above middle C, and so may have half a dozen or more ledger lines above the treble staff.

Usually, the early visual system analyses forms and sends the information it has extracted to higher areas, where it gains coherence and meaning. Normally, in the act of perception, the entire visual system is engaged. Paradoxically, according to Sacks, “one may have to study disorders of the visual system to see how complex perceptual and cognitive processes are analysed and delegated to different levels… and hallucinations of musical notation can provide a very rich field of study here.”

Filed under hallucinations music musical notation Charles Bonnet syndrome Oliver Sacks visual system neurology neuroscience science

free counters