Posts tagged neuroscience

Posts tagged neuroscience
One of the most controversial topics in neurology today is the prevalence of serious permanent brain damage after traumatic brain injury (TBI). Long-term studies and a search for genetic risk factors are required in order to predict an individual’s risk for serious permanent brain damage, according to a review article published by Sam Gandy, MD, PhD, from the Icahn School of Medicine at Mount Sinai in a special issue of Nature Reviews Neurology dedicated to TBI.
About one percent of the population in the developed world has experienced TBI, which can cause serious long-term complications such as Alzheimer’s disease (AD) or chronic traumatic encephalopathy (CTE), which is marked by neuropsychiatric features such as dementia, Parkinson’s disease, depression, and aggression. Patients may be normal for decades after the TBI event before they develop AD or CTE. Although first described in boxers in the 1920s, the association of CTE with battlefield exposure and sports, such as football and hockey, has only recently begun to attract public attention.
"Athletes such as David Duerson and Junior Seau have brought to light the need for preventive measures and early diagnosis of CTE, but it remains highly controversial because hard data are not available that enable prediction of the prevalence, incidence, and individual risk for CTE," said Dr. Gandy, who is Professor of Neurology and Psychiatry and Director of the Center for Cognitive Health at Mount Sinai. "We need much more in the way of hard facts before we can advise the public of the proper level of concern."
Led by Dr. Gandy, the authors evaluated the pathological impact of single-incident TBI, such as that sustained during military combat; and mild, repetitive TBI, as seen in boxers and National Football League (NFL) players to learn what measures need to be taken to identify risk and incidence early and reduce long-term complications.
Mild, repetitive TBI, as is seen in boxers, football players, and occasionally military veterans who suffer multiple blows to the head, is most often associated with CTE, or a condition called “boxer’s dementia.” Boxing scoring includes a record of knockouts, providing researchers with a starting point in interpreting an athlete’s risk. But no such records exist for NFL players or soldiers on the battlefield.
Dr. Gandy and the authors of the Nature Reviews Neurology piece suggest recruiting large cohorts of players and military veterans in multi-center trials, where players and soldiers maintain a TBI diary for the duration of their lives. The researchers also suggest a genome-wide association study to clearly identify risk factors of CTE. “Confirmed biomarkers of risk, diagnostic tools, and long-term trials are needed to fully characterize this disease and develop prevention and treatment strategies,” said Dr. Gandy.
Amyloid imaging, which has recently been approved by the U.S. Food and Drug Administration, may be useful as a monitoring tool in TBI, since amyloid plaques are a hallmark symptom of AD-type neurodegeneration. Amyloid imaging consists of a PET scan with an injection of a contrast agent called florbetapir, which binds to amyloid plaque in the brain, allowing researchers to visualize plaque deposits and determine whether the diagnosis is CTE or AD, and monitor progression over time. Tangle imaging is expected to be available soon, complementing amyloid imaging and providing an affirmative diagnosis of CTE. Dr. Gandy and colleagues recently reported the use of amyloid imaging to exclude AD in a retired NFL player with memory problems under their care at Mount Sinai.
Clinical diagnosis and evaluation of mild, repetitive TBI is a challenge, indicating a significant need for new biomarkers to identify damage, report the authors. Measuring cerebrospinal fluid (CSF) may reflect damage done to neurons post-TBI. Previous research has identified a marked increase in CSF biomarkers in boxers when the CSF is taken soon after a fight, and this may predict which boxers are more likely to develop detrimental long-term effects. CSF samples are now only obtained by invasive lumbar puncture; a blood test would be preferable.
"Biomarkers would be a valuable tool both from a research perspective in comparing them before and after injury and from a clinical perspective in terms of diagnostic and prognostic guidance," said Dr. Gandy. "Having the biomarker information will also help us understand the mechanism of disease development, the reasons for its delayed progression, and the pathway toward effective therapeutic interventions."
Currently, there are no treatments for boxer’s dementia or CTE, but these diseases are preventable. “With more protective equipment, adjustments in the rules of the game, and overall education among athletes, coaches, and parents, we should be able to offer informed consent to prospective sports players and soldiers. With the right combination of identified genetic risk factor, biomarkers, and better drugs, we should be able to dramatically improve the outcome of TBI and prevent the long-term, devastating effects of CTE,” said Dr. Gandy.
(Source: mountsinai.org)
People in their 20s don’t have much on their middle-aged counterparts when it comes to some fine motor movements, researchers from UT Arlington have found.
In a simple finger-tapping exercise, study participants’ speed declined only slightly with age until a marked drop in ability with participants in their mid-60s.

Priscila Caçola, an assistant professor of kinesiology at The University of Texas at Arlington, hopes the new work will help clinicians identify abnormal loss of function in their patients. Though motor ability in older adults has been studied widely, not a lot of research has focused on when deficits begin, she said.
The journal Brain and Cognition will include the study in its June 2013 issue. It is already available online.
“We have this so-called age decline, everybody knows that. I wanted to see if that was a gradual process,” Caçola said. “It’s good news really because I didn’t see differences between the young and middle-aged people.”
Caçola’s co-authors on the paper are Jerroed Roberson, a senior kinesiology major at UT Arlington, and Carl Gabbard, a professor in the Texas A&M University Department of Health and Kinesiology.
The researchers based their work on the idea that before movements are made, the brain makes a mental plan. They used an evaluation process called chronometry that compares the time of test participants’ imagined movements to actual movements. Study participants – 99 people ranging in age from 18 to 93 – were asked to imagine and perform a series of increasingly difficult, ordered finger movements. They were divided into three age groups – 18-32, 40-63 and 65-93 – and the results were analyzed.
“What we found is that there is a significant drop-off after the age of 64,” Roberson said. “So if you see a drop-off in ability before that, then it could be a signal that there might be something wrong with that person and they might need further evaluation.”
The researchers also noted that the speed of imagined movements and executed actions tended to be closely associated within each group. That also could be useful knowledge for clinicians, the study said.
“The important message here is that clinicians should be aware that healthy older adults are slower than younger adults, but are able to create relatively accurate internal models for action,” the study said.
Caçola is a member of UT Arlington Center for Health Living and Longevity. She has published previous research on the links between movement representation and motor ability in children.
Scientists discover how brains change with new skills
The phrase “practice makes perfect” has a neural basis in the brain. Researchers have discovered a set of common changes in the brain upon learning a new skill. They have essentially detected a neural marker for the reorganization the brain undergoes when a person practices and become proficient at a task.
Successful training not only prompts skill-specific changes in the brain, but also more global changes that are consistent across many different types of skills training, the researchers report in the journal Neurorehabilitation and Neural Repair. Their results indicate that as you become more adept at a skill, your brain no longer needs to work as hard at it. The brain, they report, shifts from more controlled to more automatic processing as a skill is learned, regardless of the specific type of training, they said.
“The training-related changes we found – that signify a shift to a more ‘efficient’ configuration of brain networks – provide a potential new brain marker for training effectiveness,” said neuroscientist Nathan Spreng, assistant professor of human development and the Rebecca Q. and James C. Morgan Sesquicentennial Faculty Fellow in Cornell’s College of Human Ecology. “Such neural markers are increasingly being used to inform the design of new or more-targeted interventions to improve cognitive and motor functioning in aging, brain injury or disease,” he added.
The study is the most comprehensive review of the neural correlates of training to date and the first to associate training with alterations in large-scale brain networks, said Spreng, who was awarded the distinction of “rising star” in March by the Association for Psychological Science.
The researchers conducted a systematic meta-analysis of 38 neuroimaging studies of cognitive and motor skills training interventions in healthy young adults – more than 500 participants in all. Using a quantitative literature review method, they analyzed functional neuroimaging data and mapped the patterns of brain activity changes before and after the training across the individual experiments.
The researchers found that the brain regions that are involved in attention-demanding activities are less active after training compared with before, whereas the brain regions that typically are at rest (known as the default network), became more active.
Specifically, training resulted in decreased activity in brain regions involved in effortful control and attention that closely overlap with the frontoparietal control and dorsal attention networks. Increased activity was found after training, however, in the default network that is involved in self-reflective activities, including future planning or even day dreaming. Thus, skill mastery is associated with increased activity in areas not engaged in skill performance, and this shift can be detected in the large-scale networks of the brain.
“The power of meta-analysis methods to systematically and quantitatively review neuroimaging studies makes possible discoveries such as ours that can provide new insights into how the brain functions; this helps us lay the foundation for better treatments of brain disorders in the future,” said Spreng.
“There have now been over 100,000 neuroimaging papers published, so these types of meta-analytic reviews offer new opportunities to identify common patterns of brain activity across a larger and more diverse array of studies,” he added.
(Image: iStockphoto)

Researchers identify new vision of how we explore our world
Brain researchers at Barrow Neurological Institute have discovered that we explore the world with our eyes in a different way than previously thought. Their results advance our understanding of how healthy observers and neurological patients interact and glean critical information from the world around them.
The research team was led by Dr. Susana Martinez-Conde, Director of the Laboratory of Visual Neuroscience at Barrow, in collaboration with fellow Barrow Neurological Institute researchers Jorge Otero-Millan, Rachel Langston, and Dr. Stephen Macknik, Director of the Laboratory of Behavioral Neurophysiology. The study, titled “An oculomotor continuum from exploration to fixation”, was published in the Proceedings of the National Academy of Sciences.
Previously, scientists thought that we sample visual information from the world in two main different modes: exploration and fixation. “We used to think that we make large eye movements to search for objects of interest, and then fix our gaze to see them with high detail,” says Martinez-Conde. “But now we know that’s not quite right.”
The discovery shows that even during visual fixation, we are actually scanning visual details with small eye movements — just like we explore visual scenes with big eye movements, but on a smaller scale. This means that exploration and fixation are two ends of the same continuum of oculomotor scanning.
Subjects viewed natural images while the team measured their eye movements with high-speed eye tracking. The images could range in size from the massive, presented on a room-sized video monitor in the Barrow Neurological Institute’s Eller Telepresence Room, normally used for Barrow’s surgeons to collaborate in brain surgeries with colleagues around the world, to images that are just half the width of your thumb nail.
In all cases, the researchers found that subjects’ eyes scanned the scenes with the same general strategy, along a smooth continuum of dynamical changes. “There was no abrupt change in the characteristics of the eye movements, whether the visual scenes were huge or tiny, or even when the subjects were fixing their gaze. That means that the brain controls eye movements in the same way when we explore and when we fixate,” said Dr. Martinez-Conde.
Scientists have studied how the brain controls eye movements for over 100 years, and the idea —challenged here—that fixation and exploration are fundamentally different behaviors has been central to the field. This new perspective will affect future research and bring focus to the study of neurological diseases that impact oculomotor behavior.
(Image: Getty Images)
Breakthrough in neuroscience could help re-wire appetite control
Researchers at the University of East Anglia (UEA) have made a discovery in neuroscience that could offer a long-lasting solution to eating disorders such as obesity.
It was previously thought that the nerve cells in the brain associated with appetite regulation were generated entirely during an embryo’s development in the womb and therefore their numbers were fixed for life.
But research published today in the Journal of Neuroscience has identified a population of stem cells capable of generating new appetite-regulating neurons in the brains of young and adult rodents.
Obesity has reached epidemic proportions globally. More than 1.4 billion adults worldwide are overweight and more than half a billion are obese. Associated health problems include type 2 diabetes, heart disease, arthritis and cancer. And at least 2.8 million people die each year as a result of being overweight or obese.
The economic burden on the NHS in the UK is estimated to be more than £5 billion annually. In the US, the healthcare cost tops $60 billion.
Scientists at UEA investigated the hypothalamus section of the brain – which regulates sleep and wake cycles, energy expenditure, appetite, thirst, hormone release and many other critical biological functions. The study looked specifically at the nerve cells that regulate appetite.
The researchers used ‘genetic fate mapping’ techniques to make their discovery – a method that tracks the development of stem cells and cells derived from them, at desired time points during the life of an animal.
They established that a population of brain cells called ‘tanycytes’ behave like stem cells and add new neurons to the appetite-regulating circuitry of the mouse brain after birth and into adulthood.
Lead researcher Dr Mohammad K. Hajihosseini, from UEA’s school of Biological Sciences, said: “Unlike dieting, translation of this discovery could eventually offer a permanent solution for tackling obesity.
“Loss or malfunctioning of neurons in the hypothalamus is the prime cause of eating disorders such as obesity.
“Until recently we thought that all of these nerve cells were generated during the embryonic period and so the circuitry that controls appetite was fixed.
“But this study has shown that the neural circuitry that controls appetite is not fixed in number and could possibly be manipulated numerically to tackle eating disorders.
“The next step is to define the group of genes and cellular processes that regulate the behaviour and activity of tanycytes. This information will further our understanding of brain stem cells and could be exploited to develop drugs that can modulate the number or functioning of appetite-regulating neurons.
“Our long-term goal of course is to translate this work to humans, which could take up to five or 10 years. It could lead to a permanent intervention in infancy for those predisposed to obesity, or later in life as the disease becomes apparent.”
A Sleep Aid Without the Side Effects
Insomniacs desperate for some zzzs may one day have a safer way to get them. Scientists have developed a new sleep medication that has induced sleep in rodents and monkeys without apparently impairing cognition, a potentially dangerous side effect of common sleep aids. The discovery, which originated in work explaining narcolepsy, could lead to a new class of drugs that help people who don’t respond to other treatments.
Between 10% and 15% of Americans chronically struggle with getting to or staying asleep. Many of them turn to sleeping pills for relief, and most are prescribed drugs, such as zolpidem (Ambien) and eszopiclone (Lunesta), that slow down the brain by binding to receptors for GABA, a neurotransmitter that’s involved in mood, cognition, and muscle tone. But because the drugs target GABA indiscriminately, they can also impair cognition, causing amnesia, confusion, and other problems with learning and memory, along with a number of strange sleepwalking behaviors, including wandering, eating, and driving while asleep. This has led many researchers to seek out alternative mechanisms for inducing sleep.
Neuroscientist Jason Uslaner of Merck Research Laboratories in West Point, Pennsylvania, and colleagues decided to tap into the brain’s orexin system. Orexin (also known as hypocretin) is a protein that controls wakefulness and is missing in people with narcolepsy. Past studies successfully induced sleep by inhibiting orexin, but had not looked into its effects on cognition. The researchers developed a new orexin-inhibiting compound called DORA-22 and confirmed that it could induce sleep in rats and rhesus monkeys as effectively as the GABA-modulating drugs.
Then the researchers went about testing the drugs’ effects on the animals’ cognition. They measured the rats’ cognition and memory by assessing the rodents’ ability to recognize objects. They presented the rats with a new object—say, a cone or a sphere—that the rats then sniffed and explored. Then they took the object away for an hour. After that hour, the rats were exposed to a new object and the one they’d already gotten to know; if the rats remembered, they spent less time checking out the familiar object. With the primates, Uslaner’s team tested their ability to match colors on a touchscreen and to pay attention to and identify the origin of a flashing light. In all the cases, the researchers found the GABA-modulating sleeping pills caused both the rats and the primates to respond more slowly and less accurately. Monkeys taking the memory and attention tests, for example, were 20% less accurate on the highest dose of each of the GABA-modulating drugs. But DORA-22 had no such effect on cognition, the team reports today in Science Translational Medicine.
"We were very excited," Uslaner says. "Folks who take sleep medications need to be able to perform cognitive tasks when they awake, and this [compound] could help them do so without impairment."
Although DORA-22 has not yet been tested in humans, it holds tremendous promise for helping people suffering from sleep disorders, says Emmanuel Mignot, a sleep researcher with the Stanford University School of Medicine in Palo Alto, California. “This study is encouraging and exciting, because there’s good reason to believe it would work differently from what we’ve used in the past,” says Mignot, who helped discover the link between orexin (or its absence) and narcolepsy. “Not every drug works for everyone, so it’s really, really good news to have a potential new drug on the horizon.”
Researchers at Washington University School of Medicine in St. Louis have identified a new set of genetic markers for Alzheimer’s that point to a second pathway through which the disease develops.

Much of the genetic research on Alzheimer’s centers on amyloid-beta, a key component of brain plaques that build up in the brains of people with the disease.
In the new study, the scientists identified several genes linked to the tau protein, which is found in the tangles that develop in the brain as Alzheimer’s progresses and patients develop dementia. The findings may help provide targets for a different class of drugs that could be used for treatment.
The researchers report their findings online April 24 in the journal Neuron.
"We measured the tau protein in the cerebrospinal fluid and identified several genes that are related to high levels of tau and also affect risk for Alzheimer’s disease,” says senior investigator Alison M. Goate, DPhil, the Samuel and Mae S. Ludwig Professor of Genetics in Psychiatry. “As far as we’re aware, three of these genes have no effect on amyloid-beta, suggesting that they are operating through a completely different pathway.”
A fourth gene in the mix, APOE, had been identified long ago as a risk factor for Alzheimer’s. It has been linked to amyloid-beta, but in the new study, APOE appears to be connected to elevated levels of tau. Finding that APOE is influencing more than one pathway could help explain why the gene has such a big effect on Alzheimer’s disease risk, the researchers say.
“It appears APOE influences risk in more than one way,” says Goate, also a professor of genetics and co-director of the Hope Center for Neurological Disorders. “Some of the effects are mediated through amyloid-beta and others by tau. That suggests there are at least two ways in which the gene can influence our risk for Alzheimer’s disease.”
The new research by Goate and her colleagues is the largest genome-wide association study (GWAS) yet on tau in cerebrospinal fluid. The scientists analyzed points along the genomes of 1,269 individuals who had undergone spinal taps as part of ongoing Alzheimer’s research.
Whereas amyloid is known to collect in the brain and affect brain cells from the outside, the tau protein usually is stored inside cells. So tau usually moves into the spinal fluid when cells are damaged or die. Elevated tau has been linked to several forms of non-Alzheimer’s dementia, and first author Carlos Cruchaga, PhD, says that although amyloid plaques are a key feature of Alzheimer’s disease, it’s possible that excess tau has more to do with the dementia than plaques.
“We know there are some individuals with high levels of amyloid-beta who don’t develop Alzheimer’s disease,” says Cruchaga, an assistant professor of psychiatry. “We don’t know why that is, but perhaps it could be related to the fact that they don’t have elevated tau levels.”
In addition to APOE, the researchers found that a gene called GLIS3, and the genes TREM2 and TREML2 also affect both tau levels and Alzheimer’s risk.
Goate says she suspects changes in tau may be good predictors of advancing disease. As tau levels rise, she says people may be more likely to develop dementia. If drugs could be developed to target tau, they may prevent much of the neurodegeneration that characterizes Alzheimer’s disease and, in that way, help prevent or delay dementia.
The new research also suggests it may one day be possible to reduce Alzheimer’s risk by targeting both pathways.
“Since two mechanisms apparently exist, identifying potential drug targets along these pathways could be very useful,” she says. “If drugs that influence tau could be added to those that affect amyloid, we could potentially reduce risk through two different pathways.”
(Source: news.wustl.edu)
Shift of Language Function to Right Hemisphere Impedes Post-Stroke Aphasia Recovery
In a study designed to differentiate why some stroke patients recover from aphasia and others do not, investigators have found that a compensatory reorganization of language function to right hemispheric brain regions bodes poorly for language recovery. Patients who recovered from aphasia showed a return to normal left-hemispheric language activation patterns. These results, which may open up new rehabilitation strategies, are available in the current issue of Restorative Neurology and Neuroscience.
“Overall, approximately 30% of patients with stroke suffer from various types of aphasia, with this deficit most common in stroke with left middle cerebral artery territory damage. Some of the affected patients recover to a certain degree in the months and years following the stroke. The recovery process is modulated by several known factors, but the degree of the contribution of brain areas unaffected by stroke to the recovery process is less clear,” says lead investigator Jerzy P. Szaflarski, MD, PhD, of the Departments of Neurology at the University of Alabama and University of Cincinnati Academic Health Center.
For the study, 27 right-handed adults who suffered from a left middle cerebral artery infarction at least one year prior to study enrollment were recruited. After language testing, 9 subjects were considered to have normal language ability while 18 were considered aphasic. Patients underwent a battery of language tests as well as a semantic decision/tone decision cognitive task during functional MRI (fMRI) in order to map language function. MRI scans were used to determine stroke volume.
The authors found that linguistic performance was better in those who had stronger left-hemispheric fMRI signals while performance was worse in those who had stronger signal-shifts to the right hemisphere. As expected, they also found a negative association between the size of the stroke and performance on some linguistic tests. Right cerebellar activation was also linked to better post-stroke language ability.
The authors say that while a shift to the non-dominant right hemisphere can restore language function in children who have experienced left-hemispheric injury or stroke, for adults such a shift may impede recovery. For adults, it is the left hemisphere that is necessary for language function preservation and/or recovery.

Avoid impulsive acts by imagining future benefits
Why is it so hard for some people to resist the least little temptation, while others seem to possess incredible patience, passing up immediate gratification for a greater long-term good?
The answer, suggests a new brain imaging study from Washington University in St. Louis, lies in how effective people are at feeling good right now about all the future benefits that may come from passing up a smaller immediate reward. Researchers found that activity in two regions of the brain distinguished impulsive and patient people.
“Activity in one part of the brain, the anterior prefrontal cortex, seems to show whether you’re getting pleasure from thinking about the future reward you are about to receive,” explains study co-author Todd Braver, PhD, professor of psychology in Arts & Sciences. “People can relate to this idea that when you know something good is coming, just that waiting can feel pleasurable.”
The study, which was published in the first issue of the Journal of Neuroscience this year, was designed to examine what happens in the brain as people wait for a reward, especially whether people characterized as “impulsive” would show different brain responses than those considered “patient.”
The lead author of the study was Koji Jimura, then a postdoctoral researcher in Braver’s Cognitive Control and Psychopathology Laboratory, and now a research associate professor at the Tokyo Institute of Technology, in Japan.
Unlike previous research on delayed gratification that had people choose between hypothetical rewards of money over long delays (e.g, $500 now or $1,000 a year from now), this Washington University study presented their participants with real rewards of squirts of juice that they chose to receive either immediately or after a delay of up to a minute.
“It’s kind of funny because we treated the people in our study like researchers that work with animals do, and we actually squirted juice into their mouths,” Braver says.
Results show that a brain region called the ventral striatum (VS) ramped up its activity in impulsive people as they got closer and closer to receiving their delayed reward. The VS activity of patient people, on the other hand, stayed more constant.
The researchers interpreted these different brain responses to mean that impulsive people initially did not find the prospect of waiting for a reward very appealing. However, as they approached the time they’d receive that reward, they became more excited and their VS reflected that excitement.
“This gradual increase may reflect impatience or excessive anticipation of the upcoming reward in impulsive individuals,” says Jimura. This was unlike patient people, who were likely content with waiting for the reward from the start, as no changes in VS activity were observed for them.
The most novel finding of the study concerned the anterior prefrontal cortex (aPFC). This is the part of the brain that helps you think about the future. Here, we found that the patient people heightened activity in the aPFC when they first started waiting for they reward, which then decreased as the time to receive the reward approached. Impulsive people didn’t show this brain activity pattern.
“The aPFC appears to allow you to create a mental simulation of the future. It helps you consider what it’ll be like getting the future reward. In this way, you can get access to the utility and satisfaction in the present,” says Braver.
By thinking about the future reward, patient people were able to gain what economists call “anticipatory utility.” While their reward was far away in time, they were giddy with anticipation in the present. Conversely, impulsive people weren’t thinking beyond the present and so did not feel pleasure when they were told they had to wait. Their excitement built only as they got closer to receiving their reward.
Overall this study suggests that people may be impulsive because they do not or cannot imagine the future, so they prefer rewards right away. This research could be useful for assessing the effects of clinical treatments for impulsivity problems, which can lead to issues such as problem gambling and substance abuse disorders. A similar brain imaging approach as was used in the Washington University study could allow clinicians to track the effects of an intervention on changes not only in impulsive behavior but also changes in patients’ brain responses.
“One possible treatment approach could be to enhance mental functions in aPFC, a brain region well-known to be associated with cognitive control,” says Jimura. By increasing cognitive control, impulsive patients could learn to reject their immediate impulses.
Impulsivity occurs not only in a clinical setting but also every day in our own lives. Applying his research to his personal life, Braver says, “When I’m successful at achieving long-term goals it’s from explicitly trying to activate that goal and imagining each decision as helping me achieve it, to keep me on track.” Perhaps adopting this strategy of focusing on the long-term could help us move past present distractions and move toward our future goals.
Either mad and bad or Jekyll and Hyde: media portrayals of schizophrenia
Stigma can take a heavy toll on people who suffer from mental illness. Being shunned, feared, devalued and discriminated against can impair recovery and deepen social isolation and distress. Many sufferers judge stigma to be more difficult to cope with than the symptoms of their illness.
Thankfully, there are grounds for hope. Australian researchers have shown that mental illness stigma, such as the unwillingness to interact with affected people, generally declined from 2003 to 2011. Some credit for this improvement must go to media campaigns by beyondblue and SANE, and to the willingness of many people to speak publicly about experiences that would once have been shamefully private.
The dark cloud inside this silver lining is schizophrenia, a serious condition that impairs thinking, emotion and motivation. While Australians’ attitudes towards depression have become more accepting, the stigma of schizophrenia has remained largely unchanged.
Misusing and misunderstanding
People with schizophrenia are still perceived as dangerous and unpredictable, and these perceptions have increased in recent years. Attitudes to people with schizophrenia have also worsened in the United States at the same time as attitudes to depressed people have improved.
Just as the media can take some credit for the declining stigma of other conditions, it must take some of the blame for the continuing stigma of schizophrenia. Media portrayals commonly associate it with violence and danger.
Schizophrenia is also often misused to refer to split personality or incoherence. This Jekyll-and-Hyde misconception persists despite countless corrections. One study of Italian newspapers, for instance, found that the term was employed in this way almost three times as often it was used correctly to refer to people with the diagnosis or their illness.
But just how negative are current media depictions of schizophrenia? My students and I recently examined this question in a study that we published in the academic journal Psychosis. We located every story published in major national, state and territory online and print news media outlets in the year ending August 2012 that cited schizophrenia or schizophrenic.
We then counted how many stories misused these terms and coded how often the condition was linked to violence or presented in a stigmatising way.
Our results were striking. Almost half (47%) of stories linked schizophrenia to some form of violence, and 28% of these associated it with attempted or completed homicide. The schizophrenic person was identified as a perpetrator of violence six times more frequently than as its victim.
Schizophrenia was misused as a split metaphor in 13% of stories. And fully 46% of stories were coded as stigmatising.
It’s hardly surprising that the public’s views of the condition continue to be laced with fear and loathing if they usually find schizophrenia presented in the context of violent aggression or as a metaphor for internal contradiction.
Better ways
What can be done about all of this? For one thing, journalists and the general public need to become aware that schizophrenia doesn’t mean split personality and it bears no resemblance to caricatures of craziness. This mistaken usage should be retired not because the police say it’s offensive, but because it perpetuates a misunderstanding that hurts real people.
Journalists and editors also need to think carefully before linking schizophrenia to violent behaviour. Often the proposed link is dubious and speculative, and adds nothing important to the story. Just as violence supposedly committed by people experiencing mental illness is over-reported – producing an exaggerated sense of their dangerousness – their victimisation is often under-reported.
An equally important corrective would be to publish more stories that feature people with schizophrenia living well, present their everyday struggles and adversities or showcase promising treatments and research findings.
Coverage can be improved. Our study found that stories from broadsheet newspapers were less stigmatising than tabloid stories, and longer, more developed stories were less stigmatising than briefer ones.
This is not a matter of white-washing the news. People with schizophrenia are indeed at a somewhat increased risk of committing violent offences (and of being their victims). They can behave in challenging ways. But the media landscape that our study surveyed is so tilted towards depicting schizophrenia as dangerous that it’s seriously unbalanced.
The news media can do better and, if they do, the stigma of schizophrenia may start to erode.