Neuroscience

Articles and news from the latest research reports.

Posts tagged science

14 notes

Take your time: Neurobiology sheds light on the superiority of spaced vs. massed learning

March 28, 2012 by Stuart Mason Dambrot

(Medical Xpress) — College and cramming – often where’s there’s one, the other is not far behind. That said, however, it has been recognized since the late 1800s that repeated periodic exposure to the same material leads to better retention than does a single en masse session. Nevertheless, the phenomenon’s neurobiological processes have remained poorly understood, although activity-dependent synaptic plasticity – notably long-term potentiation (LTP) of glutamatergic transmission – is believed to enable rapid storage of new information. Recently, researchers at the University of California in Irvine and the Scripps Research Institute in Jupiter, Florida determined that hippocampal activity can enhance LTP through theta burst stimulation (TBS) – but only when the affected synapses receive, after a long delay, a secondary TBS. The researchers describe mechanisms that maximize synaptic changes that optimally encode new memory by requiring long delays learning-related TBS activity.

A second theta burst train expands the pool of F-actin-enriched spines. (A) Fluorescent phalloidin labeling in CA1 stratum radiatum. (Scale bar = 10 μm). (B) Counts of densely phalloidin-positive spines in slices collected 15 or 75 min after TBS1 (gray bars) or 15 min after TBS2 delayed by 60 min (black bar). (C) Traces show responses to two successive bursts separated by 200 ms (red for second response). (D) Counts of TBS1-induced phalloidin labeling for vehicle (gray) and CX614-treated (blue) slices. (E) Pretreatment with CX614 (blue line) caused a 70% increase in the magnitude of LTP induced by TBS1; this was accompanied by a loss of TBS2-induced potentiation. Image Copyright © 2012 PNAS, doi: 10.1073/pnas.1120700109

Gavin Rumbaugh (Scripps Research Institute) discussed the challenges he, Gary Lynch (University of California) and their team encountered in the study. “The field is trying to understand the neurobiology of new learning, and in particular, how learning induces an even more complex biology to keep new information in our neural circuits,” Rumbaugh tells Medical Xpress. “Over the recent decade, it has become clear that plasticity at individual synapses is a way that neural circuits store information. However, it remains unclear how properties of synapses influence key aspects of learning and memory.”

Read more …

Filed under science neuroscience psychology brain

2 notes

Treatments to reduce anesthesia-induced injury in children show promise in animal studies

March 28, 2012

Recent clinical studies have shown that general anesthesia can be harmful to infants, presenting a dilemma for both doctors and parents. But new research at Wake Forest Baptist Medical Center may point the way to treatment options that protect very young children against the adverse effects of anesthesia.

As detailed in a study published in the March 23 online edition of the journal Neuroscience, Wake Forest Baptist scientists explored a number of strategies designed to prevent anesthesia-induced damage to the brain in infants.

Using an animal model, the researchers tested the effectiveness of a fragment of a neuroprotective protein called ADNP, as well as vitamin D3, a low-level dose of anesthetic and aspirin. They found that three of the four strategies tested protected the brain from injury induced by 20 mg ketamine, a commonly used general anesthetic.

"What didn’t work was aspirin, which was a surprise because aspirin is known to protect the brain from injury," said Christopher P. Turner, Ph.D., assistant professor of neurobiology and anatomy at Wake Forest Baptist and lead author of the study. "In fact, in our study aspirin actually increased the severity of injury from the anesthesia, possibly because it prevents the generation of substances that may be neuroprotective."

Turner and his team studied rats at ages equivalent to children from birth to age 4.

In separate tests, the rodents were injected with: NAP, a peptide fragment of activity-dependent neuroprotective protein (ADNP), 15 minutes before ketamine was administered; two 20-mg doses of vitamin D3, at 24 hours and at 15 minutes before 20 mg ketamine; a non-toxic (5 mg) doses of ketamine 24 hours before a toxic dose of 20 mg ketamine was administered; and a 30-mg dose of aspirin 15 minutes before exposure to ketamine.

The Turner lab found that NAP, vitamin D3 and prior exposure to low (non-toxic) ketamine could all prevent injury from exposure to a toxic (20 mg) level of ketamine. However, aspirin appeared to enhance ketamine-induced injury.

"We designed our studies to give doctors several possible treatment options because not all of these strategies may work in clinical applications," Turner said. "However, because vitamin D3 is already in clinical use, our findings show that it is quite promising with few risks. Further, NAP is currently in clinical trials to diminish the severity of other types of brain injury, so we feel this discovery represents a breakthrough for anesthesia-induced neurotoxicity. However, there may be a critical window of efficacy for NAP, which we need to explore further.

"Of all the approaches that our team studied, using a low dose of ketamine may be both the simplest and most cost-effective, as it suggests children can be pre-treated with the same anesthesia that will be used when they undergo general surgery," Turner added. "In essence, a low-level dose of ketamine primes the child’s brain so that the second, higher dose is not as lethal, much like an inoculation."

Provided by Wake Forest Baptist Medical Center

Source: medicalxpress.com

Filed under science neuroscience

14 notes

Coffee, other stimulant drugs may cause high achievers to slack off: research

March 28, 2012

(Medical Xpress) — While stimulants may improve unengaged workers’ performance, a new University of British Columbia study suggests that for others, caffeine and amphetamines can have the opposite effect, causing workers with higher motivation levels to slack off.

The study – published online today by Nature’s Neuropsychopharmacology – explored the impacts of stimulants on “slacker” rats and “worker” rats, and sheds important light on why stimulants might affect people differently, a question that has long been unclear. It also suggests that patients being treated with stimulants for a range of illnesses may benefit from more personalized treatment programs.

“Every day, millions of people use stimulants to wake up, stay alert and increase their productivity – from truckers driving all night to students cramming for exams,” says Jay Hosking, a PhD candidate in UBC’s Dept. of Psychology, who led the study. “These findings suggest that some stimulants may actually have an opposite effect for people who naturally favour the difficult tasks of life that come with greater rewards.”

Hosking says some individuals are more willing to concentrate and exert effort to achieve their goals than others. However, little is known about the brain mechanisms determining how much cognitive effort one will expend in decision-making for accomplishing tasks.

Hosking and study co-author Catharine Winstanley, a professor in UBC’s Dept. of Psychology, found that rats – like humans – show varying levels of willingness to expend high or low degrees of mental effort to obtain food rewards. When presented with stimulants, the “slacker” rats that typically avoided challenges worked significantly harder when given amphetamines, while “worker” rats that typically embraced challenges were less motivated by caffeine or amphetamine.

While more research is needed to understand the brain mechanisms at work, the study suggests that the amount of mental attention people devote to achieving their goals may play a role in determining how stimulants drugs affect them, Hosking says.

Winstanley, a Michael Smith Foundation for Health Research scholar, says people with psychiatric illnesses, brain injuries and Attention Deficit Hyperactivity Disorder (ADHD) may benefit from treatment programs with greater personalization, noting that patients often use stimulants to counter drowsiness and fatigue from their conditions and treatments, with mixed results.

Provided by University of British Columbia

Source: medicalxpress.com

Filed under science neuroscience psychology brain

4 notes

Blocking ‘Oh-Glick-Nack’ May Improve Long-Term Memory

ScienceDaily (Mar. 27, 2012) — Just as the familiar sugar in food can be bad for the teeth and waistline, another sugar has been implicated as a health menace and blocking its action may have benefits that include improving long-term memory in older people and treating cancer.

Blocking the action of a sugar could boost memory and even fight cancer. The neuron on the left has CREB with O-GlcNAc and is short. The neuron on the right does not have that form of CREB and is long. (Credit: Linda Hsieh-Wilson, Ph.D.)

Progress toward finding such a blocker for the sugar — with the appropriately malicious-sounding name “oh-glick-nack” — was the topic of a report presented at the 243rd National Meeting & Exposition of the American Chemical Society (ACS) in San Diego on March 27.

Linda Hsieh-Wilson, Ph.D., explained that the sugar is not table sugar (sucrose), but one of many other substances produced in the body’s cells that qualify as sugars from a chemical standpoint. Named O-linked beta-N-acetylglucosamine — or “O-GlcNAc” — it helps in orchestrating health and disease at their origins, inside the billions of cells that make up the body. O-GlcNAc does so by attaching to proteins that allow substances to pass in and out of the nucleus of cells, for instance, and helping decide whether certain genes are turned on or off. In doing so, O-GlcNAc sends signals that may be at the basis of cancer, diabetes, Alzheimer’s disease and other disorders. Research suggests, for instance, that proteins loaded up with too much O-GlcNAc can’t function normally.

At the ACS meeting, Hsieh-Wilson described how research in her lab at the California Institute of Technology and Howard Hughes Medical Institute implicate O-GlcNAc in memory loss and cancer. The research emerged from Hsieh-Wilson’s use of advanced lab tools for probing a body process that involves attachment of sugars like O-GlcNAc to proteins. Called protein glycosylation, it helps nerves and other cells communicate with each other in ways that keep the body coordinated and healthy. When O-GlcNAc is attached to a protein, that binding process is known as O-GlcNAc glycosylation.

Hsieh-Wilson’s team screened the entire mammalian brain for all O-GlcNAc-glycosylated proteins, using a new process that her laboratory developed. They identified more than 200 proteins bearing O-GlcNAc attachments or tags, many for the first time. The research was done in mice, stand-ins for humans in research that cannot be done on people. Some of the proteins carrying O-GlcNAc were involved in regulating processes like drug addiction and securing long-term storage of memories.

O-GlcNAc’s effects on one particular protein, CREB, got the scientists’ attention. CREB is a key substance that turns on and regulates the activity of genes. Many of the genes in cells are inactive at any given moment. Substances like CREB, termed transcription factors, turn genes on. Hsieh-Wilson found that when O-GlcNAc attached to CREB, CREB’s ability to turn on genes was impaired. When the researchers blocked O-GlcNAc from binding CREB, the mice developed long-term memories faster than normal mice.

Could blocking O-GlcNAc boost long-term memory in humans?

"We’re far from understanding what happens in humans," Hsieh-Wilson emphasized. "Completely blocking O-GlcNAc might not be desirable. Do you really want to sustain all memories long-term, even of events that are best forgotten? How would blocking the sugar from binding to other proteins affect other body processes? There are a lot of unanswered questions. Nevertheless, this research could eventually lead to ways to improve memory."

In a related study, Hsieh-Wilson found that O-GlcNAc interacted with another protein in ways that encourage the growth of cancer cells, suggesting that blocking its attachment might protect against cancer or slow the growth of cancer. And indeed, in mouse experiments, blocking O-GlcNAc resulted in much smaller tumors.

Again, a treatment for humans based on this discovery is far in the future, but the study singles out O-GlcNAc as a potential new target for developing anti-cancer drugs.

Source: Science Daily

Filed under science neuroscience brain psychology memory

65 notes

Creativity and human reasoning during decision-making

March 27, 2012

A hallmark of human intelligence is the ability to efficiently adapt to uncertain, changing and open-ended environments. In such environments, efficient adaptive behavior often requires considering multiple alternative behavioral strategies, adjusting them, and possibly inventing new ones. These reasoning, learning and creative abilities involve the frontal lobes, which are especially well developed in humans compared to other primates. However, how the frontal function decides to create new strategies and how multiple strategies can be monitored concurrently remain largely unknown.

In a new study, published March 27 in the online, open-access journal PLoS Biology, Anne Collins and Etienne Koechlin of Ecole Normale Supérieure and Institut National de la Santé et de la Recherche Médicale, France, examine frontal lobe function using behavioral experiments and computational models of human decision-making. They find that human frontal function concurrently monitors no more than three/four strategies but favors creativity, i.e. the exploration and creation of new strategies whenever no monitored strategies appear to be reliable enough.

The researchers asked one hundred participants to find “3-digit pin codes” by a method of trial and error, under a variety of conditions. They then developed a computational model that predicted the responses produced by participants, which also revealed that participants made their choices by mentally constructing and concurrently monitoring up to three distinct behavioral strategies; flexibly associating digits, motor responses and expected auditory feedbacks.

"This is a remarkable result, because the actual number of correct codes varied across sessions. This suggests that this capacity limit is a hard constraint of human higher cognition," said Dr. Koechlin. Consistently, the performance was significantly better in sessions including no more than three repeated codes.

Furthermore, the researchers found that the pattern of participants’ responses derived from a decision system that strongly favors the exploration of new behavioral strategies: “The results provide evidence that the human executive system favors creativity for compensating its limited monitoring capacity” explained Dr. Koechlin. “It favors the exploration of new strategies but restrains the monitoring and storage of uncompetitive ones. Interestingly, this ability to regulate creativity varied across participants and critically explains individual variations in performances. We believe our study may also help to understand the biological foundations of individual differences in decision-making and adaptive behavior”.

Provided by Public Library of Science

Source: medicalxpress.com

Filed under science neuroscience psychology brain

5 notes

Use It or Lose It: Mind Games Help Healthy Older People Too

ScienceDaily (Mar. 27, 2012) — Cognitive training including puzzles, handicrafts and life skills are known to reduce the risk, and help slow down the progress, of dementia amongst the elderly. A new study published in BioMed Central’s open access journal BMC Medicine showed that cognitive training was able to improve reasoning, memory, language and hand eye co-ordination of healthy, older adults.

It is estimated that by 2050 the number of people over 65 years old will have increased to 1.1 billion worldwide, and that 37 million of these will suffer from dementia. Research has already shown that mental activity can reduce a person’s risk of dementia but the effect of mental training on healthy people is less well understood. To address this researchers from China have investigated the use of cognitive training as a defence against mental decline for healthy older adults who live independently.

To be recruited onto the trial participants had to be between 65 and 75 years old, and have good enough eyesight, hearing, and communication skills, to be able to complete all parts of the training. The hour long training sessions occurred twice a week, for 12 weeks, and the subjects were provided with homework. Training included a multi-approach system tackling memory, reasoning, problem solving, map reading, handicrafts, health education and exercise, or focussing on reasoning only. The effect of booster training, provided six months later, was also tested.

The results of the study were positive. Profs Chunbo Li and Wenyuan Wu who led the research explained, “Compared to the control group, who received no training, both levels of cognitive training improved mental ability, although the multifaceted training had more of a long term effect. The more detailed training also improved memory, even when measured a year later and booster training had an additional improvement on mental ability scores.”

This study shows that cognitive training therapy may prevent mental decline amongst healthy older people and help them to continue independent living longer in their advancing years.

Source: Science Daily

Filed under science neuroscience psychology brain

9 notes

Genetic Risk and Stressful Early Infancy Join to Increase Risk for Schizophrenia

ScienceDaily (Mar. 26, 2012) — Working with genetically engineered mice and the genomes of thousands of people with schizophrenia, researchers at Johns Hopkins say they now better understand how both nature and nurture can affect one’s risks for schizophrenia and abnormal brain development in general.

The green neurons have reduced DISC1 protein. Red neurons have less effective GABA. (Credit: Johns Hopkins Medicine)

The researchers reported in the March 2 issue of Cell that defects in a schizophrenia-risk genes and environmental stress right after birth together can lead to abnormal brain development and raise the likelihood of developing schizophrenia by nearly one and half times.

"Our study suggests that if people have a single genetic risk factor alone or a traumatic environment in very early childhood alone, they may not develop mental disorders like schizophrenia," says Guo-li Ming, M.D., Ph.D., professor of neurology and member of the Institute for Cell Engineering at the Johns Hopkins University School of Medicine. "But the findings also suggest that someone who carries the genetic risk factor and experiences certain kinds of stress early in life may be more likely to develop the disease."

Pinpointing the cause or causes of schizophrenia has been notoriously difficult, owing to the likely interplay of multiple genes and environmental triggers, Ming says. Searching for clues at the molecular level, the Johns Hopkins team focused on the interaction of two factors long implicated in the disease: Disrupted-in-Schizophrenia 1 (DISC1) protein, which is important for brain development, and GABA, a brain chemical needed for normal brain function.

To find how these factors impact brain development and disease susceptibility, the researchers first engineered mice to have reduced levels of DISC1 protein in one type of neuron in the hippocampus, a region of the brain involved in learning, memory and mood regulation. Through a microscope, they saw that newborn mouse brain cells with reduced levels of DISC1 protein had similar sized and shaped neurons as those from mice with normal levels of DISC1 protein. To change the function of the chemical messenger GABA, the researchers engineered the same neurons in mice to have more effective GABA. Those brain cells looked much different than normal neurons, with longer appendages or projections. Newborn mice engineered with both the more effective GABA and reduced levels of DISC1 showed the longest projections, suggesting, Ming said, that defects in both DISC1 and GABA together could change the physiology of developing neurons for the worse.

Meanwhile, other researchers at University of Calgary and at the National Institute of Physiological Sciences in Japan had shown in newborn mice that changes in environment and routine stress can impede GABA from working properly during development. In the next set of experiments, the investigators paired reducing DISC1 levels and stress in mice to see if it could also lead to developmental defects. To stress the mice, the team separated newborns from their mothers for three hours a day for ten days, then examined neurons from the stressed newborns and saw no differences in their size, shape and organization compared with unstressed mice. But when they similarly stressed newborn mice with reduced DISC1 levels, the neurons they saw were larger, more disorganized and had more projections than the unstressed mouse neurons. The projections, in fact, went to the wrong places in the brain.

Next, to see if their results in mice correlated to suspected human schizophrenia risk factors, the researchers compared the genetic sequences of 2,961 schizophrenia patients and healthy people from Scotland, Germany and the United States. Specifically, they determined if specific variations of DNA letters found in two genes, DISC1 and a gene for another protein, NKCC1, which controls the effect of GABA, were more likely to be found in schizophrenia patients than in healthy individuals. They paired 36 DNA “letter” changes in DISC1 and two DNA letter variations in NKCC1 — one DNA letter change per gene — in all possible combinations. Results showed that if a person’s genome contained one specific combination of single DNA letter changes, then that person is 1.4 times more likely than people without these DNA changes to develop schizophrenia. Having these single DNA letter changes in either one of these genes alone did not increase risk.

"Now that we have identified the precise genetic risks, we can rationally search for drugs that correct these defects," says Hongjun Song, Ph.D., co-author, professor of neurology and director of the Stem Cell Program at the Institute for Cell Engineering.

Source: Science Daily

Filed under science neuroscience psychology schizophrenia

6 notes

Chronic Stress Spawns Protein Aggregates Linked to Alzheimer’s

ScienceDaily (Mar. 26, 2012) — Repeated stress triggers the production and accumulation of insoluble tau protein aggregates inside the brain cells of mice, say researchers at the University of California, San Diego School of Medicine in a new study published in the March 26 Online Early Edition of the Proceedings of the National Academy of Sciences.

Exposing mice to 14 days of repeated stress resulted in an accumulation of insoluble phosphorylated tau protein aggregates in brain cells, visualized in this electron micrograph. (Credit: Image courtesy of Robert Rissman, UC San Diego)

The aggregates are similar to neurofibrillary tangles or NFTs, modified protein structures that are one of the physiological hallmarks of Alzheimer’s disease. Lead author Robert A. Rissman, PhD, assistant professor of neurosciences, said the findings may at least partly explain why clinical studies have found a strong link between people prone to stress and development of sporadic Alzheimer’s disease (AD), which accounts for up to 95 percent of all AD cases in humans.

"In the mouse models, we found that repeated episodes of emotional stress, which has been demonstrated to be comparable to what humans might experience in ordinary life, resulted in the phosphorylation and altered solubility of tau proteins in neurons," Rissman said. "These events are critical in the development of NFT pathology in Alzheimer’s disease."

The effect was most notable in the hippocampus, said Rissman, a region of the brain linked to the formation, organization and storage of memories. In AD patients, the hippocampus is typically the first region of the brain affected by tau pathology and the hardest-hit, with substantial cell death and shrinkage.

Not all forms of stress are equally threatening. In earlier research, Rissman and colleagues reported that acute stress — a single, passing episode — does not result in lasting, debilitating long lasting changes in accumulation of phosphorylated tau. Acute stress-induced modifications in the cell are transient, he said, and on the whole, probably beneficial.

"Acute stress may be useful for brain plasticity and helping to facilitate learning. Chronic stress and continuous activation of stress pathways may lead to pathological changes in stress circuitry. It may be too much of a good thing." As people age, perhaps their neuronal circuits do too, he said, becoming less robust and perhaps less capable of completely rebounding from the effects of stress.

"Age is the primary, known risk factor for Alzheimer’s disease. It may be that as we age, our neurons just aren’t as plastic as they once were and some succumb."

The researchers observed that stress cues impacted two key corticotropin-releasing factor receptors, suggesting a target for potential therapies. Rissman noted drugs already exist and are in human trials (for other conditions) that modulate the activity of these receptors.

"You can’t eliminate stress. We all need to be able to respond at some level to stressful stimuli. The idea is to use an antagonist molecule to reduce the effects of stress upon neurons. The stress system can still respond, but the response in the brain and hippocampus would be toned down so that it doesn’t result in harmful, permanent damage."

The authors dedicate this work to long time mentor and colleague, Dr. Wylie Vale, whose years of pioneering work deciphering and describing the stress system were fundamental to this paper. Vale passed away earlier this year at the age of 70.

Source: Science Daily

Filed under science neuroscience psychology alzheimer

5 notes

Does the Brain ‘Remember’ Antidepressants? More Proof for the Power of Placebo

ScienceDaily (Mar. 26, 2012) — Individuals with major depressive disorder (MDD) often undergo multiple courses of antidepressant treatment during their lives. This is because the disorder can recur despite treatment and because finding the right medication for a specific individual can take time.

While the relationship between prior treatment and the brain’s response to subsequent treatment is unknown, a new study by UCLA researchers suggests that how the brain responds to antidepressant medication may be influenced by its remembering of past antidepressant exposure.

Interestingly, the researchers used a harmless placebo as the key to tracking the footprints of prior antidepressant use.

Aimee Hunter, the study’s lead author and an assistant professor of psychiatry at UCLA’s Semel Institute for Neuroscience and Human Behavior, and colleagues showed that a simple placebo pill, made to look like actual medication for depression, can “trick” the brain into responding in the same manner as the actual medication.

The report was published online March 23 in the journal European Neuropsychopharmacology.

The investigators examined changes in brain function in 89 depressed persons during eight weeks of treatment, using either an antidepressant medication or a similar-looking placebo pill. They set out to compare the two treatments — medication versus placebo — but they also added a twist: They separately examined the data for subjects who had never previously taken an antidepressant and those who had.

The researchers focused on the prefrontal cortex, an area of the brain thought to be involved in planning complex cognitive behavior, personality expression, decision-making and moderating social behavior, all things depressed people wrestle with.

Brain changes were assessed using electroencephalograph (EEG) measures developed at UCLA by study co-authors Dr. Ian Cook, UCLA’s Miller Family Professor of Psychiatry, and Dr. Andrew Leuchter, a professor of psychiatry and director of the Laboratory of Brain, Behavior and Pharmacology at UCLA’s Semel Institute. The EEG measure, recorded from scalp electrodes, is linked to blood flow in the cerebral cortex, which suggests the level of brain activity.

The antidepressant medication given during the study appeared to produce slight decreases in prefrontal brain activity, regardless of whether subjects had received prior antidepressant treatment during their lifetime or not. (A decrease in brain activity is not necessarily a bad thing, the researchers note; with depression, too much activity in the brain can be as bad as too little.)

However, the researchers observed striking differences in the power of placebo, depending on subjects’ prior antidepressant use. Subjects who had never been treated with an antidepressant exhibited large increases in prefrontal brain activity during placebo treatment. But those who had used antidepressant medication in the past showed slight decreases in prefrontal activity — brain changes that were indistinguishable from those produced by the actual drug.

"The brain’s response to the placebo pill seems to depend on what happened previously — on whether or not the brain has ever ‘seen’ antidepressant medication before," said Hunter, who is a member of the placebo research team at the Laboratory of Brain, Behavior and Pharmacology. "If it has seen it before, then the brain’s signature ‘antidepressant-exposure’ response shows up."

According to Hunter, the effect looks conspicuously like a classical conditioning phenomenon, wherein prior exposure to the actual drug may have produced the specific prefrontal brain response and subsequent exposure to the cues surrounding drug administration — the relationship with the doctor or nurse, the medical treatment setting, the act of taking a prescribed pill and so forth — came to elicit a similar brain response through ‘conditioning’ or ‘associative learning.’

While medication can have a powerful effect on our physiology, said Hunter, “the behaviors and cues in the environment that are associated with taking medication can come to elicit their own effects. One’s personal treatment history is one of the many factors that influence the overall effects of treatment.”

Still, she noted, there are other possible explanations, and further research is needed to tease out changes in brain function that are related to antidepressant exposure, compared with brain changes that are related to clinical improvement during treatment.

Source: Science Daily

Filed under science neuroscience brain psychology

4 notes

Smokers Could Be More Prone to Schizophrenia

ScienceDaily (Mar. 26, 2012) — Smoking alters the impact of a schizophrenia risk gene. Scientists from the universities of Zurich and Cologne demonstrate that healthy people who carry this risk gene and smoke process acoustic stimuli in a similarly deficient way as patients with schizophrenia. Furthermore, the impact is all the stronger the more the person smokes.

Schizophrenia has long been known to be hereditary. However, as a melting pot of disorders with different genetic causes is concealed behind manifestations of schizophrenia, research has still not been able to identify the main gene responsible to this day.

In order to study the genetic background of schizophrenia, the frequency of particular risk genes between healthy and ill people has mostly been compared until now. Pharmacopyschologist Professor Boris Quednow from University Hospital of Psychiatry, Zurich, and Professor Georg Winterer’s workgroup at the University of Cologne have now adopted a novel approach. Using electroencephalography (EEG), the scientists studied the processing of simple acoustic stimuli (a sequence of similar clicks). When processing a particular stimulus, healthy people suppress the processing of other stimuli that are irrelevant to the task at hand. Patients with schizophrenia exhibit deficits in this kind of stimulus filtering and thus their brains are probably inundated with too much information. As psychiatrically healthy people also filter stimuli with varying degrees of efficiency, individual stimulus processing can be associated with particular genes.

Smokers process stimuli less effectively

In a large-scale study involving over 1,800 healthy participants from the general population, Boris Quednow and Georg Winterer examined how far acoustic stimulus filtering is connected with a known risk gene for schizophrenia: the so-called “transcription factor 4” gene (TCF4). TCF4 is a protein that plays a key role in early brain development. As patients with schizophrenia often smoke, the scientists also studied the smoking habits of the test subjects.

The data collected shows that psychiatrically healthy carriers of the TCF4 gene also filter stimuli less effectively — like people who suffer from schizophrenia. It turned out that primarily smokers who carry the risk gene display a less effective filtering of acoustic impressions. This effect was all the more pronounced the more the people smoked. Non-smoking carriers of the risk gene, however, did not process stimuli much worse. “Smoking alters the impact of the TCF4 gene on acoustic stimulus filtering,” says Boris Quednow, explaining this kind of gene-environment interaction. “Therefore, smoking might also increase the impact of particular genes on the risk of schizophrenia.”

The results could also be significant for predicting schizophrenic disorders and for new treatment approaches, says Quednow and concludes: “Smoking should also be considered as an important cofactor for the risk of schizophrenia in future studies.” A combination of genetic (e.g. TCF4), electrophysiological (stimulus filtering) and demographic (smoking) factors could help diagnose the disorder more rapidly or also define new, genetically more uniform patient subgroups.

Source: Science Daily

Filed under science neuroscience brain psychology schizophrenia

free counters