Neuroscience

Articles and news from the latest research reports.

Posts tagged brain

42 notes

Individual differences in altruism explained by brain region involved in empathy

July 11, 2012

What can explain extreme differences in altruism among individuals, from Ebenezer Scrooge to Mother Teresa? It may all come down to variation in the size and activity of a brain region involved in appreciating others’ perspectives, according to a study published in the July 12th issue of the journal Neuron. The findings also provide a neural explanation for why altruistic tendencies remain stable over time.

The junction (yellow) between the parietal and the temporal lobes, in which the relative proportion of gray matter is significantly positively correlated with the propensity for altruistic behavior. Credit: University of Zurich

"This is the first study to link both brain anatomy and brain activation to human altruism,” says senior study author Ernst Fehr of the University of Zurich. “The findings suggest that the development of altruism through appropriate training or social practices might occur through changes in the brain structure and the neural activations that we identified in our study.”

Individuals who excel at understanding others’ intents and beliefs are more altruistic than those who struggle at this task. The ability to understand others’ perspectives has previously been associated with activity in a brain region known as the temporoparietal junction (TPJ). Based on these past findings, Fehr and his team reasoned that the size and activation of the TPJ would relate to individual differences in altruism.

In the new study, subjects underwent a brain imaging scan and played a game in which they had to decide how to split money between themselves and anonymous partners. Subjects who made more generous decisions had a larger TPJ in the right hemisphere of the brain compared with subjects who made stingy decisions.

Moreover, activity in the TPJ reflected each subject’s specific cutoff value for the maximal cost the subject was willing to endure to increase the partner’s payoff. Activity in the TPJ was higher during hard decisions—when the personal cost of an altruistic act was just below the cutoff value—than during easy decisions associated with a very low or very high cost.

"The structure of the TPJ strongly predicts an individual’s setpoint for altruistic behavior, while activity in this brain region predicts an individual’s acceptable cost for altruistic actions," says study author Yosuke Morishima of the University of Zurich. "We have elucidated the relationship between the hardware and software of human altruistic behavior."

Provided by Cell Press

Source: medicalxpress.com

Filed under science neuroscience brain psychology empathy emotion

21 notes

Potential Cause of HIV-Associated Dementia Revealed

ScienceDaily (July 10, 2012) — Researchers at Georgetown University Medical Center appear to have solved the mystery of why some patients infected with HIV, who are using antiretroviral therapy and show no signs of AIDS, develop serious depression as well as profound problems with memory, learning, and motor function. The finding might also provide a way to test people with HIV to determine their risk for developing dementia.

They say the answer, published in the July 11 issue of the Journal of Neuroscience, may ultimately lead to a therapeutic solution that helps these patients as well as others suffering from brain ailments that appear to develop through the same pathway, including those that occur in the aged.

"We believe we have discovered a general mechanism of neuronal decline that even explains what happens in some elderly folks," says the study’s lead investigator, Italo Mocchetti, Ph.D., professor and vice chair of the department of neuroscience at Georgetown University Medical Center. "The HIV-infected patients who develop this syndrome are usually quite young, but their brains act old."

The research team found that even though HIV does not infect neurons, it tries to stop the brain from producing a protein growth factor — mature brain derived neurotrophic factor (mature BDNF) — that Mocchetti says acts like “food” for brain neurons. Reduced mature BDNF results in the shortening of the axons and their branches that neurons use to connect to each other, and when they lose this communication, the neurons die.

"The loss of neurons and their connections is profound in these patients," Mocchetti says. HIV-associated dementia occurs in two to three percent of HIV-infected patients using retroviral therapies, all of who appear to be otherwise healthy, and in 30 percent of HIV-positive patients who are not on medication.

Mocchetti believes that HIV stops production of mature BDNF because that protein interferes with the ability of the virus to attack other brain cells. It does this through the potent gp120 envelope protein that sticks out from the viral shell — the same protein that hooks on to brain macrophages and microglial cells to infect them. “In earlier experiments, when we dumped gp120 into neuronal tissue culture, there was a 30-40 percent loss of neurons overnight. That makes gp120 a remarkable neurotoxin.”

This study is the product of years of work that has resulted in a string of publications. It began when Mocchetti and his colleagues were given a grant from the National Institutes on Drug Abuse to determine whether there was a connection between the use of cocaine and morphine, and dementia. (A substantial number of HIV-positive patients have been or currently are intravenous drugs users.)

They found that it was the virus that was responsible for the dementia, not the drugs, and so they set out to discover how the virus was altering neuronal function.

Their scientific break came when the researchers were able to study the blood of 130 women who were enrolled in the 17 year-old, nationwide WIHS (Women’s Interagency HIV Study, directed at Georgetown by Mary Young, M.D.), which has focused on the effects of HIV in infected females. In one seminal discovery, Mocchetti and colleagues found that when there was less BDNF in the blood, patients were at risk of developing brain abnormalities. He published this finding in 2011 in the May 15 issue of AIDS.

In this study, Mocchetti, Alessia Bachis, Ph.D., and their colleagues studied the brains of HIV-positive patients who had died, and who had developed HIV-associated dementia. They also found that neurons had shrunk, and that mature BDNF had substantially decreased.

He and his colleagues then worked out the mechanism responsible for this destruction of neurons.

Normally, neurons release a long form of BDNF known as proBDNF, and then certain enzymes, including one called furin, cleave proBDNF to produce mature BDNF, which then nurtures brain neurons. When uncut, proBDNF is toxic, leading to “synaptic simplification,” or the shortening of axons. It does this by binding to a receptor, p75NTR, that contains a death domain.

"HIV interferes with that normal process of cleaving proBDNF, resulting in neurons primarily secreting a toxic form of BDNF," Mocchetti says. The same imbalance between mature BDNF and proBDNF occurs as we age, he says, although no one knows how that happens. "The link between depression and lack of mature BDNF is also known, as is the link to issues of learning and memory. That’s why I say HIV-associated dementia resembles the aging brain."

Loss of mature BDNF has also been suggested to be a risk factor in chronic diseases such as Parkinson’s and Huntington’s diseases, Mocchetti says.

The findings suggest a possible therapeutic intervention, he adds. “One way would be to use a small molecule to block the p75NTR receptor that proBDNF uses to kill neurons. A small molecule like that could get through the blood-brain barrier.

"If this works in HIV-dementia, it may also work in other brain issues caused by proBDNF, such as aging," Mocchetti adds.

The finding also suggests that measuring proBDNF in HIV-positive patients may provide a biomarker of risk for development of dementia, he adds.

"This finding is extremely important for both basic scientists and physicians, because it suggests a new avenue to understand, and treat, a fairly widespread cause of dementia," Mocchetti says.

Source: Science Daily

Filed under science neuroscience brain psychology HIV dementia

16 notes

Blood-brain barrier less permeable in newborns than adults after acute stroke

July 10, 2012

The ability for substances to pass through the blood-brain barrier is increased after adult stroke, but not after neonatal stroke, according to a new study the UCSF that will be published July 11 in the Journal of Neuroscience.

The blood-brain barrier is selectively permeable and blocks unwanted molecules from entering into the brain. The selectivity is achieved through fine coordination in function of many transporting systems in endothelial cells, which line the interior of blood vessels, and communication between endothelial cells and several types of cells in the brain. When blood flow in an artery to the brain is blocked by a blood clot, as occurs in arterial stroke, brain energy metabolism is compromised, and ion and other transporting systems malfunction, leading to blood-brain disruption.

The new finding suggests, the researchers said, that drugs used to treat stroke need to be tailored to the specific makeup of the neonate blood-brain barrier.

"How the blood-brain barrier responds to stroke in adults and neonates currently is poorly understood,” said senior author Zinaida Vexler, PhD, director of research at the Neonatal Brain Disorders Center at the Department of Neurology at UCSF.

"The assumption has been that at birth the blood-brain barrier is immature and thus permeable and that a neonatal brain responds in the same way to injury as an adult brain. This would mean that, after a stroke, the blood-brain barrier is an open gate and different molecules could go in and out, like a floodgate,” she said. “But in neonatal stroke the situation is very different, and this study shows that the neonatal brain has the ability to protect itself by limiting blood-brain barrier permeability.”

In the study, the scientists examined the structural and functional aspects of the blood-brain barrier in live rats that had acute stroke, and found that the blood-brain barrier was markedly more intact in neonatal rats than in adult rats.

The study compared vascular responses to injury in an adult arterial stroke model and an age-appropriate model of neonatal arterial stroke using several blood-brain barrier permeability procedures. Injected molecules that remained in blood vessels under normal conditions leaked into the injured tissue of the adult rats, but the same molecules remained in vessels of neonatal injured rats within 24 hours after injury.

Importantly, the vessels remained intact for molecules of various sizes. The study also showed a different composition of several barrier structural proteins in neonates versus adults, as well as a differential response to stroke at both ages, findings that likely are to contribute to the higher resistance of the neonatal blood-brain barrier after stroke. The study also showed age-related differences in communication between circulating white blood cells and the blood-brain barrier. Neutrophils — a subtype of leukocytes — stuck to injured vasculature and entered the adult brain shortly after stroke, releasing toxic molecules and reactive oxidants and producing damage. In contrast, only a few neutrophils were able to enter the injured neonatal brain. However, pharmacological change – in communication of neutrophils with injured vessels in the neonate made injury worse.

"This study is a very critical step towards developing therapeutics, but these findings are a tip of the iceberg and a lot is still to be learned," said Vexler. "We’re moving to characterize the potential for neonatal repair. Some brain damage can’t be diagnosed early, but might show up later. Now we are experimenting with postponing certain treatments or tweaking some signaling mechanisms to see if we can enhance the capacity of the immature brain to repair itself."

Provided by University of California, San Francisco

Source: medicalxpress.com

Filed under science neuroscience brain psychology stroke

35 notes

Deaf Brain Processes Touch Differently: Lacking Sound Input, the Primary Auditory Cortex ‘Feels’ Touch

ScienceDaily (July 10, 2012) — People who are born deaf process the sense of touch differently than people who are born with normal hearing, according to research funded by the National Institutes of Health. The finding reveals how the early loss of a sense — in this case hearing — affects brain development. It adds to a growing list of discoveries that confirm the impact of experiences and outside influences in molding the developing brain.

People who are born deaf process the sense of touch differently than people who are born with normal hearing, according to research funded by the National Institutes of Health. The finding reveals how the early loss of a sense — in this case hearing — affects brain development. (Credit: © James Steidl / Fotolia)

The study is published in the July 11 online issue of The Journal of Neuroscience.

The researchers, Christina M. Karns, Ph.D., a postdoctoral research associate in the Brain Development Lab at the University of Oregon, Eugene, and her colleagues, show that deaf people use the auditory cortex to process touch stimuli and visual stimuli to a much greater degree than occurs in hearing people. The finding suggests that since the developing auditory cortex of profoundly deaf people is not exposed to sound stimuli, it adapts and takes on additional sensory processing tasks.

"This research shows how the brain is capable of rewiring in dramatic ways," said James F. Battey, Jr., M.D., Ph.D., director of the NIDCD. "This will be of great interest to other researchers who are studying multisensory processing in the brain."

Previous research, including studies performed by the lab director, Helen Neville Ph.D., has shown that people who are born deaf are better at processing peripheral vision and motion. Deaf people may process vision using many different brain regions, especially auditory areas, including the primary auditory cortex. However, no one has tackled whether vision and touch together are processed differently in deaf people, primarily because in experimental settings, it is more difficult to produce the kind of precise tactile stimuli needed to answer this question.

Dr. Karns and her colleagues developed a unique apparatus that could be worn like headphones while subjects were in a magnetic resonance imaging (MRI) scanner. Flexible tubing, connected to a compressor in another room, delivered soundless puffs of air above the right eyebrow and to the cheek below the right eye. Visual stimuli — brief pulses of light — were delivered through fiber optic cables mounted directly below the air-puff nozzle. Functional MRI was used to measure reactions to the stimuli in Heschl’s gyrus, the site of the primary auditory cortex in the human brain’s temporal lobe as well as other brain areas.

The researchers took advantage of an already known perceptual illusion in hearing people known as the auditory induced double flash, in which a single flash of light paired with two or more brief auditory events is perceived as multiple flashes of light. In their experiment, the researchers used a double puff of air as a tactile stimulus to replace the auditory stimulus, but kept the single flash of light. Subjects were also exposed to tactile stimuli and light stimuli separately and time-periods without stimuli to establish a baseline for brain activity.

Hearing people exposed to two puffs of air and one flash of light claimed only to see a single flash. However, when exposed to the same mix of stimuli, the subjects who were deaf saw two flashes. Looking at the brain scans of those who saw the double flash, the scientists observed much greater activity in Heschl’s gyrus, although not all deaf brains responded to the same degree. The deaf individuals with the highest levels of activity in the primary auditory cortex in response to touch also had the strongest response to the illusion.

"We designed this study because we thought that touch and vision might have stronger interactions in the auditory cortices of deaf people," said Dr. Karns." As it turns out, the primary auditory cortex in people who are profoundly deaf focuses on touch, even more than vision, in our experiment."

There are several ways the finding may help deaf people. For example, if touch and vision interact more in the deaf, touch could be used to help deaf students learn math or reading. The finding also has the potential to help clinicians improve the quality of hearing after cochlear implants, especially among congenitally deaf children who are implanted after the ages of 3 or 4. These children, who have lacked auditory input since birth, may struggle with comprehension and speech because their auditory cortex has taken on the processing of other senses, such as touch and vision. These changes may make it more challenging for the auditory cortex to recover auditory processing function after cochlear implantation. Being able to measure how much the auditory cortex has been taken over by other sensory processing could offer doctors insights into the kinds of intervention programs that would help the brain retrain and devote more capacity to auditory processing.

Source: Science Daily

Filed under science neuroscience brain psychology auditory cortex

6 notes

Preclinical development shows promise to treat hearing loss with Usher syndrome III

July 10, 2012

A new study published in the July 11 issue of the Journal of Neuroscience details the development of the first mouse model engineered to carry the most common mutation in Usher syndrome III causative gene (Clarin-1) in North America. Further, the research team from Case Western Reserve University School of Medicine used this new model to understand why mutation in Clarin-1 leads to hearing loss.

Usher Syndrome is an incurable genetic disease and it is the most common cause of the dual sensory deficits of deafness and blindness. It affects an estimated 50,000 Americans and many more worldwide. Clinically it is subdivided into types I-III based on the degree of deafness and the presence of balance disorder and each type is associated with distinct genes. While the progression of the disease is different with each type, all patients ultimately arrive at the same consequence. The focus of this study is Usher type III. More than a dozen genetic mutations are associated with Usher III, with ‘N48K’ mutation in Clarin-1 being the most prevalent mutation in Usher III patients in North America. Since N48K mutation originated in Europe, results of this study will be of significance to a subset of Usher III patients in Europe as well.

"With the prospective of designing and exploring therapies for Usher III patients with N48K mutation, this is a significant preclinical finding," says Kumar Alagramam, PhD, associate professor of otolaryngology head & neck surgery, genetics, and neurosciences and senior author of the manuscript. "This key understanding of how deafness occurs in Usher III is based on three years of collaborative work."

This new study reports on the first mouse model that mimicked the N48K mutation in Usher III patients. The genetically engineered mouse developed hearing loss similar to clinical presentations observed in Usher III patients with N48K mutation. This model allowed researchers to understand the pathophysiology in fine detail, as there is no non-invasive way to evaluate soft tissue pathology in the human inner ear.

The new study explains why the mutation in the N48K mutation in Clarin-1 leads to hearing loss – mislocalization of mutant protein in mechanosensory hair cells of the inner ear. Using this new Usher III model, researchers can now explore prospective therapeutics to rescue mutant protein localization and hearing. If successful, this approach could serve as a model to treat Usher I and II associated with missense mutation.

In 2009, Alagramam et al reported on the first mouse model of Usher III. The first mouse model was gene knockout mutation and most recent mouse model is a missense mutation, the first model of its kind for Usher III.

Provided by Case Western Reserve University

Source: medicalxpress.com

Filed under science neuroscience brain psychology hearing

9 notes

Recovery from pediatric brain injury a lifelong process, experts say

July 9, 2012

In the last ten years, a new understanding of pediatric brain injury and recovery has emerged. Professionals now understand that recovery may be a lifelong process for the child’s entire circle of family, friends, and healthcare providers. The latest efforts to advance medical and rehabilitative services to move children from medical care and rehabilitation to community reintegration are discussed by the leading experts in a recently published special issue of NeuroRehabilitation.

Recovery extends well beyond the technical period of rehabilitation,” say guest editors and noted authorities Peter D. Patrick, PhD, MS, Associate Professor Emeritus of the University of Virginia School of Medicine in Charlottesville, and Ronald C. Savage, EdD, Chairman, North American Brain Injury Society and International Pediatric Brain Injury Society. “Children, adolescents, and families struggle to regain the momentum of their life so as to reduce problems, increase opportunity, and support increased participation in work, play, home, and relationships.”

Neural plasticity introduces unknown challenges in the care of the recovering brain, and the issue addresses the most challenging and demanding medical conditions that children may confront following severe brain injury. However, children do most of their recovery at home, in school, and in the community, beyond medical surveillance. “Family-centered” approaches to developing interventions are emerging. For example, Dr. Damith T. Woods and colleagues report on a novel telephone support program to help parents manage challenging behavior associated with brain injury.

Children and adolescents with brain injuries have difficulty adjusting to their injuries and altered abilities, and frequently suffer from low self-esteem and loss of confidence. A study by Carol A. Hawley finds that children with traumatic brain injury have significantly lower self-esteem than normal children, and recommends that rehabilitation strategies promote a sense of self-worth.

Re-entry into school is a major milestone of recovery and the issue highlights a number of efforts to help children improve and return to a positive developmental trajectory. An article by Beth Wicks describes an innovative program in Britain that looks at “education as rehabilitation,” translating successful adult vocational programs into educational rehabilitation programs for children. Lucia Willadino Braga and colleagues report on a program based on cooperative learning that helped preadolescents with acquired brain injury develop metacognitive strategies and improve self-concept, thereby helping empower the preadolescents in their social relationships.

"Over the years and in multiple places around the world, innovative and creative efforts have slowly revealed effective interventions for recovery," comment Dr. Patrick and Dr. Savage. "Increasingly the interventions are evidence-based. This issue is a contribution to the effort to improve outcomes for children and families."

Provided by IOS Press

Source: medicalxpress.com

Filed under science neuroscience brain brain injury psychology

5 notes

Pediatric Brain Tumors Traced to Brain Stem Cells

ScienceDaily (July 9, 2012) — Scientists showed in mice that disabling a gene linked to a common pediatric tumor disorder, neurofibromatosis type 1 (NF1), made stem cells from one part of the brain proliferate rapidly. But the same genetic deficit had no effect on stem cells from another brain region.

The results can be explained by differences in the way stem cells from these regions of the brain respond to cancer-causing genetic changes.

NF1 is among the world’s most common genetic disorders, occurring in about one of every 3,000 births. It causes a wide range of symptoms, including brain tumors, learning disabilities and attention deficits.

Brain tumors in children with NF1 typically arise in the optic nerve and do not necessarily require treatment. If optic gliomas keep growing, though, they can threaten the child’s vision. By learning more about the many factors that contribute to NF1 tumor formation, scientists hope to develop more effective treatments.

"To improve therapy, we need to develop better ways to identify and group tumors based not just on the way they look under the microscope, but also on innate properties of their stem cell progenitors," says David H. Gutmann, MD, PhD, the Donald O. Schnuck Family Professor of Neurology.

The study appears July 9 in Cancer Cell. Gutmann also is the director of the Washington University Neurofibromatosis Center.

In the new study, researchers compared brain stem cells from two primary sources: the third ventricle, located in the midbrain, and the nearby lateral ventricles. Before birth and for a time afterward, both of these areas in the brain are lined with growing stem cells.

First author Da Yong Lee, PhD, a postdoctoral research associate, showed that the cells lining both ventricles are true stem cells capable of becoming nerve and support cells (glia) in the brain. Next, she conducted a detailed analysis of gene expression in both stem cell types.

"There are night-and-day differences between these two groups of stem cells," Gutmann says. "These results show that stem cells are not the same everywhere in the brain, which has real consequences for human neurologic disease."

The third ventricle is close to the optic chiasm, the point where the optic nerves cross and optic gliomas develop in NF1 patients. Lee and Gutmann postulated that stem cells from this ventricle might be the source of progenitor cells that can become gliomas in children with NF1.

To test the theory, they disabled the Nf1 gene in neural stem cells from the third and lateral ventricles in the mice. This same gene is mutated in patients with NF1, increasing their risk of developing tumors.

Lee found that loss of Nf1 activity had little effect on stem cells from the lateral ventricle, but stem cells from the third ventricle began to divide rapidly, a change that puts them closer to becoming tumors.

The third ventricle usually stops supplying stem cells to the brain shortly after birth. When researchers inactivated the Nf1 gene before the third ventricle closed, the mice developed optic gliomas. When they waited until the third ventricle had closed to inactivate the Nf1 gene, gliomas did not develop.

Gutmann plans further studies to determine whether all NF1-related optic gliomas form in cells descended from the third ventricle. He suspects that additional factors are necessary for optic gliomas to form in cooperation with Nf1 gene loss in third-ventricle stem cells.

"We have to recognize that cancers which appear very similar actually represent a collection of quite different diseases," he says. "Tumors are like us — they’re defined by where they live, what their families are like, the traumas they experience growing up, and a variety of other factors. If we can better understand the interplay of these factors, we’ll be able to develop treatments that are much more likely to succeed, because they’ll target what is unique about a specific patient’s tumor."

Source: Science Daily

Filed under science neuroscience brain psychology tumor

7 notes

Better Treatment for Brain Cancer Revealed by New Molecular Insights

ScienceDaily (July 9, 2012) — Nearly a third of adults with the most common type of brain cancer develop recurrent, invasive tumors after being treated with a drug called bevacizumab. The molecular underpinnings behind these detrimental effects have now been published by Cell Press in the July issue of Cancer Cell. The findings reveal a new treatment strategy that could reduce tumor invasiveness and improve survival in these drug-resistant patients.

"Understanding how and why these tumors adopt this invasive behavior is critical to being able to prevent this recurrence pattern and maximizing the benefits of bevacizumab," says study author Kan Lu of the University of California, San Francisco (UCSF).

Glioblastoma multiforme (GBM) is the most aggressive type of tumor originating in the brain. GBM tumors express high levels of vascular endothelial growth factor (VEGF), a protein that promotes the growth of new blood vessels that provide nutrients that allow tumors to expand. In 2009, the Food and Drug Administration approved bevacizumab, a VEGF inhibitor, for GBM patients who don’t respond to first-line therapies. Although the drug is initially effective, up to 30% of patients develop tumors that infiltrate deep into the brain, making surgery and treatment difficult.

To study how bevacizumab can lead to adverse effects, senior study author Gabriele Bergers of UCSF and her collaborators focused on hepatocyte growth factor (HGF), a protein that controls the growth and movement of cells, because they previously found a link between VEGF and HGF in GBM cells. In the new study, they found that VEGF inhibits the migration of GBM cells by decreasing HGF signaling through its receptor MET. Moreover, tumors were much less invasive — and survival improved — in mice with GBM tumors lacking both VEGF and MET rather than just VEGF alone. The results suggest that MET plays a critical role in GBM invasion when VEGF is blocked.

"These findings provide a rationale for therapeutically combining VEGF and MET inhibition so that patients can benefit from bevacizumab without developing more invasive tumors," Lu says. Because the VEGF and HGF/MET signaling pathways are active in a variety of tumors, this combined treatment strategy may also be applied to other types of cancer.

Source: Science Daily

Filed under science neuroscience psychology brain

27 notes

Hormone Curbs Depressive-Like Symptoms in Stressed Mice

ScienceDaily (July 9, 2012) — A hormone with anti-diabetic properties also reduces depression-like symptoms in mice, researchers from the School of Medicine at the UT Health Science Center San Antonio reported July 9.

All types of current antidepressants, including tricyclics and selective serotonin reuptake inhibitors, increase the risk for type 2 diabetes. “The finding offers a novel target for treating depression, and would be especially beneficial for those depressed individuals who have type 2 diabetes or who are at high risk for developing it,” said the study’s senior author, Xin-Yun Lu, Ph.D., associate professor of pharmacology and psychiatry and member of the Barshop Institute for Longevity and Aging Studies at the UT Health Science Center.

The hormone, called adiponectin, is secreted by adipose tissue and sensitizes the body to the action of insulin, a hormone that lowers blood sugar. “We showed that adiponectin levels in plasma are reduced in a chronic social defeat stress model of depression, which correlates with the degree of social aversion,” Dr. Lu said.

Facing Goliath over and over

In the study mice were exposed to 14 days of repeated social defeat stress. Each male mouse was introduced to the home cage of an unfamiliar, aggressive resident mouse for 10 minutes and physically defeated. After the defeat, the resident mouse and the intruder mouse each were housed in half of the cage separated by a perforated plastic divider to allow visual, olfactory and auditory contact for the remainder of the 24-hour period. Mice were exposed to a new resident mouse cage and subjected to social defeat each day. Plasma adiponectin concentrations were determined after the last social defeat session. Defeated mice displayed lower plasma adiponectin levels.

Withdrawal, lost pleasure and helplessness

When adiponectin concentrations were reduced by deleting one allele of the adiponectin gene or by a neutralizing antibody, mice were more susceptible to stress-induced social withdrawal, anhedonia (lost capacity to experience pleasure) and learned helplessness.

Mice that were fed a high-fat diet (60 percent calories from fat) for 16 weeks developed obesity and type 2 diabetes. Administration of adiponectin to these mice and mice of normal weight produced antidepressant-like effects.

Possible innovative approach for depression

"These findings suggest a critical role of adiponectin in the development of depressive-like behaviors and may lead to an innovative therapeutic approach to fight depression," Dr. Lu said.

A novel approach would benefit thousands. “So far, only about half of the patients suffering from major depressive disorders are treated to the point of remission with antidepressant drugs,” Dr. Lu said. “The prevalence of depression in the diabetic population is two to three times higher than in the non-diabetic population. Unfortunately, the use of current antidepressants can worsen the control of diabetic patients. Adiponectin, with its anti-diabetic activity, would serve as an innovative therapeutic target for depression treatments, especially for those individuals with diabetes or prediabetes and perhaps those who fail to respond to currently available antidepressants.”

Source: Science Daily

Filed under science neuroscience brain psychology depression

9 notes

Long-Term Hormone Treatment Increases Synapses in Female Rats’ Prefrontal Cortex

ScienceDaily (July 9, 2012) — A new study of aged female rats found that long-term treatment with estrogen and a synthetic progesterone known as MPA increased levels of a protein marker of synapses in the prefrontal cortex, a brain region known to suffer significant losses in aging.

The new findings appear to contradict the results of the Women’s Health Initiative, a long-term study begun in 1991 to analyze the effects of hormone therapy on a large sample of healthy postmenopausal women aged 50 to 79. Among other negative findings, the WHI found that long-term exposure to estrogen alone or to estrogen and MPA resulted in an increased risk of stroke and dementia. More recent research, however, suggests that starting hormone replacement therapy at the onset of menopause, rather than years or decades afterward, yields different results.

The new study, from researchers at the University of Illinois, is the first to look at the effects of long-term treatment with estrogen and MPA on the number of synapses in the prefrontal cortex of aged animals. The researchers describe their findings in a paper in the journal Menopause.

"The prefrontal cortex is the area of the human brain that loses the most volume with age," said U. of I. psychology professor and Beckman Institute affiliate Janice Juraska, who led the study with doctoral student Nioka Chisholm. "So understanding how anything affects the prefrontal cortex is important."

The prefrontal cortex, just behind the forehead in humans, governs what researchers call “executive function” — planning, strategic thinking, working memory (the ability to hold information in mind just long enough to use it), self-control and other functions that tend to decline with age.

Most studies of the effects of hormone treatments on the brain have focused on the hippocampus, a structure important to spatial navigation and memory consolidation. The studies tend to use young animals exposed to hormones for very brief periods of time (one or two days to a few weeks at the most). They have yielded mixed results, with most research in young female animals indicating an increase in hippocampal synapses and hippocampal function after exposure to estrogen and MPA.

"For some reason, a lot of researchers still look at the effects of hormones in young animals," Chisholm said. "And there’s a lot of evidence now saying that the aged brain is different; the effect of these hormones is not going to be the same."

The new study followed middle-aged rats exposed to estrogen alone, to no additional hormones, or to estrogen in combination with MPA for seven months, a time period that more closely corresponds to the experience of women who start hormone therapy at the onset of menopause and continue into old age. The researchers removed the rats’ ovaries just prior to the hormone treatment (or lack of treatment) to mimic the changes that occur in humans during menopause.

"Our most important finding is that estrogen in combination with MPA can result in a greater number of synapses in the prefrontal cortex than (that seen) in animals that are not receiving hormone replacement," Chisholm said. "Estrogen alone marginally increased the synapses, but it took the combination with MPA to actually see the significant effect."

"Our data indicate that re-examining the effects of estrogen and MPA, when first given to women around the time of menopause, is merited," Juraska said.

Source: Science Daily

Filed under science neuroscience brain psychology

free counters