Neuroscience

Month

July 2012

Alzheimer's plaques in PET brain scans identify future cognitive decline

July 11, 2012

Among patients with mild or no cognitive impairment, brain scans using a new radioactive dye can detect early evidence of Alzheimer’s disease that may predict future decline, according to a multi-center study led by researchers at Duke University Medical Center.

image

PET images using florbetapir dye to highlight beta-amyloid plaques show (A), a cognitively normal subject; (B) an amyloid-positive patient with Alzheimer’s disease; (C) a patient with mild cognitive impairment; and (D) a patient with mild cognitive impairment who progressed to dementia during the study. Credit: Slide courtesy of the journal Neurology.

The finding is published online July 11, 2012, in the journal Neurology, the medical journal of the American Academy of Neurology. It expands on smaller studies demonstrating that early detection of tell-tale plaques could be a predictive tool to help guide care and treatment decisions for patients with Alzheimer’s disease.

"Even at a short follow-up of 18 months we can see how the presence of amyloid plaques affects cognitive function," said P. Murali Doraiswamy, M.D., professor of psychiatry at Duke who co-led the study with R. Edward Coleman, M.D., professor of radiology at Duke . "Most people who come to the doctor with mild impairment really want to know the short-term prognosis and potential long-term effect."

Doraiswamy said such knowledge also has some pitfalls. There is no cure for Alzheimer’s disease, which afflicts 5.4 million people in the United States and is the sixth-leading cause of death among U.S. adults. But he said numerous drugs are being investigated, and identifying earlier disease would improve research into their potential benefits and speed new discoveries, while also enhancing care and treatment of current patients.

In the Neurology study, 151 people who had enrolled in a multi-center test of a new radioactive dye called florbetapir (Amyvid) were recruited to participate in a 36-month analysis. Of those participants, 69 had normal cognitive function at the start of the study, 51 had been diagnosed with mild impairment, and 31 had Alzheimer’s dementia.

All completed cognitive tests and underwent a brain scan using Positron Emission Tomography, or PET imaging. The technology uses radioactive tracers designed to highlight specific tissue to create a three-dimensional picture of an organ or a biological function.

The dye used in the study, florbetapir, was recently approved by the U.S. Food and Drug Administration for PET imaging of the brain to estimate beta-amyloid plaque density in patients who are being evaluated for cognitive impairment. It binds to the amyloid plaques that characterize Alzheimer’s disease, providing a window into the brain to see if the plaques have formed, and how extensively.

Patients in the study were reassessed with additional cognitive exams at 18 months and 36 months. At the 18-month point, patients with mild cognitive impairment who had PET evidence of plaque at the trial’s start worsened to a great degree on cognitive tests than patients who had no evidence of plaque at the trial’s start. Twenty-nine percent of the plaque-positive patients in this group developed Alzheimer’s dementia, compared to 10 percent who started with no plaque.

Cognitively normal patients with a plaque-positive PET scan at the start of the study also showed more mental decline at 18 months compared to those who were negative for plaque.

The study additionally found that people with negative scans reversed from minimally impaired to normal more often than people with positive PET scan, suggesting test anxiety or concentration problems could have affected their initial performance.

"For the most part we have been blind about who would progress and who wouldn’t, so this approach is a step toward having a biomarker that predicts risk of decline in people who are experiencing cognitive impairment," Doraiswamy said.

He said the study’s results provide initial data that needs to be verified by additional research. Final, 36-month data from the study has been completed and will be presented at the Alzheimer’s Association International Conference this week in Vancouver, Canada. Doraiswamy also cautioned that florbetapir is currently not approved to predict the development of dementia or other neurologic conditions and stressed that it should not be used as a screening tool in otherwise normal or minimally impaired people. Likewise, a positive scan is not necessarily diagnostic for Alzheimer’s by itself.

Provided by Duke University Medical Center

Source: medicalxpress.com

Jul 12, 201218 notes
#science #neuroscience #brain #psychology #alzheimer #neuroimaging
Individual differences in altruism explained by brain region involved in empathy

July 11, 2012

What can explain extreme differences in altruism among individuals, from Ebenezer Scrooge to Mother Teresa? It may all come down to variation in the size and activity of a brain region involved in appreciating others’ perspectives, according to a study published in the July 12th issue of the journal Neuron. The findings also provide a neural explanation for why altruistic tendencies remain stable over time.

image

The junction (yellow) between the parietal and the temporal lobes, in which the relative proportion of gray matter is significantly positively correlated with the propensity for altruistic behavior. Credit: University of Zurich

"This is the first study to link both brain anatomy and brain activation to human altruism,” says senior study author Ernst Fehr of the University of Zurich. “The findings suggest that the development of altruism through appropriate training or social practices might occur through changes in the brain structure and the neural activations that we identified in our study.”

Individuals who excel at understanding others’ intents and beliefs are more altruistic than those who struggle at this task. The ability to understand others’ perspectives has previously been associated with activity in a brain region known as the temporoparietal junction (TPJ). Based on these past findings, Fehr and his team reasoned that the size and activation of the TPJ would relate to individual differences in altruism.

In the new study, subjects underwent a brain imaging scan and played a game in which they had to decide how to split money between themselves and anonymous partners. Subjects who made more generous decisions had a larger TPJ in the right hemisphere of the brain compared with subjects who made stingy decisions.

Moreover, activity in the TPJ reflected each subject’s specific cutoff value for the maximal cost the subject was willing to endure to increase the partner’s payoff. Activity in the TPJ was higher during hard decisions—when the personal cost of an altruistic act was just below the cutoff value—than during easy decisions associated with a very low or very high cost.

"The structure of the TPJ strongly predicts an individual’s setpoint for altruistic behavior, while activity in this brain region predicts an individual’s acceptable cost for altruistic actions," says study author Yosuke Morishima of the University of Zurich. "We have elucidated the relationship between the hardware and software of human altruistic behavior."

Provided by Cell Press

Source: medicalxpress.com

Jul 12, 201242 notes
#science #neuroscience #brain #psychology #empathy #emotion
H1N1 Vaccine Associated With Small but Significant Risk of Guillain-Barre Syndrome

July 10th, 2012

Guillain-Barre syndrome (GBS) is usually characterized by rapidly developing motor weakness and areflexia (the absence of reflexes). “The disease is thought to be autoimmune and triggered by a stimulus of external origin.

In 1976-1977, an unusually high rate of GBS was identified in the United States following the administration of inactivated ‘swine’ influenza A(H1N1) vaccines. In 2003, the Institute of Medicine (IOM) concluded that the evidence favored acceptance of a causal relationship between the 1976 swine influenza vaccines and GBS in adults.

Studies of seasonal influenza vaccines administered in subsequent years have found small or no increased risk,” according to background information in the article. “In a more recent assessment of epidemiologic studies on seasonal influenza vaccines, experimental studies in animals, and case reports in humans, the IOM Committee to Review Adverse Effects of Vaccines concluded that the evidence was inadequate to accept or reject a causal relationship.”

image

Analysis of recent H1N1 vaccination data indicated a small but significant risk of GBS following influenza A(H1N1) vaccinations.

Philippe De Wals, M.D., Ph.D., of Laval University, Quebec City, Canada and colleagues conducted a study to assess the risk of GBS following pandemic influenza vaccine administration. In fall 2009 in Quebec an immunization campaign was launched against the 2009 influenza A(H1N1) pandemic strain. By the end of the year, 4.4 million residents had been vaccinated. The study included follow-up over the 6-month period of October 2009 through March 2010 for suspected and confirmed GBS cases reported by physicians, mostly neurologists, during active surveillance or identified in the provincial hospital summary discharge database. Immunization status was verified.

Over the 6-month period, 83 confirmed GBS cases were identified. Twenty-five confirmed cases had been vaccinated against 2009 influenza A(H1N1) 8 or fewer weeks before disease onset, with most (19/25) vaccinated 4 or fewer weeks before onset. Analysis of data indicated a small but significant risk of GBS following influenza A(H1N1) vaccination. The number of cases attributable to vaccination was approximately 2 per 1 million doses. The excess risk was observed only in persons 50 years of age or older.

“In Quebec, the individual risk of hospitalization following a documented influenza A(H1N1) infection was 1 per 2,500 and the risk of death was 1/73,000. The H1N1 vaccine was very effective in preventing infections and complications. It is likely that the benefits of immunization outweigh the risks,” the authors write.

Source: Neuroscience News

Jul 11, 201213 notes
#science #neuroscience #disease #guillain-barre #H1N1
Potential Cause of HIV-Associated Dementia Revealed

ScienceDaily (July 10, 2012) — Researchers at Georgetown University Medical Center appear to have solved the mystery of why some patients infected with HIV, who are using antiretroviral therapy and show no signs of AIDS, develop serious depression as well as profound problems with memory, learning, and motor function. The finding might also provide a way to test people with HIV to determine their risk for developing dementia.

They say the answer, published in the July 11 issue of the Journal of Neuroscience, may ultimately lead to a therapeutic solution that helps these patients as well as others suffering from brain ailments that appear to develop through the same pathway, including those that occur in the aged.

"We believe we have discovered a general mechanism of neuronal decline that even explains what happens in some elderly folks," says the study’s lead investigator, Italo Mocchetti, Ph.D., professor and vice chair of the department of neuroscience at Georgetown University Medical Center. "The HIV-infected patients who develop this syndrome are usually quite young, but their brains act old."

The research team found that even though HIV does not infect neurons, it tries to stop the brain from producing a protein growth factor — mature brain derived neurotrophic factor (mature BDNF) — that Mocchetti says acts like “food” for brain neurons. Reduced mature BDNF results in the shortening of the axons and their branches that neurons use to connect to each other, and when they lose this communication, the neurons die.

"The loss of neurons and their connections is profound in these patients," Mocchetti says. HIV-associated dementia occurs in two to three percent of HIV-infected patients using retroviral therapies, all of who appear to be otherwise healthy, and in 30 percent of HIV-positive patients who are not on medication.

Mocchetti believes that HIV stops production of mature BDNF because that protein interferes with the ability of the virus to attack other brain cells. It does this through the potent gp120 envelope protein that sticks out from the viral shell — the same protein that hooks on to brain macrophages and microglial cells to infect them. “In earlier experiments, when we dumped gp120 into neuronal tissue culture, there was a 30-40 percent loss of neurons overnight. That makes gp120 a remarkable neurotoxin.”

This study is the product of years of work that has resulted in a string of publications. It began when Mocchetti and his colleagues were given a grant from the National Institutes on Drug Abuse to determine whether there was a connection between the use of cocaine and morphine, and dementia. (A substantial number of HIV-positive patients have been or currently are intravenous drugs users.)

They found that it was the virus that was responsible for the dementia, not the drugs, and so they set out to discover how the virus was altering neuronal function.

Their scientific break came when the researchers were able to study the blood of 130 women who were enrolled in the 17 year-old, nationwide WIHS (Women’s Interagency HIV Study, directed at Georgetown by Mary Young, M.D.), which has focused on the effects of HIV in infected females. In one seminal discovery, Mocchetti and colleagues found that when there was less BDNF in the blood, patients were at risk of developing brain abnormalities. He published this finding in 2011 in the May 15 issue of AIDS.

In this study, Mocchetti, Alessia Bachis, Ph.D., and their colleagues studied the brains of HIV-positive patients who had died, and who had developed HIV-associated dementia. They also found that neurons had shrunk, and that mature BDNF had substantially decreased.

He and his colleagues then worked out the mechanism responsible for this destruction of neurons.

Normally, neurons release a long form of BDNF known as proBDNF, and then certain enzymes, including one called furin, cleave proBDNF to produce mature BDNF, which then nurtures brain neurons. When uncut, proBDNF is toxic, leading to “synaptic simplification,” or the shortening of axons. It does this by binding to a receptor, p75NTR, that contains a death domain.

"HIV interferes with that normal process of cleaving proBDNF, resulting in neurons primarily secreting a toxic form of BDNF," Mocchetti says. The same imbalance between mature BDNF and proBDNF occurs as we age, he says, although no one knows how that happens. "The link between depression and lack of mature BDNF is also known, as is the link to issues of learning and memory. That’s why I say HIV-associated dementia resembles the aging brain."

Loss of mature BDNF has also been suggested to be a risk factor in chronic diseases such as Parkinson’s and Huntington’s diseases, Mocchetti says.

The findings suggest a possible therapeutic intervention, he adds. “One way would be to use a small molecule to block the p75NTR receptor that proBDNF uses to kill neurons. A small molecule like that could get through the blood-brain barrier.

"If this works in HIV-dementia, it may also work in other brain issues caused by proBDNF, such as aging," Mocchetti adds.

The finding also suggests that measuring proBDNF in HIV-positive patients may provide a biomarker of risk for development of dementia, he adds.

"This finding is extremely important for both basic scientists and physicians, because it suggests a new avenue to understand, and treat, a fairly widespread cause of dementia," Mocchetti says.

Source: Science Daily

Jul 11, 201221 notes
#science #neuroscience #brain #psychology #HIV #dementia
Blood-brain barrier less permeable in newborns than adults after acute stroke

July 10, 2012

The ability for substances to pass through the blood-brain barrier is increased after adult stroke, but not after neonatal stroke, according to a new study the UCSF that will be published July 11 in the Journal of Neuroscience.

The blood-brain barrier is selectively permeable and blocks unwanted molecules from entering into the brain. The selectivity is achieved through fine coordination in function of many transporting systems in endothelial cells, which line the interior of blood vessels, and communication between endothelial cells and several types of cells in the brain. When blood flow in an artery to the brain is blocked by a blood clot, as occurs in arterial stroke, brain energy metabolism is compromised, and ion and other transporting systems malfunction, leading to blood-brain disruption.

The new finding suggests, the researchers said, that drugs used to treat stroke need to be tailored to the specific makeup of the neonate blood-brain barrier.

"How the blood-brain barrier responds to stroke in adults and neonates currently is poorly understood,” said senior author Zinaida Vexler, PhD, director of research at the Neonatal Brain Disorders Center at the Department of Neurology at UCSF.

"The assumption has been that at birth the blood-brain barrier is immature and thus permeable and that a neonatal brain responds in the same way to injury as an adult brain. This would mean that, after a stroke, the blood-brain barrier is an open gate and different molecules could go in and out, like a floodgate,” she said. “But in neonatal stroke the situation is very different, and this study shows that the neonatal brain has the ability to protect itself by limiting blood-brain barrier permeability.”

In the study, the scientists examined the structural and functional aspects of the blood-brain barrier in live rats that had acute stroke, and found that the blood-brain barrier was markedly more intact in neonatal rats than in adult rats.

The study compared vascular responses to injury in an adult arterial stroke model and an age-appropriate model of neonatal arterial stroke using several blood-brain barrier permeability procedures. Injected molecules that remained in blood vessels under normal conditions leaked into the injured tissue of the adult rats, but the same molecules remained in vessels of neonatal injured rats within 24 hours after injury.

Importantly, the vessels remained intact for molecules of various sizes. The study also showed a different composition of several barrier structural proteins in neonates versus adults, as well as a differential response to stroke at both ages, findings that likely are to contribute to the higher resistance of the neonatal blood-brain barrier after stroke. The study also showed age-related differences in communication between circulating white blood cells and the blood-brain barrier. Neutrophils — a subtype of leukocytes — stuck to injured vasculature and entered the adult brain shortly after stroke, releasing toxic molecules and reactive oxidants and producing damage. In contrast, only a few neutrophils were able to enter the injured neonatal brain. However, pharmacological change – in communication of neutrophils with injured vessels in the neonate made injury worse.

"This study is a very critical step towards developing therapeutics, but these findings are a tip of the iceberg and a lot is still to be learned," said Vexler. "We’re moving to characterize the potential for neonatal repair. Some brain damage can’t be diagnosed early, but might show up later. Now we are experimenting with postponing certain treatments or tweaking some signaling mechanisms to see if we can enhance the capacity of the immature brain to repair itself."

Provided by University of California, San Francisco

Source: medicalxpress.com

Jul 11, 201216 notes
#science #neuroscience #brain #psychology #stroke
Deaf Brain Processes Touch Differently: Lacking Sound Input, the Primary Auditory Cortex 'Feels' Touch

ScienceDaily (July 10, 2012) — People who are born deaf process the sense of touch differently than people who are born with normal hearing, according to research funded by the National Institutes of Health. The finding reveals how the early loss of a sense — in this case hearing — affects brain development. It adds to a growing list of discoveries that confirm the impact of experiences and outside influences in molding the developing brain.

image

People who are born deaf process the sense of touch differently than people who are born with normal hearing, according to research funded by the National Institutes of Health. The finding reveals how the early loss of a sense — in this case hearing — affects brain development. (Credit: © James Steidl / Fotolia)

The study is published in the July 11 online issue of The Journal of Neuroscience.

The researchers, Christina M. Karns, Ph.D., a postdoctoral research associate in the Brain Development Lab at the University of Oregon, Eugene, and her colleagues, show that deaf people use the auditory cortex to process touch stimuli and visual stimuli to a much greater degree than occurs in hearing people. The finding suggests that since the developing auditory cortex of profoundly deaf people is not exposed to sound stimuli, it adapts and takes on additional sensory processing tasks.

"This research shows how the brain is capable of rewiring in dramatic ways," said James F. Battey, Jr., M.D., Ph.D., director of the NIDCD. "This will be of great interest to other researchers who are studying multisensory processing in the brain."

Previous research, including studies performed by the lab director, Helen Neville Ph.D., has shown that people who are born deaf are better at processing peripheral vision and motion. Deaf people may process vision using many different brain regions, especially auditory areas, including the primary auditory cortex. However, no one has tackled whether vision and touch together are processed differently in deaf people, primarily because in experimental settings, it is more difficult to produce the kind of precise tactile stimuli needed to answer this question.

Dr. Karns and her colleagues developed a unique apparatus that could be worn like headphones while subjects were in a magnetic resonance imaging (MRI) scanner. Flexible tubing, connected to a compressor in another room, delivered soundless puffs of air above the right eyebrow and to the cheek below the right eye. Visual stimuli — brief pulses of light — were delivered through fiber optic cables mounted directly below the air-puff nozzle. Functional MRI was used to measure reactions to the stimuli in Heschl’s gyrus, the site of the primary auditory cortex in the human brain’s temporal lobe as well as other brain areas.

The researchers took advantage of an already known perceptual illusion in hearing people known as the auditory induced double flash, in which a single flash of light paired with two or more brief auditory events is perceived as multiple flashes of light. In their experiment, the researchers used a double puff of air as a tactile stimulus to replace the auditory stimulus, but kept the single flash of light. Subjects were also exposed to tactile stimuli and light stimuli separately and time-periods without stimuli to establish a baseline for brain activity.

Hearing people exposed to two puffs of air and one flash of light claimed only to see a single flash. However, when exposed to the same mix of stimuli, the subjects who were deaf saw two flashes. Looking at the brain scans of those who saw the double flash, the scientists observed much greater activity in Heschl’s gyrus, although not all deaf brains responded to the same degree. The deaf individuals with the highest levels of activity in the primary auditory cortex in response to touch also had the strongest response to the illusion.

"We designed this study because we thought that touch and vision might have stronger interactions in the auditory cortices of deaf people," said Dr. Karns." As it turns out, the primary auditory cortex in people who are profoundly deaf focuses on touch, even more than vision, in our experiment."

There are several ways the finding may help deaf people. For example, if touch and vision interact more in the deaf, touch could be used to help deaf students learn math or reading. The finding also has the potential to help clinicians improve the quality of hearing after cochlear implants, especially among congenitally deaf children who are implanted after the ages of 3 or 4. These children, who have lacked auditory input since birth, may struggle with comprehension and speech because their auditory cortex has taken on the processing of other senses, such as touch and vision. These changes may make it more challenging for the auditory cortex to recover auditory processing function after cochlear implantation. Being able to measure how much the auditory cortex has been taken over by other sensory processing could offer doctors insights into the kinds of intervention programs that would help the brain retrain and devote more capacity to auditory processing.

Source: Science Daily

Jul 11, 201235 notes
#science #neuroscience #brain #psychology #auditory cortex
Preclinical development shows promise to treat hearing loss with Usher syndrome III

July 10, 2012

A new study published in the July 11 issue of the Journal of Neuroscience details the development of the first mouse model engineered to carry the most common mutation in Usher syndrome III causative gene (Clarin-1) in North America. Further, the research team from Case Western Reserve University School of Medicine used this new model to understand why mutation in Clarin-1 leads to hearing loss.

Usher Syndrome is an incurable genetic disease and it is the most common cause of the dual sensory deficits of deafness and blindness. It affects an estimated 50,000 Americans and many more worldwide. Clinically it is subdivided into types I-III based on the degree of deafness and the presence of balance disorder and each type is associated with distinct genes. While the progression of the disease is different with each type, all patients ultimately arrive at the same consequence. The focus of this study is Usher type III. More than a dozen genetic mutations are associated with Usher III, with ‘N48K’ mutation in Clarin-1 being the most prevalent mutation in Usher III patients in North America. Since N48K mutation originated in Europe, results of this study will be of significance to a subset of Usher III patients in Europe as well.

"With the prospective of designing and exploring therapies for Usher III patients with N48K mutation, this is a significant preclinical finding," says Kumar Alagramam, PhD, associate professor of otolaryngology head & neck surgery, genetics, and neurosciences and senior author of the manuscript. "This key understanding of how deafness occurs in Usher III is based on three years of collaborative work."

This new study reports on the first mouse model that mimicked the N48K mutation in Usher III patients. The genetically engineered mouse developed hearing loss similar to clinical presentations observed in Usher III patients with N48K mutation. This model allowed researchers to understand the pathophysiology in fine detail, as there is no non-invasive way to evaluate soft tissue pathology in the human inner ear.

The new study explains why the mutation in the N48K mutation in Clarin-1 leads to hearing loss – mislocalization of mutant protein in mechanosensory hair cells of the inner ear. Using this new Usher III model, researchers can now explore prospective therapeutics to rescue mutant protein localization and hearing. If successful, this approach could serve as a model to treat Usher I and II associated with missense mutation.

In 2009, Alagramam et al reported on the first mouse model of Usher III. The first mouse model was gene knockout mutation and most recent mouse model is a missense mutation, the first model of its kind for Usher III.

Provided by Case Western Reserve University

Source: medicalxpress.com

Jul 11, 20126 notes
#science #neuroscience #brain #psychology #hearing
Recovery from pediatric brain injury a lifelong process, experts say

July 9, 2012

In the last ten years, a new understanding of pediatric brain injury and recovery has emerged. Professionals now understand that recovery may be a lifelong process for the child’s entire circle of family, friends, and healthcare providers. The latest efforts to advance medical and rehabilitative services to move children from medical care and rehabilitation to community reintegration are discussed by the leading experts in a recently published special issue of NeuroRehabilitation.

“Recovery extends well beyond the technical period of rehabilitation,” say guest editors and noted authorities Peter D. Patrick, PhD, MS, Associate Professor Emeritus of the University of Virginia School of Medicine in Charlottesville, and Ronald C. Savage, EdD, Chairman, North American Brain Injury Society and International Pediatric Brain Injury Society. “Children, adolescents, and families struggle to regain the momentum of their life so as to reduce problems, increase opportunity, and support increased participation in work, play, home, and relationships.”

Neural plasticity introduces unknown challenges in the care of the recovering brain, and the issue addresses the most challenging and demanding medical conditions that children may confront following severe brain injury. However, children do most of their recovery at home, in school, and in the community, beyond medical surveillance. “Family-centered” approaches to developing interventions are emerging. For example, Dr. Damith T. Woods and colleagues report on a novel telephone support program to help parents manage challenging behavior associated with brain injury.

Children and adolescents with brain injuries have difficulty adjusting to their injuries and altered abilities, and frequently suffer from low self-esteem and loss of confidence. A study by Carol A. Hawley finds that children with traumatic brain injury have significantly lower self-esteem than normal children, and recommends that rehabilitation strategies promote a sense of self-worth.

Re-entry into school is a major milestone of recovery and the issue highlights a number of efforts to help children improve and return to a positive developmental trajectory. An article by Beth Wicks describes an innovative program in Britain that looks at “education as rehabilitation,” translating successful adult vocational programs into educational rehabilitation programs for children. Lucia Willadino Braga and colleagues report on a program based on cooperative learning that helped preadolescents with acquired brain injury develop metacognitive strategies and improve self-concept, thereby helping empower the preadolescents in their social relationships.

"Over the years and in multiple places around the world, innovative and creative efforts have slowly revealed effective interventions for recovery," comment Dr. Patrick and Dr. Savage. "Increasingly the interventions are evidence-based. This issue is a contribution to the effort to improve outcomes for children and families."

Provided by IOS Press

Source: medicalxpress.com

Jul 11, 20129 notes
#science #neuroscience #brain #brain injury #psychology
Pediatric Brain Tumors Traced to Brain Stem Cells

ScienceDaily (July 9, 2012) — Scientists showed in mice that disabling a gene linked to a common pediatric tumor disorder, neurofibromatosis type 1 (NF1), made stem cells from one part of the brain proliferate rapidly. But the same genetic deficit had no effect on stem cells from another brain region.

The results can be explained by differences in the way stem cells from these regions of the brain respond to cancer-causing genetic changes.

NF1 is among the world’s most common genetic disorders, occurring in about one of every 3,000 births. It causes a wide range of symptoms, including brain tumors, learning disabilities and attention deficits.

Brain tumors in children with NF1 typically arise in the optic nerve and do not necessarily require treatment. If optic gliomas keep growing, though, they can threaten the child’s vision. By learning more about the many factors that contribute to NF1 tumor formation, scientists hope to develop more effective treatments.

"To improve therapy, we need to develop better ways to identify and group tumors based not just on the way they look under the microscope, but also on innate properties of their stem cell progenitors," says David H. Gutmann, MD, PhD, the Donald O. Schnuck Family Professor of Neurology.

The study appears July 9 in Cancer Cell. Gutmann also is the director of the Washington University Neurofibromatosis Center.

In the new study, researchers compared brain stem cells from two primary sources: the third ventricle, located in the midbrain, and the nearby lateral ventricles. Before birth and for a time afterward, both of these areas in the brain are lined with growing stem cells.

First author Da Yong Lee, PhD, a postdoctoral research associate, showed that the cells lining both ventricles are true stem cells capable of becoming nerve and support cells (glia) in the brain. Next, she conducted a detailed analysis of gene expression in both stem cell types.

"There are night-and-day differences between these two groups of stem cells," Gutmann says. "These results show that stem cells are not the same everywhere in the brain, which has real consequences for human neurologic disease."

The third ventricle is close to the optic chiasm, the point where the optic nerves cross and optic gliomas develop in NF1 patients. Lee and Gutmann postulated that stem cells from this ventricle might be the source of progenitor cells that can become gliomas in children with NF1.

To test the theory, they disabled the Nf1 gene in neural stem cells from the third and lateral ventricles in the mice. This same gene is mutated in patients with NF1, increasing their risk of developing tumors.

Lee found that loss of Nf1 activity had little effect on stem cells from the lateral ventricle, but stem cells from the third ventricle began to divide rapidly, a change that puts them closer to becoming tumors.

The third ventricle usually stops supplying stem cells to the brain shortly after birth. When researchers inactivated the Nf1 gene before the third ventricle closed, the mice developed optic gliomas. When they waited until the third ventricle had closed to inactivate the Nf1 gene, gliomas did not develop.

Gutmann plans further studies to determine whether all NF1-related optic gliomas form in cells descended from the third ventricle. He suspects that additional factors are necessary for optic gliomas to form in cooperation with Nf1 gene loss in third-ventricle stem cells.

"We have to recognize that cancers which appear very similar actually represent a collection of quite different diseases," he says. "Tumors are like us — they’re defined by where they live, what their families are like, the traumas they experience growing up, and a variety of other factors. If we can better understand the interplay of these factors, we’ll be able to develop treatments that are much more likely to succeed, because they’ll target what is unique about a specific patient’s tumor."

Source: Science Daily

Jul 11, 20125 notes
#science #neuroscience #brain #psychology #tumor
Better Treatment for Brain Cancer Revealed by New Molecular Insights

ScienceDaily (July 9, 2012) — Nearly a third of adults with the most common type of brain cancer develop recurrent, invasive tumors after being treated with a drug called bevacizumab. The molecular underpinnings behind these detrimental effects have now been published by Cell Press in the July issue of Cancer Cell. The findings reveal a new treatment strategy that could reduce tumor invasiveness and improve survival in these drug-resistant patients.

"Understanding how and why these tumors adopt this invasive behavior is critical to being able to prevent this recurrence pattern and maximizing the benefits of bevacizumab," says study author Kan Lu of the University of California, San Francisco (UCSF).

Glioblastoma multiforme (GBM) is the most aggressive type of tumor originating in the brain. GBM tumors express high levels of vascular endothelial growth factor (VEGF), a protein that promotes the growth of new blood vessels that provide nutrients that allow tumors to expand. In 2009, the Food and Drug Administration approved bevacizumab, a VEGF inhibitor, for GBM patients who don’t respond to first-line therapies. Although the drug is initially effective, up to 30% of patients develop tumors that infiltrate deep into the brain, making surgery and treatment difficult.

To study how bevacizumab can lead to adverse effects, senior study author Gabriele Bergers of UCSF and her collaborators focused on hepatocyte growth factor (HGF), a protein that controls the growth and movement of cells, because they previously found a link between VEGF and HGF in GBM cells. In the new study, they found that VEGF inhibits the migration of GBM cells by decreasing HGF signaling through its receptor MET. Moreover, tumors were much less invasive — and survival improved — in mice with GBM tumors lacking both VEGF and MET rather than just VEGF alone. The results suggest that MET plays a critical role in GBM invasion when VEGF is blocked.

"These findings provide a rationale for therapeutically combining VEGF and MET inhibition so that patients can benefit from bevacizumab without developing more invasive tumors," Lu says. Because the VEGF and HGF/MET signaling pathways are active in a variety of tumors, this combined treatment strategy may also be applied to other types of cancer.

Source: Science Daily

Jul 11, 20127 notes
#science #neuroscience #psychology #brain
Hormone Curbs Depressive-Like Symptoms in Stressed Mice

ScienceDaily (July 9, 2012) — A hormone with anti-diabetic properties also reduces depression-like symptoms in mice, researchers from the School of Medicine at the UT Health Science Center San Antonio reported July 9.

All types of current antidepressants, including tricyclics and selective serotonin reuptake inhibitors, increase the risk for type 2 diabetes. “The finding offers a novel target for treating depression, and would be especially beneficial for those depressed individuals who have type 2 diabetes or who are at high risk for developing it,” said the study’s senior author, Xin-Yun Lu, Ph.D., associate professor of pharmacology and psychiatry and member of the Barshop Institute for Longevity and Aging Studies at the UT Health Science Center.

The hormone, called adiponectin, is secreted by adipose tissue and sensitizes the body to the action of insulin, a hormone that lowers blood sugar. “We showed that adiponectin levels in plasma are reduced in a chronic social defeat stress model of depression, which correlates with the degree of social aversion,” Dr. Lu said.

Facing Goliath over and over

In the study mice were exposed to 14 days of repeated social defeat stress. Each male mouse was introduced to the home cage of an unfamiliar, aggressive resident mouse for 10 minutes and physically defeated. After the defeat, the resident mouse and the intruder mouse each were housed in half of the cage separated by a perforated plastic divider to allow visual, olfactory and auditory contact for the remainder of the 24-hour period. Mice were exposed to a new resident mouse cage and subjected to social defeat each day. Plasma adiponectin concentrations were determined after the last social defeat session. Defeated mice displayed lower plasma adiponectin levels.

Withdrawal, lost pleasure and helplessness

When adiponectin concentrations were reduced by deleting one allele of the adiponectin gene or by a neutralizing antibody, mice were more susceptible to stress-induced social withdrawal, anhedonia (lost capacity to experience pleasure) and learned helplessness.

Mice that were fed a high-fat diet (60 percent calories from fat) for 16 weeks developed obesity and type 2 diabetes. Administration of adiponectin to these mice and mice of normal weight produced antidepressant-like effects.

Possible innovative approach for depression

"These findings suggest a critical role of adiponectin in the development of depressive-like behaviors and may lead to an innovative therapeutic approach to fight depression," Dr. Lu said.

A novel approach would benefit thousands. “So far, only about half of the patients suffering from major depressive disorders are treated to the point of remission with antidepressant drugs,” Dr. Lu said. “The prevalence of depression in the diabetic population is two to three times higher than in the non-diabetic population. Unfortunately, the use of current antidepressants can worsen the control of diabetic patients. Adiponectin, with its anti-diabetic activity, would serve as an innovative therapeutic target for depression treatments, especially for those individuals with diabetes or prediabetes and perhaps those who fail to respond to currently available antidepressants.”

Source: Science Daily

Jul 10, 201227 notes
#science #neuroscience #brain #psychology #depression
Long-Term Hormone Treatment Increases Synapses in Female Rats' Prefrontal Cortex

ScienceDaily (July 9, 2012) — A new study of aged female rats found that long-term treatment with estrogen and a synthetic progesterone known as MPA increased levels of a protein marker of synapses in the prefrontal cortex, a brain region known to suffer significant losses in aging.

The new findings appear to contradict the results of the Women’s Health Initiative, a long-term study begun in 1991 to analyze the effects of hormone therapy on a large sample of healthy postmenopausal women aged 50 to 79. Among other negative findings, the WHI found that long-term exposure to estrogen alone or to estrogen and MPA resulted in an increased risk of stroke and dementia. More recent research, however, suggests that starting hormone replacement therapy at the onset of menopause, rather than years or decades afterward, yields different results.

The new study, from researchers at the University of Illinois, is the first to look at the effects of long-term treatment with estrogen and MPA on the number of synapses in the prefrontal cortex of aged animals. The researchers describe their findings in a paper in the journal Menopause.

"The prefrontal cortex is the area of the human brain that loses the most volume with age," said U. of I. psychology professor and Beckman Institute affiliate Janice Juraska, who led the study with doctoral student Nioka Chisholm. "So understanding how anything affects the prefrontal cortex is important."

The prefrontal cortex, just behind the forehead in humans, governs what researchers call “executive function” — planning, strategic thinking, working memory (the ability to hold information in mind just long enough to use it), self-control and other functions that tend to decline with age.

Most studies of the effects of hormone treatments on the brain have focused on the hippocampus, a structure important to spatial navigation and memory consolidation. The studies tend to use young animals exposed to hormones for very brief periods of time (one or two days to a few weeks at the most). They have yielded mixed results, with most research in young female animals indicating an increase in hippocampal synapses and hippocampal function after exposure to estrogen and MPA.

"For some reason, a lot of researchers still look at the effects of hormones in young animals," Chisholm said. "And there’s a lot of evidence now saying that the aged brain is different; the effect of these hormones is not going to be the same."

The new study followed middle-aged rats exposed to estrogen alone, to no additional hormones, or to estrogen in combination with MPA for seven months, a time period that more closely corresponds to the experience of women who start hormone therapy at the onset of menopause and continue into old age. The researchers removed the rats’ ovaries just prior to the hormone treatment (or lack of treatment) to mimic the changes that occur in humans during menopause.

"Our most important finding is that estrogen in combination with MPA can result in a greater number of synapses in the prefrontal cortex than (that seen) in animals that are not receiving hormone replacement," Chisholm said. "Estrogen alone marginally increased the synapses, but it took the combination with MPA to actually see the significant effect."

"Our data indicate that re-examining the effects of estrogen and MPA, when first given to women around the time of menopause, is merited," Juraska said.

Source: Science Daily

Jul 10, 20129 notes
#science #neuroscience #brain #psychology
Study suggests poorer outcomes for patients with stroke hospitalized on weekends

July 9, 2012

A study of patients with stroke admitted to English National Health Service public hospitals suggests that patients who were hospitalized on weekends were less likely to receive urgent treatments and had worse outcomes, according to a report published Online First by Archives of Neurology.

Studies from other countries have suggested higher mortality in patients who were admitted to the hospital on weekends for a variety of medical conditions, a phenomenon known as “the weekend effect.” However, other studies have not identified an association between the day of admission and mortality rates due to stroke, so the debate over “the weekend effect” continues, according to the study background.

William L. Palmer, M.A., M.Sc., of Imperial College and the National Audit Office, and colleagues conducted a study of patients admitted to hospitals with stroke from April 2009 through March 2010, accounting for 93,621 admissions.

Performance across five of six measures was lower on weekends, with one of the largest disparities seen in rates of same-day brain scans (43.1 percent on weekends compared with 47.6 percent on weekdays). Also, the rate of seven-day, in-hospital mortality for Sunday admissions was 11 percent compared with a mean (average) of 8.9 percent for weekday admissions, according to study results.

"We calculated that approximately 350 potentially avoidable in-hospital deaths occur within seven days each year and that an additional 650 people could be discharged to their usual place of residence within 56 days if the performance seen on weekdays was replicated on weekends," the authors comment.

Provided by JAMA and Archives Journals

Source: medicalxpress.com

Jul 10, 20123 notes
#science #neuroscience #brain #psychology #stroke
Can sounds trick the brain into perceiving your body differently?

July 9, 2012

(Medical Xpress) — Have you ever found yourself paying attention to the sound of your footsteps when walking down a quiet corridor? Or perhaps you enjoy creating rhythmic patterns by tapping on a surface? Almost every bodily movement we make generates an impact sound and a team of academics have been studying whether the perception of the physical dimensions of our body can be challenged by spatially altering the ‘action’ sounds we make.

image

Self-action sounds help us understand physical properties of objects and our own body. Picture by Antonio Caballero

The research team from Royal Holloway, University of London conducted various experiments to determine whether our action sounds influence the way we picture ourselves and whether these perceptions change when the sound is manipulated.

Dr Manos Tsakiris from Royal Holloway said: “These sounds provide an important source of information about the physical properties of the objects and the space around us, but they also inform us about the physical properties of one’s own body, although we are mostly not aware of this process.”

The study, Action sounds recalibrate perceived tactile distance, is published in Current Biology and shows that increasing the distance to sound events produced when tapping on a surface with one’s arm influences the subsequent judgments of distance between two objects touching the arm. “Participants did not report feeling their own arm extended as a result of this test possibly because it is difficult for someone to accept that the dimensions of their body can change from one minute to the other. However, the increase in reported distance between two points touching one’s arm do suggest an unconscious change in the way participants mentally represented their arm, as if they would represent this arm as being longer,” Dr Tajadura-Jiménez explains.

The researchers hope this study could have clinical applications and help in the way chronic pain is treated or help motivate older people to move further or for longer than they previously thought was possibly by manipulating the action sounds they make.

Provided by Royal Holloway, University of London

Source: medicalxpress.com

Jul 10, 201243 notes
#science #neuroscience #brain #psychology #perception
Speeding up Huntington's research

July 9, 2012

(Medical Xpress) — Human brain cells showing aspects of Huntington’s Disease have been developed, opening up new research pathways for treating the fatal disorder.

image

An international consortium, including scientists from the School of Biosciences, has taken cells from Huntington’s Disease patients and generated human brain cells that develop aspects of the disease in the laboratory. The cells and the new technology will speed up research into understanding the disease and also accelerate drug discovery programs aimed at treating this terminal, genetic disorder.

Huntington’s Disease is an aggressive, neurodegenerative disorder which causes loss of co-ordination, psychiatric problems, dementia and death. Scientists have known the genetic cause of this disease for more than 20 years but research has been hampered by the lack of human brain cells with which to study the disease and screen for effective drugs.

The new breakthrough involves taking skin cells from patients with Huntington’s disease. The scientific team reprogrammed these cells into stem cells which were then turned into the brain cells affected by the disorder. The brain cells demonstrate characteristics of the disease and will allow the consortium to investigate the mechanisms that cause the brain cells to die.

Dr. Nicholas Allen, one of the lead investigators at the School of Biosciences, said: “This breakthrough allows us to generate brain cells with many of the hallmarks of this disease, within just a few weeks. This means that we can study both the normal physiology of these brain cells, and the pathological processes that lead to their death.”

The other Cardiff lead, Professor Paul Kemp, said: “Huntington’s Disease normally takes years to manifest in the human brain. Now we have a fast and reproducible model of this disease, offering fresh hope for the discovery of new therapies.”

The corresponding author of the paper, Professor Clive Svendsen, a UK scientist and now director of the Cedars-Sinai Regenerative Medicine Institute in the USA, said “This Huntington’s ‘disease in a dish’ will enable us for the first time to test therapies on human Huntington’s disease neurons. In addition to increasing our understanding of this disorder and offering a new pathway to identifying treatments, this study is remarkable because of the extensive interactions between a large group of scientists focused on developing this model. It’s a new way of doing trailblazing science.”

Director of the School of Biosciences, Professor Ole Petersen said: “This is an extremely important development and I am delighted to see colleagues from the School of Biosciences playing their part in this distinguished international team. I look forward to seeing future stages, when this new technique is put to work modeling the diseases and testing potential treatments.”

Provided by Cardiff University

Source: medicalxpress.com

Jul 10, 201212 notes
#science #neuroscience #brain #psychology #huntington
Jul 10, 201211 notes
#science #neuroscience #brain #psychology #aneurysm #syndrome #gene #genetics
Training Improves Recognition of Quickly Presented Objects

ScienceDaily (July 9, 2012) — “Attentional blink” is the term psychologists use to describe our inability to recognize a second important object if we see it less than half a second after a first one. It always seemed impossible to overcome, but in a new paper in the Proceedings of the National Academy of Sciences, Brown University psychologists report they’ve found a way.

So far it has seemed an irreparable limitation of human perception that we strain to perceive things in the very rapid succession of, say, less than half a second. Psychologists call this deficit “attentional blink.” We’ll notice that first car spinning out in our path, but maybe not register the one immediately beyond it. It turns out, we can learn to do better after all. In a new study researchers now based at Brown University overcame the blink with just a little bit of training that was never been tried before.

"A color change can be very conspicuous. If all items are black and white and all of a sudden a color item is shown, you pay attention to that." Credit: Mike Cohea/Brown University"Attention is a very important component of visual perception," said Takeo Watanabe, professor of cognitive, linguistic and psychological sciences at Brown. "One of the best ways to enhance our visual ability is to improve our attentional function."

Watanabe and his team were at Boston University when they performed experiments described in a paper published the week of July 9 in the Proceedings of the National Academy of Sciences. The bottom line of the research is that making the second target object a distinct color is enough to train people to switch their attention more quickly than they could before. After that, they can perceive a second target object presented as quickly as a fifth of a second later, even when it isn’t distinctly colored.

Read More →

Jul 10, 201218 notes
#science #neuroscience #brain #psychology #memory #object recognition
Small Molecule May Play Big Role in Alzheimer's Disease

ScienceDaily (July 9, 2012) — Alzheimer’s disease is one of the most dreaded and debilitating illnesses one can develop. Currently, the disease afflicts 6.5 million Americans and the Alzheimer’s Association projects it to increase to between 11 and 16 million, or 1 in 85 people, by 2050.

image

Cell death in the brain causes one to grow forgetful, confused and, eventually, catatonic. Recently approved drugs provide mild relief for symptoms but there is no consensus on the underlying mechanism of the disease.

"We don’t know what the problem is in terms of toxicity," said Joan-Emma Shea, professor of chemistry and biochemistry at the University of California, Santa Barbara (UCSB). "This makes the disease difficult to cure."

Accumulations of amyloid plaques have long been associated with the disease and were presumed to be its cause. These long knotty fibrils, formed from misfolded protein fragments, are almost always found in the brains of diseased patients. Because of their ubiquity, amyloid fibrils were considered a potential source of the toxicity that causes cell death in the brain. However, the quantity of fibrils does not correspond with the degree of dementia and other symptoms.

New findings support a hypothesis that fibrils are a by-product of the disease rather than the toxic agent itself. This paradigm shift changes the focus of inquiry to smaller, intermediate molecules that form and dissipate quickly. These molecules are difficult to perceive in brain tissue.

Read More →

Jul 10, 201214 notes
#science #neuroscience #brain #psychology #alzheimer
Nutrient mixture improves memory in patients with early Alzheimer's

July 10, 2012 by Anne Trafton

A clinical trial of an Alzheimer’s disease treatment developed at MIT has found that the nutrient cocktail can improve memory in patients with early Alzheimer’s. The results confirm and expand the findings of an earlier trial of the nutritional supplement, which is designed to promote new connections between brain cells.

image

A graphic depicting a synapse, a connection between brain cells. Graphic: Christine Daniloff

Alzheimer’s patients gradually lose those connections, known as synapses, leading to memory loss and other cognitive impairments. The supplement mixture, known as Souvenaid, appears to stimulate growth of new synapses, says Richard Wurtman, a professor emeritus of brain and cognitive sciences at MIT who invented the nutrient mixture.

“You want to improve the numbers of synapses, not by slowing their degradation — though of course you’d love to do that too — but rather by increasing the formation of the synapses,” Wurtman says.

To do that, Wurtman came up with a mixture of three naturally occurring dietary compounds: choline, uridine and the omega-3 fatty acid DHA. Choline can be found in meats, nuts and eggs, and omega-3 fatty acids are found in a variety of sources, including fish, eggs, flaxseed and meat from grass-fed animals. Uridine is produced by the liver and kidney, and is present in some foods as a component of RNA.

These nutrients are precursors to the lipid molecules that, along with specific proteins, make up brain-cell membranes, which form synapses. To be effective, all three precursors must be administered together.

Results of the clinical trial, conducted in Europe, appear in the July 10 online edition of the Journal of Alzheimer’s Disease. The new findings are encouraging because very few clinical trials have produced consistent improvement in Alzheimer’s patients, says Jeffrey Cummings, director of the Cleveland Clinic’s Lou Ruvo Center for Brain Health.

“Memory loss is the central characteristic of Alzheimer’s, so something that improves memory would be of great interest,” says Cummings, who was not part of the research team.

Plans for commercial release of the supplement are not finalized, according to Nutricia, the company testing and marketing Souvenaid, but it will likely be available in Europe first. Nutricia is the specialized health care division of the food company Danone, known as Dannon in the United States.

Making connections

Wurtman first came up with the idea of targeting synapse loss to combat Alzheimer’s about 10 years ago. In animal studies, he showed that his dietary cocktail boosted the number of dendritic spines, or small outcroppings of neural membranes, found in brain cells. These spines are necessary to form new synapses between neurons.

Following the successful animal studies, Philip Scheltens, director of the Alzheimer Center at VU University Medical Center in Amsterdam, led a clinical trial in Europe involving 225 patients with mild Alzheimer’s. The patients drank Souvenaid or a control beverage daily for three months.

That study, first reported in 2008, found that 40 percent of patients who consumed the drink improved in a test of verbal memory, while 24 percent of patients who received the control drink improved their performance.

The new study, performed in several European countries and overseen by Scheltens as principal investigator, followed 259 patients for six months. Patients, whether taking Souvenaid or a placebo, improved their verbal-memory performance for the first three months, but the placebo patients deteriorated during the following three months, while the Souvenaid patients continued to improve. For this trial, the researchers used more comprehensive memory tests taken from the neuropsychological test battery, often used to assess Alzheimer’s patients in clinical research.

Patients showed a very high compliance rate: About 97 percent of the patients followed the regimen throughout the study, and no serious side effects were seen.

Both clinical trials were sponsored by Nutricia. MIT has patented the mixture of nutrients used in the study, and Nutricia holds the exclusive license on the patent.

Brain patterns

In the new study, the researchers used electroencephalography (EEG) to measure how patients’ brain-activity patterns changed throughout the study. They found that as the trial went on, the brains of patients receiving the supplements started to shift from patterns typical of dementia to more normal patterns. Because EEG patterns reflect synaptic activity, this suggests that synaptic function increased following treatment, the researchers say.

Patients entering this study were in the early stages of Alzheimer’s disease, averaging around 25 on a scale of dementia that ranges from 1 to 30, with 30 being normal. A previous trial found that the supplement cocktail does not work in patients with Alzheimer’s at a more advanced stage. This makes sense, Wurtman says, because patients with more advanced dementia have probably already lost many neurons, so they can’t form new synapses.

A two-year trial involving patients who don’t have Alzheimer’s, but who are starting to show mild cognitive impairment, is now underway. If the drink seems to help, it could be used in people who test positive for very early signs of Alzheimer’s, before symptoms appear, Wurtman says. Such tests, which include PET scanning of the hippocampus, are now rarely done because there are no good Alzheimer’s treatments available.

Provided by Massachusetts Institute of Technology

Source: medicalxpress.com

Jul 10, 201226 notes
#science #neuroscience #brain #psychology #alzheimer #memory
Jul 8, 201231 notes
#science #neuroscience #brain #development #psychology
What Makes Us Musical Animals

ScienceDaily (July 6, 2012) — In a forthcoming issue of Topics in Cognitive Science researchers from the University of Amsterdam (UvA) argue that at least two, seemingly trivial musical skills can be considered fundamental to the evolution of music: relative pitch — the skill to recognise a melody independent of its pitch level — and beat induction — the skill to pick up regularity (the beat) from a varying rhythm. Both are considered cognitive mechanisms that are essential to perceive, make and appreciate music, and, as such, could be argued to be conditional to the origin of music.

While it recently became quite popular to address the study of the origins of music from an evolutionary perspective, there is still little agreement on the idea that music is in fact an adaptation, that it influenced our survival, or that it made us sexually more attractive. Music appears to be of little use. It doesn’t quell our hunger, nor do we live a day longer because of it. So why argue that music is an adaptation? There are even researchers who claim that studying the evolution of cognition is virtually impossible (Lewontin, 1998; Bolhuis & Wynne, 2009).

Distinguishing between music and musicality

The alternative that Henkjan Honing and Annemie Ploeger of the UvA propose is, first, to distinguish between the notion of ‘music’ and ‘musicality’, with musicality being defined as a natural, spontaneously developing trait based on and constrained by our cognitive system, and music as a social and cultural construct based on that very musicality. And secondly, to collect accumulative evidence from a variety of sources (e.g., psychological, physiological, genetic, phylogenetic, and cross-cultural evidence) to be able to show that a specific cognitive trait is indeed an adaptation.

Both relative pitch and beat induction are suggested as primary candidates for such cognitive traits, musical skills that are considered trivial by most humans, but that turn out to be quite special in the rest of the animal world.

Once these fundamental cognitive mechanisms are identified, it becomes possible to see how these might have evolved. In short: the study of the evolution of music cognition is conditional on a characterisation of the basic mechanisms that make up musicality.

Source: Science Daily

Jul 7, 201251 notes
#science #neuroscience #psychology #music #brain
Can You Hear Me Now? New Strategy Discovered to Prevent Hearing Loss

ScienceDaily (July 6, 2012) — If you’re concerned about losing your hearing because of noise exposure (earbud deafness syndrome), a new discovery published online in the FASEB Journal offers some hope. That’s because scientists from Germany and Canada show that the protein, AMPK, which protects cells during a lack of energy, also activates a channel protein in the cell membrane that allows potassium to leave the cell. This activity is important because this mechanism helps protect sensory cells in the inner ear from permanent damage following acoustic noise exposure.

This information could lead to new strategies and therapies to prevent and treat trauma resulting from extreme noise, especially in people with AMPK gene variants that may make them more vulnerable to hearing loss.

"Future research on the basis of the present study may lead to the development of novel strategies preventing noise-induced hearing loss or accelerating recovery from acoustic trauma," said Florian Lang, Ph.D., a researcher involved in the work from the Department of Physiology at the University of Tübingen, in Tübingen, Germany.

To make this discovery, Lang and colleagues compared two groups of mice. The first group was normal and the second lacked the AMPK protein. Hearing of the mice was tested by measuring sound-induced brain activity. All mice were exposed to well-defined noise causing an acoustic trauma and leading to hearing impairment. Prior to noise exposure, the hearing ability was similar in normal mice and mice lacking AMPK. After exposure, the hearing of the normal mice mostly recovered after two weeks, but the recovery of hearing in AMPK-deficient mice remained significantly impaired.

"When it comes to preventing hearing loss, keeping the volume down is still the best strategy, and this discovery doesn’t prevent loud music from beating on our ear drums," said Gerald Weissmann, M.D., Editor-in-Chief of the FASEB Journal. “This discovery does help explain why some people seem more likely to lose their hearing than others. At the same time, it also provides a target for new preventive strategies — and perhaps even a treatment — for earbud deafness syndrome.”

Source: Science Daily

Jul 7, 201222 notes
#science #neuroscience #brain #psychology #hearing
'Stoned' gene key to maintaining normal brain function

July 6, 2012

(Medical Xpress) — Scientists at the University of Liverpool have found that a protein produced by a gene identified in fruitflies, is responsible for communication between nerve cells in the brain.

image

Dr Stephen Royle: “This research is another step towards fully understanding the complexities of the human brain.”

The ‘stoned’ gene was discovered in fruitflies by scientists in the 1970s. When this gene was mutated, the flies had problems walking and flying, giving rise to the term ‘stoned’ gene. The same gene was found in mammals some years later, but until now scientists have not known precisely what this gene is responsible for and why it causes problems with physical functions when it mutates.

‘Packets of chemicals’

Scientists at Liverpool have found that the protein the gene expresses in mammals, called stonin2, is responsible for retrieving ‘packets’ of chemicals that nerve cells in the brain release in order to communicate with each other.  The inability of the gene to express this protein in the fruitfly study, suggests why the insect appeared not to be able to walk or fly normally.

The team used advanced techniques to inactivate stonin2 for short and long periods of time in animal cells grown in the laboratory. The cells used where from an area of the brain associated with learning and memory.  They showed that without stonin2 the nerve cells could not retrieve the ‘packets’ needed to transport the chemicals required for communications between nerve cells.

Dr Stephen Royle, from the University’s Institute of Translational Medicine, explains: “Nerve cells in the brain communicate by releasing ‘packets’ of chemicals.  These ‘packets’ must be retrieved and refilled with chemicals so that they can be used once again. This recycling programme is very important for nerve cells to keep communicating with each other. 

“We have shown that a protein called stonin 2 is needed for the packets to be retrieved. There is currently no evidence to suggest that the gene which expresses this protein is mutated in human disease, but any failure in its function would be disastrous.  The research is another step towards fully understanding the complexities of the human brain.”

The research is published in the journal, Current Biology.

Provided by University of Liverpool

Source: medicalxpress.com

Jul 7, 201225 notes
#science #neuroscience #brain #genes #biology #fruitflies
Zebrafish Reveal Promising Process for Healing Spinal Cord Injury

ScienceDaily (July 6, 2012) — Yona Goldshmit, Ph.D., is a former physical therapist who worked in rehabilitation centers with spinal cord injury patients for many years before deciding to switch her focus to the underlying science.

"After a few years in the clinic, I realized that we don’t really know what’s going on," she said.

Now a scientist working with Peter Currie, Ph.D., at Monash University in Australia, Dr. Goldshmit is studying the mechanisms of spinal cord repair in zebrafish, which, unlike humans and other mammals, can regenerate their spinal cord following injury. On June 23 at the 2012 International Zebrafish Development and Genetics Conference in Madison, Wisconsin, she described a protein that may be a key difference between regeneration in fish and mammals.

One of the major barriers to spinal regeneration in mammals is a natural protective mechanism, which incongruously results in an unfortunate side effect. After a spinal injury, nervous system cells called glia are activated and flood the area to seal the wound to protect the brain and spinal cord. In doing so, however, the glia create scar tissue that acts as a physical and chemical barrier, which prevents new nerves from growing through the injury site.

One striking difference between the glial cells in mammals and fish is the resulting shape: mammalian glia take on highly branched, star-like arrangements that appear to intertwine into dense tissue. Fish glia cells, by contrast, adopt a simple elongated shape — called bipolar morphology — that bridges the injury site and appears to help new nerve cells grow through the damaged area to heal the spinal cord.

"Zebrafish don’t have so much inflammation and the injury is not so severe as in mammals, so we can actually see the pro-regenerative effects that can happen," Dr. Goldshmit explained.

Studies in mice have found that mammalian glia can take up the same elongated shape, but in response to the environment around the injury they instead mature into scar tissue that does not allow nerve regrowth.

Dr. Goldshmit and her colleagues have focused on a family of molecules called fibroblast growth factors (Fgf), which have shown some evidence of improving recovery in mice and humans with spinal cord damage. The Monash University group found that Fgf activity around the damage site promotes the bipolar glial shape and encourages nerve regeneration in zebrafish.

Preliminary results in mice show that Fgf injections near a spinal injury increase both the number of glia cells at the site and the elongated morphology. Their evidence suggests that Fgfs may work to create an environment more supportive of regeneration in mammals as well and could be a valuable therapeutic target.

Spinal injury patients usually have few options, Dr. Goldshmit emphasized, and development of new, biologically-based approaches will be critical.

"This is a nice example of how we can use the zebrafish model," she said. "When we learn from the zebrafish what to look at, we can find things that give us hope for finding therapeutic approaches for spinal cord injury in humans."

Source: Science Daily

Jul 7, 201214 notes
#science #neuroscience #spinal cord #zebrafish
Brain scanner, not joystick, is in human-robot future

July 6, 2012 by Nancy Owano

(Phys.org) — Talk about fMRI may not be entirely familiar to many people, but that could change with new events that are highlighting efforts to link up humans and machines. fMRI (Functional Magnetic Resonance Imaging) is a promising technology that can help human move beyond joysticks to control robots via brain scanners instead. Now a research project exploring ways to develop robot surrogates with whom humans can interact has turned a corner. A university student‘s ability to make his robot surrogate move around, using fMRI technology, was successful. The experiment linked up Israeli student Tirosh Shapira in a lab at Bar-Ilan University, Israel, with a small robot in another lab far away at Beziers Technology Institute in France.

Shapira merely had to think about moving his arms or legs and the robot, with a camera on its head with an image displayed in front of Shapira, successfully would do the same. If Shapira thought about moving forward or backward, the robot responded accordingly.

fmri monitors blood flowing through the brain and can spot when areas associated with certain actions, such as movement, are in use. The fMRI read the student’s thoughts, which were translated via computer into commands relayed across the Internet to the robot in France.

There is much more work to be done to advance this approach, however. The researchers seek to devise a different type of scanning. An fMRI scanner is an expensive piece of equipment but the scientists believe that improvements in software might allow for a head-mounted device. Another research goal is to see if they can get humans to speak via the robot. The size of the robot will need modification, closer to the size and movement of a human, and engineered with a wider range of movement that would include hand gestures. In sum, according to the researchers, this experiment is only one of many steps ahead.

Medical applications for this technology are seen as promising, especially as scientists explore how patients with paralysis can interface with robots so that the patients can reconnect to the world. Another suggested application has been in the military, where robot surrogates rather than soldiers would be sent into battle.

Source: PHYS.ORG

Jul 7, 201215 notes
#science #neuroscience #brain #fMRI #robotics
Researchers decode molecular mechanism that sheds light on how trauma can become engraved in the brain

July 6, 2012

(Medical Xpress) — Researchers decode a molecular mechanism that sheds light on how trauma can become engraved in the brain

image

Scientists at the Universities of Bonn and Berlin have discovered a mechanism which stops the process of forgetting anxiety after a stress event. In experiments they showed that feelings of anxiety don’t subside if too little dynorphin is released into the brain. The results can help open up new paths in the treatment of trauma patients. The study has been published in the current edition of the Journal of Neuroscience.

Feelings of anxiety very effectively prevent people from getting into situations that are too dangerous. Those who have had a terrible experience initially tend to avoid the place of tragedy out of fear. If no other oppressive situation arises, normally the symptoms of fear gradually subside. “The memory of the terrible events is not just erased.” states first author, PD Dr. Andras Bilkei Gorzo, from the Institute for Molecular Psychiatry at the University of Bonn. “Those impacted learn rather via an active learning process that they no longer need to be afraid because the danger has passed.” But following extreme psychical stress resulting from wars, hostage-takings, accidents or catastrophes chronic anxiety disorders can develop which even after months don’t subside.

Body’s own dynorphin weakens fears

Why is it that in some people terrible events are deeply engraved in their memory, while after a while others seem to have completely put aside any anxiety related to the incident? Scientists in the fields of psychiatry, molecular psychiatry and radiology at the University of Bonn are all involved in probing this issue. “We were able to demonstrate by way of a series of experiments that dynorphin plays an important role in weakening anxiety,” says Prof. Dr. Andreas Zimmer, Director of the Institute for Molecular Psychiatry at the University of Bonn. The substance group in question is opiods which also includes, for instance, endorphins. The latter are released by the body of athletes and have an analgesic and euphoric effect. The reverse, however, is true of dynorphins: They are known for putting a damper on emotional moods.

Mice with disabled gene exhibit persistent anxiety

The team working with Prof. Zimmer tested the exact impact of dynorphins on the brain using mice whose gene for the formation of this substance had been disabled. After being exposed to a brief and unpleasant electric shock, the animals exhibited persistent anxiety symptoms, even if they hadn’t been confronted with the negative stimulus over a longer time. Mice exhibiting a normal amount of released dynorphin were anxious to begin with as well, but the symptoms quickly subsided. “This behavior is the same in humans: If you burn your hand on the stove once, you don’t forget the incident that quickly,” explains Prof. Zimmer. “Learning vocabulary, on the other hand, typically tends to be more tedious because it’s not tied to emotions.”

Results are transferrable to people

Next the researchers showed that these results can be transferred to people. “We took advantage of the fact that people exhibit natural variations of the dynorphin gene that lead to different levels of this substance being released in the brain,” reports Prof. Dr. Henrik Walter, Director of the Research Area Mind and Brain at the Psychiatric University Clinic at the Charité in Berlin, who also used to perform research in this area at the University Clinic in Bonn. A total of 33 healthy probands were divided into two groups: One with the genetically stronger dynorphin release and the other which exhibits less gene activity.

Unpleasant stimulus leads to stress reactions in the probands

Equipped with computer glasses the probands observed blue and green squares which appeared and then disappeared again in a magnetic resonance tomograph (MRT). When the green square was visible the scientists repeatedly gave probands an unpleasant stimulus on the hand using a laser. Scientists were able to prove that these negative stimuli actually led to a stress reaction given the increased sweat on the skin. At the same time, researchers recorded the activities of various brain areas with the tomograph. After this conditioning stage came part two of the experiment: The researchers showed the colored squares without any unpleasant stimuli and recorded how long the stress reaction acquired earlier lasted. The next day the experiment was continued without the laser stimulus in an effort to monitor the longer-term development.

New paths in the treatment of trauma patients

It became apparent that, as in mice human, probands with lower gene activity for dynorphin exhibited stress reactions lasting considerably longer than those probands who released considerably more. Moreover, in brain scans it could be observed that the amygdala – a brain structure in the temporal lobes that processes emotional contents - was also active even if in later testing rounds a green square was shown without the subsequent laser stimulus.

“After the negative laser stimulus stopped this amygdala activity gradually became weaker. This means that the acquired anxiety reaction to the stimulus was forgotten,” reports Prof. Walter. This effect was not as pronounced in the group with less dynorphin activity and prolonged anxiety. “But the ‘forgetting’ of acquired anxiety reactions isn’t a fading, but, rather, an active process which involves the ventromedial prefrontal cortex,” emphasizes Prof. Walter. To corroborate this, researchers found that in the group with less dynorphin activity there was reduced coupling between the prefrontal cortex and the amygdala. “In all likelihood dynorphins affect fear forgetting in a crucial way through this structure,” says Prof. Walter. The scientists now hope that by using the results they will be able to develop long-term approaches for new strategies when it comes to the treatment of trauma patients.

Provided by University of Bonn

Source: medicalxpress.com

Jul 7, 201238 notes
#science #neuroscience #brain #psychology #anxiety
Gene Linked to Facial, Skull and Cognitive Impairment Identified

ScienceDaily (July 5, 2012) — A gene whose mutation results in malformed faces and skulls as well as mental retardation has been found by scientists.

They looked at patients with Potocki-Shaffer syndrome, a rare disorder that can result in significant abnormalities such as a small head and chin and intellectual disability, and found the gene PHF21A was mutated, said Dr. Hyung-Goo Kim, molecular geneticist at the Medical College of Georgia at Georgia Health Sciences University.

The scientists confirmed PHF21A’s role by suppressing it in zebrafish, which developed head and brain abnormalities similar to those in patients. “With less PHF21A, brain cells died, so this gene must play a big role in neuron survival,” said Kim, lead and corresponding author of the study published in The American Journal of Human Genetics. They reconfirmed the role by giving the gene back to the malformed fish — studied for their adeptness at regeneration — which then became essentially normal. They also documented the gene’s presence in the craniofacial area of normal mice.

While giving the normal gene unfortunately can’t cure patients as it does zebrafish, the scientists believe the finding will eventually enable genetic screening and possibly early intervention during fetal development, including therapy to increase PHF21A levels, Kim said. It also provides a compass for learning more about face, skull and brain formation.

The scientists zeroed in on the gene by using a distinctive chromosomal break found in patients with Potocki-Shaffer syndrome as a starting point. Chromosomes — packages of DNA and protein — aren’t supposed to break, and when they do, it can damage genes in the vicinity.

"We call this breakpoint mapping and the breakpoint is where the trouble is," said Dr. Lawrence C. Layman, study co-author and Chief of the MCG Section of Reproductive Endocrinology, Infertility and Genetics. Damaged genes may no longer function optimally; in PHF21A’s case it’s about half the norm.

"When you see the chromosome translocation, you don’t know which gene is disrupted," Layman said. "You use the break as a focus then use a bunch of molecular techniques to zoom in on the gene." Causes of chromosomal breaks are essentially unknown but likely are environmental and/or genetic, Kim said.

Little was known about PHF21A other than its role in determining how tightly DNA is wound in a package with proteins called histones. How tightly DNA is wound determines whether proteins called transcription factors have the access needed to regulate gene expression, which is important, for example, when a gene needs to be expressed only at a specific time or tissue. PHF21A is believed to primarily work by suppressing other genes, for example, ensuring that genes that should be expressed only in brain cells don’t show up in other cell types, Kim said.

Next steps include using PHF21A as a sort of geographic positioning system to identify other “depressor” genes it regulates then screening patients to look for mutations in those genes as well. “We want to find other people with different genes causing the same problem,” Layman said, and they suspect the genes PHF21A interacts with or regulates are the most likely suspects. It’s too early to know what percentage of Potocki-Shaffer syndrome patients have the PHF21A mutation, Kim noted. “Now that we know the causative gene, we can sequence the gene in more patients and see if they have a mutation,” Layman said.

They also want to look at less-severe forms of mental deficiency, including autism, for potentially milder mutations of PHF21A. More than a dozen of the 25,000 human genes are known to cause craniofacial defects and mental retardation, which often occur together, Kim said.

Source: Science Daily

Jul 6, 20129 notes
#science #neuroscience #psychology #gene #genetic disorders
Music to My Eyes: Device Converting Images Into Music Helps Visually Impaired Find Things With Ease

ScienceDaily (July 5, 2012) — Sensory substitution devices (SSDs) use sound or touch to help the visually impaired perceive the visual scene surrounding them. The ideal SSD would assist not only in sensing the environment but also in performing daily activities based on this input. For example, accurately reaching for a coffee cup, or shaking a friend’s hand. In a new study, scientists trained blindfolded sighted participants to perform fast and accurate movements using a new SSD, called EyeMusic. Their results are published in the July issue of Restorative Neurology and Neuroscience.

image

Left: An illustration of the EyeMusic SSD, showing a user with a camera mounted on the glasses, and scalp headphones, hearing musical notes that create a mental image of the visual scene in front of him. He is reaching for the red apple in a pile of green ones. Top right: close-up of the glasses-mounted camera and headphones; bottom right: hand-held camera pointed at the object of interest. (Credit: Maxim Dupliy, Amir Amedi and Shelly Levy-Tzedek)

The EyeMusic, developed by a team of researchers at the Hebrew University of Jerusalem, employs pleasant musical tones and scales to help the visually impaired “see” using music. This non-invasive SSD converts images into a combination of musical notes, or “soundscapes.”

The device was developed by the senior author Prof. Amir Amedi and his team at the Edmond and Lily Safra Center for Brain Sciences (ELSC) and the Institute for Medical Research Israel-Canada at the Hebrew University. The EyeMusic scans an image and represents pixels at high vertical locations as high-pitched musical notes and low vertical locations as low-pitched notes according to a musical scale that will sound pleasant in many possible combinations. The image is scanned continuously, from left to right, and an auditory cue is used to mark the start of the scan. The horizontal location of a pixel is indicated by the timing of the musical notes relative to the cue (the later it is sounded after the cue, the farther it is to the right), and the brightness is encoded by the loudness of the sound.

The EyeMusic’s algorithm uses different musical instruments for each of the five colors: white (vocals), blue (trumpet), red (reggae organ), green (synthesized reed), yellow (violin); Black is represented by silence. Prof. Amedi mentions that “The notes played span five octaves and were carefully chosen by musicians to create a pleasant experience for the users.” Sample sound recordings are available at http://brain.huji.ac.il/em/.

"We demonstrated in this study that the EyeMusic, which employs pleasant musical scales to convey visual information, can be used after a short training period (in some cases, less than half an hour) to guide movements, similar to movements guided visually," explain lead investigators Drs. Shelly Levy-Tzedek, an ELSC researcher at the Faculty of Medicine, Hebrew University, Jerusalem, and Prof. Amir Amedi. "The level of accuracy reached in our study indicates that performing daily tasks with an SSD is feasible, and indicates a potential for rehabilitative use."

The study tested the ability of 18 blindfolded sighted individuals to perform movements guided by the EyeMusic, and compared those movements to those performed with visual guidance. At first, the blindfolded participants underwent a short familiarization session, where they learned to identify the location of a single object (a white square) or of two adjacent objects (a white and a blue square).

In the test sessions, participants used a stylus on a digitizing tablet to point to a white square located either in the north, the south, the east or the west. In one block of trials they were blindfolded (SSD block), and in the other block (VIS block) the arm was placed under an opaque cover, so they could see the screen but did not have direct visual feedback from the hand. The endpoint location of their hand was marked by a blue square. In the SSD block, they received feedback via the EyeMusic. In the VIS block, the feedback was visual.

"Participants were able to use auditory information to create a relatively precise spatial representation," notes Dr. Levy-Tzedek.

The study lends support to the hypothesis that representation of space in the brain may not be dependent on the modality with which the spatial information is received, and that very little training is required to create a representation of space without vision, using sounds to guide fast and accurate movements. “SSDs may have great potential to provide detailed spatial information for the visually impaired, allowing them to interact with their external environment and successfully make movements based on this information, but further research is now required to evaluate the use of our device in the blind ” concludes Dr. Levy-Tzedek. These results demonstrate the potential application of the EyeMusic in performing everyday tasks — from accurately reaching for the red (but not the green!) apples in the produce aisle, to, perhaps one day, playing a Kinect / Xbox game.

Source: Science Daily

Jul 6, 2012134 notes
#science #neuroscience #brain #psychology #vision
Jul 6, 2012128 notes
#science #neuroscience #brain #psychology #AI #robotics #vision
How a protein meal tells your brain you are full

July 5, 2012

Feeling full involves more than just the uncomfortable sensation that your waistband is getting tight. Investigators reporting online on July 5th in the Cell Press journal Cell have now mapped out the signals that travel between your gut and your brain to generate the feeling of satiety after eating a protein-rich meal. Understanding this back and forth loop between the brain and gut may pave the way for future approaches in the treatment and/or prevention of obesity.

image

Feeling full involves more than just the uncomfortable sensation that your waistband is getting tight. Investigators reporting online on July 5th in the Cell Press journal Cell have now mapped out the signals that travel between your gut and your brain to generate the feeling of satiety after eating a protein-rich meal. Understanding this back and forth loop between the brain and gut may pave the way for future approaches in the treatment and/or prevention of obesity. Credit: Duraffourd et al., Cell

Food intake can be modulated through mu-opioid receptors (MORs, which also bind morphine) on nerves found in the walls of the portal vein, the major blood vessel that drains blood from the gut. Specifically, stimulating the receptors enhances food intake, while blocking them suppresses intake. Investigators have now found that peptides, the products of digested dietary proteins, block MORs, curbing appetite. The peptides send signals to the brain that are then transmitted back to the gut to stimulate the intestine to release glucose, suppressing the desire to eat.

Mice that were genetically engineered to lack MORs did not carry out this release of glucose, nor did they show signs of ‘feeling full’, after eating high-protein foods. Giving them MOR stimulators or inhibitors did not affect their food intake, unlike normal mice.

Because MORs are also present in the neurons lining the walls of the portal vein in humans, the mechanisms uncovered here may also take place in people.

"These findings explain the satiety effect of dietary protein, which is a long-known but unexplained phenomenon,” says senior author Dr. Gilles Mithieux of the Université de Lyon, in France. “They provide a novel understanding of the control of food intake and of hunger sensations, which may offer novel approaches to treat obesity in the future,” he adds.

Provided by Cell Press

Source: medicalxpress.com

Jul 6, 201242 notes
#science #neuroscience #brain #psychology #obesity #proteins
Diabetes Drug Makes Brain Cells Grow

ScienceDaily (July 5, 2012) — The widely used diabetes drug metformin comes with a rather unexpected and alluring side effect: it encourages the growth of new neurons in the brain. The study reported in the July 6th issue of Cell Stem Cell, a Cell Press publication, also finds that those neural effects of the drug also make mice smarter.

image

New research finds that the widely used diabetes drug metformin comes with a rather unexpected and alluring side effect: it encourages the growth of new neurons in the brain. (Credit: iStockphoto/Guido Vrola)

The discovery is an important step toward therapies that aim to repair the brain not by introducing new stem cells but rather by spurring those that are already present into action, says the study’s lead author Freda Miller of the University of Toronto-affiliated Hospital for Sick Children. The fact that it’s a drug that is so widely used and so safe makes the news all that much better.

Earlier work by Miller’s team highlighted a pathway known as aPKC-CBP for its essential role in telling neural stem cells where and when to differentiate into mature neurons. As it happened, others had found before them that the same pathway is important for the metabolic effects of the drug metformin, but in liver cells.

"We put two and two together," Miller says. If metformin activates the CBP pathway in the liver, they thought, maybe it could also do that in neural stem cells of the brain to encourage brain repair.

The new evidence lends support to that promising idea in both mouse brains and human cells. Mice taking metformin not only showed an increase in the birth of new neurons, but they were also better able to learn the location of a hidden platform in a standard maze test of spatial learning.

While it remains to be seen whether the very popular diabetes drug might already be serving as a brain booster for those who are now taking it, there are already some early hints that it may have cognitive benefits for people with Alzheimer’s disease. It had been thought those improvements were the result of better diabetes control, Miller says, but it now appears that metformin may improve Alzheimer’s symptoms by enhancing brain repair.

Miller says they now hope to test whether metformin might help repair the brains of those who have suffered brain injury due to trauma or radiation therapies for cancer.

Source: Science Daily

Jul 6, 201261 notes
#science #neuroscience #psychology #diabetes #brain
Brain Center for Social Choices Discovered: Poker-Playing Subjects Seen Weighing Whether to Bluff

ScienceDaily (July 5, 2012) — Although many areas of the human brain are devoted to social tasks like detecting another person nearby, a new study has found that one small region carries information only for decisions during social interactions. Specifically, the area is active when we encounter a worthy opponent and decide whether to deceive them.

image

(Credit: © wtamas / Fotolia)

A brain imaging study conducted by researchers at the Duke Center for Interdisciplinary Decision Science (D-CIDES) put human subjects through a functional MRI brain scan while playing a simplified game of poker against a computer and human opponents. Using computer algorithms to sort out what amount of information each area of the brain was processing, the team found only one brain region — the temporal-parietal junction, or TPJ — carried information that was unique to decisions against the human opponent.

Some of the time, the subjects were dealt an obviously weak hand. The researchers wanted to see whether they could watch the player calculate whether to bluff his opponent. The brain signals in the TPJ told the researchers whether the subject would soon bluff against a human opponent, especially if that opponent was judged to be skilled. But against a computer, signals in the TPJ did not predict the subject’s decisions.

The TPJ is in a boundary area of the brain, and may be an intersection for two streams of information, said lead researcher McKell Carter, a postdoctoral fellow at Duke. It brings together a flow of attentional information and biological information, such as “is that another person?”

Carter observed that in general, participants paid more attention to their human opponent than their computer opponent while playing poker, which is consistent with humans’ drive to be social.

Throughout the poker game experiment, regions of the brain that are typically thought to be social in nature did not carry information specific to a social context. “The fact that all of these brain regions that should be specifically social are used in other circumstances is a testament to the remarkable flexibility and efficiency of our brains,” said Carter.

"There are fundamental neural differences between decisions in social and non-social situations," said D-CIDES Director Scott Huettel, the Hubbard professor of psychology & neuroscience at Duke and senior author of the study. "Social information may cause our brain to play by different rules than non-social information, and it will be important for both scientists and policymakers to understand what causes us to approach a decision in a social or a non-social manner.

"Understanding how the brain identifies important competitors and collaborators — those people who are most relevant for our future behavior — will lead to new insights into social phenomena like dehumanization and empathy," Huettel added.

Source: Science Daily

Jul 6, 201215 notes
#science #neuroscience #brain #psychology
Scientific Study Reveals That Individuals Cooperate According to Their Emotional State and Their Prior Experiences

ScienceDaily (July 4, 2012) — A study by researchers at Universidad Carlos III de Madrid and Universidad de Zaragoza has determined that when deciding whether to cooperate with others, people do not act thinking about their own reward, as had been previously believed, but rather individuals are more influenced by their own mood at the time and by the number of individuals with whom they have cooperated before.

In addition to previous studies, this research is also based on an experiment carried out by the Institute for Biocomputation and Physics of Complex Systems (BIFI) at the Universidad de Zaragoza, together with the Fundación Ibercivis and Universidad Carlos III de Madrid (UC3M), the largest study of its kind to date in real time regarding cooperation in society. It was carried out during this past December, with 1,200 Aragon secondary students participating, who interacted electronically in real time via a social conflict prototype known as the “Prisoner’s Dilemma.” This game shows that the greatest benefit for individuals who interact is produced when both of them collaborate, but if one collaborates and the other does not, the latter will receive more benefits than the one who cooperates. On occasion, this allows an individual to take advantage of the cooperation of others, but if this tendency is extended, in the end, no one cooperates and as such, nobody obtains rewards.

After analyzing the information, the main conclusion drawn by the researchers is that in a situation where cooperating with others is beneficial, the way the individuals involved are organized into one social structure or another is irrelevant. A first analysis contradicts what many researchers have held based on theoretical studies.

In the experiment, the degree of cooperation in a network in which each subject interacts with four other individuals is compared to a network in which the number of connections vary between 2 and 16, that is, one that is more similar to a social network. What has been observed is that the results in the two networks are identical. “This happens because, contrary to what has been proposed in the majority of studies, people do not make their decisions based on the rewards obtained (by them or by their neighbors), but rather based on how many people have recently cooperated with them, as well as on their own mood at the time,” the researchers explained.

These results help understand how people make decisions, above all in the context in which one has to decide between collaborating with or taking advantage of others. “Understanding why we do one thing or another can help in designing incentives that induce people to cooperate,” the authors of the research pointed out. On the other hand, the fact that the networks are not important has implications, for organizational design, for example. The experiment revealed that people are not going to cooperate more because of being organized in a certain way. In this respect, it can be inferred that we do not have to be concerned with the design of organizational structure, but rather with motivating people individually to cooperate.

Ruling out that network organization influences in the cooperation of people, and having discovered that what is important is reciprocity, that is, cooperating according to cooperation received, will radically change the focus of a significant number of researchers who are developing theories on the emergence of cooperation among individuals.

Source: Science Daily

Jul 5, 201232 notes
#science #neuroscience #brain #psychology
Skin patch improves attention span in stroke patients

July 4, 2012

(Medical Xpress) — Researchers at the UCL Institute of Neurology have found that giving the drug rotigotine as a skin patch can improve inattention in some stroke patients.

Hemi-spatial neglect, a severe and common form of inattention that can be caused by brain damage following a stroke, is one of the most debilitating symptoms, frequently preventing patients from living independently. When the right side of the brain has suffered damage, the patient may have little awareness of their left-hand side and have poor memory of objects that they have seen, leaving them inattentive and forgetful. Currently there are few treatment options.

The randomised control trial took 16 patients who had suffered a stroke on the right-hand side of their brain and assessed to see whether giving the drug rotigotine improved their ability to concentrate on their left-hand side. The results showed that even with treatment for just over a week, patients who received the drug performed significantly better on attention tests than when they received the placebo treatment.

Rotigotine acts by stimulating receptors on nerve cells for dopamine, a chemical normally produced within the brain.

Professor Masud Husain who led the study at the Institute of Neurology at UCL says: “Inattention can have a devastating effect on stroke patients and their families. It impacts on all aspects of their lives. If the results of our clinical trial are replicated in further, larger studies, we will have overcome a major hurdle towards providing a new treatment for this important consequence of stroke.

“Milder forms of inattention occur in other brain disorders, across all ages - from ADHD (attention deficit hyperactivity disorder) to Parkinson’s disease. Our findings show that it is possible to alter attention by using a drug that acts at specific receptors in the brain, and therefore have implications for understanding the mechanisms that might cause inattention in conditions other than stroke.”

Provided by University College London

Source: medicalxpress.com

Jul 5, 20127 notes
#science #neuroscience #brain #psychology #stroke
Artificial Cerebellum Than Enables Robotic Human-Like Object Handling Developed

ScienceDaily (July 3, 2012) — University of Granada researchers have developed an artificial cerebellum (a biologically-inspired adaptive microcircuit) that controls a robotic arm with human-like precision. The cerebellum is the part of the human brain that controls the locomotor system and coordinates body movements.

To date, although robot designers have achieved very precise movements, such movements are performed at very high speed, require strong forces and are power consuming. This approach cannot be applied to robots that interact with humans, as a malfunction might be potentially dangerous.

To solve this challenge, University of Granada researchers have implemented a new cerebellar spiking model that adapts to corrections and stores their sensorial effects; in addition, it records motor commands to predict the action or movement to be performed by the robotic arm. This cerebellar model allows the user to articulate a state-of-the-art robotic arm with extraordinary mobility.

Automatic Learning

The developers of the new cerebellar model have obtained a robot that performs automatic learning by extracting the input layer functionalities of the brain cortex. Furthermore, they have developed two control systems that enable accurate and robust control of the robotic arm during object handling.

The synergy between the cerebellum and the automatic control system enables robot’s adaptability to changing conditions i.e. the robot can interact with humans. The biologically-inspired architectures used in this model combine the error training approach with predictive adaptive control.

The designers of this model are Silvia Tolu, Jesús Garrido and Eduardo Ros Vidal, at the University of Granada Department of Computer Architecture and Technology, and the University of Almería researcher Richard Carrillo.

Source: Science Daily

Jul 5, 201239 notes
#science #neuroscience #brain #psychology
Childhood Adversity Increases Risk for Depression and Chronic Inflammation

ScienceDaily (July 3, 2012) — When a person injures their knee, it becomes inflamed. When a person has a cold, their throat becomes inflamed. This type of inflammation is the body’s natural and protective response to injury.

Interestingly, there is growing evidence that a similar process happens when a person experiences psychological trauma. Unfortunately, this type of inflammation can be destructive.

Previous studies have linked depression and inflammation, particularly in individuals who have experienced early childhood adversity, but overall, findings have been inconsistent. Researchers Gregory Miller and Steve Cole designed a longitudinal study in an effort to resolve these discrepancies, and their findings are now published in a study in Biological Psychiatry.

They recruited a large group of female adolescents who were healthy, but at high risk for experiencing depression. The volunteers were then followed for 2 ½ years, undergoing interviews and giving blood samples to measure their levels of C-reactive protein and interleukin-6, two types of inflammatory markers. Their exposure to childhood adversity was also assessed.

The researchers found that when individuals who suffered from early childhood adversity became depressed, their depression was accompanied by an inflammatory response. In addition, among subjects with previous adversity, high levels of interleukin-6 forecasted risk of depression six months later. In subjects without childhood adversity, there was no such coupling of depression and inflammation.

Dr. Miller commented on their findings: “What’s important about this study is that it identifies a group of people who are prone to have depression and inflammation at the same time. That group of people experienced major stress in childhood, often related to poverty, having a parent with a severe illness, or lasting separation from family. As a result, these individuals may experience depressions that are especially difficult to treat.”

Another important aspect to their findings is that the inflammatory response among the high-adversity individuals was still detectable six months later, even if their depression had abated, meaning that the inflammation is chronic rather than acute. “Because chronic inflammation is involved in other health problems, like diabetes and heart disease, it also means they have greater-than-average risk for these problems. They, along with their doctors, should keep an eye out for those problems,” added Dr. Miller.

"This study provides important additional support for the notion that inflammation is an important and often under-appreciated factor that compromises resilience after major life stresses. It provides evidence that these inflammatory states persist for long periods of time and have important functional correlates," said Dr. John Krystal, Editor of Biological Psychiatry.

Further research is necessary, to extend the findings beyond female adolescents and particularly in individuals with more severe, long-term depression.. However, findings such as these may eventually help doctors and clinicians better manage depression and medical illness for particularly vulnerable patients.

Source: medicalxpress.com

Jul 5, 201231 notes
#science #neuroscience #depression #brain #psychology
Molecular Clues to Link Between Childhood Maltreatment and Later Suicide

ScienceDaily (July 3, 2012) — Exposure to childhood maltreatment increases the risk for most psychiatric disorders as well as many negative consequences of these conditions. This new study, by Dr. Gustavo Turecki and colleagues at McGill University, Canada, provides important insight into one of the most extreme outcomes, suicide.

"In this study, we expanded our previous work on the epigenetic regulation of the glucocorticoid receptor gene by investigating the impact of severe early-life adversity on DNA methylation," explained Dr. Turecki. The glucocorticoid receptor is important because it is a brain target for the stress hormone cortisol.

The researchers studied brain tissue from people who had committed suicide, some of whom had a history of childhood maltreatment, and compared that tissue to people who had died from other causes. They found that particular variants of the glucocorticoid receptor were less likely to be present in the limbic system, or emotion circuit, of the brain in people who had committed suicide and were maltreated as children compared to the other two groups..

This study also advances the understanding of how the altered pattern of glucocorticoid receptor regulation developed in the maltreated suicide completers. The authors found that the pattern of methylation of the gene coding for the glucocorticoid receptors was altered in those who completed suicide and who also had a history of abuse. These DNA methylation differences were associated with distinct gene expression patterns.

Since methylation is one way that genes are switched on or off for long periods of time, it appears that childhood adversity can produce long-lasting changes in the regulation of a key stress response system that may be associated with increased risk for suicide.

"Preventing suicide is a critical challenge for psychiatry. This study provides important new information about brain changes that may increase the risk of suicide," said Dr. John Krystal, Editor of Biological Psychiatry. "It is striking that early life maltreatment can produce these long-lasting changes in the control of specific genes in the brain. It is also troubling that the consequences of this process can be so dire. Thus, it is important that we continue to study these epigenetic processes that seem to underlie aspects of the lasting consequences of childhood adversity."

Source: Science Daily

Jul 5, 201234 notes
#science #neuroscience #brain #psychology
Adult Stem Cells from Bone Marrow: Cell Replacement/Tissue Repair Potential in Adult Bone Marrow Stem Cells in Animal Model

ScienceDaily (July 3, 2012) — searchers from the University of Maryland School of Maryland report promising results from using adult stem cells from bone marrow in mice to help create tissue cells of other organs, such as the heart, brain and pancreas — a scientific step they hope may lead to potential new ways to replace cells lost in diseases such as diabetes, Parkinson’s or Alzheimer’s.

The research in collaboration with the University of Paris Descartes is published online in the June 29, 2012 edition of Comptes Rendus Biologies, a publication of the French Academy of Sciences.

"Finding stem cells capable of restoring function to different damaged organs would be the Holy Grail of tissue engineering," says lead author David Trisler, PhD, assistant professor of neurology at the University of Maryland School of Medicine.

He adds, “This research takes us another step in that process by identifying the potential of these adult bone marrow cells, or a subset of them known as CD34+ bone marrow cells, to be ‘multipotent,’ meaning they could transform and function as the normal cells in several different organs.”

University of Maryland researchers previously developed a special culturing system to collect a select sample of these adult stem cells in bone marrow, which normally makes red and white blood cells and immune cells. In this project, the team followed a widely recognized study model, used to prove the multipotency of embryonic stem cells, to prove that these bone marrow stem cells could make more than just blood cells. The investigators also found that the CD34+ cells had a limited lifespan and did not produce teratomas, tumors that sometimes form with the use of embryonic stem cells and adult stem cells cultivated from other methods that require some genetic manipulation.

"When taken at an early stage, we found that the CD34+ cells exhibited similar multipotent capabilities as embryonic stem cells, which have been shown to be the most flexible and versatile. Because these CD34+ cells already exist in normal bone marrow, they offer a vast source for potential cell replacement therapy, particularly because they come from a person’s own body, eliminating the need to suppress the immune system, which is sometimes required when using adults stem cells derived from other sources," explains Paul Fishman, MD, PhD, professor of neurology at the University of Maryland School of Medicine.

The researchers say that proving the potential of these adult bone marrow stem cells opens new possibilities for scientific exploration, but that more research will be needed to see how this science can be translated to humans.

Source: Science Daily

Jul 5, 20124 notes
#science #neuroscience #brain #parkinson #alzheimer
Why Current Strategies for Fighting Obesity Are Not Working

ScienceDaily (July 3, 2012) — As the United States confronts the growing epidemic of obesity among children and adults, a team of University of Colorado School of Medicine obesity researchers concludes that what the nation needs is a new battle plan — one that replaces the emphasis on widespread food restriction and weight loss with an emphasis on helping people achieve “energy balance” at a healthy body weight.

In a paper published in the July 3 issue of the journal Circulation, James O. Hill, PhD. and colleagues at the Anschutz Health and Wellness Center take on the debate over whether excessive food intake or insufficient physical activity cause obesity, using the lens of energy balance — which combines food intake, energy expended through physical activity and energy (fat) storage — to advance the concept of a “regulated zone,” where the mechanisms by which the body establishes energy balance are managed to overcome the body’s natural defenses towards preserving existing body weight. This is accomplished by strategies that match food and beverage intake to a higher level of energy expenditure than is typical in America today, enabling the biological system that regulates body weight to work more effectively. Additional support for this concept comes from many studies showing that higher levels of physical activity are associated with low weight gain whereas comparatively low levels of activity are linked to high weight gain over time.

"A healthy body weight is best maintained with a higher level of physical activity than is typical today and with an energy intake that matches," explained Hill, professor of pediatrics and medicine and executive director of the Anschutz Health and Wellness Center at the University of Colorado Anschutz Medical Campus and the lead author of the paper. "We are not going to reduce obesity by focusing only on reducing food intake. Without increasing physical activity in the population we are simply promoting unsustainable levels of food restriction. This strategy hasn’t worked so far and it is not likely to work in the future.

As Dr. Hill explains, “What we are really talking about is changing the message from ‘Eat Less, Move More” to ‘Move More, Eat Smarter.’ “

The authors argue that preventing excessive weight gain is a more achievable goal than treating obesity once it is present. Here, the researchers stress that reducing calorie intake by 100 calories a day would prevent weight gain in 90 percent of the adult population and is achievable through small increases in physical activity and small changes in food intake.

People who have a low level of physical activity have trouble achieving energy balance because they must constantly use food restriction to match energy intake to a low level of energy expenditure. Constant food restriction is difficult to maintain long-term and when it cannot be maintained, the result is positive energy balance (when the calories consumed are greater than the calories expended) and an increase in body mass, of which 60 percent to 80 percent is usually body fat. The increasing body mass elevates energy expenditure and helps reestablish energy balance. In fact, the researchers speculate that becoming obese may be the only way to achieve energy balance when living a sedentary lifestyle in a food-abundant environment.

Using an exhaustive review of the energy balance literature as the basis, the researchers also refuted the popular theory that escalating obesity rates can be attributed exclusively to two factors — the change in the American diet and the rise in overall energy intake without a compensatory increase in energy expenditure. Using rough estimates of increases in food intake and decreases in physical activity from 1971 to 2000, the researchers calculated that were it not for the physiological processes that produce energy balance, American adults would have experienced a 30 to 80 fold increase in weight gain during that period, which demonstrates why it is not realistic to attribute obesity solely to caloric intake or physical activity levels. In fact, energy expenditure has dropped dramatically over the past century as our lives now require much less physical activity just to get through the day. The authors argue that this drop in energy expenditure was a necessary prerequisite for the current obesity problem, which necessitates adding a greater level of physical activity back into our modern lives.

"Addressing obesity requires attention to both food intake and physical activity, said co-author John Peters, PhD., assistant director of the Anschutz Health and Wellness Center. "Strategies that focus on either alone will not likely work."

In addition, the researchers conclude that food restriction alone is not effective in reducing obesity, explaining that although caloric restriction produces weight loss, this process triggers hunger and the body’s natural defense to preserve existing body weight, which leads to a lower resting metabolic rate and notable changes in how the body burns calories. As a result, energy requirements after weight loss can be reduced from 170 to 250 calories for a 10 percent weight loss and from 325 to 480 calories for a 20 percent weight loss. These findings provide insight concerning weight loss plateau and the common occurrence of regaining weight after completing a weight loss regimen.

Recognizing that energy balance is a new concept for to the public, the researchers call for educational efforts and new information tools that will teach Americans about energy balance and how food and physical activity choices affect energy balance.

Source: Science Daily

Jul 5, 201225 notes
#science #neuroscience #obesity #psychology
Jul 5, 201222 notes
#science #neuroscience #brain #toxoplasma #animals
Jul 5, 201257 notes
#science #neuroscience #brain #psychology
Bees Can 'Turn Back Time,' Reverse Brain Aging

ScienceDaily (July 3, 2012) — Scientists at Arizona State University have discovered that older honey bees effectively reverse brain aging when they take on nest responsibilities typically handled by much younger bees. While current research on human age-related dementia focuses on potential new drug treatments, researchers say these findings suggest that social interventions may be used to slow or treat age-related dementia.

image

Old bees collect nectar and pollen. Most bees start doing this job when they are 3-4 weeks old, and after that they age very quickly. Their bodies and wings become worn and they loose the ability to learn new things. Most food collector bees die after about 10 days. (Credit: Christofer Bang)

In a study published in the scientific journal Experimental Gerontology, a team of scientists from ASU and the Norwegian University of Life Sciences, led by Gro Amdam, an associate professor in ASU’s School of Life Sciences, presented findings that show that tricking older, foraging bees into doing social tasks inside the nest causes changes in the molecular structure of their brains.

"We knew from previous research that when bees stay in the nest and take care of larvae — the bee babies — they remain mentally competent for as long as we observe them," said Amdam. "However, after a period of nursing, bees fly out gathering food and begin aging very quickly. After just two weeks, foraging bees have worn wings, hairless bodies, and more importantly, lose brain function — basically measured as the ability to learn new things. We wanted to find out if there was plasticity in this aging pattern so we asked the question, ‘What would happen if we asked the foraging bees to take care of larval babies again?"

During experiments, scientists removed all of the younger nurse bees from the nest — leaving only the queen and babies. When the older, foraging bees returned to the nest, activity diminished for several days. Then, some of the old bees returned to searching for food, while others cared for the nest and larvae. Researchers discovered that after 10 days, about 50 percent of the older bees caring for the nest and larvae had significantly improved their ability to learn new things.

Amdam’s international team not only saw a recovery in the bees’ ability to learn, they discovered a change in proteins in the bees’ brains. When comparing the brains of the bees that improved relative to those that did not, two proteins noticeably changed. They found Prx6, a protein also found in humans that can help protect against dementia — including diseases such as Alzheimer’s — and they discovered a second and documented “chaperone” protein that protects other proteins from being damaged when brain or other tissues are exposed to cell-level stress.

In general, researchers are interested in creating a drug that could help people maintain brain function, yet they may be facing up to 30 years of basic research and trials.

"Maybe social interventions — changing how you deal with your surroundings — is something we can do today to help our brains stay younger," said Amdam. "Since the proteins being researched in people are the same proteins bees have, these proteins may be able to spontaneously respond to specific social experiences."

Amdam suggests further studies are needed on mammals such as rats in order investigate whether the same molecular changes that the bees experience might be socially inducible in people.

Source: Science Daily

Jul 4, 201233 notes
#science #neuroscience #brain #animals #psychology
Road-mapping the Asian brain

July 3, 2012

Scientists at The University of Nottingham are leading research that will develop the world’s first ‘atlas’ of the Asian brain.

Working in collaboration with colleagues in South Korea, the project aims to build a detailed picture of how the Asian brain develops normally, taking into account the differences and variations which occur from person to person.

The resulting road-map of the brain could be used to help doctors in countries like South Korea, Japan and China to develop new diagnostic tools for age-related neurodegenerative diseases such as Alzheimer’s, Parkinson’s and dementia, allowing them to spot illnesses at a much earlier stage, thereby improving treatment options and outcomes.

The two-year project will marry the expertise of Nottingham academics in advanced brain imaging techniques, including ultra high field magnetic resonance imaging (MRI), with the clinical expertise and specialist computer software development skills of researchers at Korea University in Seoul.

Stephen Jackson, Professor of Cognitive Neuroscience in the University’s School of Psychology, said: “Developing this atlas of the Asian brain will be a major step forward in furthering the field of neuroscience, which is developing rapidly in the East.

"We hope this two-year project will also act as a template for further UK-South Korean collaboration and knowledge transfer, which has been highlighted by Government as a strategic priority."

The project, initially funded with a Global Partnership Fund grant from the British Foreign and Commonwealth Office (BIS), will see the Nottingham academics working with colleagues in the College of Medicine, Biomedical Engineering, and Psychology at Korea University, to scan the brains of healthy Asian adults using advanced MRI techniques.

Data from the hundreds of images produced will then be analysed and computer modelling techniques used to build up a detailed picture of how a normal Asian brain develops in adults, taking into account the slight variations that occur from person to person.

There are subtle differences in the size and genetics of the Asian brain compared to its Western cousin and the research will allow for the development of new diagnostic aids for age-related neuro-degenerative diseases which are specifically tailored to Asian patients.

The research will build on The University of Nottingham’s reputation as a world-leader in MRI research — the technique was invented there by Professor Sir Peter Mansfield, whose work jointly earned him the Nobel Prize for Medicine in 2003.

Biomedical imaging remains a strategic research priority for Nottingham through its Sir Peter Mansfield Magnetic Resonance Centre, which hosts the UK’s only 7 Tesla MRI scanner.

The University has recently established a UK Centre for Child Neuroimaging, a core theme of Nottingham’s Impact Campaign, the biggest fundraising campaign in The University of Nottingham’s 130 year history. It aims to raise £150m to transform research, enrich the student experience and enable the institution to make an even greater contribution to the global communities it serves.

The work to map the Asian brain will also involve collaboration with academics at other UK and European institutions, including University College London, the Institute of Neurology, Institute of Psychiatry, Imperial College and the University of Aachen in Germany.

The collaboration between The University of Nottingham and Korea University is the latest in a long-running relationship between the two higher education institutions and follows the signing of a Memorandum of Understanding, along with 12 other universities in the Universitas 21 group, in 2009 that aimed to offer postdoc students international opportunities through a joint PhD programme.

Provided by University of Nottingham

Source: medicalxpress.com

Jul 4, 201218 notes
#science #neuroscience #brain #psychology
3-D Movies Linked to Increased Vision Symptoms

ScienceDaily (July 2, 2012) — Watching 3D movies can “immerse” you in the experience — but can also lead to visual symptoms and even motion sickness, reports a study — “Stereoscopic Viewing and Reported Perceived Immersion and Symptoms,” in the July issue of Optometry and Vision Science, official journal of the American Academy of Optometry.

The journal is published by Lippincott Williams & Wilkins, a part of Wolters Kluwer Health.

Symptoms related to 3D viewing are affected by where you sit while watching, and even how old you are. “Younger viewers incurred higher immersion but also greater visual and motion sickness symptoms in 3D viewing,” according to the authors, led by Shun-nan Yang, PhD, of Pacific University College of Optometry, Forest Grove, Ore. “Both [problems] will be reduced if a farther distance and a wider viewing angle are adopted.”

Greater ‘Immersion’ in 3D Also Associated With Increased Symptoms

The researchers performed experiments in which adults, from young adult to middle-aged, were invited to watch a movie (Cloudy with a Chance of Meatballs) in 2D or 3D while sitting at different angles and distances. Visual and other symptoms were assessed — including the role of factors including age, seating position, and level of “immersion” in the movie.

Twenty-one percent of participants reported symptoms while watching the movie in 3D, compared to twelve percent with 2D viewing. For younger study participants blurred vision, double vision, dizziness, disorientation, and nausea were all more frequent and severe when watching the movie in 3D.

3D viewing also led to a greater sense of immersion — “a greater sense of object motion and motion of the viewer in space” — compared to 2D viewing. Subjects sitting in more central or closer positions reported greater immersion as well as increased symptoms of motion sickness — that is, nausea. Sitting at an angle to the screen was associated with less immersion as well as reduced motion symptoms.

There were some differences by age, including a lower rate of blurred vision in older viewers (age 46 and older). Older viewers had more visual and motion sickness symptoms in 2D viewing, while younger viewers (age 24 to 34) had more symptoms in 3D viewing. The same age-related changes leading to lower rates of blurred vision in older viewers may also explain their lower rates of symptoms during 3D vision.

As 3D movies become more common, including on home screens, there are reports of visual and other symptoms among 3D viewers. Vision and orientation symptoms related to 3D viewing may be related to a “mismatch” between focusing and converging the eyes. Anthony Adams, OD, PhD, Editor-in-Chief of Optometry and Vision Science notes “the technology for reducing mismatch between where the eyes converge and where they focus is likely to improve rapidly.”

The study identifies several factors associated with symptoms during 3D viewing. “3D viewing is quite specific in causing blurred vision and double vision, and the resultant symptoms are greater for younger adults,” Dr Yang and colleagues write. 3D produces a greater sense of immersion than 2D viewing, which leads to more symptoms of motion sickness — especially for younger adults and when viewing from a closer distance and a more direct angle.

The study will help optometrists and other eye care professionals in talking to patients about visual and other symptoms related to today’s sophisticated 3D video setups.

Source: Science Daily

Jul 4, 201212 notes
#science #neuroscience #brain #psychology #vision
Novel Mechanism and Potential Link Responsible for Huntington's Disease

ScienceDaily (July 2, 2012) — Using an in vitro cell model of Huntington’s disease (HD), researchers at Florida Atlantic University’s Charles E. Schmidt College of Medicine have discovered a novel mechanism and potential link between mutant huntingtin, cell loss and cell death or apoptosis in the brain, which is responsible for the devastating effects of this disease. Apoptosis has been proposed as one of the mechanisms leading to neuronal death in HD.

Dr. Jianning Wei, Ph.D., assistant professor of biomedical science in the Schmidt College of Medicine, has received a $428,694 grant from the National Institutes of Health (NIH) for a project titled “Regulation of BimEL phosphorylation in the pathogenesis of Huntington’s disease.” With this grant, she will further her research and investigation of the molecular and physiological functions of BimEL, a protein known to promote cell death, in a rodent HD model to better understand the pathogenesis of this disease and develop treatments and therapies to prevent or slow down its progression. Wei’s previous findings may also represent a universal mechanism in the pathogenesis of neurodegenerative diseases that are involved with protein misfolding and aggregation — a phenomenon that occurs in many highly debilitating disorders including neurodegenerative diseases.

HD is a fatal, inherited disease caused by abnormal repeats of a small segment in an individual’s DNA or genetic code. The production of malfunctioning proteins in the body are results of this mutation, and the more repeat the protein contains, the worse the disease. A person who has the disease carries one normal copy of the gene and one mutated copy in his or her cells. Although the mutated forms of these genes are known for their devastating effects, their normal forms are critical for nerve function, embryonic development and other bodily processes. Similar mutations in other proteins are involved in several other neurodegenerative diseases.

"HD is a highly complex genetic, neurological disorder that causes certain nerve cells in the brain to waste away, and the underlying molecular mechanism of this disease still remains elusive," said Wei. "We are continuing our research to identify the pathways in the brain that are altered in response to mutant proteins, as well as to understand the cellular processes impacted by the disease in order to facilitate the development of effective pharmacological interventions."

Named after American physician George Huntington, HD is characterized by a selective loss of neurons in the brain and affects the basal ganglia, which controls motor control, cognition, learning and emotions. It also affects the outer surface of the brain or the cortex, which controls thought, perception, and memory. It is estimated that more than 250,000 Americans have HD or are at risk of inheriting the disease from an affected parent.

"The vital research that Dr. Wei and her colleagues are conducting at Florida Atlantic University will help to shed light on a very devastating and difficult disease for which there are currently no treatments available to stop or reverse its course," said Dr. David J. Bjorkman, M.D., M.S.P.H., dean of FAU’s Charles E. Schmidt College of Medicine.

Source: Science Daily

Jul 4, 20125 notes
#science #neuroscience #brain #psychology #huntington
Chronic Inflammation in the Brain Leads the Way to Alzheimer's Disease

ScienceDaily (July 2, 2012) — Research published July 2 in Biomed Central’s open access journal Journal of Neuroinflammation suggests that chronic inflammation can predispose the brain to develop Alzheimer’s disease.

To date it has been difficult to pin down the role of inflammation in Alzheimer’s disease (AD), especially because trials of NSAIDs appeared to have conflicting results. Although the ADAPT (The Alzheimer`s Disease Anti-inflammatory Prevention Trial) trial was stopped early, recent results suggest that NSAIDs can help people with early stages of AD but that prolonged treatment is necessary to see benefit.

Researchers from the University of Zurich, in collaboration with colleagues from the ETH Zurich and University of Bern investigated what impact immune system challenges (similar to having a severe viral infection) would have on the development of AD in mice. Results showed that a single infection before birth (during late gestation) was enough to induce long-term neurological changes and significant memory problems at old age.

These mice had a persistent increase in inflammatory cytokines, increased levels of amyloid precursor protein (APP), and altered cellular localization of Tau. If this immune system challenge was repeated during adulthood the effect was strongly exacerbated, resulting in changes similar to those seen for pathological aging.

Dr Irene Knuesel who led this research explained, “The AD-like changes within the brain of these mice occurred without an increase in amyloid β (Aβ). However, in mice genetically modified to produce the human version of Aβ, the viral-like challenge drastically increased the amount of Aβ at precisely the sites of inflammation-induced APP deposits. Based on the similarity between these APP/AƒÒ aggregates in mice and those found in human AD, it seems likely that chronic inflammation due to infection could be an early event in the development of AD.

Source: Science Daily

Jul 4, 20124 notes
#science #neuroscience #brain #psychology #alzheimer
Years Before Diagnosis, Quality of Life Declines for Parkinson's Disease Patients

ScienceDaily (July 2, 2012) — Growing evidence suggests that Parkinson’s disease (PD) often starts with non-motor symptoms that precede diagnosis by several years. In the first study to examine patterns in the quality of life of Parkinson’ disease patients prior to diagnosis, researchers have documented declines in physical and mental health, pain, and emotional health beginning several years before the onset of the disease and continuing thereafter.

Their results are reported in the latest issue of Journal of Parkinson’s Disease.

"We observed a decline in physical function in PD patients relative to their healthy counterparts beginning three years prior to diagnosis in men and seven and a half years prior to diagnosis in women," says lead investigator Natalia Palacios, PhD, Department of Nutrition, Harvard School of Public Health. "The decline continues at a rate that is five to seven times faster than the average yearly decline caused by normal aging in individuals without the disease."

The study included 51,350 male health professionals enrolled in the Health Professionals Follow Up Study (HPFS) and 121,701 female registered nurses enrolled in the Nurses’ Health Study (NHS). In both ongoing studies, participants fill out biannual questionnaires about a variety of lifestyle characteristics and document the occurrence of major chronic disease. In the NHS study, questionnaires measured health-related quality of life in eight areas: physical functioning, role limitations due to physical problems, role limitations due to emotional problems, vitality, bodily pain, social functioning, mental health, and general health perceptions. In the HPFS, only physical functioning was assessed.

Researchers identified 454 men and 414 women with PD in the two cohorts. At 7.5 years prior to diagnosis, physical function among PD cases, in both men and women, was comparable to that in the overall cohort. A decline began approximately 3 years prior to diagnosis in men and approximately 7.5 years prior to diagnosis in women. Physical function continued to decline thereafter at a rate of 1.43 and 2.35 points per year in men and women, respectively. In comparison, the average yearly decline in individuals without PD was 0.23 in men and 0.42 in women. Other measures of quality of life, available only in women, declined in a similar pattern.

Dr. Palacios notes that a strength of the study is the availability of prospective data on both PD patients and a healthy comparison group, and the ability to chart the deterioration in functioning and quality of life over the whole study follow-up, which included many years prior to diagnosis.

"This result provides support to the notion that the pathological process leading to PD may start several years before PD diagnosis," says Dr. Palacios. "Our hope is that, with future research, biological markers of the disease process may be recognizable in this preclinical phase."

Source: Science Daily

Jul 4, 20129 notes
#science #neuroscience #brain #psychology #parkinson
Premature Infants Do Feel Pain from Procedures: Physiological Markers for Neonate Pain Identified

ScienceDaily (July 2, 2012) — There was a time when a belief was widely held that premature neonates did not perceive pain. That, of course, has been refuted but measurements of neonate pain tend to rely on inexact measures, such as alertness and ability to react expressively to pain sensations. Researchers at Loma Linda University reported in The Journal of Pain that there is a significant relationship between procedural pain and detectable oxidative stress in neonates.

Previous studies have shown an approach involving measurement of systemic biochemical reactions to pain offers the benefit of providing an objective method for measuring pain in premature neonates. Exposure to painful procedures often results in reductions in oxygen saturations and tachycardia, but few studies have quantified the effects of increased pain oxygen consumption. No studies have examined the relationship between pain scores that reflect behavioral and physiological markers of pain and plasma markers of ATP utilization and oxidative stress.

In this study, 80 preterm neonates were evaluated. In about half, tape was taken off the skin following removal of catheters, and they were evaluated for oxidative stress by measuring uric acid and malondialdehyde (MDA) concentration in plasma before and after the procedure. These subjects were compared with a control group not experiencing tape removal. Pain scores were assessed using the Premature Infant Pain Profile. The data showed there was a significant relationship between procedural pain and MDA, which is a well accepted marker of oxidative stress.

There were increases in MDA in preterm neonates exposed to the single painful procedure and not in the control group. Since premature neonates undergo several painful procedures a day, the researchers concluded that if exposure to multiple painful procedures is shown to contribute to oxidative stress, biochemical markers might be useful in evaluating mechanism-based interventions that could decrease adverse effects of painful procedures.

Source: Science Daily

Jul 4, 20126 notes
#science #neuroscience #brain #psychology #pain
Childless Women With Fertility Problems at Higher Risk of Hospitalization for Psychiatric Disorders

ScienceDaily (July 2, 2012) — While many small studies have shown a relationship between infertility and psychological distress, reporting a high prevalence of anxiety, mood disorders and depressive symptoms, few have studied the psychological effect of childlessness on a large population basis. Now, based on the largest cohort of women with fertility problems compiled to date, Danish investigators have shown that women who remained childless after their first investigation for infertility had more hospitalisations for psychiatric disorders than women who had at least one child following their investigation.

The results of the study were presented July 1 at the annual meeting of ESHRE (European Society of Human Reproduction and Embryology) by Dr Birgitte Baldur-Felskov, an epidemiologist from the Danish Cancer Research Center in Copenhagen.

Most studies of this kind have been based on single clinics and self-reported psychological effects. This study, however, was a nationwide follow-up of 98,737 Danish women investigated for infertility between 1973 and 2008, who were then cross-linked via Denmark’s population-based registries to the Danish Psychiatric Central Registry. This provided information on hospitalisations for psychiatric disorders, which were divided into an inclusive group of “all mental disorders,” and six discharge sub-groups which comprised “alcohol and intoxicant abuse,” “schizophrenia and psychoses,” “affective disorders including depression,” “anxiety, adjustment and obsessive compulsive disorder,” “eating disorders,” and “other mental disorders.”

All women were followed from the date of their initial fertility investigation until the date of psychiatric event, date of emigration, date of death, date of hospitalisation or 31st December 2008, whichever came first. Such studies, said Dr Baldur-Felskov, could only be possible in somewhere like Denmark, where each citizen has a personal identification number which can be linked to any or all of the country’s diagnostic registries.

Results of the study showed that, over an average follow-up time of 12.6 years (representing 1,248,243 woman-years), 54% of the 98,737 women in the cohort did have a baby. Almost 5000 women from the entire cohort were hospitalised for a psychiatric disorder, the most common discharge diagnosis being “anxiety, adjustment and obsessive compulsive disorders” followed by “affective disorders including depression.”

However, those women who remained childless after their initial fertility investigation had a statistically significant (18%) higher risk of hospitalisations for all mental disorders than the women who went on to have a baby; the risk was also significantly greater for alcohol/substance abuse (by 103%), schizophrenia (by 47%) and other mental disorders (by 43%). The study also showed that childlessness increased the risk of eating disorders by 47%, although this was not statistically significant.

However, the most commonly seen discharge diagnosis in the entire cohort (anxiety, adjustment and obsessive compulsive disorders) was not affected by fertility status.

Commenting on the study’s results, Dr Baldur-Felskov said: “Our study showed that women who remained childless after fertility evaluation had an 18% higher risk of all mental disorders than the women who did have at least one baby. These higher risks were evident in alcohol and substance abuse, schizophrenia and eating disorders, although appeared lower in affective disorders including depression.

"The results suggest that failure to succeed after presenting for fertility investigation may be an important risk modifier for psychiatric disorders. This adds an important component to the counselling of women being investigated and treated for infertility. Specialists and other healthcare personnel working with infertile patients should also be sensitive to the potential for psychiatric disorders among this patient group."

Source: Science Daily

Jul 4, 201212 notes
#science #neuroscience #psychology #brain #disorders
Day Dreaming Good for You? Reflection Is Critical for Development and Well-Being

ScienceDaily (July 2, 2012) — As each day passes, the pace of life seems to accelerate — demands on productivity continue ever upward and there is hardly ever a moment when we aren’t, in some way, in touch with our family, friends, or coworkers. While moments for reflection may be hard to come by, a new article suggests that the long-lost art of introspection — even daydreaming — may be an increasingly valuable part of life.

image

The long-lost art of introspection — even daydreaming — may be an increasingly valuable part of life. (Credit: © HaywireMedia / Fotolia)

In the article, published in the July issue of Perspectives on Psychological Science, a journal of the Association for Psychological Science, psychological scientist Mary Helen Immordino-Yang and colleagues survey the existing scientific literature from neuroscience and psychological science, exploring what it means when our brains are ‘at rest.’

In recent years, researchers have explored the idea of rest by looking at the so-called ‘default mode’ network of the brain, a network that is noticeably active when we are resting and focused inward. Findings from these studies suggest that individual differences in brain activity during rest are correlated with components of socioemotional functioning, such as self-awareness and moral judgment, as well as different aspects of learning and memory. Immordino-Yang and her colleagues believe that research on the brain at rest can yield important insights into the importance of reflection and quiet time for learning.

"We focus on the outside world in education and don’t look much at inwardly focused reflective skills and attentions, but inward focus impacts the way we build memories, make meaning and transfer that learning into new contexts," says Immordino-Yang, a professor of education, psychology and neuroscience at the University of Southern California. "What are we doing in schools to support kids turning inward?"

Accumulated research suggests that the networks that underlie a focus inward versus outward likely are interdependent, and our ability to regulate and move between them probably improves with maturity and practice. While outward attention is essential for carrying out tasks and learning from classroom lessons, for example, the reflection and consolidation that may accompany mind wandering is equally important, fostering healthy development and learning in the longer term.

"Balance is needed between outward and inward attention, since time spent mind wandering, reflecting and imagining may also improve the quality of outward attention that kids can sustain," says Immordino-Yang.

She and her colleagues argue that mindful introspection can become an effective part of the classroom curriculum, providing students with the skills they need to engage in constructive internal processing and productive reflection. Research indicates that when children are given the time and skills necessary for reflecting, they often become more motivated, less anxious, perform better on tests, and plan more effectively for the future.

And mindful reflection is not just important in an academic context — it’s also essential to our ability to make meaning of the world around us. Inward attention is an important contributor to the development of moral thinking and reasoning and is linked with overall socioemotional well-being.

Immordino-Yang and her colleagues worry that the high attention demands of fast-paced urban and digital environments may be systematically undermining opportunities for young people to look inward and reflect, and that this could have negative effects on their psychological development. This is especially true in an age when social media seems to be a constant presence in teens’ day-to-day lives.

"Consistently imposing overly high-attention demands on children, either in school, through entertainment, or through living conditions, may rob them of opportunities to advance from thinking about ‘what happened’ or ‘how to do this’ to constructing knowledge about ‘what this means for the world and for the way I live my life,’ " Immordino-Yang writes.

According to the authors, perhaps the most important conclusion to be drawn from research on the brain at rest is the fact that all rest is not idleness. While some might be inclined to view rest as a wasted opportunity for productivity, the authors suggest that constructive internal reflection is critical for learning from past experiences and appreciating their value for future choices, allowing us to understand and manage ourselves in the social world.

Source: Science Daily

Jul 4, 201252 notes
#science #neuroscience #psychology #brain
Next page →
20132014
  • January
  • February
  • March
  • April
  • May
  • June
  • July
  • August
  • September
  • October
  • November
  • December
201220132014
  • January
  • February
  • March
  • April
  • May
  • June
  • July
  • August
  • September
  • October
  • November
  • December
20122013
  • January
  • February
  • March
  • April
  • May
  • June
  • July
  • August
  • September
  • October
  • November
  • December