Neuroscience

Month

April 2012

Binge eating may lead to addiction-like behaviors

April 24, 2012

A history of binge eating — consuming large amounts of food in a short period of time — may make an individual more likely to show other addiction-like behaviors, including substance abuse, according to Penn State College of Medicine researchers. In the short term, this finding may shed light on the factors that promote substance abuse, addiction, and relapse. In the long term, may help clinicians treat individuals suffering from this devastating disease.

"Drug addiction persists as a major problem in the United States," said Patricia Sue Grigson, Ph.D., professor, Department of Neural and Behavioral Sciences. "Likewise, excessive food intake, like binge eating, has become problematic. Substance-abuse and binge eating are both characterized by a loss of control over consumption. Given the common characteristics of these two types of disorders, it is not surprising that the co-occurrence of eating disorders and substance abuse disorders is high. It is unknown, however, whether loss of control in one disorder predisposes an individual to loss of control in another."

Grigson and her colleagues found a link between bingeing on fat and the development of cocaine-seeking and -taking behaviors in rats, suggesting that conditions promoting excessive behavior toward one substance can increase the probability of excessive behavior toward another. They report their results in Behavioral Neuroscience.

The researchers used rats to test whether a history of binge eating on fat would augment addiction-like behavior toward cocaine by giving four groups of rats four different diets: normal rat chow; continuous ad lib access to an optional source of dietary fat; one hour of access to optional dietary fat daily; and one hour of access to dietary fat on Mondays, Wednesdays, and Fridays. All four groups also had unrestricted access to nutritionally complete chow and water. The researchers then assessed the cocaine-seeking and -taking behaviors.

"Fat bingeing behaviors developed in the rats with access to dietary fat on Mondays, Wednesdays, and Fridays — the group with the most restricted access to the optional fat," Grigson said. 

This group tended to take more cocaine late in training, continued to try to get cocaine when signaled it was not available, and worked harder for cocaine as work requirements increased.

"While the underlying mechanisms are not known, one point is clear from behavioral data: A history of bingeing on fat changed the brain, physiology, or both in a manner that made these rats more likely to seek and take a drug when tested more than a month later," Grigson said. "We must identify these predisposing neurophysiological changes."

While the consumption of fat in and of itself did not increase the likelihood of subsequent addiction-like behavior for cocaine, the irregular binge-type manner in which the fat was eaten proved critical. Rats that had continuous access to fat consumed more fat than any other group, but were three times less likely to exhibit addiction-like behavior for cocaine than the group with access only on Mondays, Wednesdays and Fridays.

"Indeed, while about 20 percent of those rats and humans exposed to cocaine will develop addiction-like behavior for the drug under normal circumstances, in our study, the probability of addiction to cocaine increased to approximately 50 (percent) for subjects with a history of having binged on fat," Grigson said.

Future studies will look more closely at how bingeing can lead to addiction-like behaviors — whether bingeing on sugar or a mixture of sugar and fat also promotes cocaine or heroin addiction, for example, and whether bingeing on a drug, in turn, increases the likelihood of bingeing on fat.

Provided by Pennsylvania State University

Source: medicalxpress.com

Apr 25, 201222 notes
#science #neuroscience #brain #psychology
Anticonvulsant Drug Helps Marijuana Smokers Kick the Habit

ScienceDaily (Apr. 24, 2012) — Scientists at The Scripps Research Institute have found clinical evidence that the drug gabapentin, currently on the market to treat neuropathic pain and epilepsy, helps people to quit smoking marijuana (cannabis). Unlike traditional addiction treatments, gabapentin targets stress systems in the brain that are activated by drug withdrawal.

In a 12-week trial of 50 treatment-seeking cannabis users, those who took gabapentin used less cannabis, experienced fewer withdrawal symptoms such as sleeplessness, and scored higher on tests of attention, impulse-control, and other cognitive skills, compared to patients who received a placebo. If these results are confirmed by ongoing larger trials, gabapentin could become the first FDA-approved pharmaceutical treatment for cannabis dependence.

"A lot of other drugs have been tested for their ability to decrease cannabis use and withdrawal, but this is the first to show these key effects in a controlled treatment study," said Barbara J. Mason, the Pearson Family Chair and Co-Director of the Pearson Center for Alcoholism and Addiction Research at Scripps Research. "The other nice thing about gabapentin is that it is already widely prescribed, so its safety is less likely to be an issue."

Mason led the new gabapentin study, recently published online ahead of print by the journal Neuropsychopharmacology.

Stress Circuits

Addiction researchers have long known that recreational drugs hook users by disrupting the normal tuning of their brains’ reward and motivation circuitry. But as scientists at Scripps Research and other institutions have shown in animal studies, cannabis withdrawal after prolonged heavy use also leads to the long-term activation of basic stress circuits. “In human cannabis users who try to quit, this stress response is reflected in reports of drug craving, sleep disturbances, anxiety, irritability, and dysphoria, any one of which can motivate a person to return to using, because cannabis will quiet these symptoms,” said Mason.

A 2008 study by Pearson Center Co-Director George Koob and his colleagues found that gabapentin, an FDA-approved anticonvulsant drug that resembles the neurotransmitter GABA, can quiet this withdrawal-related activation in stress circuitry in alcohol-dependent rats. That finding motivated Mason to set up a pilot trial of gabapentin in cannabis-dependent individuals, whose withdrawal syndrome features a similar over-activation of stress circuits.

She and her colleagues recruited cannabis users with local newspaper and web ads headlined: “Smoking too much pot? We want to help you stop.” "We needed only 50 subjects, but we quickly got more than 700 queries from cannabis users who were eager to quit," Mason said. "Some people deny that cannabis can be addictive, but surveys show that between 16 and 25 percent of substance use treatment admissions around the world every year involve people with primary cannabis dependence."

Twice as Many Abstinent from Cannabis Use

The trial was based at Mason’s laboratory at The Scripps Research Institute. Half of the 50 recruits were randomly assigned to take 1,200 mg/day of gabapentin; the rest were given identical-looking placebo capsules. Over 12 weeks, Mason and her colleagues, including a medical team from the nearby Scripps Clinic, monitored the subjects with tests. Using standard behavioral therapy techniques, they also counseled the patients to stay off cannabis.

The subjects’ self-reports and more objective urine tests revealed that gabapentin, compared to placebo, significantly reduced their continuing cannabis use. “Urine metabolite readings indicate about twice as many of the gabapentin subjects had no new cannabis use during the entire study, and, in the last four weeks of the study, all of the gabapentin subjects who completed the study stayed abstinent,” Mason said.

Gabapentin also clearly reduced the reported symptoms of withdrawal such as sleep disturbances, drug cravings, and dysphoria. And even though gabapentin normally is thought of as a brain-quieting drug that can cause sleepiness as a side effect, there was some evidence that it sharpened cognition among the cannabis users. Seven gabapentin and ten placebo patients sat for tests of attention, impulse-control, and other executive functions just before the start of the trial and at week four. While the placebo patients tended to score lower after four weeks of attempted abstinence, the gabapentin patients generally scored higher.

Help Resisting Cravings

Addiction researchers now recognize that one of the effects of repeated drug use is the weakening of executive functions — which can happen through the over-activation of reward circuitry as well as by withdrawal-related stress. “That weakening of self-control-related circuits makes it even harder for people to resist drug cravings when they’re trying to quit, but gabapentin may help restore those circuits, by reducing stress and enabling patients to sleep better, so that they function better while awake,” Mason said.

She is now conducting a larger, confirmatory study of gabapentin in cannabis users, as well as a new study of a novel drug that targets the same stress circuitry.

"People in the treatment community have told me that they’re eager for these trial results to come out, because until now nothing has been shown to work against both relapse and withdrawal symptoms," Mason said.

Source: Science Daily

Apr 24, 20128 notes
#science #neuroscience #brain #psychology #marijuana
Study Points to Potential Treatment for Stroke

ScienceDaily (Apr. 24, 2012) — Stanford University School of Medicine neuroscientists have demonstrated, in a study published online April 24 in Stroke, that a compound mimicking a key activity of a hefty, brain-based protein is capable of increasing the generation of new nerve cells, or neurons, in the brains of mice that have had strokes. The mice also exhibited a speedier recovery of their athletic ability.

These results are promising, because the compound wasn’t administered to the animals until a full three days after they had suffered strokes, said the study’s senior author, Marion Buckwalter, MD, PhD, an assistant professor of neurology and neurological sciences. This means that the compound works not by limiting a stroke’s initial damage to the brain, but by enhancing recovery.

This is of critical significance, said Buckwalter, a practicing clinical neurologist who often treats recently arrived stroke patients in Stanford Hospital’s intensive care unit.

"No existing therapeutic agents today enhance recovery from stroke," Buckwalter said. "The only approved stroke drug, tissue plasminogen activator, can bust up clots that initially caused the stroke but does nothing to stimulate the restoration of brain function later." Furthermore, to be effective, tPA has to be given within four and a half hours after a stroke has occurred, she added. "In real life, many people don’t get to the hospital that quickly. They may live alone or have their stroke while sleeping, or they and the people close to them didn’t recognize the stroke’s symptoms well enough to realize they’d just had one."

Looking for an alternative, Buckwalter chose to focus on a compound called LM22A-4, which had shown promise in previous research. LM22A-4 is a small molecule whose bulk is less than one-seventieth that of the brain protein it mimics: brain-derived neurotrophic factor, a powerful and long-studied nerve growth factor. BDNF is critical during the development of the nervous system and known to be involved in important brain functions including memory and learning.

Stem-cell therapy, while an exciting prospect, is a relatively invasive and expensive way to replace lost or damaged tissue. A drug that could achieve similar results in such a delicate and complex organ as the brain would be a welcome development.

Read More →

Apr 24, 20123 notes
#science #neuroscience #brain #stroke
Evaluating the First Drug to Show Improvement in Subtype of Autism

ScienceDaily (Apr. 24, 2012) — In an important test of one of the first drugs to target core symptoms of autism, researchers at Mount Sinai School of Medicine are undertaking a pilot clinical trial to evaluate insulin-like growth factor (IGF-1) in children who have SHANK3 deficiency (also known as 22q13 Deletion Syndrome or Phelan-McDermid Syndrome), a known cause of autism spectrum disorder (ASD).

This study builds on findings announced by the researchers in 2010, which showed that after two weeks of treatment with IGF-1 in a mouse model, deficits in nerve cell communication were reversed and deficiencies in adaptation of nerve cells to stimulation, a key part of learning and memory, were restored.

"This clinical trial is part of a paradigm shift to develop medications specifically to treat the core symptoms of autism, as opposed to medications that were developed for other purposes but were found to be beneficial for autism patients as well," said Joseph Buxbaum, PhD, Director of the Seaver Autism Center at Mount Sinai. "Our study will evaluate the impact of IGF-1 vs. placebo on autism-specific impairments in socialization and associated symptoms of language and motor disability."

The seven-month study, which begins this month, will be conducted under the leadership of the Seaver Autism Center Clinical Director Alex Kolevzon, MD, and will utilize a double-blind, placebo-controlled crossover design in children ages 5 to 17 years old with SHANK3 deletions or mutations. Patients will receive three months of treatment with active medication or placebo, separated by a four-week washout period. Future trials are planned to explore the utility of IGF-1 in ASD without SHANK3 deficiency.

The primary aim of the study is to target core features of ASD, including social withdrawal and language impairment, which will be measured using both behavioral and objective assessments. If preliminary results are promising, the goal is to expand the studies into larger, multi-centered efforts to include as many children as possible affected by this disorder.

IGF-1 is a US Food and Drug Administration-approved, commercially available compound that is known to promote neuronal cell survival as well as synaptic maturation and plasticity. Side effects of IGF-1 administration include low blood sugar, liver function abnormalities, and increased cholesterol and triglyceride levels. Study subjects will undergo rigorous safety screening before they are enrolled in the trial, and will be carefully monitored every two to four weeks with safety and efficacy assessments.

"We are excited that the researchers at the Seaver Autism Center are undertaking this pilot study to evaluate a possible treatment for SHANK3 deficiency, which may also help everyone with ASD," said Geraldine Bliss, Research Support Chair of the Phelan-McDermid Foundation. "This will be the first clinical trial in Phelan-McDermid Syndrome to emerge from convincing preclinical evidence in a model system."

The cause of autism has been debated for many years. Currently the best scientific evidence indicates that genetic mutations are the most likely culprits, acting either directly or indirectly, in upwards of 80 to 90 percent of individuals with ASDs. In the past few years, gene mutations and gene copy number variations have been identified that cause approximately 15 percent of cases of ASD. However, it is thought that hundreds of genes may be involved in causing autism.

One copy of the q13 portion of chromosome 22 is either missing or otherwise mutated in SHANK3 deficiency, also known as Phelan-McDermid Syndrome or 22q13 Deletion Syndrome (22q13DS). The area in question contains the gene SHANK3, and there is overwhelming evidence that it is the loss of one copy of SHANK3 that produces the neurological and behavioral aspects of the syndrome. The SHANK3 gene is key to the development of the human nervous system, and loss of SHANK3 can impair the ability of neurons to communicate with one another.

Source: Science Daily

Apr 24, 20129 notes
#science #brain #neuroscience #autism
Chronic Fatigue Syndrome Patients Had Reduced Activity in Brain’s 'Reward Center'

ScienceDaily (Apr. 24, 2012) — Chronic fatigue syndrome, a medical disorder characterized by extreme and ongoing fatigue with no other diagnosed cause, remains poorly understood despite decades of scientific study. Although researchers estimate that more than 1 million Americans are affected by this condition, the cause for chronic fatigue syndrome, a definitive way to diagnose it, and even its very existence remain in question. In a new study, researchers have found differing brain responses in people with this condition compared to healthy controls, suggesting an association between a biologic functional response and chronic fatigue syndrome.

The findings show that patients with chronic fatigue syndrome have decreased activation of an area of the brain known as the basal ganglia in response to reward. Additionally, the extent of this lowered activation was associated with each patient’s measured level of fatigue. The basal ganglia are at the base of the brain and are associated with a variety of functions, including motor activity and motivation. Diseases affecting basal ganglia are often associated with fatigue. These results shed more light on this mysterious condition, information that researchers hope may eventually lead to better treatments for chronic fatigue syndrome.

The study was conducted by Elizabeth R. Unger, James F. Jones, and Hao Tian of the Centers for Disease Control and Prevention (CDC), Andrew H. Miller and Daniel F. Drake of Emory University School of Medicine, and Giuseppe Pagnoni of the University of Modena and Reggio Emilia. An abstract of their study entitled, “Decreased Basal Ganglia Activation in Chronic Fatigue Syndrome Subjects is Associated with Increased Fatigue,” will be discussed at the meeting Experimental Biology 2012, being held April 21-25 at the San Diego Convention Center. The abstract is sponsored by the American Society for Investigative Pathology (ASIP), one of six scientific societies sponsoring the conference which last year attracted some 14,000 attendees.

More Fatigue, Less Activation

Dr. Unger says that she and her colleagues became curious about the role of the basal ganglia after previous studies by collaborators at Emory University showed that patients treated with interferon alpha, a common treatment for chronic hepatitis C and several other conditions, often experienced extreme fatigue. Further investigation into this phenomenon showed that basal ganglia activity decreased in patients who received this immune therapy. Since the fatigue induced by interferon alpha shares many characteristics with chronic fatigue syndrome, Unger and her colleagues decided to investigate whether the basal ganglia were also affected in this disorder.

The researchers recruited 18 patients with chronic fatigue syndrome, as well as 41 healthy volunteers with no symptoms of CFS. Each study participant underwent functional magnetic resonance imaging, a brain scan technique that measures activity in various parts of the brain by blood flow, while they played a simple card game meant to stimulate feelings of reward. The participants were each told that they’d win a small amount of money if they correctly guessed whether a preselected card was red or black. After making their choice, they were presented with the card while researchers measured blood flow to the basal ganglia during winning and losing hands.

The researchers showed that patients with chronic fatigue syndrome experienced significantly less change in basal ganglia blood flow between winning and losing than the healthy volunteers. When the researchers looked at scores for the Multidimensional Fatigue Inventory, a survey often used to document fatigue for chronic fatigue syndrome and various other conditions, they also found that the extent of a patient’s fatigue was tightly tied with the change in brain activity between winning and losing. Those with the most fatigue had the smallest change.

Results Suggest Role of Inflammation

Unger notes that the findings add to our understanding of biological factors that may play a role in chronic fatigue syndrome. “Many patients with chronic fatigue syndrome encounter a lot of skepticism about their illness,” she says. “They have difficulty getting their friends, colleagues, coworkers, and even some physicians to understand their illness. These results provide another clue into the biology of chronic fatigue syndrome.”

The study also suggests some areas of further research that could help scientists develop treatments for this condition in the future, she adds. Since the basal ganglia use the chemical dopamine as their major neurotransmitter, dopamine metabolism may play an important role in understanding and changing the course of this illness. Similarly, the difference in basal ganglia activation between the patients and healthy volunteers may be caused by inflammation, a factor now recognized as pivotal in a variety of conditions, ranging from heart disease to cancer.

Estimates from the CDC suggest that annual medical costs associated with chronic fatigue syndrome total about $14 billion in the United States. Annual losses to productivity because of lost work time range between $9 and $37 billion, with costs to individual households ranging between $8,000 and $20,000 per year.

Source: Science Daily

Apr 24, 201212 notes
#science #neuroscience #brain
Prions in the Brain Eliminated by Homing Molecules

ScienceDaily (Apr. 24, 2012) — Toxic prions in the brain can be detected with self-illuminating polymers. The originators, at Linköping University in Sweden, has now shown that the same molecules can also render the prions harmless, and potentially cure fatal nerve-destroying illnesses.

Linköping researchers and their colleagues at the University Hospital in Zürich tested the luminescent conjugated polymers, or LCPs, on tissue sections from the brains of mice that had been infected with prions. The results show that the number of prions, as well as their toxicity and infectibility, decreased drastically. This is the first time anyone has been able to demonstrate the possibility of treating illnesses such as mad cow disease and Creutzfeldt-Jacobs with LCP molecules.

"When we see this effect on prion infections, we believe the same approach could work on Alzheimer’s disease as well," says Peter Nilsson, researcher in Bioorganic Chemistry funded by ERC, the European Research Council.

Along with professors Per Hammarström and Adriano Aguzzi and others, he is now publishing the results in The Journal of Biological Chemistry.

Prions are diseased forms of normally occurring proteins in the brain. When they clump together in large aggregates, nerve cells in the surrounding area are affected, which leads to serious brain damage and a quick death. Prion illnesses can be inherited, occur spontaneously or through infection, for example through infected meat — as was the case with mad cow disease.

The course of the illness is relentless when the prions fall to pieces and replicate at an exponential rate. When researchers inserted the LCP molecules into their model system, the replication was arrested, possible through stabilizing the prion aggregates.

The variable components in an LCP are various chemical subgroups attached onto the polymer. In the published study, eight different substances were tested, and all of them had significant effect on the toxicity of the prions.

"Based on these results, we can now customise entirely new molecules with potentially even better effect. These are now being tested on animal models," Nilsson says.

Researchers want to go even further and test whether the molecules will function on fruit flies with an Alzheimer’s-like nerve disorder. Alzheimer’s is caused by what is known as amyloid plaque, which has a similar but slower course than prion diseases.

Source: Science Daily

Apr 24, 20122 notes
#science #neuroscience #brain
Nano-Devices that Cross Blood-Brain Barrier Open Door to Treatment of Cerebral Palsy, Other Neurologic Disorders

April 23rd, 2012

A team of scientists from Johns Hopkins and elsewhere have developed nano-devices that successfully cross the brain-blood barrier and deliver a drug that tames brain-damaging inflammation in rabbits with cerebral palsy.

image

Schematic picture of a dendrimer with multiple branches that are tagged with drug molecules and imaging agents. Image adapted from press release image from Johns Hopkins.

A report on the experiments, conducted at Wayne State University in collaboration with the Perinatology Research Branch of the National Institute of Child Health and Human Development, before the lead and senior investigators moved to Johns Hopkins, is published in the April 18 issue of Science Translational Medicine.

For the study, researchers used tiny, manmade molecules laced with N-acetyl-L-cysteine (NAC), an anti-inflammatory drug used as antidote in acetaminophen poisoning. The researchers precision-targeted brain cells gone awry to halt brain injury. In doing so they improved the animals’ neurologic function and motor skills.

The new approach holds therapeutic potential for a wide variety of neurologic disorders in humans that stem from neuro-inflammation, including Alzheimer’s disease, stroke, autism and multiple sclerosis, the investigators say.

The scientists caution that the findings are a long way from human application, but that the simplicity and versatility of the drug-delivery system make it an ideal candidate for translation into clinical use.

“In crossing the blood-brain barrier and targeting the cells responsible for inflammation and brain injury, we believe we may have opened the door to new therapies for a wide-variety of neurologic disorders that stem from an inflammatory response gone haywire,” says lead investigator Sujatha Kannan, M.D., now a pediatric critical-care specialist at Johns Hopkins Children’s Center.

Cerebral palsy (CP), estimated to occur in three out of 1,000 newborns, is a lifelong, often devastating disorder caused by infection or reduced oxygen to the brain before, during or immediately after birth. Current therapies focus on assuaging symptoms and improving quality of life, but can neither reduce nor reverse neurologic damage and loss of motor function.

Neuro-inflammatory damage occurs when two types of brain cells called microglia and astrocytes — normally deployed to protect the brain during infection and inflammation — actually damage it by going into overdrive and destroying healthy brain cells along with damaged ones.

Directly treating cells in the brain has long proven difficult because of the biological and physiological systems that have evolved to protect the brain from blood-borne infections. The quest to deliver the drug to the brain also involved developing a technique to get past the brain-blood barrier, spare healthy brain cells and deliver the anti-inflammatory drug exclusively inside the rogue cells.

To do all this, the scientists used a globular, tree-like synthetic molecule, known as a dendrimer. Its size — 2,000 times smaller than a red blood cell — renders it fit for travel across the blood-brain barrier. Moreover, the dendrimer’s tree-like structure allowed scientists to attach to it molecules of an anti-inflammatory NAC. The researchers tagged the drug-laced dendrimers with fluorescent tracers to monitor their journey to the brain and injected them into rabbits with cerebral palsy six hours after birth. Another group of newborn rabbits received an injection of NAC only.

Not only did the drug-loaded dendrimers make their way inside the brain but, once there, were rapidly swallowed by the overactive astrocytes and microglia.

“These rampant inflammatory cells, in effect, gobbled up their own poison,” Kannan says.

“The dendrimers not only successfully crossed the blood-brain barrier but, perhaps more importantly, zeroed in on the very cells responsible for neuro-inflammation, releasing the therapeutic drug directly into them,” says senior investigator Rangaramanujam Kannan, Ph.D., of the Center for Nanomedicine at the Johns Hopkins Wilmer Eye Institute.

Animals treated with dendrimer-borne NAC showed marked improvement in motor control and coordination within five days after birth, nearly reaching the motor skill of healthy rabbits. By comparison, rabbits treated with dendrimer-free NAC showed minimal, if any, improvement, even at doses 10 times higher than the dendrimer-borne version. Animals treated with the dendrimer-delivered drug also showed better muscle tone and less stiffness in the hind leg muscles, both hallmarks of CP.

Brain tissue analysis revealed that rabbits treated with dendrimer-borne NAC had notably fewer “bad” microglia — the inflammatory cells responsible for brain damage — as well as markedly lower levels of other inflammation markers. They also had better preserved myelin, the protein that sheaths nerves and is stripped or damaged in CP and other neurologic disorders. And even though CP is marked by neuron death in certain brain centers, animals who received dendrimer-borne NAC had higher number of neurons in the brain regions responsible for coordination and motor control, compared with untreated animals and those treated with NAC only.

The findings suggest that the treatment not only reduces inflammation in the cells, but may also prevent cell damage and cell death, the researchers said. The Kannans, who are married, say they plan to follow some treated animals into adulthood to ensure the improvements are not temporary.

Source: Neuroscience News

Apr 24, 20126 notes
#science #neuroscience #brain
Protein prevents DNA damage in the developing brain and might serve as a tumor suppressor

April 23, 2012

St. Jude Children’s Research Hospital scientists have rewritten the job description of the protein TopBP1 after demonstrating that it guards early brain cells from DNA damage. Such damage might foreshadow later problems, including cancer.

Researchers showed that cells in the developing brain require TopBP1 to prevent DNA strands from breaking as the molecule is copied prior to cell division. Investigators also reported that stem cells and immature cells known as progenitor cells involved at the beginning of brain development are more sensitive to unrepaired DNA damage than progenitor cells later in the process. Although more developmentally advanced than stem cells, progenitor cells retain the ability to become one of a variety of more specialized neurons.

"Such DNA strand breaks have great potential for creating mutations that push a normal cell toward malignancy," said Peter McKinnon, Ph.D., a St. Jude Department of Genetics member and the paper’s senior author. "When we selectively knocked out TopBP1 in mice, the amount of DNA damage we saw suggests that TopBP1 is likely to be a tumor suppressor. We are exploring that question now."

The work appeared in the April 22 online edition of the scientific journal Nature Neuroscience. The research builds on McKinnon’s interest in DNA repair systems, including the enzymes ATM and ATR, which are associated with a devastating cancer-prone neurodegenerative disease in children called ataxia telangiectasia, and a neurodevelopmental disorder called Seckel syndrome.

TopBP1 was known to activate ATR. Previous laboratory research by other investigators also suggested that activation made TopBP1 indispensable for DNA replication and cell proliferation. This study, however, showed that was not the case. Most progenitor cells in the embryonic mouse brain kept dividing after investigators switched off the TopBP1 gene. 

Read More →

Apr 24, 20124 notes
#science #neuroscience #brain #psychology
Gatekeeper of brain steroid signals boosts emotional resilience to stress

April 23, 2012

A cellular protein called HDAC6, newly characterized as a gatekeeper of steroid biology in the brain, may provide a novel target for treating and preventing stress-linked disorders, such as depression and post-traumatic stress disorder (PTSD), according to research from the Perelman School of Medicine at the University of Pennsylvania.

Glucocorticoids are natural steroids secreted by the body during stress. A small amount of these hormones helps with normal brain function, but their excess is a precipitating factor for stress-related disorders.

Glucocorticoids exert their effects on mood by acting on receptors in the nucleus of emotion–regulating neurons, such as those producing the neurotransmitter serotonin. For years, researchers have searched for ways to prevent deleterious effects of stress by blocking glucocorticoids in neurons. However, this has proved difficult to do without simultaneously interfering with other functions of these hormones, such as the regulation of immune function and energy metabolism.

In a recent Journal of Neuroscience paper, the lab of Olivier Berton, PhD, assistant professor of Psychiatry, shows how a regulator of glucocorticoid receptors may provide a path towards resilience to stress by modulating glucocorticoid signaling in the brain. The protein HDAC6, which is particularly enriched in serotonin pathways, as well as in other mood-regulatory regions in both mice and humans, is ideally distributed in the brain to mediate the effect of glucocorticoids on mood and emotions. HDAC6 likely does this by controlling the interactions between glucocorticoid receptors and hormones in these serotonin circuits.

Experiments that first alerted Berton and colleagues to a peculiar role of HDAC6 in stress adaptation came from an approach that reproduces certain clinical features of traumatic stress and depression in mice. The animals are exposed to brief bouts of aggression from trained “bully” mice. In most aggression-exposed mice this experience leads to the development of a lasting form of social aversion that can be treated by chronic administration of antidepressants. 

Read More →

Apr 24, 20126 notes
#science #neuroscience #brain #psychology
Brain surgery for epilepsy underutilized: study

April 23, 2012

Ten years ago, a landmark clinical trial in Canada demonstrated the unequivocal effectiveness of brain surgeries for treating uncontrolled epilepsy, but since then the procedure has not been widely adopted—in fact, it is dramatically underutilized according to a new study from the University of California, San Francisco (UCSF).

The study, published this month in the journal Neurology, showed that the number of Americans having the surgery has not changed in the decade since release of the effectiveness study, though surgical treatment is now uniformly encouraged by neurology and neurosurgery professional societies.

The U.S. Centers for Disease Control and Prevention estimates that 2 million Americans have epilepsy. Hundreds of thousands of these men, women and children suffer from uncontrolled seizures, but nationally only a few hundred are treated surgically each year with UCSF performing about 50 of the operations.

Among people who do have the operation, the study found, there are significant disparities by race and insurance status. White patients were more likely to have surgery than racial minorities, and privately insured patients were more likely to undergo surgery than those with Medicaid or Medicare.

"As a medical community, we are not practicing evidence-based medicine with regard to the treatment of patients who have epilepsy," said Edward Chang, MD, chief of adult epilepsy surgery in the UCSF Department of Neurological Surgery and the UCSF Epilepsy Center. "There are a lot of people who are taking medications and continuing to have seizures even though they can potentially be seizure-free."

A MODERN SURGERY FOR AN ANCIENT DISEASE

Epilepsy has been recognized as an important neurological condition since ancient times and its name means “seizures” in Greek. It can be inherited or it can be caused by anything that injures or irritates the brain. Hippocrates, the father of western medicine, described it in detail in his writings some 2,500 years ago, and it is believed to have afflicted many famous people throughout history, including Julius Caesar.

UCSF is one of the world’s leading institutions involved in epilepsy research, with one of the few medical centers that has top-ranking departments in relevant areas: neurology, biomedical imaging, and neurosurgery.

Paul Garcia, MD, director of the clinical epilepsy program and a study co-author, said that most patients referred to UCSF for surgical evaluation have had uncontrolled seizures for many years despite trying several medications. Research has shown that after the first two medicines fail, it is uncommon for patients to gain complete seizure control with medical treatment alone. Without control over their seizures, patients are at risk for physical injuries or even dying. Furthermore, the seizures often interfere with normal life activities such as driving, studying and working.

Read More →

Apr 24, 20128 notes
#science #neuroscience #brain #psychology
Omega-3 fatty acids not associated with beneficial effects in multiple sclerosis: study

April 23, 2012

Omega-3 fatty acid supplements were not associated with beneficial effects on disease activity in patients with relapsing-remitting multiple sclerosis, according to a report of a randomized controlled trial published Online First by Archives of Neurology.

Multiple sclerosis is a chronic, incurable disease of the central nervous system that affects about 2.5 million people worldwide. Some patients use, or have tried, omega-3 fatty acids supplementation to control the disease because the essential fatty acids could theoretically have anti-inflammatory and neuroprotective effects in multiple sclerosis, the authors write in their study background.

Øivind Torkildsen, M.D., Ph.D., of Haukeland University Hospital, Bergen, Norway, and colleagues included 92 patients with multiple sclerosis in their double-blind, placebo-controlled trial to examine whether omega-3 fatty acid supplementation as a monotherapy (single therapy) or in combination with subcutaneous (under the skin) interferon beta-1a could reduce disease activity.

Half of the patients (46) were given omega-3 fatty acids – 1350 mg of eicosapentaenoic acid and 850 mg of docosahexaenoic acid daily - and the other half (46) were administered placebo. After six months, all patients received interferon beta-1a three times a week for another 18 months. Researchers used magnetic resonance imaging (MRI) to measure disease activity by the number of new T1-weighted gadolinium-enhancing lesions in the brain.

"The results from this study did not show any beneficial effects of ω-3 [omega-3] fatty acid supplementation on disease activity in multiple sclerosis as a monotherapy or in combination with interferon beta," the authors comment. They note their results were in contrast with two other studies reporting a possible positive effect.

The median number of new T1-weighted gadolinium-enhancing lesions was three in the omega-3 fatty acids group and two in the placebo group during the first six months, according to the study results. The results indicate no difference between the two groups in the number of relapses during the first six months of treatment or after 24 months. No differences were detected either in fatigue or quality-of-life scores.

However, the authors comment their data do not suggest that omega-3 fatty acid supplementation was harmful or that it interfered with interferon beta treatment, which they note can reduce disease activity in the relapsing-remitting course of the disease.

"The design of this study allowed us to compare the effect of ω-3 fatty acid supplementation both against placebo alone and in combination with interferon beta. As expected, the MRI disease activity was significantly reduced when interferon beta-1a was introduced," they conclude.

Provided by JAMA and Archives Journals

Source: medicalxpress.com

Apr 24, 20123 notes
#science #neuroscience #psychology #brain
Clinical decline in Alzheimer's requires plaque and proteins

April 23, 2012

According to a new study, the neuron-killing pathology of Alzheimer’s disease (AD), which begins before clinical symptoms appear, requires the presence of both amyloid-beta (a-beta) plaque deposits and elevated levels of an altered protein called p-tau.

Without both, progressive clinical decline associated with AD in cognitively healthy older individuals is “not significantly different from zero,” reports a team of scientists at the University of California, San Diego School of Medicine in the April 23 online issue of the Archives of Neurology.

"I think this is the biggest contribution of our work," said Rahul S. Desikan, MD, PhD, research fellow and resident radiologist in the UC San Diego Department of Radiology and first author of the study. "A number of planned clinical trials – and the majority of Alzheimer’s studies – focus predominantly on a-beta. Our results highlight the importance of also looking at p-tau, particularly in trials investigating therapies to remove a-beta. Older, non-demented individuals who have elevated a-beta levels, but normal p-tau levels, may not progress to Alzheimer’s, while older individuals with elevated levels of both will likely develop the disease."

The findings also underscore the importance of p-tau as a target for new approaches to treating patients with conditions ranging from mild cognitive impairment (MCI) to full-blown AD. An estimated 5.4 million Americans have AD. It’s believed that 10 to 20 percent of Americans age 65 and older have MCI, a risk factor for AD. Some current therapies appear to delay clinical AD onset, but the disease remains irreversible and incurable.

"It may be that a-beta initiates the Alzheimer’s cascade," said Desikan. "But once started, the neurodegenerative mechanism may become independent of a-beta, with p-tau and other proteins playing a bigger role in the downstream degenerative cascade. If that’s the case, prevention with anti-a-beta compounds may prove efficacious against AD for older, non-demented individuals who have not yet developed tau pathology. But novel, tau-targeting therapies may help the millions of individuals who already suffer from mild cognitive impairment or Alzheimer’s disease." 

Read More →

Apr 24, 20125 notes
#science #neuroscience #brain #psychology #alzheimer
New guidelines: Treatments can help prevent migraine

April 23, 2012

Research shows that many treatments can help prevent migraine in certain people, yet few people with migraine who are candidates for these preventive treatments actually use them, according to new guidelines issued by the American Academy of Neurology. The guidelines, which were co-developed with the American Headache Society, will be announced at the American Academy of Neurology’s 64th Annual Meeting in New Orleans and published in the April 24, 2012, print issue of Neurology®, the medical journal of the American Academy of Neurology.

"Studies show that migraine is underrecognized and undertreated," said guideline author Stephen D. Silberstein, MD, FACP, FAHS, of Jefferson Headache Center at Thomas Jefferson University in Philadelphia and a Fellow of the American Academy of Neurology. "About 38 percent of people who suffer from migraine could benefit from preventive treatments, but only less than a third of these people currently use them."

Unlike acute treatments, which are used to relieve the pain and associated symptoms of a migraine attack when it occurs, preventive treatments usually are taken every day to prevent attacks from occurring as often and to lessen their severity and duration when they do occur.

"Some studies show that migraine attacks can be reduced by more than half with preventive treatments," Silberstein said.

The guidelines, which reviewed all available evidence on migraine prevention, found that among prescription drugs, the seizure drugs divalproex sodium, sodium valproate and topiramate, along with the beta-blockers metoprolol, propranolol and timolol, are effective for migraine prevention and should be offered to people with migraine to reduce the frequency and severity of attacks. The seizure drug lamotrigine was found to be ineffective in preventing migraine.

The guidelines also reviewed over-the-counter treatments and complementary treatments. The guideline found that the herbal preparation Petasites, also known as butterbur, is effective in preventing migraine. Other treatments that were found to be probably effective are the nonsteroidal anti-inflammatory drugs fenoprofen, ibuprofen, ketoprofen, naproxen and naproxen sodium, subcutaneous histamine and complementary treatments magnesium, MIG-99 (feverfew) and riboflavin.

Silberstein noted that while people do not need a prescription from a physician for these over-the-counter and complementary treatments, they should still see their doctor regularly for follow-up. “Migraines can get better or worse over time, and people should discuss these changes in the pattern of attacks with their doctors and see whether they need to adjust their dose or even stop their medication or switch to a different medication,” said Silberstein. “In addition, people need to keep in mind that all drugs, including over-the-counter drugs and complementary treatments, can have side effects or interact with other medications, which should be monitored.”

Provided by American Academy of Neurology
Source: medicalxpress.com 

Apr 24, 20123 notes
#science #neuroscience #brain #psychology
Neuroscientists discover key protein responsible for controlling nerve cell protection

April 22, 2012

A key protein, which may be activated to protect nerve cells from damage during heart failure or epileptic seizure, has been found to regulate the transfer of information between nerve cells in the brain. The discovery, made by neuroscientists at the University of Bristol and published in Nature Neuroscience and PNAS, could lead to novel new therapies for stroke and epilepsy.

The research team, led by Professor Jeremy Henley and Dr Jack Mellor from Bristol’s Medical School, has identified a protein, known as SUMO, responsible for controlling the chemical processes which reduce or enhance protection mechanisms for nerve cells in the brain.

These key SUMO proteins produce subtle responses to the brain’s activity levels to regulate the amount of information transmitted by kainate receptors - responsible for communication between nerve cells and whose activation can lead to epileptic seizures and nerve cell death.

Protein function is controlled by altering their structure in processes that can be independent or inter-related including phosphorylation, ubiquitination and SUMOylation. In the present work it is shown that phosphorylation of kainate receptors on its own promotes their activity. However, phosphorylation also facilitates SUMOylation of kainate receptors that reduces their activity. Thus there is a dynamic and delicate interplay between phosphorylation and SUMOylation that regulates kainate receptor function.

This fine balance between phosphorylation and SUMOylation is dependent on brain activity levels where damaging activity that occurs during stroke or epilepsy will enhance SUMOylation and therefore reduce kainate receptor function to protect nerve cells.

Dr Mellor, Senior Lecturer from the University’s School of Physiology and Pharmacology, said: “Kainate receptors are a somewhat mysterious but clearly very important group of proteins that are known to be involved in a number of diseases including epilepsy. However, we currently know little about what makes kainate receptors so important. Likewise, we also know that SUMO proteins play an important role in neuroprotection. These findings provide a link between SUMO and kainate receptors that increases our understanding of the processes that nerve cells use to protect themselves from excessive and abnormal activity.”

Professor Henley added: “This work is important because it gives a new perspective and a deeper understanding of how the flow of information between cells in the brain is regulated. The team has found that by increasing the amount of SUMO attached to kainate receptors – which would reduce communication between the cells – could be a way to treat epilepsy by preventing over-excitation of the brain’s nerve cells.”

The research follows on from previous findings published in Nature(447, 321-325) that discovered SUMO proteins target the brain’s kainate receptors altering their cellular location.

Provided by University of Bristol

Source: medicalxpress.com

Apr 24, 20125 notes
#science #neuroscience #brain #psychology
Cocaine decreases activity of a protein necessary for normal functioning of the brain's reward system

April 22, 2012

New research from Mount Sinai Medical Center in New York reveals that repeated exposure to cocaine decreases the activity of a protein necessary for normal functioning of the brain’s reward system, thus enhancing the reward for cocaine use, which leads to addiction. Investigators were also able to block the ability of repeated cocaine exposure, to induce addiction. The findings, published online April 22 in the journal Nature Neuroscience, provide the first evidence of how cocaine changes the shape and size of neuron rewards in a mouse model.

Repeated exposure to cocaine decreases the expression of a protein necessary for normal functioning of the brain’s reward system, thus enhancing the reward for cocaine use and stimulating addiction. Using the protein’s light-activated form in real time, in a technique known as optogenetics, investigators were also able to block repeated cocaine exposure from enhancing the brain’s reward center from cocaine. Even though the results are very early and many steps will be important in moving from mice to humans, the researchers say that the finding opens the door to a new direction for treatment for cocaine addiction.

"There are virtually no medication regimens for cocaine addiction, only psychotherapy, and some early work with vaccines," said the study’s senior investigator, Eric Nestler, MD, PhD, Nash Family Professor of Neuroscience, Chairman of the Neuroscience and Director of the Friedman Brain Institute at Mount Sinai School of Medicine. The protein, Rac1, is found in many cells in mice, rats, monkeys, and humans, and it is known to be involved in controlling the growth of nerve cells.

Investigators “knocked out,” or deleted, the gene responsible for Rac1 production, or injected a virus to enhance expression of Rac1.

"The research gives us new information on how cocaine affects the brain’s reward center and how it could potentially be repaired," said Dr. Nestler. "This is the first case in the brain in vivo where it’s been possible to control the activity of a protein, inside nerve cells in real time. Our findings reveal new pathways and target — a proof of principle study really — for treatment of cocaine addiction."

Provided by The Mount Sinai Hospital / Mount Sinai School of Medicine

Source: medicalxpress.com

Apr 24, 20126 notes
#science #neuroscience #brain #psychology
Key Protein Responsible for Controlling Nerve Cell Protection Discovered

ScienceDaily (Apr. 22, 2012) — A key protein, which may be activated to protect nerve cells from damage during heart failure or epileptic seizure, has been found to regulate the transfer of information between nerve cells in the brain. The discovery, made by neuroscientists at the University of Bristol and published in Nature Neuroscience and PNAS, could lead to novel new therapies for stroke and epilepsy.

image

An image of a hippocampal neuron. (Credit: Inma Gonzalez-Gonzalez)

The research team, led by Professor Jeremy Henley and Dr Jack Mellor from Bristol’s Medical School, has identified a protein, known as SUMO, responsible for controlling the chemical processes which reduce or enhance protection mechanisms for nerve cells in the brain.

These key proteins produce subtle responses to the brain’s activity levels to regulate the amount of information transmitted by kainate receptors — responsible for communication between nerve cells and whose activation can lead to epileptic seizures and nerve cell death.

Protein function is controlled by altering their structure in processes that can be independent or inter-related including phosphorylation, ubiquitination and SUMOylation. In the present work it is shown that phosphorylation of kainate receptors on its own promotes their activity. However, phosphorylation also facilitates SUMOylation of kainate receptors that reduces their activity. Thus there is a dynamic and delicate interplay between phosphorylation and SUMOylation that regulates kainate receptor function.

This fine balance between phosphorylation and SUMOylation is dependent on brain activity levels where damaging activity that occurs during stroke or epilepsy will enhance SUMOylation and therefore reduce kainate receptor function to protect nerve cells.

Dr Mellor, Senior Lecturer from the University’s School of Physiology and Pharmacology, said: “Kainate receptors are a somewhat mysterious but clearly very important group of proteins that are known to be involved in a number of diseases including epilepsy. However, we currently know little about what makes kainate receptors so important. Likewise, we also know that SUMO proteins play an important role in neuroprotection. These findings provide a link between SUMO and kainate receptors that increases our understanding of the processes that nerve cells use to protect themselves from excessive and abnormal activity.”

Professor Henley added: “This work is important because it gives a new perspective and a deeper understanding of how the flow of information between cells in the brain is regulated. The team has found that by increasing the amount of SUMO attached to kainate receptors — which would reduce communication between the cells — could be a way to treat epilepsy by preventing over-excitation of the brain’s nerve cells.”

The research follows on from previous findings published in Nature that discovered SUMO proteins target the brain’s kainate receptors altering their cellular location.

Source: Science Daily

Apr 24, 20124 notes
#science #neuroscience #brain #psychology
New Technique May Help Severely Damaged Nerves Regrow and Restore Function

ScienceDaily (Apr. 22, 2012) — Engineers at the University of Sheffield have developed a method of assisting nerves damaged by traumatic accidents to repair naturally, which could improve the chances of restoring sensation and movement in injured limbs.

image

Scanning electron microscopy images of the structures fabricated by (left) 2PP and (right) microreplication techniques. (Credit: Image courtesy of University of Sheffield)

In a collaborative study with Laser Zentrum Hannover (Germany) published April 23, 2012 in the journal Biofabrication, the team describes a new method for making medical devices called nerve guidance conduits or NGCs.

The method is based on laser direct writing, which enables the fabrication of complex structures from computer files via the use of CAD/CAM (computer aided design/manufacturing), and has allowed the research team to manufacture NGCs with designs that are far more advanced than previously possible.

Currently patients with severe traumatic nerve damage suffer a devastating loss of sensation and/or movement in the affected limb. The traditional course of action, where possible, is to surgically suture or graft the nerve endings together. However, reconstructive surgery often does not result in complete recovery.

"When nerves in the arms or legs are injured they have the ability to re-grow, unlike in the spinal cord; however, they need assistance to do this," said University of Sheffield Professor of Bioengineering, John Haycock. "We are designing scaffold implants that can bridge an injury site and provide a range of physical and chemical cues for stimulating this regrowth."

The new conduit is made from a biodegradable synthetic polymer material based on polylactic acid and has been designed to guide damaged nerves to re-grow through a number of small channels.

"Nerves aren’t just like one long cable, they’re made up of lots of small cables, similar to how an electrical wire is constructed," said lead author Dr Frederik Claeyssens, of the University’s Department of Materials Science and Engineering. "Using our new technique we can make a conduit with individual strands so the nerve fibres can form a similar structure to an undamaged nerve."

Once the nerve is fully regrown, the conduit biodegrades naturally. The team hopes that this approach will significantly increase recovery for a wide range of peripheral nerve injuries.

In laboratory experiments, nerve cells added to the polymer conduit grew naturally within its channelled structure and the research team is now working towards clinical trials.

"If successful we anticipate these scaffolds will not just be applicable to peripheral nerve injury, but could also be developed for other types of nerve damage too. The technique of laser direct writing may ultimately allow production of scaffolds that could help in the treatment of spinal cord injury" said Dr Claeyssens.

"What’s exciting about this work is that not only have we designed a new method for making nerve guide scaffolds which support nerve growth, we´ve also developed a method of easily reproducing them through micromolding.

"This technology could make a huge difference to patients suffering severe nerve damage," he added.

Source: Science Daily

Apr 24, 20127 notes
#science #neuroscience #brain #psychology
'Housekeeping' Mechanism for Brain Stem Cells Discovered

ScienceDaily (Apr. 22, 2012) — Researchers at Columbia University Medical Center (CUMC) have identified a molecular pathway that controls the retention and release of the brain’s stem cells. The discovery offers new insights into normal and abnormal neurologic development and could eventually lead to regenerative therapies for neurologic disease and injury. The findings, from a collaborative effort of the laboratories of Drs. Anna Lasorella and Antonio Iavarone, were published April 22in the online edition of Nature Cell Biology.

image

Neural stem cells detaching from the vascular niche. (Credit: Anna Lasorella, CUMC /Nature Cell Biology)

The research builds on recent studies, which showed that stem cells reside in specialized niches, or microenvironments, that support and maintain them.

"From this research, we knew that when stem cells detach from their niche, they lose their identity as stem cells and begin to differentiate into specific cell types," said co-senior author Antonio Iavarone, MD, professor of Pathology and Neurology at CUMC.

"However, the pathways that regulate the interaction of stem cells with their niche were obscure," said co-senior author Anna Lasorella, MD, associate professor of Pathology and Pediatrics at CUMC and a member of the Columbia Stem Cell Initiative.

In the brain, the stem cell niche is located in an area adjacent to the ventricles, the fluid-filled spaces within the brain. Neural stem cells (NSCs) within the niche are carefully regulated, so that enough cells are released to populate specific brain areas, while a sufficient supply is kept in reserve.

In previous studies, Drs. Iavarone and Lasorella focused on molecules called Id (inhibitor of differentiation) proteins, which regulate various stem cell properties. They undertook the present study to determine how Id proteins maintain stem cell identity.

The team developed a genetically altered strain of mice in which Id proteins were silenced, or knocked down, in NSCs. In the absence of Id proteins, mice died within 24 hours of birth. Their brains showed markedly lowered NSC proliferative capacity, and their stem cell populations were reduced.

Studies of NSCs from this strain of mice revealed that Id proteins directly regulate the production of a protein called Rap1GAP, which in turn controls Rap1, one of the master regulators of cell adhesion. The researchers found that the Id-Rap1GAP-Rap1 pathway is critical for the adhesion of NSCs to their niche and for NSC maintenance. “There may be other pathways involved, but we believe this is the key pathway,” said Dr. Iavarone. “There is good reason to believe that it operates in other kinds of stem cells, and our labs are investigating this question now.”

"This is a new idea," added Dr. Lasorella. "Before this study, the prevailing wisdom was that NSCs are regulated by the niche components, conceivably through the release of chemical attractants such as cytokines. However, our findings suggest that stem cell identity relies on this mechanism."

More research needs to be done before the findings can be applied therapeutically, Dr. Iavarone said. “Multiple studies show that NSCs respond to insults such as ischemic stroke or neurodegenerative diseases. If we can understand how to manipulate the pathways that determine stem cell fate, in the future we may be able to control NSC properties for therapeutic purposes.”

"Another aspect," added Dr. Lasorella, "is to determine whether Id proteins also maintain stem cell properties in cancer stem cells in the brain. In fact, normal stem cells and cancer stem cells share properties and functions. Since cancer stem cells are difficult to treat, identifying these pathways may lead to more effective therapies for malignant brain tumors."

Stephen G. Emerson, MD, PhD, director of the Herbert Irving Comprehensive Cancer Center at NewYork-Presbyterian Hospital/Columbia University Medical Center, added that, “Understanding the pathway that allows stem cells to develop into mature cells could eventually lead to more effective, less toxic cancer treatments. This beautiful study opens up a wholly unanticipated way to think about treating brain tumors.”

Source: Science Daily

Apr 24, 20121 note
#science #neuroscience #brain #psychology
Play
Apr 24, 20127 notes
#science #neuroscience #brain #psychology
Single Neuron Observations Mark Steps in Alzheimer’s Disease

April 20th, 2012

Multiple disease-related changes progress in parallel through distinct stages.

image

This schematic illustration shows the experimental arrangement for in vivo two-photon calcium imaging of stimulation-evoked neuronal activity in anesthetized mice. At left, in vivo two-photon image of the visual cortex. The neurons are stained with the calcium indicator dye Oregon Green BAPTA-1 (green, OGB-1) and the astrocytes with Sulforhodamine 101 (yellow, SR101). Right, visual stimuli were projected on a screen placed in front of the eye of the mouse. Image adapted from image credited to Konnerth lab, TU Muenchen.

Studying a mouse model of Alzheimer’s disease, neuroscientists at the Technische Universitaet Muenchen have observed correlations between increases in both soluble and plaque-forming beta-amyloid – a protein implicated in the disease process – and dysfunctional developments on several levels: individual cortical neurons, neuronal circuits, sensory cognition, and behavior. Their results, published in Nature Communications, show that these changes progress in parallel and that, together, they reveal distinct stages in Alzheimer’s disease with a specific order in time.

In addition to its well known, devastating effects on memory and learning, Alzheimer’s disease can also impair a person’s sense of smell or vision. Typically these changes in sensory cognition only show themselves behaviorally when the disease is more advanced. A new study sheds light on what is happening in the brain throughout the disease process, specifically with respect to the part of the cerebral cortex responsible for integrating visual information. A team led by Prof. Arthur Konnerth, a Carl von Linde Senior Fellow of the TUM Institute for Advanced Study, has observed Alzheimer’s-related changes in the visual cortex at the single-cell level.

Using a technique called two-photon calcium imaging, the researchers recorded both spontaneous and stimulated signaling activity in cortical neurons of living mice: transgenic mice carrying mutations that cause Alzheimer’s disease in humans, and wild-type mice as a control group. By observing how neuronal signaling responded to a special kind of vision test – in which a simple grating pattern of light and dark bars moves in front of the mouse’s eye – the scientists could characterize the visual circuit as being more or less “tuned” to specific orientations and directions of movement.

Konnerth explains, “Like many Alzheimer’s patients, the diseased mice have impairments in their ability to discriminate visual objects. Our results provide important new insights on the cause that may underlie the impaired behavior, by identifying in the visual cortex a fraction of neurons with a strongly disturbed function.” And within this group, the researchers discovered, there are two subsets of neurons – both dysfunctional, but in completely different ways. One subset, thought to be the first neurons to degenerate, showed no activity at all; the other showed a pathologically high level of activity, rendering these neurons incapable of properly sensing objects in the mouse’s environment. “While around half of the neurons in the visual cortex were disturbed in one way or the other, roughly half responded normally,” notes Christine Grienberger, a doctoral candidate in Konnerth’s institute and first author of this paper. “That could have significant implications for future research in the field of Alzheimer’s disease, as our findings raise the question of whether future work only needs to target this population of neurons that are disturbed in their function.”

The in vivo single-neuron experiments were carried out for three age groups, corresponding to different stages of this progressive, degenerative disease. The results were correlated with other measurements, including soluble beta-amyloid levels and the density of beta-amyloid plaques in the brain tissue. The researchers’ findings show for the first time a progressive decline of function in cortical circuits. “An important conclusion from this study,” Konnerth says, “is that the Alzheimer’s disease-related changes on all levels – including behavior, cortical circuit dysfunction, and the density of amyloid plaques in diseased brains – progress in parallel in a distinct temporal order. In the future, the identification of such stages in patients may help researchers pinpoint stage-specific and effective therapies, with reduced levels of side effects.”

Source: Neuroscience News

Apr 24, 20122 notes
#science #neuroscience #brain #psychology
Mini-sensor Measures Magnetic Activity in Human Brain

April 20th, 2012

A miniature atom-based magnetic sensor developed by the National Institute of Standards and Technology (NIST) has passed an important research milestone by successfully measuring human brain activity.

image

NIST’s atom-based magnetic sensor, about the size of a sugar cube, can measure human brain activity. Inside the sensor head is a container of 100 billion rubidium atoms (not seen), packaged with micro-optics (a prism and a lens are visible in the center cutout). The light from a low-power infrared laser interacts with the atoms and is transmitted through the grey fiber-optic cable to register the magnetic field strength. The black and white wires are electrical connections. Image adapted from image by Knappe/NIST.

Experiments reported this week in Biomedical Optics Express verify the sensor’s potential for biomedical applications such as studying mental processes and advancing the understanding of neurological diseases.

NIST and German scientists used the NIST sensor to measure alpha waves in the brain associated with a person opening and closing their eyes as well as signals resulting from stimulation of the hand. The measurements were verified by comparing them with signals recorded by a SQUID (superconducting quantum interference device). SQUIDs are the world’s most sensitive commercially available magnetometers and are considered the “gold standard” for such experiments. The NIST mini-sensor is slightly less sensitive now but has the potential for comparable performance while offering potential advantages in size, portability and cost.

The study results indicate the NIST mini-sensor may be useful in magnetoencephalography (MEG), a noninvasive procedure that measures the magnetic fields produced by electrical activity in the brain. MEG is used for basic research on perceptual and cognitive processes in healthy subjects as well as screening of visual perception in newborns and mapping brain activity prior to surgery to remove tumors or treat epilepsy. MEG also might be useful in brain-computer interfaces.

MEG currently relies on SQUID arrays mounted in heavy helmet-shaped flasks containing cryogenic coolants because SQUIDs work best at 4 degrees above absolute zero, or minus 269 degrees Celsius. The chip-scale NIST sensor is about the size of a sugar cube and operates at room temperature, so it might enable lightweight and flexible MEG helmets. It also would be less expensive to mass produce than typical atomic magnetometers, which are larger and more difficult to fabricate and assemble.

“We’re focusing on making the sensors small, getting them close to the signal source, and making them manufacturable and ultimately low in cost,” says NIST co-author Svenja Knappe. “By making an inexpensive system you could have one in every hospital to test for traumatic brain injuries and one for every football team.”

The mini-sensor consists of a container of about 100 billion rubidium atoms in a gas, a low-power infrared laser and fiber optics for detecting the light signals that register magnetic field strength—the atoms absorb more light as the magnetic field increases. The sensor has been improved since it was used to measure human heart activity in 2010. NIST scientists redesigned the heaters that vaporize the atoms and switched to a different type of optical fiber to enhance signal clarity.

The brain experiments were carried out in a magnetically shielded facility at the Physikalisch Technische Bundesanstalt (PTB) in Berlin, Germany, which has an ongoing program in biomagnetic imaging using human subjects. The NIST sensor measured magnetic signals of about 1 picotesla (trillionths of a tesla). For comparison, the Earth’s magnetic field is 50 million times stronger (at 50 millionths of a tesla). NIST scientists expect to boost the mini-sensor’s performance about tenfold by increasing the amount of light detected. Calculations suggest an enhanced sensor could match the sensitivity of SQUIDS. NIST scientists are also working on a preliminary multi-sensor magnetic imaging system in a prelude to testing clinically relevant applications.

Source: Neuroscience News

Apr 24, 20121 note
#science #neuroscience #brain
Experiment shows visual cortex in women quiets when viewing porn

April 20, 2012 by Bob Yirka 

(Medical Xpress) — Researchers from the University of Groningen Medical Centre in the Netherlands have found that for women at least, watching pornographic videos tends to quiet the part of the brain most heavily involved in looking at and processing things in the immediate environment, suggesting that the brain finds arousal more important during that time than is processing what is actually being seen. The team has published a paper in The Journal of Sexual Medicine describing their findings.

To find out if the primary visual cortex is essentially deactivated during sexual arousal in women, the team enlisted 12 volunteers; all women between the ages of 18 and 47, who had not yet reached menopause. Also each was on oral birth control pills which tend to flatten menstrual cycles and smooth out sexual desire and/or anxiety. Each was shown three videos, one with no sexual connotation, another with mild sexual content, and a third that was full on hard-core porn. While they were watching the videos, the women were also having their brain activity watched via PET scans, which work by measuring blood flow to the various brain regions. It is thought that more blood flow indicates that more brainwork is occurring, which implies that when the brain delegates tasks to different regions, by sending more blood, it is demonstrating that it finds certain activities more important than others.

The team found virtually no difference in brain activity in all of the women when watching the first two videos. When watching the third however, they found that blood flow to the visual cortex was reduced in all of the volunteers indicating that the brain had decided that focusing on arousal was more important than fixating on exactly what was occurring on the screen in front of them (or that women just don’t want to really see what is going on with sex). This is in direct contrast to most other visual activities which tend to cause more blood to flow to the visual cortex to process all of the information that is coming in.

The researchers also suggest their findings help explain why women who exhibit symptoms of anxiety often report sexual problems, as high anxiety is often correlated with increased blood flow to the visual cortex due to the person reacting on a nearly constant basis to visual stimuli. They point out that for people in general, the brain cannot be both anxious and aroused, it generally has to be one or the other, or neither.

Source: medicalxpress.com

Apr 24, 20127 notes
#science #neuroscience #brain #psychology
Researchers Show How Social Interaction and Teamwork Lead to Human Intelligence

April 19th, 2012

Scientists have discovered proof that the evolution of intelligence and larger brain sizes can be driven by cooperation and teamwork, shedding new light on the origins of what it means to be human.

image

Scientists have discovered proof that the evolution of intelligence and larger brain sizes can be driven by cooperation and teamwork, shedding new light on the origins of what it means to be human. Image adapted from Trinity College Dublin image.

The study appears online in the journal Proceedings of the Royal Society B and was led by scientists at Trinity College Dublin: PhD student, Luke McNally and Assistant Professor Dr Andrew Jackson at the School of Natural Sciences in collaboration with Dr Sam Brown of the University of Edinburgh.

The researchers constructed computer models of artificial organisms, endowed with artificial brains, which played each other in classic games, such as the ‘Prisoner’s Dilemma’, that encapsulate human social interaction.  They used 50 simple brains, each with up to 10 internal processing and 10 associated memory nodes. The brains were pitted against each other in these classic games.

The game was treated as a competition, and just as real life favours successful individuals, so the best of these digital organisms which was defined as how high they scored in the games, less a penalty for the size of their brains were allowed to reproduce and populate the next generation of organisms.

By allowing the brains of these digital organisms to evolve freely in their model the researchers were able to show that  the transition to cooperative society  leads to the strongest selection for bigger brains. Bigger brains essentially did better as cooperation increased.

The social strategies that emerge spontaneously in these bigger, more intelligent brains show complex memory and decision making. Behaviours like forgiveness, patience, deceit and Machiavellian trickery all evolve within the game as individuals try to adapt to their social environment.

“The strongest selection for larger, more intelligent brains, occurred when the social groups were first beginning to start cooperating, which then kicked off an evolutionary Machiavellian arms race of one individual trying to outsmart the other by investing in a larger brain. Our digital organisms typically start to evolve more complex ‘brains’ when their societies first begin to develop cooperation.” explained Dr Andrew Jackson.

The idea that social interactions underlie the evolution of intelligence has been around since the mid-70s, but support for this hypothesis has come largely from correlative studies where large brains were observed in more social animals.  The authors of the current research provide the first evidence that mechanistically links decision making in social interactions with the evolution of intelligence. This study highlights the utility of evolutionary models of artificial intelligence in answering fundamental biological questions about our own origins.

“Our model differs in that we exploit the use of theoretical experimental evolution combined with artificial neural networks to actually prove that yes, there is an actual cause-and-effect link between needing a large brain to compete against and cooperate with your social group mates.”

“Our extraordinary level of intelligence defines mankind and sets us apart from the rest of the animal kingdom. It has given us the arts, science and language, and above all else the ability to question our very existence and ponder the origins of what makes us unique both as individuals and as a species,” concluded PhD student and lead author Luke McNally.

Source: Neuroscience News

Apr 24, 20124 notes
#science #neuroscience #brain #psychology
Researcher Says Distinct God Spot in the Brain Does Not Exist

April 19th, 2012

Study shows religious participation and spirituality processed in different cerebral regions.

Scientists have speculated that the human brain features a “God spot,” one distinct area of the brain responsible for spirituality. Now, University of Missouri researchers have completed research that indicates spirituality is a complex phenomenon, and multiple areas of the brain are responsible for the many aspects of spiritual experiences. Based on a previously published study that indicated spiritual transcendence is associated with decreased right parietal lobe functioning, MU researchers replicated their findings. In addition, the researchers determined that other aspects of spiritual functioning are related to increased activity in the frontal lobe.

“We have found a neuropsychological basis for spirituality, but it’s not isolated to one specific area of the brain,” said Brick Johnstone, professor of health psychology in the School of Health Professions. “Spirituality is a much more dynamic concept that uses many parts of the brain. Certain parts of the brain play more predominant roles, but they all work together to facilitate individuals’ spiritual experiences.”

In the most recent study, Johnstone studied 20 people with traumatic brain injuries affecting the right parietal lobe, the area of the brain situated a few inches above the right ear. He surveyed participants on characteristics of spirituality, such as how close they felt to a higher power and if they felt their lives were part of a divine plan. He found that the participants with more significant injury to their right parietal lobe showed an increased feeling of closeness to a higher power.

“Neuropsychology researchers consistently have shown that impairment on the right side of the brain decreases one’s focus on the self,” Johnstone said. “Since our research shows that people with this impairment are more spiritual, this suggests spiritual experiences are associated with a decreased focus on the self. This is consistent with many religious texts that suggest people should concentrate on the well-being of others rather than on themselves.”

Johnstone says the right side of the brain is associated with self-orientation, whereas the left side is associated with how individuals relate to others. Although Johnstone studied people with brain injury, previous studies of Buddhist meditators and Franciscan nuns with normal brain function have shown that people can learn to minimize the functioning of the right side of their brains to increase their spiritual connections during meditation and prayer.

In addition, Johnstone measured the frequency of participants’ religious practices, such as how often they attended church or listened to religious programs. He measured activity in the frontal lobe and found a correlation between increased activity in this part of the brain and increased participation in religious practices.

“This finding indicates that spiritual experiences are likely associated with different parts of the brain,” Johnstone said.

Written by Brad Fischer

Source: Neuroscience News

Apr 24, 20125 notes
#science #neuroscience #brain #psychology
Changing brains for the better; article documents benefits of multiple practices

April 18, 2012

(Medical Xpress) — Practices like physical exercise, certain forms of psychological counseling and meditation can all change brains for the better, and these changes can be measured with the tools of modern neuroscience, according to a review article now online at Nature Neuroscience.

The study reflects a major transition in the focus of neuroscience from disease to well being, says first author Richard Davidson, professor of psychology at University of Wisconsin-Madison.

The brain is constantly changing in response to environmental factors, he says, and the article “reflects one of the first efforts to apply this conceptual framework to techniques to enhance qualities that we have not thought of as skills, like well-being. Modern neuroscience research leads to the inevitable conclusion that we can actually enhance well-being by training that induces neuroplastic changes in the brain.”

"Neuroplastic" changes affect the number, function and interconnections of cells in the brain, usually due to external factors.

Although the positive practices reviewed in the article were not designed using the tools and theories of modern neuroscience, “these are practices which cultivate new connections in the brain and enhance the function of neural networks that support aspects of pro-social behavior, including empathy, altruism, kindness,” says Davidson, who directs the Center for Investigating Healthy Minds at UW-Madison.

The review, co-written with Bruce McEwen of Rockefeller University, begins by considering how social stressors can harm the brain. The massive neglect of children in orphanages in Romania did not just have psychological impacts; it created measurable changes in their brains, Davidson says. “Such studies provide an important foundation for understanding the opposite effects of interventions designed to promote wellbeing.”

Davidson says his work has been shaped by his association with the Dalai Lama, who asked him in the 1990s, “Why can’t we use the same rigorous tools of neuroscience to investigate kindness, compassion and wellbeing?”

Davidson, who has explored the neurological benefits of meditation, says, “meditation is one of many different techniques, and not necessarily the best for all people. Cognitive therapy, developed in modern psychology, is one of most empirically validated treatments for depression and counteracting the effects of stress.”

Overall, Davidson says, the goal is “to use what we know about the brain to fine-tune interventions that will improve well-being, kindness, altruism. Perhaps we can develop more targeted, focused interventions that take advantage of the mechanisms of neuroplasticity to induce specific changes in specific brain circuits.”

Brains change all the time, Davidson emphasizes. “You cannot learn or retain information without a change in the brain. We all know implicitly that in order to develop expertise in any complex domain, to become an accomplished musician or athlete, requires practice, and that causes new connections to form in the brain. In extreme cases, specific parts of the brain enlarge or contract in response to our experience.”

Scientific documentation for the benefits of brain training may have broader social impacts, says Davidson. “If you go back to the 1950s, the majority of middle-class citizens in Western countries did not regularly engage in physical exercise. It was because of scientific research that established the importance of physical exercise in promoting health and well-being that more people now engage in regular physical exercise. I think mental exercise will be regarded in a similar way 20 years from now.

"Rather than think of the brain as a static organ, or one that just degenerates with age, it’s better understood as an organ that is constantly reshaping itself, is being continuously influenced, wittingly or not, by the forces around us," says Davidson, author of the new book "The Emotional Life of Your Brain." "We can take responsibility for our own brains. They are not pawns to external influences; we can be more pro-active in shaping the positive influences on the brain."

Provided by University of Wisconsin-Madison 

Source: medicalxpress.com

Apr 18, 201212 notes
#science #neuroscience #brain #psychology
Brain changes may hamper decision-Making in old age

April 17, 2012

(HealthDay) — The ability to make decisions in new situations declines with age, apparently because of changes in the brain’s white matter, a new imaging study says.

image

The researchers asked 25 adults, aged 21 to 85, to perform a learning task involving money and also undergo MRI brain scans.

They found that age-related declines in decision-making are associated with the weakening of two specific white-matter pathways that connect an area called the medial prefrontal cortex (located in the cerebral cortex) with two other areas deeper in the brain, called the thalamus and the ventral striatum.

The medial prefrontal cortex is involved in decision-making, the ventral striatum is involved in emotional and motivational aspects of behavior, and the thalamus is a highly connected relay center.

"The evidence that this decline in decision-making is associated with white-matter integrity suggests that there may be effective ways to intervene," study first author Gregory Samanez-Larkin, a postdoctoral fellow in Vanderbilt University’s psychology department and Institute of Imaging Science in Nashville, Tenn., said in a university news release. "Several studies have shown that white-matter connections can be strengthened by specific forms of cognitive training."

The study was published April 11 in the Journal of Neuroscience. 

Source: medicalxpress.com

Apr 18, 20123 notes
#science #neuroscience #brain #psychology
Brain Scans Can Predict Weight Gain and Sexual Activity, Research Shows

ScienceDaily (Apr. 17, 2012) — At a time when obesity has become epidemic in American society, Dartmouth scientists have found that functional magnetic resonance imaging (fMRI) brain scans may be able to predict weight gain. In a study published April 18, 2012, in The Journal of Neuroscience, the researchers demonstrated a connection between fMRI brain responses to appetite-driven cues and future behavior.

image

Raspberry cheesecake. The people whose brains responded more strongly to food cues were the people who went on to gain more weight six months later, researchers said. (Credit: © JJAVA / Fotolia)

"This is one of the first studies in brain imaging that uses the responses observed in the scanner to predict important, real-world outcomes over a long period of time," says Todd Heatherton, the Lincoln Filene Professor in Human Relations in the department of psychological and brain sciences and a coauthor on the study. "Using brain activity to predict a consequential behavior outside the scanner is pretty novel."

Using fMRI, the researchers targeted a region of the brain known as the nucleus accumbens, often referred to as the brain’s “reward center,” in a group of incoming first-year college students. While undergoing scans, the subjects viewed images of animals, environmental scenes, appetizing food items, and people. Six months later, their weight and responses to questionnaires regarding interim sexual behavior were compared with their previously recorded weight and brain scan data.

"The people whose brains responded more strongly to food cues were the people who went on to gain more weight six months later," explains Kathryn Demos, first author on the paper. Demos, who conducted the research as part of her doctoral dissertation at Dartmouth, is currently on the research faculty at the Warren Alpert Medical School of Brown University.

The correlation between strong food image brain responses and weight gain was also present for sexual images and activity. “Just as cue reactivity to food images was investigated as potential predictors of weight gain, cue reactivity to sexual images was used to predict sexual desire,” the authors report.

The paper stresses “material specificity,” noting that the participants who responded to food images gained weight but did not engage in more sexual behavior, and vice versa. The authors go on to say that none of the non-food images predicted weight gain.

Heatherton and William Kelley, associate professor of psychological and brain science and a senior author on the paper, have a longstanding interest in psychological theories of self-regulation, also called self-control or willpower.

"We seek to understand situations in which people face temptations and try to not act on them," says Kelley.

The researchers note that the first step toward controlling cravings may be an awareness of how much you are affected by specific triggers in the environment, such as the arrival of the dessert tray in a restaurant.

"You need to actively be thinking about the behavior you want to control in order to regulate it," remarks Kelley. "Self-regulation requires a lot of conscious effort."

Source: Science Daily

Apr 18, 201218 notes
#science #neuroscience #brain #psychology
Parkinson's Protein Causes Disease Spread in Animal Model

ScienceDaily (Apr. 17, 2012) — Last year, researchers from the Perelman School of Medicine at the University of Pennsylvania found that small amounts of a misfolded brain protein can be taken up by healthy neurons, replicating within them to cause neurodegeneration. The protein, alpha-synuclein (a-syn), is commonly found in the brain, but forms characteristic clumps called Lewy bodies, in neurons of patients with Parkinson’s disease (PD) and other neurodegenerative disorders. They found that abnormal forms of a-syn called fibrils acted as “seeds” that induced normal a-syn to misfold and form aggregates.

image

These images show the brainstem from a control animal (top) and an animal injected with pathologic alpha-synuclein. Brown spots are immunostaining using an antibody specifically recognizing an abnormal form of alpha-synuclein. (Credit: Kelvin C. Luk, Ph.D., Perelman School of Medicine, University of Pennsylvania.)

In earlier studies at other institutions, when fetal nerve cells were transplanted into the brains of PD patients, some of the transplanted cells developed Lewy bodies. This suggested that the corrupted form of a-syn could somehow be transmitted from diseased neurons to healthy ones.

Now, in a follow-up study published in the Journal of Experimental Medicine, the team, led by senior author Virginia M.-Y Lee, PhD, director of the Center for Neurodegenerative Disease Research and professor of Pathology and Laboratory Medicine, showed that brain tissue from a PD mouse model, as well as synthetically produced a-syn fibrils, injected into young, symptom-free PD mice led to spreading of a-syn pathology. By three months after a single injection, neurons containing abnormal a-syn clumps were detected throughout the mouse brains. The inoculated mice died between 100 to 125 days post-inoculation, out of their typical two-year life span.

"We think the spreading is via white-matter tracks through brain neural network connections," explains Lee. "This study will open new opportunities for novel Parkinson’s disease therapies."

One of the remaining questions is how, once inside a neuron, does the misfolded a-syn protein spread from cell to cell.

"It’s like a biochemical chain reaction," says first author Kelvin C. Luk, Ph.D., research associate, in the CNDR. Once inside the confines of a neuron, the misfolded a-syn recruits normally shaped a-syn protein that is present in the cell, causing them to eventually misfold. This occurs along the axons and dendrites (neuronal extensions that reach other neurons), leading to a dramatic accumulation of the abnormal protein. The misshapen a-syn then invades other neurons when they reach the synapse, the small space between neurons.

This transmission process is remarkably similar to what is seen in prions, the protein agents responsible for conditions such as transmissible spongiform encephalopathies ( mad cow disease). However, the researchers are quick to caution that there is no evidence that Parkinson’s or any related neurodegenerative diseases is either infectious or acquired.

The accumulation of misfolded proteins is a fundamental pathogenic process in neurodegenerative diseases, but the factors that trigger aggregation of a-syn are poorly understood.

The Penn team saw that misfolded a-syn propagated along major central nervous system pathways, reaching regions far beyond injection sites. What’s more, they showed for the first time that synthetically produced a-syn fibrils are sufficient to initiate a vicious cycle of Lewy body formation and transmission of the misfolded a-syn in mice.

The study demonstrates just how the Parkinson’s disease protein can spread in a patient’s brain in terms of uptake into a healthy neuron, expansion within the cell, and finally release to a neighboring neuron.

"Knowing this mechanism allows for possible immunotherapies to interrupt the chain reaction by stopping the mutant protein from spreading at the synapse," says Lee.

"Shedding light on how a-synuclein contributes to Parkinson’s disease and related Lewy body disorders is of significant interest both for understanding these diseases and developing potential treatments," said Beth-Anne Sieber, Ph.D., of the National Institute of Neurological Disorders and Stroke (NINDS), part of the National Institutes of Health. "This study provides evidence for the progressive, pathological spread of a-synuclein through the brain."

Source: Science Daily

Apr 18, 2012
#science #neuroscience #brain #psychology #parkinson
Play
Apr 15, 201240 notes
#science #neuroscience #brain #psychology
Brain Network Reveals Disorders

April 13th, 2012
By Kay H. Brodersen 

Researchers at ETH Zurich and the University of Zurich identify a new method of unerringly detecting the presence of pathophysiological changes in the brain.

image

Brain model (left) depicting brain activity stimulated by speech processing (yellow). The new method allows for the mathematical modeling of interactions between regions within the brain (right). The prism represents the transition or “Generative Embedding.” Image adapted from pr image by Brodersen KH/ ETH Zurich.

The new method was developed in order to gain a mechanistic understanding of schizophrenia and other spectrum disorders, which will lead to more accurate diagnoses and more effective treatments.

When mathematical genius John Nash was diagnosed with schizophrenia, the chance for recovery was slim. Medicine in the 1960’s simply had no convincing explanations for his condition. Alarmingly, things don’t look much better nowadays: depression, addiction, schizophrenia, and other spectrum disorders remain among the toughest challenges for medicine. This is because they are caused by complicated and largely unknown interactions between genes and the environment. Different disease mechanisms may underlie similar, or even identical, symptoms. This means that the effect of any given drug may vary hugely across individuals, resulting in trial-and-error treatment. In addition, conditions whose biological basis is not well-understood may be perceived as particularly stigmatizing.

Most spectrum disorders lack a physiological definition altogether; they are simply described in terms of particular symptoms. This is problematic when these symptoms are caused by different disease mechanisms. Conversely, existing disease classifications frequently group patients with disjoint symptoms under the same label: a person with delusions and disorganized thought, for instance, can be diagnosed with schizophrenia, just as somebody else suffering from hallucinations and movement problems. Examples such as this one show that the development of more specific diagnoses and more effective treatment will require a mechanistic understanding of the pathophysiological mechanisms underlying spectrum disorders.

One step in this direction has recently been made by Kay Henning Brodersen and Klaas Enno Stephan at ETH Zurich and the University of Zurich. Within the framework of the SystemsX.ch project ‘Neurochoice’, the two researchers investigate how insights gained from mathematical models of decision making and underlying brain function can be translated into clinical applications. “Put simply, we develop ‘mathematical microscopes’ that allow us to estimate physiological or computational quantities that cannot be measured directly,” says Klaas Enno Stephan, director of the newly founded Translational Neuromodeling Unit (TNU) in Zurich. “This allows us to obtain more accurate classifications and gain deeper mechanistic insights into the underlying condition than previous attempts.”

To demonstrate the plausibility of their idea, the two scientists collaborated with a clinical team led by Alex Leff at University College London. They analysed brain activity from two groups of participants: one group of stroke patients that suffered from language impairments; and one group of healthy volunteers. While undergoing functional magnetic resonance imaging (fMRI), participants were asked to passively listen to speech. A mathematical model was then used to assess, separately within each participant, how brain regions involved in speech processing interacted. Notably, none of the brain regions included in the model had been affected by the stroke in the patients.

The researchers then asked whether it was possible to automatically detect the presence of a remote lesion from patterns of brain connectivity in the healthy part of the brain. “Using our model of brain function, we were able to diagnose patients with an accuracy of 98%,” says Brodersen, first author of the study. “This became possible by tying together dynamic causal models of neuronal dynamics with mathematical techniques from machine learning and Bayesian inference.” In contrast to subtle spectrum disorders, of course, this initial proof-of-principle study concerned a rather salient clinical condition, that is, language impairments caused by a stroke. In the future, Stephan and Brodersen therefore plan to investigate whether their approach might work equally well for those diseases where contemporary medicine is struggling, such as schizophrenia, depression, and addiction. The two researchers hope that their approach will help dissect these spectrum disorders into pathophysiologically well-defined subgroups. Identifying such subgroups would provide an important step towards more specific diagnoses and may eventually predict the most effective treatment for an individual patient.

Source: Neuroscience News

Apr 15, 201212 notes
#science #neuroscience #psychology #brain
Research reveals development of the glial cell

April 11, 2012

A vast majority of cells in the brain are glial, yet our understanding of how they are generated, a process called gliogenesis, has remained enigmatic. Researchers at Baylor College of Medicine have identified a novel transcripitonal cascade that controls these formative stages of gliogenesis and answered the longstanding question of how glial cells are generated from neural stem cells.

The findings appear in the current edition of Neuron.

"Most people are familiar with neurons, cells that process and transmit information in the brain. Glial cells, on the other hand, make-up about 80 percent of the cells in the brain and function by providing trophic support to neruons, participating in neurotransmission, myelin sheaths for axons, and comprise the blood brain barrier," said Dr. Benjamin Deneen, assistant professor of neuroscience at BCM. "Importantly, glia have been linked to numerous CNS pathologies, from brain tumors and spinal cord injury and several neurological disorders including, Retts Syndrome, ALS, and Multiple Sclerosis. Therefore deciphering how glial cells are generated is key to understanding brain function during health and disease."

As researchers began investigating glial development in chicks they started by going backwards – examing what steps were needed before the glial cells matured. They discovered that glial cells are specified in neural stem cells when the transcription factor NFIA is induced.

Taking another step back in the transcriptional cascade, they looked for what triggered NFIA induction.

"By comparing mouse and chick regulatory sequences we were able to perform enhancer screening in the chick to identify regulatory elements with activity that resembled NFIA induction. This method allowed us to pinpoint Sox9," said Peng Kang, postdoctoral associate in the Center for Stem Cell and Regenerative Medicine at BCM. "Subsequently, we found that Sox9 doesn’t just induce NFIA expression, it also associates with NFIA, forming a complex."

Just after the initiation of gliogenesis this complex was discovered to co-regulate a subset of genes that play important roles in mitochondria energy metabolism and glial precursor migration.

"Sox9 induces NFIA expression during glial initiation and then binds NFIA to drive lineage progression by cooperatively regulating a genetic program that controls cell migration and energy metabolism, two key processes associated with cellular differentiation," said Deneen. "We now need to ask what other proteins contribute to this process, and how does the nature of this complex evolve during astro-glial lineage progression."

Additionally, these findings may also help researchers to understand how certain brain tumors might begin to form, as these same developmental processes and proteins are found in both adult and pediatric brain tumors. A more comprehensive understanding how this regulatory cascade operates during development, could eventually lead to better treatment targets for brain tumors.

Provided by Baylor College of Medicine

Source: medicalxpress.com

Apr 15, 201216 notes
#science #neuroscience #brain #psychology
Distinct brain cells recognize novel sights

April 11, 2012

No matter what novel objects we come to behold, our brains effortlessly take us from an initial “What’s that?” to “Oh, that old thing” after a few casual encounters. In research that helps shed light on the malleability of this recognition process, Brown University neuroscientists have teased apart the potentially different roles that two distinct cell types may play.

image

In a study published in the journal Neuron, the researchers document that this kind of learning is based in the inferior temporal cortex (ITC), a brain area buried deep in the skull. Scientists already knew the area was important for visual recognition of familiar items, but they hadn’t figured out the steps required to move from novelty to familiarity, a process they refer to as “plasticity.”

"We know little about that because of the level at which this plasticity is taking place," said senior author David Sheinberg, professor of neuroscience and a member of the Brown Institute for Brain Science. "The inner workings made up of individual neurons make it very hard to actually track what’s going on at that level."

Working with two monkeys, in whom they monitored single neuron activity using tiny microelectrodes, Sheinberg and graduate student Luke Woloszyn tracked the firing patterns of individual neurons in the ITC while monkeys viewed 125 objects they had been trained to recognize and 125 others that they had never seen before.

The scientists found that the two major classes of cells found in the brain, excitatory and inhibitory, responded differently depending on what the monkeys saw. Excitatory neurons were especially active when the monkeys saw a preferred familiar object — the familiar image, out of the 125 such images, that the cell “liked” best. Although the particular preferred familiar image varied across the sample of neurons, almost every excitatory cell had at least one familiar image to which it responded more robustly than its preferred novel image, Sheinberg said. Inhibitory neurons, meanwhile, were much more active when the monkeys saw any novel image, independent of the object’s actual identity. 

Read More →

Apr 15, 20126 notes
#science #neuroscience #brain #psychology
Scientists find possible cause of movement defects in spinal muscular atrophy

April 11, 2012

(Medical Xpress) — An abnormally low level of a protein in certain nerve cells is linked to movement problems that characterize the deadly childhood disorder spinal muscular atrophy, new research in animals suggests.

Spinal muscular atrophy, or SMA, is caused when a child’s motor neurons – nerve cells that send signals from the spinal cord to muscles – produce insufficient amounts of what is called survival motor neuron protein, or SMN. This causes motor neurons to die, leading to muscle weakness and the inability to move.

Though previous research has established the disease’s genetic link to SMN in motor neurons, scientists haven’t yet uncovered how this lack of SMN does so much damage. Some children with the most severe form of the disease die before age 2.

A research team led by Ohio State University scientists showed in zebrafish that when SMN is missing – in cells throughout the body as well as in motor neurons specifically – levels of a protein called plastin 3 also decrease.

When the researchers added plastin 3 back to motor neurons in zebrafish that were genetically altered so they couldn’t produce SMN, the zebrafish regained most of their swimming abilities movement that had been severely limited by their reduced SMN. These findings tied the presence of plastin 3 – alone, without SMN – to the recovery of lost movement.

The recovery was not complete. Fish without SMN in their cells still eventually died, so the addition of plastin 3 alone is not a therapeutic option. But further defining this protein’s role increases understanding of how spinal muscular atrophy develops.

“What all is lost when SMN is lost? That’s something we’re still struggling with,” said Christine Beattie, associate professor of neuroscience at Ohio State and lead author of the study.

“We think part of the motor neuron defects that are seen in spinal muscular atrophy are caused by this decrease in plastin 3 we get when SMN is lowered. And when we add plastin 3 back to motor neurons we can rescue defects that are seen when SMN is decreased, suggesting that a decrease in plastin 3 is contributing to some of the disease’s characteristics.” 

Read More →

Apr 15, 20122 notes
#science #neuroscience #brain
Fragile X syndrome can be reversed in adult mouse brain

April 11, 2012

A recent study finds that a new compound reverses many of the major symptoms associated with Fragile X syndrome (FXS), the most common form of inherited intellectual disability and a leading cause of autism. The paper, published by Cell Press in the April 12 issue of the journal Neuron, describes the exciting observation that the FXS correction can occur in adult mice, after the symptoms of the condition have already been established.

Fragile X patients suffer from a complex set of neuropsychiatric symptoms of varying severity which include anxiety, hyperactivity, learning and memory deficits, low IQ, social and communication deficits, and seizures. Previous research has suggested that inhibition of mGlu5, a subtype of receptor for the excitatory neurotransmitter glutamate, may be useful for ameliorating many of the major symptoms of the disease.

The new study, a collaboration between a group at F. Hoffmann-La Roche Ltd. in Switzerland, led by Dr. Lothar Lindemann, and a group at the Picower Institute for Learning at the Massachusetts Institute of Technology, led by Dr. Mark Bear, used a newly developed mGlu5 inhibitor called CTEP to examine whether pharmacologic inhibition of mGlu5 could reverse FXS symptoms.

The researchers used a mouse model of FXS and administered CTEP after the brain had matured. “We found that even when treatment with CTEP was started in adult mice, it reduced a wide range of FXS symptoms, including learning and memory deficits and auditory hypersensitivity, as well as morphological changes and signaling abnormalities characteristic of the disease,” reports Dr. Lindemann.

Although the CTEP drug itself is not being developed for humans, the findings have significance for human FXS. “The most important implications of our study are that many aspects of FXS are not caused by an irreversible disruption of brain development, and that correction of the altered glutamate signaling can provide widespread therapeutic benefit,” explains Dr. Bear.

The researchers agree that future work may shed light on treatment of FXS in humans. “It will be of great interest to see whether treatment of FXS in human patients can be addressed in a similar broad fashion and with a similar magnitude as was suggested by our preclinical data,” conclude Dr. Lindemann and Dr. Bear. “We anticipate that disturbed signaling can be corrected with other small molecule therapies targeting mGlu5 that are currently being used in human clinical trials.”

Provided by Cell Press

Source: medicalxpress.com

Apr 15, 20121 note
#science #neuroscience #brain
'Brain-only' mutation causes epileptic brain size disorder

April 11, 2012

Scientists have discovered a mutation limited to brain tissue that causes hemimegalencephaly (HMG), a condition where one half of the brain is enlarged and dysfunctional, leading to intellectual disability and severe epilepsy. The research, published by Cell Press in the April 12 issue of Neuron, has broad significance as a potential model for other complex neuropsychiatric diseases that may also be caused by “brain-only” mutations.

Mutations can be inherited or occur spontaneously. Inherited mutations are present throughout all cells of the body, but some spontaneous mutations can occur during development and hence be limited to cells in some organs but not others. For some time it has been suspected that there might be neurological diseases that are caused by mutations limited to the brain, but this had not yet been definitively demonstrated as it is very difficult to study brain tissue.

"The striking asymmetry of the brain in individuals with HMG has long suggested that this disease may be caused by a spontaneous mutation restricted to one half of the brain and detectable by direct study of affected brain tissue," explains the study’s first author, Dr. Ann Poduri, from Children’s Hospital and Harvard Medical School.

Patients with HMG often have dozens of seizures per day, which so interferes with their cognitive development that doctors make the difficult decision to remove brain tissue in a desperate attempt to control the seizures. Fortunately, these operations are frequently successful in controlling seizures and allowing children to develop remarkably normally. Such operations provided brain tissue samples that were used by Dr. Poduri and her colleagues to identify mutations in the AKT3 gene in HMG brain tissue. Previous research has linked AKT3 with the control of brain size. The AKT3 mutations were restricted to the affected brain tissue, and were not evident in blood cells, suggesting that the mutation was spontaneous and not inherited.

"Our data suggest that spontaneous mutations resulting in abnormal activation of AKT3 contribute to overgrowth of one-half of the brain. The size and architecture of HMG may be determined in part by the stage at which the mutation occurs relative to the stage of brain development," concludes senior study author, Dr. Christopher Walsh from Children’s Hospital Boston, Howard Hughes Medical Institute, and Harvard Medical School. "It is also notable that, to our knowledge, this is the first disease attributed to mutations that are limited to brain tissue. There are other epilepsies and neuropsychiatric diseases that are associated with spontaneous mutations and are therefore also candidates for these sorts of ‘brain-only’ mutations."

The study was supported by the Howard Hughes Medical Institute, the National Institute of Neurological Diseases and Stroke, and the National Institute of Mental Health.

Provided by Cell Press

Source: medicalxpress.com

Apr 15, 2012
#science #neuroscience #brain
Data mining opens the door to predictive neuroscience

April 11, 2012

The discovery, using state-of-the-art informatics tools, increases the likelihood that it will be possible to predict much of the fundamental structure and function of the brain without having to measure every aspect of it. That in turn makes the Holy Grail of modelling the brain in silico — the goal of the proposed Human Brain Project — a more realistic, less Herculean, prospect.

image

 

“It is the door that opens to a world of predictive biology,” says Henry Markram. Credit: EPFL

"It is the door that opens to a world of predictive biology," says Henry Markram, the senior author on the study, which is published this week in PLoS ONE.

Within a cortical column, the basic processing unit of the mammalian brain, there are roughly 300 different neuronal types. These types are defined both by their anatomical structure and by their electrical properties, and their electrical properties are in turn defined by the combination of ion channels they present—the tiny pores in their cell membranes through which electrical current passes, which make communication between neurons possible.

Scientists would like to be able to predict, based on a minimal set of experimental data, which combination of ion channels a neuron presents. They know that genes are often expressed together, perhaps because two genes share a common promoter—the stretch of DNA that allows a gene to be transcribed and, ultimately, translated into a functioning protein—or because one gene modifies the activity of another. The expression of certain gene combinations is therefore informative about a neuron’s characteristics, and Georges Khazen and co-workers hypothesised that they could extract rules from gene expression patterns to predict those characteristics.

They took a dataset that Prof Markram and others had collected a few years ago, in which they recorded the expression of 26 genes encoding ion channels in different neuronal types from the rat brain. They also had data classifying those types according to a neuron’s morphology, its electrophysiological properties and its position within the six, anatomically distinct layers of the cortex. They found that, based on the classification data alone, they could predict those previously measured ion channel patterns with 78 per cent accuracy. And when they added in a subset of data about the ion channels to the classification data, as input to their data-mining programme, they were able to boost that accuracy to 87 per cent for the more commonly occurring neuronal types.

"This shows that it is possible to mine rules from a subset of data and use them to complete the dataset informatically," says one of the study’s authors, Felix Schürmann. "Using the methods we have developed, it may not be necessary to measure every single aspect of the behaviour you’re interested in." Once the rules have been validated in similar but independently collected datasets, for example, they could be used to predict the entire complement of ion channels presented by a given neuron, based simply on data about that neuron’s morphology, its electrical behaviour and a few key genes that it expresses.

Researchers could also use such rules to explore the roles of different genes in regulating transcription processes. And importantly, if rules exist for ion channels, they are also likely to exist for other aspects of brain organisation. For example, the researchers believe it will be possible to predict where synapses are likely to form in neuronal networks, based on information about the ratio of neuronal types in that network. Knowledge of such rules could therefore usher in a new era of predictive biology, and accelerate progress towards understanding and modelling the brain.

Provided by Ecole Polytechnique Federale de Lausanne

Source: medicalxpress.com

Apr 15, 20122 notes
#science #neuroscience #brain
Researchers use brain-injury data to map intelligence in the brain

April 10, 2012

Scientists report that they have mapped the physical architecture of intelligence in the brain. Theirs is one of the largest and most comprehensive analyses so far of the brain structures vital to general intelligence and to specific aspects of intellectual functioning, such as verbal comprehension and working memory.

image

A new study found that specific structures, primarily on the left side of the brain, are vital to general intelligence and executive function (the ability to regulate and control behavior). Brain regions that are associated with general intelligence and executive function are shown in color, with red indicating common areas, orange indicating regions specific to general intelligence, and yellow indicating areas specific to executive function. Credit: Aron Barbey

Their study, published in Brain: A Journal of Neurology, is unique in that it enlisted an extraordinary pool of volunteer participants: 182 Vietnam veterans with highly localized brain damage from penetrating head injuries.

"It’s a significant challenge to find patients (for research) who have brain damage, and even further, it’s very hard to find patients who have focal brain damage," said University of Illinois neuroscience professor Aron Barbey, who led the study. Brain damage – from stroke, for example – often impairs multiple brain areas, he said, complicating the task of identifying the cognitive contributions of specific brain structures.

But the very focal brain injuries analyzed in the study allowed the researchers “to draw inferences about how specific brain structures are necessary for performance,” Barbey said. “By studying how damage to particular brain regions produces specific forms of cognitive impairment, we can map the architecture of the mind, identifying brain structures that are critically important for specific intellectual abilities.”

The researchers took CT scans of the participants’ brains and administered an extensive battery of cognitive tests. They pooled the CT data to produce a collective map of the cortex, which they divided into more than 3,000 three-dimensional units called voxels. By analyzing multiple patients with damage to a particular voxel or cluster of voxels and comparing their cognitive abilities with those of patients in whom the same structures were intact, the researchers were able to identify brain regions essential to specific cognitive functions, and those structures that contribute significantly to intelligence.

"We found that general intelligence depends on a remarkably circumscribed neural system," Barbey said. "Several brain regions, and the connections between them, were most important for general intelligence."

These structures are located primarily within the left prefrontal cortex (behind the forehead), left temporal cortex (behind the ear) and left parietal cortex (at the top rear of the head) and in “white matter association tracts” that connect them. (Watch a video about the findings.)

The researchers also found that brain regions for planning, self-control and other aspects of executive function overlap to a significant extent with regions vital to general intelligence.

The study provides new evidence that intelligence relies not on one brain region or even the brain as a whole, Barbey said, but involves specific brain areas working together in a coordinated fashion.

"In fact, the particular regions and connections we found support an emerging body of neuroscience evidence indicating that intelligence depends on the brain’s ability to integrate information from verbal, visual, spatial and executive processes," he said.

The findings will “open the door to further investigations into the biological basis of intelligence, exploring how the brain, genes, nutrition and the environment together interact to shape the development and continued evolution of the remarkable intellectual abilities that make us human,” Barbey said.

Provided by University of Illinois at Urbana-Champaign

Source: medicalxpress.com

Apr 10, 201228 notes
#science #neuroscience #psychology #brain
Dreamless nights: Brain activity during nonrapid eye movement sleep

April 9, 2012 by Stuart Mason Dambrot      

(Medical Xpress) — The link between dreaming and rapid eye movement (REM) sleep are well understood – but the fact that consciousness is reduced during nonrapid eye movement (NREM) sleep is not. Recently, scientists in the Cyclotron Research Centre at the University of Liège, in Liège, Belgium, and the Institut National de la Santé et de la Recherche Médicale at the Université Pierre et Marie Curie in Paris, and the Functional Neuroimaging Unit at the Montreal Geriatrics Institute, investigated NREM sleep with the hypothesis that this phenomenon is associated with increased modularity of the brain’s functional activity during these periods. Using functional clustering – which estimates how integration is hierarchically organized within and across the constituent parts of a system they found that while in NREM sleep, hierarchically-organized large-scale neural networks were disaggregated into smaller independent modules. The researchers concluded that this difference could reduce the ability of the brain to integrate information, thereby accounting for the decreased consciousness experienced during NREM sleep.

image

(A) The six networks extracted during wakefulness. (B) Levels of brain hierarchical integration. (C) Increases in functional clustering ratio in the brain and the six networks (all significant with a probability >0.95). Networks: dATT, dorsal attentional; DM, default mode; EC, executive control; MOT, sensorimotor; SAL, salience; VIS, visual. Copyright © PNAS, doi: 10.1073/pnas.111113310

Led by Pierre Maquet at the Cyclotron Research Centre and Habib Benali at the Institut National de la Santé et de la Recherche Médicale, the team faced a fundamental challenge in framing their research. Maquet first notes that there is currently no consensus on what consciousness really is, let alone how it arises.

“For many years,” he explains to Medical Xpress, “Giulio Tononi put forward the hypothesis that consciousness is related to the ability of the brain to integrate information. Our objective was simply to test this hypothesis, using novel tools allowing for the computation of information exchange within the brain and a set of EEG/fMRI data recorded in the same individuals during wakefulness and deep NREM sleep.” The latter state, he adds, is arguably the condition associated with the most reduce conscious content in normal human volunteers.

Maquet notes that the team used methods devised by Benali. “These allow us to measure the hierarchical organization of integration – i.e., information. The data itself,” he continues, “were acquired in Liège. Conducting simultaneous EEG/fMRI recordings in sleeping volunteers is not that easy.” Moreover, he notes, in practice, their findings are only one small step toward a better understanding of consciousness – and, for that matter, unconsciousness.

“The results were rather unexpected in that the amount of information exchanged in the brain actually increased during sleep. However, the patterns of exchange between regions were different than during wakefulness. Essentially, there was an increased information exchange within small clusters of mainly homologous brain areas whereas communication between clusters significantly decreased during sleep.” Thus, he points out, the data support their hypothesis.

The team has already defined the next steps in their research, says Maquet, who acknowledges that fMRI suffers from a rather sluggish signal. “The next step is to apply the methods to EEG, which has a much better time resolution.” He also states that it might it be possible to transition to in silico modeling, and that there are attempts in this direction in some laboratories.

A key advantage of the team’s approach was relying on functional clustering rather than so-called total integration in neural network analysis. “This is a big question,” states Maquet. “We don’t know what is the information exchanged within clusters, and I don’t see any technique that could currently answer this question in humans. More generally,” he adds, “it is thought that NREM sleep is regulated by the homeostasis of synaptic strength, and perhaps by neuronal energy metabolism.” These assumptions, he concludes, are being studied in animal models.

Source: medicalxpress.com

Apr 10, 20126 notes
#science #neuroscience #brain #psychology
New finding offers neurological support for Adam Smith's 'theories of morality'

April 9, 2012

The part of the brain we use when engaging in egalitarian behavior may also be linked to a larger sense of morality, researchers have found. Their conclusions, which offer scientific support for Adam Smith’s theories of morality, are based on experimental research published in the latest issue of the Proceedings of the National Academy of Sciences.

The study, coming seven months after the start of the Occupy Wall Street Movement, which has been aimed at addressing income inequality, was conducted by researchers from: New York University’s Wilf Family Department of Politics; the University of Toronto; the University of California, San Diego; the University of California, Davis; and the University of Nebraska, Lincoln.

Previous scholarship has established that two areas of the brain are active when we behave in an egalitarian manner—the ventromedial prefrontal cortex (vmPFC) and the insular cortex, which are two neurological regions previously shown to be related to social preferences such as altruism, reciprocity, fairness, and aversion to inequality. Less clear, however, is how these parts of the brain may also be connected to egalitarian behavior in a group setting.

To explore this possibility, the researchers conducted an experiment in which individuals played a game to gauge brain activity in decision-making. In the “random income game” participants in a group are randomly assigned a level of income and the group is assigned to one of three income distributions. Subjects are shown the income of all members of their group, including their own, on a computer screen. Individuals are then asked if they wish to pay a cost in order to increase or decrease the incomes of group members. Subjects are told they may keep the money they don’t give away to the others shown on their screen, so there is a strong incentive not to part with any of the money already allocated to them. Nonetheless, the researchers found that the study’s subjects frequently sought to reallocate resources so the money was more equally distributed among the group members.

During this period, the researchers gauged the subjects’ neurological activity through functional magnetic resonance imaging (fMRI). As shown in previous studies, the researchers found significant activity in the brain’s vmPFC and insular cortex.

But to get at a more detailed understanding of neurological activity during these behaviors, they also examined whether activations in these areas were associated with two additional measures of egalitarian preferences elicited outside of the fMRI. As part of a survey, subjects were asked their level of agreement or disagreement to six questions, which included: “Our society should do whatever is necessary to make sure that everyone has an equal opportunity to succeed” and “This country would be better off if we worried less about how equal people are.” In addition, subjects completed a series of decision-making tasks asking them to split money with another anonymous person. The choices individuals make in this task are a measure of egalitarian behavior.

The researchers found that these two measures of egalitarian preferences were significantly associated with activations in the insular cortex, but not with the vmPFC.

This particular result is a potentially profound one as the insular cortex is also the part of the brain that processes the relationship of the individual with respect to her or his environment. In other words, egalitarian behavior may not exist in isolation, neurologically speaking, but, rather, be part of a larger process that stems from altruism and a sense of the larger social good.

Adam Smith, in The Theory of Moral Sentiments, expressed this perspective in his 18th-century essay.

"Adam Smith contended that moral sentiments like egalitarianism derived from a ‘fellow-feeling’ that would increase with our level of sympathy for others, predicting not merely aversion to inequity, but also our propensity to engage in egalitarian behaviors," the researchers wrote. "The evidence here supports such an interpretation—our results suggest that it is the brain mechanisms involved in experiencing the emotional and social states of self and others that appear to be driving egalitarian behaviors. This conclusion is consistent with a broader view of the insular cortex as a neural substrate that processes the relationship of the individual with respect to his or her environment."

Provided by New York University 

Source: medicalxpress.com

Apr 10, 20123 notes
#science #neuroscience
Scientists Redraw the Blueprint of the Body’s Biological Clock

April 5th, 2012

Discovery of key link between circadian rhythms and metabolism may lead to new therapies for sleep disorders and diabetes.

The discovery of a major gear in the biological clock that tells the body when to sleep and metabolize food may lead to new drugs to treat sleep problems and metabolic disorders, including diabetes.

Scientists at the Salk Institute for Biological Studies, led by Ronald M. Evans, a professor in Salk’s Gene Expression Laboratory, showed that two cellular switches found on the nucleus of mouse cells, known as REV-ERB-α and REV-ERB-β, are essential for maintaining normal sleeping and eating cycles and for metabolism of nutrients from food.

The findings, reported March 29 in Nature, describe a powerful link between circadian rhythms and metabolism and suggest a new avenue for treating disorders of both systems, including jet lag, sleep disorders, obesity and diabetes.

“This fundamentally changes our knowledge about the workings of the circadian clock and how it orchestrates our sleep-wake cycles, when we eat and even the times our bodies metabolize nutrients,” says Evans. “Nuclear receptors can be targeted with drugs, which suggests we might be able to target REV-ERB-α and β to treat disorders of sleep and metabolism.”

Nurses, emergency personnel and others who work shifts that alter the normal 24-hour cycle of waking and sleeping are at much higher risk for a number of diseases, including metabolic disorders such as diabetes. To address this, scientists are trying to understand precisely how the biological clock works and uncover possible targets for drugs that could adjust the circadian rhythm in people with sleep disorders and circadian-associated metabolic disorders.

In mammals, the circadian timing system is orchestrated by a central clock in the brain and subsidiary clocks in most other organs. The master clock in the brain is set by light and determines the overall diurnal or nocturnal preference of an animal, including sleep-wake cycles and feeding behavior.

Scientists knew that two genes, BMAL1 and CLOCK, worked together at the core of the clock’s molecular machinery to activate the network of circadian genes. In this way, BMAL1 acts like the accelerator on a car, activating genes to rev up our physiology each morning so that we are alert, hungry and physically active.

Prior to this work REV-ERB-α and β were thought to play only a minor role in these cycles, possibly working together to slow CLOCK-BMAL1 activity to make minor adjustments to keep the clock running on time.

However, genetic studies of two genes with similar functions can be very difficult and thus the real importance of REV-ERB-α and β remained mysterious.

The Salk scientists got around this hurdle by developing mice in which both genes could be turned off in the liver at any point by giving them an estrogen derivative called tamoxifen. Now mice could develop normally to adulthood, at which point the scientists could turn off REV-ERB-α and REV-ERB-β in their livers—an organ crucial to maintaining the correct balance of sugar and fat in blood—to see what effects it had on circadian rhythms and metabolism.

“When we turned off both receptors, the animal’s biological clocks went haywire,” says Han Cho, first author on the paper and a postdoctoral researcher in Evan’s laboratory. “The mice started running on their exercise wheels when they should have been resting. This suggested REV-ERB-α and REV-ERB-β aren’t an auxiliary system that makes minor adjustments, but an integral part of the clock’s core mechanism. Without them, the clock can’t function properly.”

Digging more deeply into the clockworks, the Salk scientists mapped out the genes that the REV-ERBs control to keep the body operating on the right schedule, finding that they overlap with hundreds of the same genes controlled by CLOCK and BMAL1. This and other findings suggested that the REV-ERBs, act as a break on the genes BMAL1 activates.

“We thought that the core of the clock was an accelerator, and that all REV-ERB-α and REV-ERB-β did was to pull the foot off that pedal,” says Evans. “What we’ve shown is that these receptors act directly as a break to slow clock activity. Now we’ve got a accelerator and a break, each equally important in creating the daily rhythm of the clock.”

The scientists also found that the REV-ERBs control the activity of hundreds of genes involved metabolism, including those responsible for controlling levels of fats and bile. The mice in which REV-ERB-α and REV-ERB-β were turned off had high levels of fat and sugar in their blood—common problems in people with metabolic disorders.

“This explains how our cellular metabolism is tied to daylight cycles determined by the movements of the sun and the earth,” says Satchidananda Panda, an associate professor in Salk’s Regulatory Biology Laboratory and co-author on the paper. “Now we want to find ways of leveraging this mechanism to fix a person’s metabolic rhythms when they are disrupted by travel, shift work or sleep disorders.”

Source: Neuroscience News

Apr 10, 20123 notes
#science #neuroscience #brain #psychology
Researchers Show How Embryonic Stem Cells Orchestrate Human Development

April 5th, 2012

Yale researchers show in detail how three genes within human embryonic stem cells regulate development, a finding that increases understanding of how to grow these cells for therapeutic purposes.

This process, described in the April 6 issue of the journal Cell Stem Cell, is different in humans than in mice, highlighting the importance of research using human embryonic stem cells.

“It is difficult to deduce from the mouse how these cells work in humans,” said Natalia Ivanova, assistant professor of genetics in the Yale Stem Cell Center and senior author of the study. “Human networks organize themselves quite differently.”

Embryonic stem cells form soon after conception and are special because each cell can become any type of cell in the body. Cells become increasingly specialized as development progresses, losing the ability to become other cell types — except for the renewal of a few new stem cells. Scientists want to understand the processes of self-renewal and differentiation in order to treat a host of diseases characterized by damaged cells such as Parkinson’s disease, spinal cord injury, heart disease, and Alzheimer’s.

Scientists have identified three genes active in early development — Nanog, Oct 4, and Sox 2 — as essential to maintaining the stem cell’s ability to self-renew and prevent premature differentiation into the “wrong” type of cells. Because of restrictions on the use of human embryonic stem cells, much of the investigation into how these genes work has been done in mice.

The new study shows that human embryonic cells operate in fundamentally different ways in humans than in mouse cells. In humans, for instance, Nanog pairs with Oct 4 to regulate differentiation of so-called neuro-ectoderm cells, a lineage that gives rise to neurons and other central nervous system cells. Sox 2, by contrast, inhibits the differentiation of mesoderm — a lineage that gives rise to muscles and many other tissue types. Oct 4 cooperates with the other genes and is crucial in the regulation of all four early cell lineages: ectoderm, mesoderm, and endoderm — which gives rise to gut and glands such as liver and pancreas — as well as the creation of new stem cells. The self-renewal of stem cells has been implicated in several forms of cancer.

Ivanova stresses that many other genes must be involved in regulation of these early developmental changes, and her lab is investigating that question now.

Source: Neuroscience News

Apr 10, 20123 notes
#science #neuroscience #brain
Apr 4, 201267 notes
#brain #science #neuroscience
Therapeutic approach for patients with severe depression

April 4, 2012

Brain pacemakers have a long-term effect in patients with the most severe depression. This has now been proven by scientists from the Bonn University Medical Center. Eleven patients took part in the study over a period of two to five years. A lasting reduction in symptoms of more than 50 percent was seen in nearly half of the subjects. The results are now being presented in the current edition of the journal Neuropsychopharmacology.

People with severe depression are constantly despondent, lacking in drive, withdrawn and no longer feel joy. Most suffer from anxiety and the desire to take their own life. Approximately one out of every five people in Germany suffers from depression in the course of his/her life – sometimes resulting in suicide. People with depression are frequently treated with psychotherapy and medication. “However, many patients are not helped by any therapy,” says Prof. Dr. Thomas E. Schläpfer from the Bonn University Medical Center for Psychiatry and Psychotherapy. “Many spend more than ten years in bed – not because they are tired, but because they have no drive at all and they are unable to get up.”

One possible alternative is “deep brain stimulation,” in which electrodes are implanted in the patient’s brain. The target point is the nucleus accumbens - an area of the brain known as the gratification center. There, a weak electrical current stimulates the nerve cells. Brain pacemakers of this type are often used today by neurosurgeons and neurologists to treat ongoing muscle tremors in Parkinson’s disease.

A 2009 study proved an antidepressive effect

In 2009, the Bonn scientists were able to establish that brain pacemakers also demonstrate an effect in the most severely depressed patients. Ten subjects who underwent implantation of electrodes in the nucleus accumbens all experienced relief of symptoms. Half of the subjects had a particularly noticeable response to the stimulation by the electrodes.

"In the current study, we investigated whether these effects last over the long term or whether the effects of the deep brain stimulation gradually weaken in patients," says Prof. Schläpfer. There are always relapses in the case of psychotherapy or drug treatment. Many patients had already undergone up to 60 treatments with psychotherapy, medications and electroconvulsive therapy, to no avail. "By contrast, in the case of deep brain stimulation, the clinical improvement continues steadily for many years." The scientists observed a total of eleven patients over a period of two to five years. "Those who initially responded to the deep brain stimulation are still responding to it even today," says the Bonn psychiatrist, summarizing the results. During the study, one patient committed suicide. "That is very unfortunate," says Prof. Schläpfer. "However, this cannot always be prevented in the case of patients with very severe depression."

he current study shows that the positive effects last for years

Even after a short amount of time, the study participants demonstrated an improvement in symptoms. “The intensity of the anxiety symptoms decreased and the subjects’ drive improved,” reports the psychiatrist. “After many years of illness, some were even able to work again.” With the current publication, the scientists have now demonstrated that the positive effects do not decrease over a longer period of time. “An improvement in symptoms was recorded for all subjects; for nearly half of the subjects, the extent of the symptoms was more than 50 percent below that of the baseline, even years after the start of treatment,” says Prof. Schläpfer. “There were no serious adverse effects of the therapy recorded.”

The long-term effect is now confirmed with the current study. How precisely the electrical stimulation is able to alter the function of the nucleus accumbens is not yet known. “Research is still needed in this area,” says Prof. Schläpfer. “Using imaging techniques, it was proven that the electrodes actually activate the nucleus accumbens.” The deep brain stimulation method may signify hope for people who suffer from the most severe forms of depressive diseases. “However, it will still take quite a bit of time before this therapeutic method becomes a part of standard clinical practice,” says the Bonn scientist.

Provided by University of Bonn 

Source: medicalxpress.com

Apr 4, 201218 notes
#science #neuroscience #brain #psychology #depression
Primitive consciousness emerges first as you awaken from anesthesia

April 4, 2012

Awakening from anesthesia is often associated with an initial phase of delirious struggle before the full restoration of awareness and orientation to one’s surroundings. Scientists now know why this may occur: primitive consciousness emerges first. Using brain imaging techniques in healthy volunteers, a team of scientists led by Adjunct Professor Harry Scheinin, M.D. from the University of Turku, Finland in collaboration with investigators from the University of California, Irvine, have now imaged the process of returning consciousness after general anesthesia. The emergence of consciousness was found to be associated with activations of deep, primitive brain structures rather than the evolutionary younger neocortex.

image

This image shows one returning from oblivion — imaging the neural core of consciousness. Positron emission tomography (PET) findings show that the emergence of consciousness after anesthesia is associated with activation of deep, phylogenetically old brain structures rather than the neocortex. Left: Sagittal (top) and axial (bottom) sections show activation in the anterior cingulate cortex (i), thalamus (ii) and the brainstem (iii) locus coeruleus/parabrachial area overlaid on magnetic resonance image (MRI) slices. Right: Cortical renderings show no evident activations. Credit: Turku PET Center

These results may represent an important step forward in the scientific explanation of human consciousness.

"We expected to see the outer bits of brain, the cerebral cortex (often thought to be the seat of higher human consciousness), would turn back on when consciousness was restored following anesthesia. Surprisingly, that is not what the images showed us. In fact, the central core structures of the more primitive brain structures including the thalamus and parts of the limbic system appeared to become functional first, suggesting that a foundational primitive conscious state must be restored before higher order conscious activity can occur" Scheinin said.

Twenty young healthy volunteers were put under anesthesia in a brain scanner using either dexme-detomidine or propofol anesthetic drugs. The subjects were then woken up while brain activity pictures were being taken. Dexmedetomidine is used as a sedative in the intensive care unit setting and propofol is widely used for induction and maintenance of general anesthesia. Dexmedetomidineinduced unconsciousness has a close resemblance to normal physiological sleep, as it can be reversed with mild physical stimulation or loud voices without requiring any change in the dosing of the drug. This unique property was critical to the study design, as it enabled the investigators to separate the brain activity changes associated with the changing level of consciousness from the drugrelated effects on the brain. The staterelated changes in brain activity were imaged with positron emission tomography (PET).

The emergence of consciousness, as assessed with a motor response to a spoken command, was associated with the activation of a core network involving subcortical and limbic regions that became functionally coupled with parts of frontal and inferior parietal cortices upon awakening from dexme-detomidine-induced unconsciousness. This network thus enabled the subjective awareness of the external world and the capacity to behaviorally express the contents of consciousness through voluntary responses.

Interestingly, the same deep brain structures, i.e. the brain stem, thalamus, hypothalamus and the anterior cingulate cortex, were activated also upon emergence from propofol anesthesia, suggesting a common, drugindependent mechanism of arousal. For both drugs, activations seen upon regaining consciousness were thus mostly localized in deep, phylogenetically old brain structures rather than in the neocortex.

The researchers speculate that because current depth-of-anesthesia monitoring technology is based on cortical electroencephalography (EEG) measurement (i.e., measuring electrical signals on the sur-face of the scalp that arise from the brain’s cortical surface), their results help to explain why these devices fail in differentiating the conscious and unconscious states and why patient awareness during general anesthesia may not always be detected. The results presented here also add to the current understanding of anesthesia mechanisms and form the foundation for developing more reliable depth-of-anesthesia technology.

The anesthetized brain provides new views into the emergence of consciousness. Anesthetic agents are clinically useful for their remarkable property of being able to manipulate the state of consciousness. When given a sufficient dose of an anesthetic, a person will lose the precious but mysterious capacity of being aware of one’s own self and the surrounding world, and will sink into a state of oblivion. Conversely, when the dose is lightened or wears off, the brain almost magically recreates a subjective sense of being as experience and awareness returns. The ultimate nature of consciousness remains a mystery, but anesthesia offers a unique window for imaging internal brain activity when the subjective phenomenon of consciousness first vanishes and then re-emerges. This study was designed to give the clearest picture so far of the internal brain processes involved in this phenomenon.

The results may also have broader implications. The demonstration of which brain mechanisms are involved in the emergence of the conscious state is an important step forward in the scientific explanation of consciousness. Yet, much harder questions remain. How and why do these neural mechanisms create the subjective feeling of being, the awareness of self and environment the state of being conscious?

Provided by Academy of Finland

Source: medicalxpress.com

Apr 4, 201237 notes
#science #neuroscience #brain #psychology
Light Switch Added to Gene Tool Opens New View of Cell Development

ScienceDaily (Apr. 4, 2012) — University of Oregon scientists collaborating with an Oregon company that synthesizes antisense Morpholinos for genetic research have developed a UV light-activated on-off switch for the vital gene-blocking molecule. Based on initial testing in zebra-fish embryos, the enhanced molecule promises to deliver new insights for developmental biologists and brain researchers.

The seven-member team describes the advancement in an open-access paper published in the May issue of the journal Development. UO neuroscientist Philip Washbourne, a professor of biology, says the paper is a “proof-of-concept” on an idea he began discussing with scientists at Gene Tools LLC in Philomath, Ore., about four years ago. Gene Tools was founded in the 1980s by James Summerton, who first invented Morpholino oligos. The company holds the exclusive license to distribute these molecules to researchers around the world.

Morpholinos are short-chain, artificially produced oligomers that bind to RNA in cells and block protein synthesis. For a decade, biologists have used them in zebra fish, mice and African clawed toads to study development, but they remained in the active, or on, position. Gene Tools created and introduced a light-sensitive linker, allowing researchers to control the molecule — even leaving one on in one cell and off in an adjacent cell — with a pinpoint UV laser beam.

Researchers in Washbourne’s lab — led by neuroscience research associate Alexandra Tallafuss — were challenged to give the new molecules a test run. They applied them to their work in zebra fish. “Now we can turn them on and off,” Washbourne said. “You can insert them and then manipulate them to learn just when a gene is important, and we learned two things right away.”

Researchers have known that if a gene known as “no tail” is blocked in development, zebra fish fail to grow tails. They now know that the no-tail gene does not need to produce protein for tail formation until about 10 hours, or very late, into an embryo’s development.

Secondly, the researchers looked at the gene sox10, which is vital in the formation of neural crest cells, which give rise to dorsal root ganglion cells — neurons that migrate out of the spinal cord — and pigment cells. “Again, we found that sox10 is not needed as early in development as theorized,” Washbourne said.

"These light-sensitive molecules significantly expand the power and precision of molecular genetic studies in zebrafish," said Robert Riddle, a program director at the National Institute of Neurological Disorders and Stroke (NINDS). "Researchers from many fields will be able to use these tools to explore the function of different genes in embryonic regions, specific cell types and at precise times in an animal’s lifespan."

The NINDS and National Institute of Child Health and Human Development, both at the National Institutes of Health, supported the research through grants to Washbourne and Eisen.

"This successful collaboration between our scientists and this Oregon-based company shows that commercial innovation can come quickly by jointly addressing common needs," said Kimberly Andrews Espy, vice president for research and innovation at the UO. "This is a remarkable example of turning a concept into a working tool that likely will benefit many researchers around the world."

Source: Science Daily

Apr 4, 2012
#science #neuroscience #brain #psychology
Positive Stress Helps Protect Eye from Glaucoma

April 3rd, 2012

Working in mice, scientists at Washington University School of Medicine in St. Louis have devised a treatment that prevents the optic nerve injury that occurs in glaucoma, a neurodegenerative disease that is a leading cause of blindness.

Researchers increased the resistance of optic nerve cells to damage by repeatedly exposing the mice to low levels of oxygen similar to those found at high altitudes. The stress of the intermittent low-oxygen environment induces a protective response called tolerance that makes nerve cells — including those in the eye — less vulnerable to harm.

The study, published online in Molecular Medicine, is the first to show that tolerance induced by preconditioning can protect against a neurodegenerative disease.

Stress is typically thought of as a negative phenomenon, but senior author Jeffrey M. Gidday, PhD, associate professor of neurological surgery and ophthalmology, and others have previously shown that the right kinds of stress, such as exercise and low-oxygen environments, can precondition cells and induce changes that make them more resistant to injury and disease.

Scientists previously thought tolerance in the central nervous system only lasted for a few days. But last year Gidday developed a preconditioning protocol that extended the effects of tolerance from days to months. By exposing mice to hypoxia, or low oxygen concentrations, several times over a two-week period, Gidday and colleagues triggered an extended period of tolerance. After preconditioning ended, the brain was protected from stroke damage for at least 8 weeks.

“Once we discovered tolerance could be extended, we wondered whether this protracted period of injury resistance could also protect against the slow, progressive loss of neurons that characterizes neurodegenerative diseases,” Gidday says.

To find out, Gidday turned to an animal model of glaucoma, a condition linked to increases in the pressure of the fluid that fills the eye. The only treatments for glaucoma are drugs that reduce this pressure; there are no therapies designed to protect the retina and optic nerves from harm.

Scientists classify glaucoma as a neurodegenerative disease based on how slowly and progressively it kills retinal ganglion cells. The bodies of these cells are located in the retina of the eye; their branches or axons come together in bundles and form the optic nerves. Scientists don’t know if damage begins in the bodies or axons of the cells, but as more and more retinal ganglion cells die, patients experience peripheral vision loss and eventually become blind.

For the new study, Yanli Zhu, MD, research instructor in neurosurgery, induced glaucoma in mice by tying off vessels that normally allow fluid to drain from the eye. This causes pressure in the eye to increase. Zhu then assessed how many cell bodies and axons of retinal ganglion cells were intact after three or 10 weeks.

The investigators found that normal mice lost an average of 30 percent of their retinal ganglion cell bodies after 10 weeks of glaucoma. But mice that received the preconditioning before glaucoma-inducing surgery lost only 3 percent of retinal ganglion cell bodies.

“We also showed that preconditioned mice lost significantly fewer retinal ganglion cell axons,” Zhu says.

Gidday is currently investigating which genes are activated or repressed by preconditioning. He hopes to identify the changes in gene activity that make cells resistant to damage.

“Previous research has shown that there are literally hundreds of survival genes built into our DNA that are normally inactive,” Gidday says. “When these genes are activated, the proteins they encode can make cells much less vulnerable to a variety of injuries.”

Identifying specific survival genes should help scientists develop drugs that can activate them, according to Gidday.

Neurologists are currently conducting clinical trials to see if stress-induced tolerance can reduce brain damage after acute injuries like stroke, subarachnoid hemorrhage or trauma.

Gidday hopes his new finding will promote studies of tolerance’s potential usefulness in animal models of Parkinson’s disease, Alzheimer’s disease and other neurodegenerative conditions.

“Neurons in the central nervous system appear to be hard-wired for survival,” Gidday says. “This is one of the first steps in establishing a framework for how we can take advantage of that metaphorical wiring and use positive stress to help treat a variety of neurological diseases.”

Source: Neuroscience News

Apr 4, 20121 note
#science #neuroscience #brain #vision
Study shows why some pain drugs become less effective over time

April 3, 2012

Researchers at the University of Montreal’s Sainte-Justine Hospital have identified how neural cells like those in our bodies are able to build up resistance to opioid pain drugs within hours. Humans have known about the usefulness of opioids, which are often harvested from poppy plants, for centuries, but we have very little insight into how they lose their effectiveness in the hours, days and weeks following the first dose.

"Our study revealed cellular and molecular mechanisms within our bodies that enables us to develop resistance to this medication, or what scientists call drug tolerance," lead author Dr. Graciela Pineyro explained. "A better understanding of these mechanisms will enable us to design drugs that avoid tolerance and produce longer therapeutic responses."

The research team looked at how drug molecules would interact with molecules called “receptors” that exist in every cell in our body. Receptors, as the name would suggest, receive “signals” from the chemicals that they come into contact with, and the signals then cause the various cells to react in different ways. They sit on the cell wall, and wait for corresponding chemicals known as receptor ligands to interact with them. “Until now, scientists have believed that ligands acted as ‘on- off’ switches for these receptors, all of them producing the same kind of effect with variations in the magnitude of the response they elicit,” Pineyro explained. “We now know that drugs that activate the same receptor do not always produce the same kind of effects in the body, as receptors do not always recognize drugs in the same way. Receptors will configure different drugs into specific signals that will have different effects on the body.”

Pineyro is attempting to tease the “painkilling” function of opioids from the part that triggers mechanisms that enable tolerance to build up. “My laboratory and my work are mostly structured around rational drug design, and trying to define how drugs produce their desired and non desired effects, so as to avoid the second, Pineyro said. “If we can understand the chemical mechanisms by which drugs produce therapeutic and undesired side effects, we will be able to design better drugs.”

Once activated by a drug, receptors move from the surface of the cell to its interior, and once they have completed this ‘journey’, they can either be destroyed or return to the surface and used again through a process known as “receptor recycling.” By comparing two types of opioids – DPDPE and SNC-80 – the researchers found that the ligands that encouraged recycling produced less analgesic tolerance than those that didn’t. “We propose that the development of opioid ligands that favour recycling could be away of producing longer-acting opioid analgesics,” Pineyro said.

Provided by University of Montreal

Source: medicalxpress.com

Apr 4, 20124 notes
#science #neuroscience #brain #psychology
Early warning system for seizures could cut false alarms

April 3, 2012

Epilepsy affects 50 million people worldwide, but in a third of these cases, medication cannot keep seizures from occurring. One solution is to shoot a short pulse of electricity to the brain to stamp out the seizure just as it begins to erupt. But brain implants designed to do this have run into a stubborn problem: too many false alarms, triggering unneeded treatment. To solve this, Johns Hopkins biomedical engineers have devised new seizure detection software that, in early testing, significantly cuts the number of unneeded pulses of current that an epilepsy patient would receive.

image

Sridevi Sarma’s research focuses on a system with three components: electrodes implanted in the brain, which are connected by wires to a neurostimulator or battery pack, and a sensing device, also located in the brain implant, which detects when a seizure is starting and activates the current to stop it. Credit: Greg Stanley/JHU

Sridevi V. Sarma, an assistant professor of biomedical engineering, is leading this effort to improve anti-seizure technology that sends small amounts of current into the brain to control seizures.

"These devices use algorithms — a series of mathematical steps —to figure out when to administer the treatment," Sarma said. "They’re very good at detecting when a seizure is about to happen, but they also produce lots of false positives, sometimes hundreds in one day. If you introduce electric current to the brain too often, we don’t know what the health impacts might be. Also, too many false alarms can shorten the life of the battery that powers the device, which must be replaced surgically."

Her new software was tested on real-time brain activity recordings collected from four patients with drug-resistant epilepsy who experienced seizures while being monitored. In a study published recently in the journal Epilepsy & Behavior, Sarma’s team reported that its system yielded superior results, including flawless detection of actual seizures and up to 80 percent fewer alarms when a seizure was not occurring. Although the testing was not conducted on patients in a clinical setting, the results were promising.

Read More →

Apr 4, 20122 notes
#science #neuroscience #brain #psychology #epilepsy
Activity in Brain Networks Related to Features of Depression

ScienceDaily (Apr. 3, 2012) — Depressed individuals with a tendency to ruminate on negative thoughts, i.e. to repeatedly think about particular negative thoughts or memories, show different patterns of brain network activation compared to healthy individuals, report scientists of a new study in Biological Psychiatry.

The risk for depression is increased in individuals with a tendency towards negative ruminations, but patterns of autobiographic memory also may be predictive of depression.

When asked to recall specific events, some individuals have a tendency to recall broader categories of events instead of specific events. This is termed overgeneral memory and, like those who tend to ruminate, these individuals also have a higher risk of developing depression.

These self-referential activities engage a network of brain regions called the default mode network, or DMN. Prior studies using imaging techniques have already shown that the DMN activates abnormally in individuals with depression, but the relationship between DMN activity and depressive ruminations was not clear.

In this new report, Dr. Shuqiao Yao of Central South University in Hunan, China and colleagues evaluated DMN functional connectivity in untreated young adults experiencing their first episode of major depression and healthy volunteers. Each participant underwent a brain scan and completed tests to measure their levels of rumination and overgeneral memory.

As expected, the depressed patients exhibited higher levels of rumination and overgeneral memory than did the control subjects. They also observed increased functional connectivity in the anterior medial cortex regions and decreased functional connectivity in the posterior medial cortex regions in depressed patients compared with control subjects.

Among the depressed subjects, an interesting pattern of dissociation emerged. The increased connectivity in anterior regions was positively associated with rumination, while the decreased connectivity in posterior regions was negatively associated with overgeneral memory.

Dr. Yao commented on the importance of these findings: “In the future, resting-state network activity in the brain will provide useful models for investigating network features of cognitive dysfunction in psychopathology.”

"As we dig deeper in brain imaging studies, we are becoming increasingly interested in the activity of brain circuits rather than single brain regions," said Dr. John Krystal, Editor of Biological Psychiatry. “Although it is a more complicated process, studying brain circuits may provide greater insight into symptoms, such as depressive ruminations. The current study nicely illustrates how altered activity at different sites within a brain network may be related to different features of depression.”

Source: Science Daily

Apr 3, 20129 notes
#science #neuroscience #brain #psychology #depression
Northwestern study compares endovascular brain aneurysm repair devices

April 3, 2012

Approximately 6 million Americans have brain aneurysms, a condition that occurs when a weak or thin spot develops on a blood vessel in the brain causing it to balloon. Often, these do not cause symptoms and go undetected, but every year an estimated 30,000 Americans experience a ruptured aneurysm that bleeds into the brain causing a life threatening injury. Immediate medical treatment is necessary to prevent stroke, nerve damage or death, and includes surgery or coiling. Coiling is an approach that blocks blood flow to the aneurysm by filling it with platinum coils. While less invasive than surgery, the likelihood of future aneurysm recurrence and subsequent treatment is higher with coiling. In an effort to lower the risk for repeat aneurysm treatment after coiling, Northwestern Medicine researchers are examining a new type of gel-coated coil to determine if it is more effective than the standard bare coils in preventing aneurysm recurrence.

Aneurysms can be a very serious health threat, according to Bernard R. Bendok, MD, a neurosurgeon at Northwestern Memorial Hospital, who is the principal investigator for the new generation Hydrogel Endovascular Aneurysm Treatment Trial (HEAT). “When an aneurysm needs treatment, it is important to perform the safest, most effective and most durable treatment. This clinical research trial, called HEAT, will help us determine whether bare platinum coils, which have been used for years, or the newer gel-coated coils are more effective long-term,” said Bendok, who is also an associate professor of neurological surgery and radiology at Northwestern University Feinberg School of Medicine.

Coiling involves inserting a catheter into an artery and threading it through the body using live x-rays as a guide to the site of the aneurysm. Coils are passed through the catheter and released into the aneurysm filling it to block blood from entering. Blood clots then form around the coil preventing the vessels from rupturing or leaking and destroying the aneurysm.

"Coils are not always able to fill the aneurysm completely, which leaves dead space in the aneurysm. This space has been associated with a higher rate of aneurysm recurrence," explained Bendok. "The new coils are made with platinum and a hydrogel that expands over time to eliminate the space between the coils, potentially limiting the need for future treatment."

HEAT is an international randomized study that seeks to determine how the gel packed coils measure up to the standard option in preventing future aneurysm recurrence. Northwestern is the lead site for the trial. Patients may be eligible for the trial if they are between the ages of 18 and 75 years with aneurysms 3 to 14mm in size, amenable to coiling. An estimated 30 sites around the world are expected to join the trial which has an enrollment goal of 600 participants. 

On average aneurysms impact about one percent of the adult population. Understanding symptoms and risk factors can be potentially lifesaving. Small aneurysms may not be associated with symptoms, but a larger, growing aneurysm may cause pressure on tissues and nerves, leading to symptoms including headache, pain above and behind the eye, a dilated pupil, double vision, and weakness, numbness or paralysis on one side of face or body.

"In many cases, brain aneurysms remain silent until there’s a major problem," said Bendok. "Most are not found until they rupture or are found incidentally on brain images taken to assess another condition. The number one sign to look for is a sudden and extremely severe headache. If this occurs, one should seek immediate medical attention."

Other indicators that a person may have a ruptured aneurysm include double vision, nausea, vomiting, stroke-like symptoms, stiff neck, loss of consciousness and in some cases, seizure and changes in memory. Risk factors include hypertension, alcohol and drug abuse, and smoking. Aneurysms can be influenced by genetic factors and family history may be an indication for screening. People with certain hereditary diseases including connective tissue disorders or polycystic kidney disease can have a higher occurrence. Other associations include arteriovenous malformation (AVM) and blockage of certain blood vessels in the brain. Women are more likely than men to have brain aneurysms. It’s estimated about 10 in every 100,000 people will experience a ruptured aneurysm each year.

"Brain aneurysm rupture can be very devastating," said H. Hunt Batjer, MD, chairman of the department of neurological surgery at Northwestern Memorial and Michael J. Marchese Professor of neurological surgery at the Feinberg School. "It’s important to know what to look for and who might be at increased risk for aneurysm disease. While current treatments are effective, trials like HEAT have the potential to advance the art and science of brain aneurysm treatment and lead to even better treatment options in the future."

Provided by Northwestern Memorial Hospital

Source: medicalxpress.com

Apr 3, 2012
#science #neuroscience #brain #psychology
Next page →
20132014
  • January
  • February
  • March
  • April
  • May
  • June
  • July
  • August
  • September
  • October
  • November
  • December
201220132014
  • January
  • February
  • March
  • April
  • May
  • June
  • July
  • August
  • September
  • October
  • November
  • December
20122013
  • January
  • February
  • March
  • April
  • May
  • June
  • July
  • August
  • September
  • October
  • November
  • December