Neuroscience

Articles and news from the latest research reports.

155 notes

2 dimensions of value: Dopamine neurons represent reward but not aversiveness
To make decisions, we need to estimate the value of sensory stimuli and motor actions, their “goodness” and “badness.” We can imagine that good and bad are two ends of a single continuum, or dimension, of value. This would be analogous to the single dimension of light intensity, which ranges from dark on one end to bright light on the other, with many shades of gray in between. Past models of behavior and learning have been based on a single continuum of value, and it has been proposed that a particular group of neurons (brain cells) that use dopamine as a neurotransmitter (chemical messenger) represent the single dimension of value, signaling both good and bad.
The experiments reported here show that dopamine neurons are sensitive to the value of reward but not punishment (like the aversiveness of a bitter taste). This demonstrates that reward and aversiveness are represented as two discrete dimensions (or categories) in the brain. “Reward” refers to the category of good things (food, water, sex, money, etc.), and “punishment” to the category of bad things (stimuli associated with harm to the body and that cause pain or other unpleasant sensations or emotions).
Rather than having one neurotransmitter (dopamine) to represent a single dimension of value, the present results imply the existence of four neurotransmitters to represent two dimensions of value. Dopamine signals evidence for reward (“gains”) and some other neurotransmitter presumably signals evidence against reward (“losses”). Likewise, there should be a neurotransmitter for evidence of danger and another for evidence of safety. It is interesting that there are three other neurotransmitters that are analogous to dopamine in many respects (serotonin, norepinephrine, and acetylcholine), and it is possible that they could represent the other three value signals.

2 dimensions of value: Dopamine neurons represent reward but not aversiveness

To make decisions, we need to estimate the value of sensory stimuli and motor actions, their “goodness” and “badness.” We can imagine that good and bad are two ends of a single continuum, or dimension, of value. This would be analogous to the single dimension of light intensity, which ranges from dark on one end to bright light on the other, with many shades of gray in between. Past models of behavior and learning have been based on a single continuum of value, and it has been proposed that a particular group of neurons (brain cells) that use dopamine as a neurotransmitter (chemical messenger) represent the single dimension of value, signaling both good and bad.

The experiments reported here show that dopamine neurons are sensitive to the value of reward but not punishment (like the aversiveness of a bitter taste). This demonstrates that reward and aversiveness are represented as two discrete dimensions (or categories) in the brain. “Reward” refers to the category of good things (food, water, sex, money, etc.), and “punishment” to the category of bad things (stimuli associated with harm to the body and that cause pain or other unpleasant sensations or emotions).

Rather than having one neurotransmitter (dopamine) to represent a single dimension of value, the present results imply the existence of four neurotransmitters to represent two dimensions of value. Dopamine signals evidence for reward (“gains”) and some other neurotransmitter presumably signals evidence against reward (“losses”). Likewise, there should be a neurotransmitter for evidence of danger and another for evidence of safety. It is interesting that there are three other neurotransmitters that are analogous to dopamine in many respects (serotonin, norepinephrine, and acetylcholine), and it is possible that they could represent the other three value signals.

Filed under neurons neurotransmitters dopamine reward-punishment neuroscience science

78 notes

Burnt sugar-derivative reduces muscle wasting in fly and mouse models of muscular dystrophy
A trace substance in caramelized sugar, when purified and given in appropriate doses, improves muscle regeneration in a mouse model of Duchenne muscular dystrophy. The findings are published today (Aug. 1) in the journal Skeletal Muscle.
Morayma Reyes, professor of pathology and laboratory medicine, and Hannele Ruohola-Baker, professor of biochemistry and associate director of the Institute for Stem Cell and Regenerative Medicine, headed the University of Washington team that made the discovery. The first authors of the paper were Nicholas Ieronimakis, UW Department of Pathology; and Mario Pantoja, UW Department of Biochemistry.

They explained that the mice in their study, like boys with the gender-linked inherited disorder, are missing the gene that produces dystrophin, a muscle-repair protein. Neither the mice nor the affected boys can replace enough of their routinely lost muscle cells. In people, muscle weakness begins when the boys are toddlers, and progresses until, as teens, they can no longer walk unaided. During early adulthood, their heart and respiratory muscles weaken. Even with ventilators to assist breathing, death usually ensues before age 30. No cure or satisfactory treatment is available. Prednisone drugs relieve some symptoms, but at the cost of severe side effects.

The disabling, then lethal, nature of the rare disease in young men presses scientists to search for better therapeutic agents. Reyes and Ruohola-Baker are seeking ways to suppress the disorder’s characteristic functional and structural muscle defects.
Ruohola-Baker’s lab originally identified the sphingosine 1-phosphate (S1P) pathway as a critical player in ameliorating muscular dystrophy in flies. Her lab did this through a large genetic suppressor screen using the fruit fly, Drosophila melanogaster. Sphingosine 1-phosphate is found in the cells of most living beings from yeasts to mammals. Named after the enigmatic sphinx, this cell signal is important in many activities of living cells, from migration to proliferation. The multi-talented, bioactive lipid is essential, Reyes said, in turning stem cells into specific types of cells, in regenerating damaged tissue, and in inhibiting cell death. Without cell receptors for sphingosine 1-phosphate, an embryo would fail to develop.

Other scientists had observed that levels of sphingosine 1-phosphate are lower in the muscles of mice with the muscular dystrophy mutation, and that certain cell repair pathways involving this signal are impaired. However, sphingosine 1-phosphate couldn’t be administered as a drug because it is rapidly used up.

Instead, Reyes and Ruohola-Baker sought to prevent the sphingosine 1-phosphate occurring naturally in the body from degrading. A fruit fly model of Duchenne muscular dystrophy allowed Ruohola-Baker’s lab to rapidly score small molecule therapy candidates for raising the level of sphingosine 1-phosphate. Flies with the genetic defect act normally after they hatch and fly around, but in a few weeks, due to muscle degeneration, they are flightless. By using insect activity monitors, the scientists assessed the effects of drug and gene therapy candidates on the flies’ ability to move.

This screening tool led to the discovery that a small molecule with a long name, 2-acetyl4 (5)-tetrahydroxybutyl imidazole, or THI for short, blocks an enzyme that breaks down sphingosine 1-phosphate.

“It’s interesting to note that THI is a trace component of Caramel Color III, which the U.S. Food and Drug Administration categories as ‘generally recognized as safe’,” said Reyes. The substance is also found in very tiny amounts in burnt sugar, brown sugar, beer, cola and some candies.

The researchers added a purified, concentrated form of THI to the food of young flies with the muscular dystrophy-like mutation. They confirmed that the THI alleviated muscle wasting in the flies. A few other drugs, including a THI derivative and an unrelated drug now in clinical trials for rheumatoid arthritis, also showed beneficial effects in fruit flies.

The study of THI then switched from insects to mammals. Reyes lab began by treating old dystrophic mice with direct injection of THI. Later, the researchers simply added the compound to the drinking water in the habitats of young dystrophic mice. These mice were comparable in developmental stage to human teens who have muscular dystrophy genetic variation.

“We observed that treatment with THI significantly increased muscle fiber size and muscle-specific force in our affected mice,” Reyes said. “We also saw that other hallmarks of impaired muscle regeneration – fat deposits and fibrosis [scar tissue] accumulation – were also lower in the THI-treated mice.”

The research team linked the desired regenerative effects in the mice to the response of muscle-forming cells and the subsequent regrowth of muscle fibers. A type of sphingosine 1-phosphate, and cell receptors for it, also were observed in the cells in the regenerating muscle fibers. The researchers proposed that sphingosine 1-phosphate turned up the dial on the regulators for the biochemical pathways that mediate skeletal muscle mass and muscle function.

Now that they have shown proof-of-concept, the researchers hope to conduct additional animal studies on THI and other compounds that protect the body’s supply of sphingosine 1-phosphate necessary for muscle cell regeneration. If THI continues to show promise as a nutraceutical or food-based drug, medical scientists will head into pre-clinical studies of effectiveness and safety before advancing to human trials. In addition to muscular dystrophy treatment research, similar studies might also be conducted in the future on loss of muscle strength during normal or accelerated aging.

While excited about the preliminary findings, the scientists cautioned that they are still at the very earliest stages of research, and that much more work needs to be done before any conclusions can be drawn about the potential of THI as a muscular dystrophy treatment.

Burnt sugar-derivative reduces muscle wasting in fly and mouse models of muscular dystrophy

A trace substance in caramelized sugar, when purified and given in appropriate doses, improves muscle regeneration in a mouse model of Duchenne muscular dystrophy. The findings are published today (Aug. 1) in the journal Skeletal Muscle.

Morayma Reyes, professor of pathology and laboratory medicine, and Hannele Ruohola-Baker, professor of biochemistry and associate director of the Institute for Stem Cell and Regenerative Medicine, headed the University of Washington team that made the discovery. The first authors of the paper were Nicholas Ieronimakis, UW Department of Pathology; and Mario Pantoja, UW Department of Biochemistry.

They explained that the mice in their study, like boys with the gender-linked inherited disorder, are missing the gene that produces dystrophin, a muscle-repair protein. Neither the mice nor the affected boys can replace enough of their routinely lost muscle cells. In people, muscle weakness begins when the boys are toddlers, and progresses until, as teens, they can no longer walk unaided. During early adulthood, their heart and respiratory muscles weaken. Even with ventilators to assist breathing, death usually ensues before age 30. No cure or satisfactory treatment is available. Prednisone drugs relieve some symptoms, but at the cost of severe side effects.

The disabling, then lethal, nature of the rare disease in young men presses scientists to search for better therapeutic agents. Reyes and Ruohola-Baker are seeking ways to suppress the disorder’s characteristic functional and structural muscle defects.

Ruohola-Baker’s lab originally identified the sphingosine 1-phosphate (S1P) pathway as a critical player in ameliorating muscular dystrophy in flies. Her lab did this through a large genetic suppressor screen using the fruit fly, Drosophila melanogaster. Sphingosine 1-phosphate is found in the cells of most living beings from yeasts to mammals. Named after the enigmatic sphinx, this cell signal is important in many activities of living cells, from migration to proliferation. The multi-talented, bioactive lipid is essential, Reyes said, in turning stem cells into specific types of cells, in regenerating damaged tissue, and in inhibiting cell death. Without cell receptors for sphingosine 1-phosphate, an embryo would fail to develop.

Other scientists had observed that levels of sphingosine 1-phosphate are lower in the muscles of mice with the muscular dystrophy mutation, and that certain cell repair pathways involving this signal are impaired. However, sphingosine 1-phosphate couldn’t be administered as a drug because it is rapidly used up.

Instead, Reyes and Ruohola-Baker sought to prevent the sphingosine 1-phosphate occurring naturally in the body from degrading. A fruit fly model of Duchenne muscular dystrophy allowed Ruohola-Baker’s lab to rapidly score small molecule therapy candidates for raising the level of sphingosine 1-phosphate. Flies with the genetic defect act normally after they hatch and fly around, but in a few weeks, due to muscle degeneration, they are flightless. By using insect activity monitors, the scientists assessed the effects of drug and gene therapy candidates on the flies’ ability to move.

This screening tool led to the discovery that a small molecule with a long name, 2-acetyl4 (5)-tetrahydroxybutyl imidazole, or THI for short, blocks an enzyme that breaks down sphingosine 1-phosphate.

“It’s interesting to note that THI is a trace component of Caramel Color III, which the U.S. Food and Drug Administration categories as ‘generally recognized as safe’,” said Reyes. The substance is also found in very tiny amounts in burnt sugar, brown sugar, beer, cola and some candies.

The researchers added a purified, concentrated form of THI to the food of young flies with the muscular dystrophy-like mutation. They confirmed that the THI alleviated muscle wasting in the flies. A few other drugs, including a THI derivative and an unrelated drug now in clinical trials for rheumatoid arthritis, also showed beneficial effects in fruit flies.

The study of THI then switched from insects to mammals. Reyes lab began by treating old dystrophic mice with direct injection of THI. Later, the researchers simply added the compound to the drinking water in the habitats of young dystrophic mice. These mice were comparable in developmental stage to human teens who have muscular dystrophy genetic variation.

“We observed that treatment with THI significantly increased muscle fiber size and muscle-specific force in our affected mice,” Reyes said. “We also saw that other hallmarks of impaired muscle regeneration – fat deposits and fibrosis [scar tissue] accumulation – were also lower in the THI-treated mice.”

The research team linked the desired regenerative effects in the mice to the response of muscle-forming cells and the subsequent regrowth of muscle fibers. A type of sphingosine 1-phosphate, and cell receptors for it, also were observed in the cells in the regenerating muscle fibers. The researchers proposed that sphingosine 1-phosphate turned up the dial on the regulators for the biochemical pathways that mediate skeletal muscle mass and muscle function.

Now that they have shown proof-of-concept, the researchers hope to conduct additional animal studies on THI and other compounds that protect the body’s supply of sphingosine 1-phosphate necessary for muscle cell regeneration. If THI continues to show promise as a nutraceutical or food-based drug, medical scientists will head into pre-clinical studies of effectiveness and safety before advancing to human trials. In addition to muscular dystrophy treatment research, similar studies might also be conducted in the future on loss of muscle strength during normal or accelerated aging.

While excited about the preliminary findings, the scientists cautioned that they are still at the very earliest stages of research, and that much more work needs to be done before any conclusions can be drawn about the potential of THI as a muscular dystrophy treatment.

Filed under muscular dystrophy duchenne muscular dystrophy dystrophin genetics neuroscience science

45 notes

Speedier scans reveal new distinctions in resting and active brain

A boost in the speed of brain scans is unveiling new insights into how brain regions work with each other in cooperative groups called networks.

Scientists at Washington University School of Medicine in St. Louis and the Institute of Technology and Advanced Biomedical Imaging at the University of Chieti, Italy, used the quicker scans to track brain activity in volunteers at rest and while they watched a movie.

“Brain activity occurs in waves that repeat as slowly as once every 10 seconds or as rapidly as once every 50 milliseconds,” said senior researcher Maurizio Corbetta, MD, the Norman J. Stupp Professor of Neurology. “This is our first look at these networks where we could sample activity every 50 milliseconds, as well as track slower activity fluctuations that are more similar to those observed with functional magnetic resonance imaging (fMRI). This analysis performed at rest and while watching a movie provides some interesting and novel insights into how these networks are configured in resting and active brains.”

Understanding how brain networks function is important for better diagnosis and treatment of brain injuries, according to Corbetta.

The study appears online in Neuron.

Researchers know of several resting-state brain networks, which are groups of different brain regions whose activity levels rise and fall in sync when the brain is at rest. Scientists used fMRI to locate and characterize these networks, but the relative slowness of this approach limited their observations to activity that changes every 10 seconds or so. A surprising result from fMRI was that the spatial pattern of activity (or topography) of these brain networks is similar at rest and during tasks.

In contrast, a faster technology called magnetoencephalography (MEG) can detect activity at the millisecond level, letting scientists examine waves of activity in frequencies from slow (0.1-4 cycles per second) to fast (greater than 50 cycles per second).

“Interestingly, even when we looked at much higher temporal resolution, brain networks appear to fluctuate on a relatively slow time scale,” said first author Viviana Betti, PhD, a postdoctoral researcher at Chieti. “However, when the subjects went from resting to watching a movie, the networks appeared to shift the frequency channels in which they operate, suggesting that the brain uses different frequencies for rest and task, much like a radio.”

In the study, the scientists asked one group of volunteers to either rest or watch the movie during brain scans. A second group was asked to watch the movie and look for event boundaries, moments when the plot or  characters or other elements of the story changed. They pushed a button when they noticed these changes.

As in previous studies, most subjects recognized similar event boundaries in the movie. The MEG scans showed that the communication between regions in the visual cortex was altered near the movie boundaries, especially in networks in the visual cortex.

“This gives us a hint of how cognitive activity dynamically changes the resting-state networks,” Corbetta said. “Activity locks and unlocks in these networks depending on how the task unfolds. Future studies will need to track resting-state networks in different tasks to see how correlated activity is dynamically coordinated across the brain.”

(Source: news.wustl.edu)

Filed under brain injury brain mapping neuroimaging brain networks brain activity neuroscience science

128 notes

New Insight Into How Brain ‘Learns’ Cocaine Addiction
A team of researchers says it has solved the longstanding puzzle of why a key protein linked to learning is also needed to become addicted to cocaine. Results of the study, published in the Aug. 1 issue of the journal Cell, describe how the learning-related protein works with other proteins to forge new pathways in the brain in response to a drug-induced rush of the “pleasure” molecule dopamine. By adding important detail to the process of addiction, the researchers, led by a group at Johns Hopkins, say the work may point the way to new treatments.
“The broad question was why and how cocaine strengthened certain circuits in the brain long term, effectively re-wiring the brain for addiction,” says Paul Worley, M.D., a professor in the Solomon H. Snyder Department of Neuroscience at the Johns Hopkins University School of Medicine. “What we found in this study was how two very different types of systems in the brain work together to make that happen.” Cocaine addiction, experts say, is among the strongest of addictions.
Worley did not come to the problem as an addiction researcher, but as an expert in a group of genes known as immediate early genes, which rapidly ramp up production in neurons when the brain is exposed to new information. In 2001, he said, a European group led by François Conquet of GlaxoSmithKline reported that deleting mGluR5, a protein complex that responds to the common brain-signaling molecule glutamate, made mice unresponsive to cocaine. “That finding came out of the blue,” says Worley, who knew mGluR proteins for their interactions with immediate early genes. “I never would have thought this type of protein was linked to dopamine and addiction, because the functions for it that we knew about up to that point were completely unrelated. That’s what scientists love: when you’re pretty sure something is right, but you don’t have a clue why.” The finding set Worley’s research group on a long search for an explanation. Eventually, in addition to studying the effects of altering genes for the relevant proteins in mice, they partnered with experts in measuring the brain’s electrical signals and in a biophysical technique that detects when chemical bonds are rotated within protein molecules. Using different types of experiments, they pieced together a complex story of how dopamine released in response to cocaine works together with mGluR5 and immediate early genes to switch cells into synapse-strengthening mode. “The process we identified explains how cocaine exposure can co-opt normal mechanisms of learning to induce addiction,” Worley says. Knowing the details of the mechanism may help researchers identify targets for potential drugs to treat addiction, he adds.
(Image: Milos Jokic)

New Insight Into How Brain ‘Learns’ Cocaine Addiction

A team of researchers says it has solved the longstanding puzzle of why a key protein linked to learning is also needed to become addicted to cocaine. Results of the study, published in the Aug. 1 issue of the journal Cell, describe how the learning-related protein works with other proteins to forge new pathways in the brain in response to a drug-induced rush of the “pleasure” molecule dopamine. By adding important detail to the process of addiction, the researchers, led by a group at Johns Hopkins, say the work may point the way to new treatments.

“The broad question was why and how cocaine strengthened certain circuits in the brain long term, effectively re-wiring the brain for addiction,” says Paul Worley, M.D., a professor in the Solomon H. Snyder Department of Neuroscience at the Johns Hopkins University School of Medicine. “What we found in this study was how two very different types of systems in the brain work together to make that happen.” Cocaine addiction, experts say, is among the strongest of addictions.

Worley did not come to the problem as an addiction researcher, but as an expert in a group of genes known as immediate early genes, which rapidly ramp up production in neurons when the brain is exposed to new information. In 2001, he said, a European group led by François Conquet of GlaxoSmithKline reported that deleting mGluR5, a protein complex that responds to the common brain-signaling molecule glutamate, made mice unresponsive to cocaine. “That finding came out of the blue,” says Worley, who knew mGluR proteins for their interactions with immediate early genes. “I never would have thought this type of protein was linked to dopamine and addiction, because the functions for it that we knew about up to that point were completely unrelated. That’s what scientists love: when you’re pretty sure something is right, but you don’t have a clue why.”

The finding set Worley’s research group on a long search for an explanation. Eventually, in addition to studying the effects of altering genes for the relevant proteins in mice, they partnered with experts in measuring the brain’s electrical signals and in a biophysical technique that detects when chemical bonds are rotated within protein molecules. Using different types of experiments, they pieced together a complex story of how dopamine released in response to cocaine works together with mGluR5 and immediate early genes to switch cells into synapse-strengthening mode.

“The process we identified explains how cocaine exposure can co-opt normal mechanisms of learning to induce addiction,” Worley says. Knowing the details of the mechanism may help researchers identify targets for potential drugs to treat addiction, he adds.

(Image: Milos Jokic)

Filed under addiction cocaine addiction dopamine glutamate neuroplasticity synapses neuroscience science

71 notes

Brain chemistry changes in children with autism offer clues to earlier detection and intervention
Between ages three and 10, children with autism spectrum disorder exhibit distinct brain chemical changes that differ from children with developmental delays and those with typical development, according to a new study led by University of Washington researchers.
The finding that early brain chemical alterations tend to normalize during the course of development in children with ASD gives new insight to efforts to improve early detection and intervention. The findings were reported July 31 in the Journal of the American Medical Assocation Psychiatry.
“In autism, we found a pattern of early chemical alterations at the cellular level that over time resolved – a pattern similar to what others have seen with people who have had a closed head injury and then got better,” said Stephen R. Dager, a UW professor of radiology and adjunct professor of bioengineering and associate director of UW’s Center on Human Development and Disability.
Neva Corrigan, a senior research fellow in radiology, was first author and Dager corresponding author of the study, titled “Atypical Developmental Patterns of Brain Chemistry in Children with Autism Spectrum Disorder.”
“The brain developmental abnormalities we observed in the children with autism are dynamic, not static. These early chemical alterations may hold clues as to specific processes at play in the disorder and, even more exciting, these changes may hold clues to reversing these processes,” Dager said.
In the study, scientists compared brain chemistry among three groups of children: those with a diagnosis of ASD, those with a diagnosis of developmental delay, and those considered typically developing. The researchers used magnetic resonance spectroscopic imaging, a type of MRI, to measure tissue-based chemicals in three age groups: 3-4 years, 6-7 years and 9-10 years.
One of the chemicals measured, N-acetylaspartate (NAA), is thought to play an important role in regulating synaptic connections and myelination. Its levels are decreased in people with conditions such as Alzheimer’s, traumatic brain injury or stroke. Other chemicals examined in the study – choline, creatine, glutamine/glutamate and myo-inositol – help characterize brain tissue integrity and bioenergetic status.
A notable finding concerned changes in gray matter NAA concentration: In scans of the 3- to 4-year-olds, NAA concentrations were low in both the ASD and developmentally delayed groups. By 9 to 10 years, NAA levels in the children with ASD had caught up to the levels of the typically developing group, while low levels of NAA persisted in the developmentally delayed group.
“A substantial number of kids with early, severe autism symptoms make tremendous improvements. We’re only measuring part of the iceberg, but this is a glimmer that we might be able to find a more specific period of vulnerability that we can measure and learn how to do something more proactively,” said Annette Estes, a co-author of the study and director of the UW Autism Center. She is an associate professor of speech and hearing sciences.
Study co-author Dennis Shaw, a UW professor of radiology and director of MRI at Seattle Children’s, observed that the findings “parallel some of the early brain structural differences we and others have found on MRI that also appear to normalize over time in children with autism. These chemical findings will help to better establish the timing and mechanisms underlying genetic abnormalities known to be involved in at least some cases of autism.”
Dager and UW colleagues are currently using more advanced MRI methods to study infants at risk for ASD because of an older sibling with autism.
“We’re looking prospectively at these children starting at 6 months to determine if we can detect very early alterations in brain cell signaling or related cellular disruption that may precede early, subtle clinical symptoms of ASD.”
Despite the encouraging finding, science has yet to pinpoint the when, what and why of autism’s inception, an event often likened to the flipping of a switch. Discovering the earliest period that a child’s brain starts to develop a profile of ASD is crucial because, as the study acknowledged, “even a relatively brief period of abnormal signaling between glial cells and neurons during early development would likely have a lasting effect” on how a child’s brain network develops.
This study also suggests that developmental delay and autism spectrum disorder are distinct disorders having different underlying brain mechanisms and treatment considerations, Dager said.
“Autism appears to have a different pathophysiology and different early biological course than idiopathic developmental disorder. There are differences in their underlying biological processes; this supports the notion that ASD is different from developmental delay and challenges the notion that the increasing prevalence of autism merely reflects a re-categorization of symptoms between autism and intellectual disabilities.”

Brain chemistry changes in children with autism offer clues to earlier detection and intervention

Between ages three and 10, children with autism spectrum disorder exhibit distinct brain chemical changes that differ from children with developmental delays and those with typical development, according to a new study led by University of Washington researchers.

The finding that early brain chemical alterations tend to normalize during the course of development in children with ASD gives new insight to efforts to improve early detection and intervention. The findings were reported July 31 in the Journal of the American Medical Assocation Psychiatry.

“In autism, we found a pattern of early chemical alterations at the cellular level that over time resolved – a pattern similar to what others have seen with people who have had a closed head injury and then got better,” said Stephen R. Dager, a UW professor of radiology and adjunct professor of bioengineering and associate director of UW’s Center on Human Development and Disability.

Neva Corrigan, a senior research fellow in radiology, was first author and Dager corresponding author of the study, titled “Atypical Developmental Patterns of Brain Chemistry in Children with Autism Spectrum Disorder.”

“The brain developmental abnormalities we observed in the children with autism are dynamic, not static. These early chemical alterations may hold clues as to specific processes at play in the disorder and, even more exciting, these changes may hold clues to reversing these processes,” Dager said.

In the study, scientists compared brain chemistry among three groups of children: those with a diagnosis of ASD, those with a diagnosis of developmental delay, and those considered typically developing. The researchers used magnetic resonance spectroscopic imaging, a type of MRI, to measure tissue-based chemicals in three age groups: 3-4 years, 6-7 years and 9-10 years.

One of the chemicals measured, N-acetylaspartate (NAA), is thought to play an important role in regulating synaptic connections and myelination. Its levels are decreased in people with conditions such as Alzheimer’s, traumatic brain injury or stroke. Other chemicals examined in the study – choline, creatine, glutamine/glutamate and myo-inositol – help characterize brain tissue integrity and bioenergetic status.

A notable finding concerned changes in gray matter NAA concentration: In scans of the 3- to 4-year-olds, NAA concentrations were low in both the ASD and developmentally delayed groups. By 9 to 10 years, NAA levels in the children with ASD had caught up to the levels of the typically developing group, while low levels of NAA persisted in the developmentally delayed group.

“A substantial number of kids with early, severe autism symptoms make tremendous improvements. We’re only measuring part of the iceberg, but this is a glimmer that we might be able to find a more specific period of vulnerability that we can measure and learn how to do something more proactively,” said Annette Estes, a co-author of the study and director of the UW Autism Center. She is an associate professor of speech and hearing sciences.

Study co-author Dennis Shaw, a UW professor of radiology and director of MRI at Seattle Children’s, observed that the findings “parallel some of the early brain structural differences we and others have found on MRI that also appear to normalize over time in children with autism. These chemical findings will help to better establish the timing and mechanisms underlying genetic abnormalities known to be involved in at least some cases of autism.”

Dager and UW colleagues are currently using more advanced MRI methods to study infants at risk for ASD because of an older sibling with autism.

“We’re looking prospectively at these children starting at 6 months to determine if we can detect very early alterations in brain cell signaling or related cellular disruption that may precede early, subtle clinical symptoms of ASD.”

Despite the encouraging finding, science has yet to pinpoint the when, what and why of autism’s inception, an event often likened to the flipping of a switch. Discovering the earliest period that a child’s brain starts to develop a profile of ASD is crucial because, as the study acknowledged, “even a relatively brief period of abnormal signaling between glial cells and neurons during early development would likely have a lasting effect” on how a child’s brain network develops.

This study also suggests that developmental delay and autism spectrum disorder are distinct disorders having different underlying brain mechanisms and treatment considerations, Dager said.

“Autism appears to have a different pathophysiology and different early biological course than idiopathic developmental disorder. There are differences in their underlying biological processes; this supports the notion that ASD is different from developmental delay and challenges the notion that the increasing prevalence of autism merely reflects a re-categorization of symptoms between autism and intellectual disabilities.”

Filed under autism ASD choline neurodevelopmental disorders neuroimaging neuroscience science

123 notes

Stray prenatal gene network suspected in schizophrenia
Researchers have reverse-engineered the outlines of a disrupted prenatal gene network in schizophrenia, by tracing spontaneous mutations to where and when they likely cause damage in the brain. Some people with the brain disorder may suffer from impaired birth of new neurons, or neurogenesis, in the front of their brain during prenatal development, suggests the study, which was funded by the National Institutes of Health.
“Processes critical for the brain’s development can be revealed by the mutations that disrupt them,” explained Mary-Claire King, Ph.D., University of Washington (UW), Seattle, a grantee of NIH’s National Institute of Mental Health (NIMH). “Mutations can lead to loss of integrity of a whole pathway, not just of a single gene. Our results implicate networked genes underlying a pathway responsible for orchestrating neurogenesis in the prefrontal cortex in schizophrenia.”
King, and collaborators at UW and seven other research centers participating in the NIMH genetics repository, report on their discovery Aug. 1, 2013 in the journal Cell.
“By linking genomic findings to functional measures, this approach gives us additional insight into how early development differs in the brain of someone who will eventually manifest the symptoms of psychosis,” said NIMH Director Thomas R. Insel, M.D.
Earlier studies had linked spontaneous mutations to non-familial schizophrenia and traced them broadly to genes involved in brain development, but little was known about convergent effects on pathways. King and colleagues set out to explore causes of schizophrenia by integrating genomic data with newly available online transcriptome resources that show where in the brain and when in development genes turn on. They compared spontaneous mutations in 105 people with schizophrenia with those in 84 unaffected siblings, in families without previous histories of the illness.
Unlike most other genes, expression levels of many of the 50 mutation-containing genes that form the suspected network were highest early in fetal development, tapered off by childhood, but conspicuously increased again in early adulthood – just when schizophrenia symptoms typically first develop. This adds to evidence supporting the prevailing neurodevelopmental model of schizophrenia. The implicated genes play important roles in migration of cells in the developing brain, communication between brain cells, regulation of gene expression, and related intracellular workings.
Having an older father increased the likelihood of spontaneous mutations for both affected and unaffected siblings. Yet affected siblings were modestly more likely to have mutations predicted to damage protein function. Such damaging mutations were estimated to account for 21 percent of schizophrenia cases in the study sample. The mutations tend to be individually rare; only one gene harboring damaging mutations turned up in more than one of the cases, and several patients had damaging mutations in more than one gene.
The networks formed by genes harboring these damaging mutations were found to vary in connectivity, based on the extent to which their proteins are co-expressed and interact. The network formed by genes harboring damaging mutations in schizophrenia had significantly more nodes, or points of connection, than networks modeled from unaffected siblings. By contrast, the network of genes harboring non-damaging mutations in affected siblings had no more nodes than similar networks in unaffected siblings.
When the researchers compared such network connectivity across different brain tissues and different periods of development, they discovered a notable difference between affected and unaffected siblings: Genes harboring damaging mutations that are expressed together in the fetal prefrontal cortex of people with schizophrenia formed a network with significantly greater connectivity than networks modeled from genes harboring similar mutations in their unaffected siblings at that time in development.
The study results are consistent with several lines of evidence implicating the prefrontal cortex in schizophrenia. The prefrontal cortex organizes information from other brain regions to coordinate executive functions like thinking, planning, attention span, working memory, problem-solving, and self-regulation. The findings suggest that impairments in such functions — often beginning before the onset of symptoms in early adulthood, when the prefrontal cortex fully matures – appear to be early signs of the illness.
The study demonstrates how integrating genomic data and transcriptome analysis can help to pinpoint disease mechanisms and identify potential treatment targets. For example, the mutant genes in the patients studied suggest the possible efficacy of medications targeting glutamate and calcium channel pathways, say the researchers.
"These results are striking, as they show that the genetic architecture of schizophrenia cannot be understood without an appreciation of how genes work in temporal and spatial networks during neurodevelopment," said Thomas Lehner, Ph.D., chief of the NIMH Genomics Research Branch.

Stray prenatal gene network suspected in schizophrenia

Researchers have reverse-engineered the outlines of a disrupted prenatal gene network in schizophrenia, by tracing spontaneous mutations to where and when they likely cause damage in the brain. Some people with the brain disorder may suffer from impaired birth of new neurons, or neurogenesis, in the front of their brain during prenatal development, suggests the study, which was funded by the National Institutes of Health.

“Processes critical for the brain’s development can be revealed by the mutations that disrupt them,” explained Mary-Claire King, Ph.D., University of Washington (UW), Seattle, a grantee of NIH’s National Institute of Mental Health (NIMH). “Mutations can lead to loss of integrity of a whole pathway, not just of a single gene. Our results implicate networked genes underlying a pathway responsible for orchestrating neurogenesis in the prefrontal cortex in schizophrenia.”

King, and collaborators at UW and seven other research centers participating in the NIMH genetics repository, report on their discovery Aug. 1, 2013 in the journal Cell.

“By linking genomic findings to functional measures, this approach gives us additional insight into how early development differs in the brain of someone who will eventually manifest the symptoms of psychosis,” said NIMH Director Thomas R. Insel, M.D.

Earlier studies had linked spontaneous mutations to non-familial schizophrenia and traced them broadly to genes involved in brain development, but little was known about convergent effects on pathways. King and colleagues set out to explore causes of schizophrenia by integrating genomic data with newly available online transcriptome resources that show where in the brain and when in development genes turn on. They compared spontaneous mutations in 105 people with schizophrenia with those in 84 unaffected siblings, in families without previous histories of the illness.

Unlike most other genes, expression levels of many of the 50 mutation-containing genes that form the suspected network were highest early in fetal development, tapered off by childhood, but conspicuously increased again in early adulthood – just when schizophrenia symptoms typically first develop. This adds to evidence supporting the prevailing neurodevelopmental model of schizophrenia. The implicated genes play important roles in migration of cells in the developing brain, communication between brain cells, regulation of gene expression, and related intracellular workings.

Having an older father increased the likelihood of spontaneous mutations for both affected and unaffected siblings. Yet affected siblings were modestly more likely to have mutations predicted to damage protein function. Such damaging mutations were estimated to account for 21 percent of schizophrenia cases in the study sample. The mutations tend to be individually rare; only one gene harboring damaging mutations turned up in more than one of the cases, and several patients had damaging mutations in more than one gene.

The networks formed by genes harboring these damaging mutations were found to vary in connectivity, based on the extent to which their proteins are co-expressed and interact. The network formed by genes harboring damaging mutations in schizophrenia had significantly more nodes, or points of connection, than networks modeled from unaffected siblings. By contrast, the network of genes harboring non-damaging mutations in affected siblings had no more nodes than similar networks in unaffected siblings.

When the researchers compared such network connectivity across different brain tissues and different periods of development, they discovered a notable difference between affected and unaffected siblings: Genes harboring damaging mutations that are expressed together in the fetal prefrontal cortex of people with schizophrenia formed a network with significantly greater connectivity than networks modeled from genes harboring similar mutations in their unaffected siblings at that time in development.

The study results are consistent with several lines of evidence implicating the prefrontal cortex in schizophrenia. The prefrontal cortex organizes information from other brain regions to coordinate executive functions like thinking, planning, attention span, working memory, problem-solving, and self-regulation. The findings suggest that impairments in such functions — often beginning before the onset of symptoms in early adulthood, when the prefrontal cortex fully matures – appear to be early signs of the illness.

The study demonstrates how integrating genomic data and transcriptome analysis can help to pinpoint disease mechanisms and identify potential treatment targets. For example, the mutant genes in the patients studied suggest the possible efficacy of medications targeting glutamate and calcium channel pathways, say the researchers.

"These results are striking, as they show that the genetic architecture of schizophrenia cannot be understood without an appreciation of how genes work in temporal and spatial networks during neurodevelopment," said Thomas Lehner, Ph.D., chief of the NIMH Genomics Research Branch.

Filed under schizophrenia brain development neurogenesis neurons prefrontal cortex neuroscience science

134 notes

Re-learning how to see: researchers find crucial on-off switch in visual development 
A new discovery by a University of Maryland-led research team offers hope for treating “lazy eye” and other serious visual problems that are usually permanent unless they are corrected in early childhood.
Amblyopia afflicts about three percent of the population, and is a widespread cause of vision loss in children. It occurs when both eyes are structurally normal, but mismatched – either misaligned, or differently focused, or unequally receptive to visual stimuli because of an obstruction such as a cataract in one eye.
During the so-called “critical period” when a young child’s brain is adapting very quickly to new experiences, the brain builds a powerful neural network connecting the stronger eye to the visual cortex. But the weaker eye gets less stimulation and develops fewer synapses, or points of connection between neurons. Over time the brain learns to ignore the weaker eye. Mild forms of amblyopia such as “lazy eye” result in problems with depth perception. In the most severe form, deprivation amblyopia, a cataract blocks light and starves the eye of visual experiences, significantly altering synaptic development and seriously impairing vision.
Because brain plasticity declines rapidly with age, early diagnosis and treatment of amblyopia is vital, said neuroscientist Elizabeth M. Quinlan, an associate professor of biology at UMD. If the underlying cause of amblyopia is resolved early enough, the child’s vision can recover to normal levels. But if the treatment comes after the end of the critical period and the loss of synaptic plasticity, the brain cannot relearn to see with the weaker eye.
“If a child is born with a cataract and it is not removed very early in life, very little can be done to improve vision,” Quinlan said. “The severe amblyopia that results is the most difficult to treat. For that reason, science has the most to gain by a better understanding of the underlying mechanisms.”
Quinlan, who specializes in studying how communication through the brain’s circuits changes over the course of a lifetime, wanted to find out what process controls the timing of the critical period of synaptic plasticity. If researchers could find the neurological on-off switch for the critical period, she reasoned, clinicians could use the information to successfully treat older children and adults.
Researchers in Quinlan’s University of Maryland lab teamed up with the laboratory of Alfredo Kirkwood at Johns Hopkins University to address two questions: What are the age boundaries of the critical period for synaptic plasticity, when it comes to determining eye dominance? And what developmental processes are involved?
Experiments in rodents suggested the timing of the critical period is controlled by a specific class of inhibitory neurons, which come into play after a visual stimulus activates excitatory neurons that link the eye to the visual cortex. The inhibitory neurons act as signal controllers, affecting the interactions between excitatory neurons and synapses.
“The generally accepted view has been that as the inhibitory neurons develop, synaptic plasticity declines, which was thought to occur at about five weeks of age in rodents,” roughly equivalent to five years of age in humans, Quinlan said. But in earlier experiments, Quinlan and Kirkwood found no correlation between the development of these inhibitory neurons and the loss of plasticity. In fact, they found the visual circuitry in rodents was highly adaptable at ages beyond five weeks.
In their latest research the UMD-led team looked “one synapse upstream from these inhibitory neurons,” Quinlan said, studying the control of that synapse by a protein called NARP (Neuronal Activity-Regulated Pentraxin). Working with two sets of mice – one group genetically similar to wild mice and another that lacked the NARP gene - the researchers covered one eye in each animal to simulate conditions that produce amblyopia.
The mice that were genetically similar to wild mice developed amblyopia, with characteristic dominance of the normal eye over the deprived eye. But the mice that lacked NARP did not develop amblyopia, regardless of age or the length of time one eye was deprived of stimulation.
The study, published in the current issue of the peer-reviewed journal Neuron, demonstrated that only one specific class of synapses was affected by the absence of NARP. Without NARP, the mice simply had no critical period in which the brain circuitry was weakened in response to the impaired blocking vision in one eye, Quinlan said. Except for the lack of this plasticity, their vision was normal.
“It’s remarkable how specific the deficit is,” Quinlan said. Without the NARP protein, “these animals develop normal vision. Their brain circuitry just isn’t plastic. We can completely turn off the critical period for plasticity by knocking out this protein.”
Since there are indications that NARP levels vary with age, the discovery raises hope that a treatment targeting NARP levels in humans could allow correction of amblyopia late in life, without affecting other aspects of vision.

Re-learning how to see: researchers find crucial on-off switch in visual development

A new discovery by a University of Maryland-led research team offers hope for treating “lazy eye” and other serious visual problems that are usually permanent unless they are corrected in early childhood.

Amblyopia afflicts about three percent of the population, and is a widespread cause of vision loss in children. It occurs when both eyes are structurally normal, but mismatched – either misaligned, or differently focused, or unequally receptive to visual stimuli because of an obstruction such as a cataract in one eye.

During the so-called “critical period” when a young child’s brain is adapting very quickly to new experiences, the brain builds a powerful neural network connecting the stronger eye to the visual cortex. But the weaker eye gets less stimulation and develops fewer synapses, or points of connection between neurons. Over time the brain learns to ignore the weaker eye. Mild forms of amblyopia such as “lazy eye” result in problems with depth perception. In the most severe form, deprivation amblyopia, a cataract blocks light and starves the eye of visual experiences, significantly altering synaptic development and seriously impairing vision.

Because brain plasticity declines rapidly with age, early diagnosis and treatment of amblyopia is vital, said neuroscientist Elizabeth M. Quinlan, an associate professor of biology at UMD. If the underlying cause of amblyopia is resolved early enough, the child’s vision can recover to normal levels. But if the treatment comes after the end of the critical period and the loss of synaptic plasticity, the brain cannot relearn to see with the weaker eye.

“If a child is born with a cataract and it is not removed very early in life, very little can be done to improve vision,” Quinlan said. “The severe amblyopia that results is the most difficult to treat. For that reason, science has the most to gain by a better understanding of the underlying mechanisms.”

Quinlan, who specializes in studying how communication through the brain’s circuits changes over the course of a lifetime, wanted to find out what process controls the timing of the critical period of synaptic plasticity. If researchers could find the neurological on-off switch for the critical period, she reasoned, clinicians could use the information to successfully treat older children and adults.

Researchers in Quinlan’s University of Maryland lab teamed up with the laboratory of Alfredo Kirkwood at Johns Hopkins University to address two questions: What are the age boundaries of the critical period for synaptic plasticity, when it comes to determining eye dominance? And what developmental processes are involved?

Experiments in rodents suggested the timing of the critical period is controlled by a specific class of inhibitory neurons, which come into play after a visual stimulus activates excitatory neurons that link the eye to the visual cortex. The inhibitory neurons act as signal controllers, affecting the interactions between excitatory neurons and synapses.

“The generally accepted view has been that as the inhibitory neurons develop, synaptic plasticity declines, which was thought to occur at about five weeks of age in rodents,” roughly equivalent to five years of age in humans, Quinlan said. But in earlier experiments, Quinlan and Kirkwood found no correlation between the development of these inhibitory neurons and the loss of plasticity. In fact, they found the visual circuitry in rodents was highly adaptable at ages beyond five weeks.

In their latest research the UMD-led team looked “one synapse upstream from these inhibitory neurons,” Quinlan said, studying the control of that synapse by a protein called NARP (Neuronal Activity-Regulated Pentraxin). Working with two sets of mice – one group genetically similar to wild mice and another that lacked the NARP gene - the researchers covered one eye in each animal to simulate conditions that produce amblyopia.

The mice that were genetically similar to wild mice developed amblyopia, with characteristic dominance of the normal eye over the deprived eye. But the mice that lacked NARP did not develop amblyopia, regardless of age or the length of time one eye was deprived of stimulation.

The study, published in the current issue of the peer-reviewed journal Neuron, demonstrated that only one specific class of synapses was affected by the absence of NARP. Without NARP, the mice simply had no critical period in which the brain circuitry was weakened in response to the impaired blocking vision in one eye, Quinlan said. Except for the lack of this plasticity, their vision was normal.

“It’s remarkable how specific the deficit is,” Quinlan said. Without the NARP protein, “these animals develop normal vision. Their brain circuitry just isn’t plastic. We can completely turn off the critical period for plasticity by knocking out this protein.”

Since there are indications that NARP levels vary with age, the discovery raises hope that a treatment targeting NARP levels in humans could allow correction of amblyopia late in life, without affecting other aspects of vision.

Filed under vision visual development lazy eye amblyopia synaptic plasticity brain circuitry neurons neuroscience science

393 notes

ucsdhealthsciences:

Healthy brains require a balance of two energy sources – ATP and GTP – regulated by the gene AMPD2. A mutation in the gene can result in pontocerebellar hyplasia, a neurodegenerative disease afflicting children. Illustration courtesy of Evgeny Onutchin, Buryat Studio
Potential Nutritional Therapy for Childhood Neurodegenerative Disease
Researchers at the University of California, San Diego School of Medicine have identified the gene mutation responsible for a particularly severe form of pontocerebellar hypoplasia, a currently incurable neurodegenerative disease affecting children. Based on results in cultured cells, they are hopeful that a nutritional supplement may one day be able to prevent or reverse the condition.
The study, from a team of international collaborators led by Joseph G. Gleeson, MD – Howard Hughes Medical Institute investigator and professor in the UCSD Departments of Neurosciences and Pediatrics and at Rady Children’s Hospital-San Diego, a research affiliate of UC San Diego – will be published in the August 1 issue of the journal Cell.
Pontocerebellar hypoplasia is a group of rare, related genetic neurological disorders characterized by abnormal development of the brain, resulting in disabilities in movement and cognitive function. Most patients do not survive to adulthood.
Gleeson and colleagues identified a specific gene mutation that causes pontocerebellar hypoplasia and linked it to an inability of brain cells to generate a form of energy required to synthesize proteins. Without this ability, neurons die, but the researchers also found that bypassing this block with a nutritional supplement restored neuronal survival.
“The goal is to one day use this supplement to prevent or reverse the course of neurodegeneration in humans, and thus cure this disease,” said Gleeson. 
More here

ucsdhealthsciences:

Healthy brains require a balance of two energy sources – ATP and GTP – regulated by the gene AMPD2. A mutation in the gene can result in pontocerebellar hyplasia, a neurodegenerative disease afflicting children. Illustration courtesy of Evgeny Onutchin, Buryat Studio

Potential Nutritional Therapy for Childhood Neurodegenerative Disease

Researchers at the University of California, San Diego School of Medicine have identified the gene mutation responsible for a particularly severe form of pontocerebellar hypoplasia, a currently incurable neurodegenerative disease affecting children. Based on results in cultured cells, they are hopeful that a nutritional supplement may one day be able to prevent or reverse the condition.

The study, from a team of international collaborators led by Joseph G. Gleeson, MD – Howard Hughes Medical Institute investigator and professor in the UCSD Departments of Neurosciences and Pediatrics and at Rady Children’s Hospital-San Diego, a research affiliate of UC San Diego – will be published in the August 1 issue of the journal Cell.

Pontocerebellar hypoplasia is a group of rare, related genetic neurological disorders characterized by abnormal development of the brain, resulting in disabilities in movement and cognitive function. Most patients do not survive to adulthood.

Gleeson and colleagues identified a specific gene mutation that causes pontocerebellar hypoplasia and linked it to an inability of brain cells to generate a form of energy required to synthesize proteins. Without this ability, neurons die, but the researchers also found that bypassing this block with a nutritional supplement restored neuronal survival.

“The goal is to one day use this supplement to prevent or reverse the course of neurodegeneration in humans, and thus cure this disease,” said Gleeson.

More here

87 notes

Anemia Linked to Increased Risk of Dementia

Anemia, or low levels of red blood cells, may increase the risk of dementia, according to a study published in the July 31, 2013, online issue of Neurology®, the medical journal of the American Academy of Neurology.

“Anemia is common in the elderly and occurs in up to 23 percent of adults ages 65 and older,” said study author Kristine Yaffe, MD, with the University of California – San Francisco and a member of the American Academy of Neurology. “The condition has also been linked in studies to an increased risk of early death.”

For the study, 2,552 older adults between the ages of 70-79 were tested for anemia and also underwent memory and thinking tests over 11 years. Of those, 393 had anemia at the start of the study. At the end of the study, 445, or about 18 percent of participants, developed dementia.

The research found that people who had anemia at the start of the study had a nearly 41 percent higher risk of developing dementia than those who were not anemic. The link remained after considering other factors, such as age, race, sex and education. Of the 393 people with anemia, 89 people, or 23 percent, developed dementia, compared to 366 of the 2,159 people who did not have anemia, or 17 percent.

“There are several explanations for why anemia may be linked to dementia. For example, anemia may be a marker for poor health in general, or low oxygen levels resulting from anemia may play a role in the connection. Reductions in oxygen to the brain have been shown to reduce memory and thinking abilities and may contribute to damage to neurons,” said Yaffe.

Filed under anemia dementia neurology neuroscience science

54 notes

New Therapy Improves Life Span in Melanoma Patients with Brain Metastases

In a retrospective study, Saint Louis University researchers have found that patients with melanoma brain metastases can be treated with large doses of interleukin-2 (HD IL-2), a therapy that triggers the body’s own immune system to destroy the cancer cells.

The study that was recently published in Chemotherapy Research and Practice, reviews cases of eight patients who underwent this therapy at Saint Louis University.

John Richart, M.D., associate professor of internal medicine at SLU and principal investigator of the study, first treated a patient with the disease using the HD IL-2 treatment in 1999.

"Traditionally, melanoma patients with brain metastases have not been considered for HD IL-2 because treatment was thought to be futile," Richart said. "Our study shows that having this condition does not exclude a patient from getting this treatment and can in fact improve the length of their life."

Melanoma is the most dangerous form of skin cancer that begins in the melanin-producing cells called melanocytes. In some melanoma patients, the cancer spreads to the brain, causing multiple tumors that are difficult to treat. According to the CDC, melanoma is the third most common cancer causing brain metastases in the U.S. Richart said the median overall survival of patients with melanoma brain metastases is approximately four months whereas in the study, the median overall survival for patients was 8.7 months.

During the treatment, patients are given an IV medication - a chemical the body naturally makes that stimulates the immune system to recognize and destroy melanoma cells - for a period of six days while they are admitted to the hospital and are closely monitored by doctors and nurses. A patient requires four such six-day admission cycles in order to complete the course of the treatment.

To be eligible for HD IL-2 treatment, melanoma patients with brain metastases have to be in healthy shape with good brain function - that is they cannot have brain lesions that are growing rapidly or show any symptoms of brain lesions. In the past, melanoma patients with brain metastases have been considered ineligible for this treatment because doctors thought that the treatment would cause life-threatening cerebral edema, a complication that causes excess accumulation of fluids in the brain, and neurotoxicity, or irreversible damage to the brain or the nervous system.

"In this review, we found that there were no episodes of treatment-related mortality. Our findings demonstrate that HD IL-2 can be considered as an option for patients with melanoma brain metastases," said Melinda Chu, M.D., a first year dermatology resident at SLU and first author of the study.
SLU is the only medical center in the region that provides this treatment.

"We need a highly skilled nursing staff for the HD-IL-2 program to be successful," Richart said. "Our nursing team at SLU is with each patient every step of the way, 24 hours a day. They help patients get through and continue the treatment."

Filed under interleukin-2 melanoma melanocytes cancer cells immune system brain neuroscience science

free counters