Neuroscience

Articles and news from the latest research reports.

397 notes

Marijuana use in adolescence may cause permanent brain abnormalities
Regular marijuana use in adolescence, but not adulthood, may permanently impair brain function and cognition, and may increase the risk of developing serious psychiatric disorders such as schizophrenia, according to a recent study from the University of Maryland School of Medicine. Researchers hope that the study, published in Neuropsychopharmacology — a publication of the journal Nature – will help to shed light on the potential long-term effects of marijuana use, particularly as lawmakers in Maryland and elsewhere contemplate legalizing the drug.
"Over the past 20 years, there has been a major controversy about the long-term effects of marijuana, with some evidence that use in adolescence could be damaging," says the study’s senior author Asaf Keller, Ph.D., Professor of Anatomy and Neurobiology at the University of Maryland School of Medicine. "Previous research has shown that children who started using marijuana before the age of 16 are at greater risk of permanent cognitive deficits, and have a significantly higher incidence of psychiatric disorders such as schizophrenia. There likely is a genetic susceptibility, and then you add marijuana during adolescence and it becomes the trigger."
"Adolescence is the critical period during which marijuana use can be damaging," says the study’s lead author, Sylvina Mullins Raver, a Ph.D. candidate in the Program in Neuroscience in the Department of Anatomy and Neurobiology at the University of Maryland School of Medicine. "We wanted to identify the biological underpinnings and determine whether there is a real, permanent health risk to marijuana use."
The scientists — including co-author Sarah Paige Haughwout, a research technician in Dr. Keller’s laboratory — began by examining cortical oscillations in mice. Cortical oscillations are patterns of the activity of neurons in the brain and are believed to underlie the brain’s various functions. These oscillations are very abnormal in schizophrenia and in other psychiatric disorders. The scientists exposed young mice to very low doses of the active ingredient in marijuana for 20 days, and then allowed them to return to their siblings and develop normally.
"In the adult mice exposed to marijuana ingredients in adolescence, we found that cortical oscillations were grossly altered, and they exhibited impaired cognitive abilities," says Ms. Raver. "We also found impaired cognitive behavioral performance in those mice. The striking finding is that, even though the mice were exposed to very low drug doses, and only for a brief period during adolescence, their brain abnormalities persisted into adulthood."
The scientists repeated the experiment, this time administering marijuana ingredients to adult mice that had never been exposed to the drug before. Their cortical oscillations and ability to perform cognitive behavioral tasks remained normal, indicating that it was only drug exposure during the critical period of adolescence that impaired cognition through this mechanism. The researchers took the next step in their studies, trying to pinpoint the mechanisms underlying these changes and the time period in which they occur.
"We looked at the different regions of the brain," says Dr. Keller. "The back of the brain develops first, and the frontal parts of the brain develop during adolescence. We found that the frontal cortex is much more affected by the drugs during adolescence. This is the area of the brain controls executive functions such as planning and impulse control. It is also the area most affected in schizophrenia."
Dr. Keller’s team believes that the results have indications for humans as well. They will continue to study the underlying mechanisms that cause these changes in cortical oscillations. “The purpose of studying these mechanisms is to see whether we can reverse these effects,” says Dr. Keller. “We are hoping we will learn more about schizophrenia and other psychiatric disorders, which are complicated conditions. These cognitive symptoms are not affected by medication, but they might be affected by controlling these cortical oscillations.”

Marijuana use in adolescence may cause permanent brain abnormalities

Regular marijuana use in adolescence, but not adulthood, may permanently impair brain function and cognition, and may increase the risk of developing serious psychiatric disorders such as schizophrenia, according to a recent study from the University of Maryland School of Medicine. Researchers hope that the study, published in Neuropsychopharmacology — a publication of the journal Nature – will help to shed light on the potential long-term effects of marijuana use, particularly as lawmakers in Maryland and elsewhere contemplate legalizing the drug.

"Over the past 20 years, there has been a major controversy about the long-term effects of marijuana, with some evidence that use in adolescence could be damaging," says the study’s senior author Asaf Keller, Ph.D., Professor of Anatomy and Neurobiology at the University of Maryland School of Medicine. "Previous research has shown that children who started using marijuana before the age of 16 are at greater risk of permanent cognitive deficits, and have a significantly higher incidence of psychiatric disorders such as schizophrenia. There likely is a genetic susceptibility, and then you add marijuana during adolescence and it becomes the trigger."

"Adolescence is the critical period during which marijuana use can be damaging," says the study’s lead author, Sylvina Mullins Raver, a Ph.D. candidate in the Program in Neuroscience in the Department of Anatomy and Neurobiology at the University of Maryland School of Medicine. "We wanted to identify the biological underpinnings and determine whether there is a real, permanent health risk to marijuana use."

The scientists — including co-author Sarah Paige Haughwout, a research technician in Dr. Keller’s laboratory — began by examining cortical oscillations in mice. Cortical oscillations are patterns of the activity of neurons in the brain and are believed to underlie the brain’s various functions. These oscillations are very abnormal in schizophrenia and in other psychiatric disorders. The scientists exposed young mice to very low doses of the active ingredient in marijuana for 20 days, and then allowed them to return to their siblings and develop normally.

"In the adult mice exposed to marijuana ingredients in adolescence, we found that cortical oscillations were grossly altered, and they exhibited impaired cognitive abilities," says Ms. Raver. "We also found impaired cognitive behavioral performance in those mice. The striking finding is that, even though the mice were exposed to very low drug doses, and only for a brief period during adolescence, their brain abnormalities persisted into adulthood."

The scientists repeated the experiment, this time administering marijuana ingredients to adult mice that had never been exposed to the drug before. Their cortical oscillations and ability to perform cognitive behavioral tasks remained normal, indicating that it was only drug exposure during the critical period of adolescence that impaired cognition through this mechanism. The researchers took the next step in their studies, trying to pinpoint the mechanisms underlying these changes and the time period in which they occur.

"We looked at the different regions of the brain," says Dr. Keller. "The back of the brain develops first, and the frontal parts of the brain develop during adolescence. We found that the frontal cortex is much more affected by the drugs during adolescence. This is the area of the brain controls executive functions such as planning and impulse control. It is also the area most affected in schizophrenia."

Dr. Keller’s team believes that the results have indications for humans as well. They will continue to study the underlying mechanisms that cause these changes in cortical oscillations. “The purpose of studying these mechanisms is to see whether we can reverse these effects,” says Dr. Keller. “We are hoping we will learn more about schizophrenia and other psychiatric disorders, which are complicated conditions. These cognitive symptoms are not affected by medication, but they might be affected by controlling these cortical oscillations.”

Filed under adolescence marijuana brain function cognitive deficits psychiatric disorders cortical oscillations neuroscience science

69 notes

Scientists Find a Potential Cause of Parkinson’s Disease that Points to a New Therapeutic Strategy

Biologists at The Scripps Research Institute (TSRI) have made a significant discovery that could lead to a new therapeutic strategy for Parkinson’s disease.

The findings, recently published online ahead of print in the journal Molecular and Cell Biology, focus on an enzyme known as parkin, whose absence causes an early-onset form of Parkinson’s disease. Precisely how the loss of this enzyme leads to the deaths of neurons has been unclear. But the TSRI researchers showed that parkin’s loss sharply reduces the level of another protein that normally helps protect neurons from stress.

“We now have a good model for how parkin loss can lead to the deaths of neurons under stress,” said TSRI Professor Steven I. Reed, who was senior author of the new study. “This also suggests a therapeutic strategy that might work against Parkinson’s and other neurodegenerative diseases.”

Genetic Clues

Parkinson’s is the world’s second-most common neurodegenerative disease, affecting about one million people in the United States alone. The disease is usually diagnosed after the appearance of the characteristic motor symptoms, which include tremor, muscle rigidity and slowness of movements. These symptoms are caused by the loss of neurons in the substantia nigra, a brain region that normally supplies the neurotransmitter dopamine to other regions that regulate muscle movements.

Most cases of Parkinson’s are considered “sporadic” and are thought to be caused by a variable mix of factors including advanced age, subtle genetic influences, chronic neuroinflammation and exposure to pesticides and other toxins. But between 5 and 15 percent of cases arise specifically from inherited gene mutations. Among these, mutations to the parkin gene are relatively common. Patients who have no functional parkin gene typically develop Parkinson’s-like symptoms before age 40.

Parkin belongs to a family of enzymes called ubiquitin ligases, whose main function is to regulate the levels of other proteins. They do so principally by “tagging” their protein targets with ubiquitin molecules, thus marking them for disposal by roving protein-breakers in cells known as proteasomes. Because parkin is a ubiquitin ligase, researchers have assumed that its absence allows some other protein or proteins to evade proteasomal destruction and thus accumulate abnormally and harm neurons. But since 1998, when parkin mutations were first identified as a cause of early-onset Parkinson’s, consensus about the identity of this protein culprit has been elusive.

“There have been a lot of theories, but no one has come up with a truly satisfactory answer,” Reed said.

Oxidative Stress

In 2005, Reed and his postdoctoral research associate (and wife) Susanna Ekholm-Reed decided to investigate a report that parkin associates with another ubiquitin ligase known as Fbw7. “We soon discovered that parkin regulates Fbw7 levels by tagging it with ubiquitin and thus targeting it for degradation by the proteasome,” said Ekholm-Reed.

Loss of parkin, they found, leads to rises in Fbw7 levels, specifically for a form of the protein known as Fbw7β. The scientists observed these elevated levels of Fbw7β in embryonic mouse neurons from which parkin had been deleted, in transgenic mice that were born without the parkin gene, and even in autopsied brain tissue from Parkinson’s patients who had parkin mutations.

Subsequent experiments showed that when neurons are exposed to harmful molecules known as reactive oxygen species, parkin appears to work harder at tagging Fbw7β for destruction, so that Fbw7β levels fall. Without the parkin-driven decrease in Fbw7β levels, the neurons become more sensitive to this “oxidative stress”—so that more of them undergo a programmed self-destruction called apoptosis. Oxidative stress, to which dopamine-producing substantia nigra neurons may be particularly vulnerable, has long been considered a likely contributor to Parkinson’s.

“We realized that there must be a downstream target of Fbw7β that’s important for neuronal survival during oxidative stress,” said Ekholm-Reed.

A New Neuroprotective Strategy

The research slowed for a period due to a lack of funding. But then, in 2011, came a breakthrough. Other researchers who were investigating Fbw7’s role in cancer reported that it normally tags a cell-survival protein called Mcl-1 for destruction. The loss of Fbw7 leads to rises in Mcl-1, which in turn makes cells more resistant to apoptosis. “We were very excited about that finding,” said Ekholm-Reed. The TSRI lab’s experiments quickly confirmed the chain of events in neurons: parkin keeps levels of Fbw7β under control, and Fbw7β keeps levels of Mcl-1 under control. Full silencing of Mcl-1 leaves neurons extremely sensitive to oxidative stress.

Members of the team suspect that this is the principal explanation for how parkin mutations lead to Parkinson’s disease. But perhaps more importantly, they believe that their discovery points to a broad new “neuroprotective” strategy: reducing the Fbw7β-mediated destruction of Mcl-1 in neurons, which should make neurons more resistant to oxidative and other stresses.

“If we can find a way to inhibit Fbw7β in a way that specifically raises Mcl-1 levels, we might be able to prevent the progressive neuronal loss that’s seen not only in Parkinson’s but also in other major neurological diseases, such as Huntington’s disease and ALS [amyotrophic lateral sclerosis],” said Reed.

Finding such an Mcl-1-boosting compound, he added, is now a major focus of his laboratory’s work.

(Source: scripps.edu)

Filed under neurodegenerative diseases parkinson's disease oxidative stress parkin gene dopamine neuroscience science

51 notes

Key Molecular Pathways Leading to Alzheimer’s Identified
Key molecular pathways that ultimately lead to late-onset Alzheimer’s disease, the most common form of the disorder, have been identified by researchers at Columbia University Medical Center (CUMC). The study, which used a combination of systems biology and cell biology tools, presents a new approach to Alzheimer’s disease research and highlights several new potential drug targets. The paper was published today in the journal Nature.
Much of what is known about Alzheimer’s comes from laboratory studies of rare, early-onset, familial (inherited) forms of the disease. “Such studies have provided important clues as to the underlying disease process, but it’s unclear how these rare familial forms of Alzheimer’s relate to the common form of the disease,” said study leader Asa Abeliovich, MD, PhD, associate professor of pathology and cell biology and of neurology in the Taub Institute for Research on Alzheimer’s Disease and the Aging Brain at CUMC. “Most important, dozens of drugs that ‘work’ in mouse models of familial disease have ultimately failed when tested in patients with late-onset Alzheimer’s. This has driven us, and other laboratories, to pursue mechanisms of the common form of the disease.”
Non-familial Alzheimer’s is complex; it is thought to be caused by a combination of genetic and environmental risk factors, each having a modest effect individually. Using so-called genome-wide association studies (GWAS), prior reports have identified a handful of common genetic variants that increase the likelihood of Alzheimer’s. A key goal has been to understand how such common genetic variants function to impact the likelihood of Alzheimer’s.
In the current study, the CUMC researchers identified key molecular pathways that link such genetic risk factors to Alzheimer’s disease. The work combined cell biology studies with systems biology tools, which are based on computational analysis of the complex network of changes in the expression of genes in the at-risk human brain.
More specifically, the researchers first focused on the single most significant genetic factor that puts people at high risk for Alzheimer’s, called APOE4 (found in about a third of all individuals). People with one copy of this genetic variant have a three-fold increased risk of developing late-onset Alzheimer’s, while those with two copies have a ten-fold increased risk. “In this study,” said Dr. Abeliovich, “we initially asked: If we look at autopsy brain tissue from individuals at high risk for Alzheimer’s, is there a consistent pattern?”
Surprisingly, even in the absence of Alzheimer’s disease, brain tissue from individuals at high risk (who carried APOE4 in their genes) harbored certain changes reminiscent of those seen in full-blown Alzheimer’s disease,” said Dr. Abeliovich. “We therefore focused on trying to understand these changes, which seem to put people at risk. The brain changes we considered were based on ‘transcriptomics’—a broad molecular survey of the expression levels of the thousands of genes expressed in brain.”
Using the network analysis tools mentioned above, the researchers then identified a dozen candidate “master regulator” factors that link APOE4 to the cascade of destructive events that culminates in Alzheimer’s dementia. Subsequent cell biology studies revealed that a number of these master regulators are involved in the processing and trafficking of amyloid precursor protein (APP) within brain neurons. APP gives rise to amyloid beta, the protein that accumulates in the brain cells of patients with Alzheimer’s. In sum, the work ultimately connected the dots between a common genetic factor that puts individuals at high risk for Alzheimer’s, APOE4, and the disease pathology.
Among the candidate “master regulators” identified, the team further analyzed two genes, SV2A and RFN219. “We were particularly interested in SV2A, as it is the target of a commonly used anti-epileptic drug, levetiracetam. This suggested a therapeutic strategy. But more research is needed before we can develop clinical trials of levetiracetam for patients with signs of late-onset Alzheimer’s disease.”
The researchers evaluated the role of SV2A, using human-induced neurons that carry the APOE4 genetic variant. (The neurons were generated by directed conversion of skin fibroblasts from individuals at high risk for Alzheimer’s, using a technology developed in the Abeliovich laboratory.) Treating neurons that harbor the APOE4 at-risk genetic variant with levetiracetam (which inhibits SV2A) led to reduced production of amyloid beta. The study also showed that RFN219 appears to play a role in APP-processing in cells with the APOE4 variant.

Key Molecular Pathways Leading to Alzheimer’s Identified

Key molecular pathways that ultimately lead to late-onset Alzheimer’s disease, the most common form of the disorder, have been identified by researchers at Columbia University Medical Center (CUMC). The study, which used a combination of systems biology and cell biology tools, presents a new approach to Alzheimer’s disease research and highlights several new potential drug targets. The paper was published today in the journal Nature.

Much of what is known about Alzheimer’s comes from laboratory studies of rare, early-onset, familial (inherited) forms of the disease. “Such studies have provided important clues as to the underlying disease process, but it’s unclear how these rare familial forms of Alzheimer’s relate to the common form of the disease,” said study leader Asa Abeliovich, MD, PhD, associate professor of pathology and cell biology and of neurology in the Taub Institute for Research on Alzheimer’s Disease and the Aging Brain at CUMC. “Most important, dozens of drugs that ‘work’ in mouse models of familial disease have ultimately failed when tested in patients with late-onset Alzheimer’s. This has driven us, and other laboratories, to pursue mechanisms of the common form of the disease.”

Non-familial Alzheimer’s is complex; it is thought to be caused by a combination of genetic and environmental risk factors, each having a modest effect individually. Using so-called genome-wide association studies (GWAS), prior reports have identified a handful of common genetic variants that increase the likelihood of Alzheimer’s. A key goal has been to understand how such common genetic variants function to impact the likelihood of Alzheimer’s.

In the current study, the CUMC researchers identified key molecular pathways that link such genetic risk factors to Alzheimer’s disease. The work combined cell biology studies with systems biology tools, which are based on computational analysis of the complex network of changes in the expression of genes in the at-risk human brain.

More specifically, the researchers first focused on the single most significant genetic factor that puts people at high risk for Alzheimer’s, called APOE4 (found in about a third of all individuals). People with one copy of this genetic variant have a three-fold increased risk of developing late-onset Alzheimer’s, while those with two copies have a ten-fold increased risk. “In this study,” said Dr. Abeliovich, “we initially asked: If we look at autopsy brain tissue from individuals at high risk for Alzheimer’s, is there a consistent pattern?”

Surprisingly, even in the absence of Alzheimer’s disease, brain tissue from individuals at high risk (who carried APOE4 in their genes) harbored certain changes reminiscent of those seen in full-blown Alzheimer’s disease,” said Dr. Abeliovich. “We therefore focused on trying to understand these changes, which seem to put people at risk. The brain changes we considered were based on ‘transcriptomics’—a broad molecular survey of the expression levels of the thousands of genes expressed in brain.”

Using the network analysis tools mentioned above, the researchers then identified a dozen candidate “master regulator” factors that link APOE4 to the cascade of destructive events that culminates in Alzheimer’s dementia. Subsequent cell biology studies revealed that a number of these master regulators are involved in the processing and trafficking of amyloid precursor protein (APP) within brain neurons. APP gives rise to amyloid beta, the protein that accumulates in the brain cells of patients with Alzheimer’s. In sum, the work ultimately connected the dots between a common genetic factor that puts individuals at high risk for Alzheimer’s, APOE4, and the disease pathology.

Among the candidate “master regulators” identified, the team further analyzed two genes, SV2A and RFN219. “We were particularly interested in SV2A, as it is the target of a commonly used anti-epileptic drug, levetiracetam. This suggested a therapeutic strategy. But more research is needed before we can develop clinical trials of levetiracetam for patients with signs of late-onset Alzheimer’s disease.”

The researchers evaluated the role of SV2A, using human-induced neurons that carry the APOE4 genetic variant. (The neurons were generated by directed conversion of skin fibroblasts from individuals at high risk for Alzheimer’s, using a technology developed in the Abeliovich laboratory.) Treating neurons that harbor the APOE4 at-risk genetic variant with levetiracetam (which inhibits SV2A) led to reduced production of amyloid beta. The study also showed that RFN219 appears to play a role in APP-processing in cells with the APOE4 variant.

Filed under alzheimer's disease neurodegenerative diseases amyloid precursor protein GWAS genes neurons neuroscience science

127 notes

Novel technology seen as new, more accurate way to diagnose and treat autism
Researchers at Indiana University School of Medicine and Rutgers University have developed a new quantitative screening method for diagnosing and longitudinal tracking of autism in children after age 3. The studies are published as part of a special collection of papers in the open-access journal Frontiers in Neuroscience titled “Autism: The Movement Perspective.”
The technique involves tracking a person’s random movements in real time with a sophisticated computer program that produces 240 images a second and detects systematic signatures unique to each person. The traditional assessment for diagnosing autism involves primarily subjective opinions of a person’s social interaction, deficits in communication, and repetitive and restricted behaviors and interests.
The new screening tool is a collaboration between Jorge V. José, Ph.D., vice president of research at Indiana University and the James H. Rudy Distinguished Professor of Physics in the IU Bloomington College of Arts and Sciences; Elizabeth Torres, Ph.D., the principal investigator for the study and an assistant professor in the Department of Psychology in the School of Arts and Sciences at Rutgers University; and Dimitri Metaxas, Ph.D., a Distinguished Professor of computer science at Rutgers. The research was funded by a $670,000 grant from the National Science Foundation.
"This research may open doors for the autistic community by offering the option of a dynamic diagnosis at a much earlier age and possibly enabling the start of therapy sooner in the child’s development," said Dr. José, who also is a professor of cellular and integrative physiology at the Indiana University School of Medicine.
The new technique provides an earlier, more objective and more accurate diagnosis of autism. It factors the importance of changes in movements and movement sensing, thus enabling the identification of inherent capabilities in each child, rather than just highlighting impairments of the child’s movement systems. It measures tiny fluctuations in movement as the individual moves through space and can determine the exact degree to which these patterns of motion differ from more typically developing individuals, and to what degree they can turn into predictive, reliable and anticipatory movements.
Even in nonverbal children and adults with autism, the method can diagnose autism subtypes, identify gender differences and track individual progress in development and treatment. The method may also be applied to infants.
Dr. José said statistical properties of how people move and the speed and random nature of the movements produce a quantitative measurement that can be applied to individuals when the new technology captures their movements.
“We can estimate the cognitive abilities of people just from the variability of how they move,” Dr. José said. “This may lead to a complementary way to develop therapies for autistic children at an early age.”
In a second paper in the collection, the new method can be applied to interventions. The researchers say it could change the way autistic children learn and communicate by helping them develop self-motivation, rather than relying exclusively on external cues and commands, which are the basis of behavioral therapy for children with autism.
Torres and her team created a digital set-up that works much like a Wii. Children with autism were exposed to onscreen media — such as videos of themselves, cartoons, a music video or a favorite TV show — and learned to communicate what they like with a simple motion.
"Every time the children cross a certain region in space, the media they like best goes on," Dr. Torres said. "They start out randomly exploring their surroundings. They seek where in space that interesting spot is which causes the media to play, and then they do so more systematically. Once they see a cause and effect connection, they move deliberately. The action becomes an intentional behavior."
Researchers found that all 25 children in the study, most of whom were nonverbal, spontaneously learned how to choose their favorite media. They also retained this knowledge over time even without practice.
The children independently learned that they could control their bodies to convey and procure what they want. “Children had to search for the magic spot themselves,” Dr. Torres said. “We didn’t instruct them.”
Torres believes that traditional forms of therapy, which place more emphasis on socially acceptable behavior, can actually hinder children with autism by discouraging mechanisms they have developed to cope with their sensory and motor differences, which vary greatly from individual to individual.
It is too early to tell whether the research will translate into publicly available methods for therapy and diagnosis, Dr. Torres said. But she is confident that parents of children with autism would find it easy to adopt her computer-aided technique to help their children.

Novel technology seen as new, more accurate way to diagnose and treat autism

Researchers at Indiana University School of Medicine and Rutgers University have developed a new quantitative screening method for diagnosing and longitudinal tracking of autism in children after age 3. The studies are published as part of a special collection of papers in the open-access journal Frontiers in Neuroscience titled “Autism: The Movement Perspective.”

The technique involves tracking a person’s random movements in real time with a sophisticated computer program that produces 240 images a second and detects systematic signatures unique to each person. The traditional assessment for diagnosing autism involves primarily subjective opinions of a person’s social interaction, deficits in communication, and repetitive and restricted behaviors and interests.

The new screening tool is a collaboration between Jorge V. José, Ph.D., vice president of research at Indiana University and the James H. Rudy Distinguished Professor of Physics in the IU Bloomington College of Arts and Sciences; Elizabeth Torres, Ph.D., the principal investigator for the study and an assistant professor in the Department of Psychology in the School of Arts and Sciences at Rutgers University; and Dimitri Metaxas, Ph.D., a Distinguished Professor of computer science at Rutgers. The research was funded by a $670,000 grant from the National Science Foundation.

"This research may open doors for the autistic community by offering the option of a dynamic diagnosis at a much earlier age and possibly enabling the start of therapy sooner in the child’s development," said Dr. José, who also is a professor of cellular and integrative physiology at the Indiana University School of Medicine.

The new technique provides an earlier, more objective and more accurate diagnosis of autism. It factors the importance of changes in movements and movement sensing, thus enabling the identification of inherent capabilities in each child, rather than just highlighting impairments of the child’s movement systems. It measures tiny fluctuations in movement as the individual moves through space and can determine the exact degree to which these patterns of motion differ from more typically developing individuals, and to what degree they can turn into predictive, reliable and anticipatory movements.

Even in nonverbal children and adults with autism, the method can diagnose autism subtypes, identify gender differences and track individual progress in development and treatment. The method may also be applied to infants.

Dr. José said statistical properties of how people move and the speed and random nature of the movements produce a quantitative measurement that can be applied to individuals when the new technology captures their movements.

“We can estimate the cognitive abilities of people just from the variability of how they move,” Dr. José said. “This may lead to a complementary way to develop therapies for autistic children at an early age.”

In a second paper in the collection, the new method can be applied to interventions. The researchers say it could change the way autistic children learn and communicate by helping them develop self-motivation, rather than relying exclusively on external cues and commands, which are the basis of behavioral therapy for children with autism.

Torres and her team created a digital set-up that works much like a Wii. Children with autism were exposed to onscreen media — such as videos of themselves, cartoons, a music video or a favorite TV show — and learned to communicate what they like with a simple motion.

"Every time the children cross a certain region in space, the media they like best goes on," Dr. Torres said. "They start out randomly exploring their surroundings. They seek where in space that interesting spot is which causes the media to play, and then they do so more systematically. Once they see a cause and effect connection, they move deliberately. The action becomes an intentional behavior."

Researchers found that all 25 children in the study, most of whom were nonverbal, spontaneously learned how to choose their favorite media. They also retained this knowledge over time even without practice.

The children independently learned that they could control their bodies to convey and procure what they want. “Children had to search for the magic spot themselves,” Dr. Torres said. “We didn’t instruct them.”

Torres believes that traditional forms of therapy, which place more emphasis on socially acceptable behavior, can actually hinder children with autism by discouraging mechanisms they have developed to cope with their sensory and motor differences, which vary greatly from individual to individual.

It is too early to tell whether the research will translate into publicly available methods for therapy and diagnosis, Dr. Torres said. But she is confident that parents of children with autism would find it easy to adopt her computer-aided technique to help their children.

Filed under autism technology neuroscience science

103 notes

Neural Simulations Hint at the Origin of Brain Waves
At EPFL’s Blue Brain facilities, computer models of individual neurons are being assembled into neural circuits that produce electrical signals akin to brain waves. The results, published in the journal Neuron, are helping solve the mystery of how and why these signals arise in the brain.  
For almost a century, scientists have been studying brain waves to learn about mental health and the way we think. Yet the way billions of interconnected neurons work together to produce brain waves remains unknown. Now, scientists from EPFL’s Blue Brain Project in Switzerland, at the core of the European Human Brain Project, and the Allen Institute for Brain Science in the United States, show in the July 24th edition of the journal Neuron how a complex computer model is providing a new tool to solve the mystery.
The brain is composed of many different types of neurons, each of which carry electrical signals. Electrodes placed on the head or directly in brain tissue allow scientists to monitor the cumulative effect of this electrical activity, called electroencephalography (EEG) signals. But what is it about the structure and function of each and every neuron, and the way they network together, that give rise to these electrical signals measured in a mammalian brain?
Modeling Brain Circuitry

The Blue Brain Project is working to model a complete human brain. For the moment, Blue Brain scientists study rodent brain tissue and characterize different types of neurons to excruciating detail, recording their electrical properties, shapes, sizes, and how they connect.
To answer the question of brain-wave origin, researchers at EPFL’s Blue Brain Project and the Allen Institute joined forces with the help of the Blue Brain modeling facilities. Their work is based on a computer model of a neural circuit the likes of which have never been seen before, encompassing an unprecedented amount of detail and simulating 12,000 neurons.
“It is the first time that a model of this complexity has been used to study the underlying properties of brain waves,” says EPFL scientist Sean Hill.
In observing their model, the researchers noticed that the electrical activity swirling through the entire system was reminiscent of brain waves measured in rodents. Because the computer model uses an overwhelming amount of physical, chemical and biological data, the supercomputer simulation allows scientists to analyze brain waves at a level of detail simply unattainable with traditional monitoring of live brain tissue.
“We need a computer model because it is impossible to relate the electrical activity of potentially billions of individual neurons and the resulting brain waves at the same time,” says Hill. “Through this view, we’re able to provide an interpretation, at the single-neuron level, of brain waves that are measured when tissue is actually probed in the lab.”
Finding brain wave analogs
Neurons are somewhat like tiny batteries, needing to be charged in order to fire off an electrical impulse known as a “spike”. It is through these “spikes” that neurons communicate with each other to produce thought and perception. To “recharge” a neuron, charged particles called ions must travel through miniscule ionic channels. These channels are like gates that regulate electrical current. Ultimately, the accumulation of multiple electrical signals throughout the entire circuit of neurons produces brain waves.
The challenge for scientists in this study was to incorporate into the simulation the thousands of parameters, per neuron, that describe these electrical properties. Once they did that, they saw that the overall electrical activity in their model of 12,000 neurons was akin to observations of brain activity in rodents, hinting at the origin of brain waves.
“Our model is still incomplete, but the electrical signals produced by the computer simulation and what was actually measured in the rat brain have some striking similarities,” says Allen Institute scientist Costas Anastassiou.
Hill adds, “For the first time, we show that the complex behavior of ion channels on the branches of the neurons contributes to the shape of brain waves.”There is still much work to be done in order to arrive at a complete simulation. While the model’s electrical signals are analogous to in vivo measurements, researchers warn that there are still many open questions as well as room to improve the model. For instance, the simulation is modeled on neurons that control the hind-limb, while in vivo data represent brain waves coming from neurons that have a similar function but control whiskers instead.
“Even so, the computer model we used allowed us to characterize, and more importantly quantify, key features of how neurons produce these signals,” says Anastassiou.
The scientists are currently studying similar brain wave phenomena in larger and more realistic neural circuits.
This computer model is drawing cellular biophysics and cognitive neuroscience closer together, in order to achieve the same goal: understanding the brain. But the two disciplines share neither the methods nor the scientific language. By simulating electrical brain activity and relating the behavior of single neurons to brain waves, the researchers aim to bridge this gap, opening the way to better tools for diagnosing mental disorders, and on a deeper level, offering a better understanding of ourselves.

Neural Simulations Hint at the Origin of Brain Waves

At EPFL’s Blue Brain facilities, computer models of individual neurons are being assembled into neural circuits that produce electrical signals akin to brain waves. The results, published in the journal Neuron, are helping solve the mystery of how and why these signals arise in the brain.

For almost a century, scientists have been studying brain waves to learn about mental health and the way we think. Yet the way billions of interconnected neurons work together to produce brain waves remains unknown. Now, scientists from EPFL’s Blue Brain Project in Switzerland, at the core of the European Human Brain Project, and the Allen Institute for Brain Science in the United States, show in the July 24th edition of the journal Neuron how a complex computer model is providing a new tool to solve the mystery.

The brain is composed of many different types of neurons, each of which carry electrical signals. Electrodes placed on the head or directly in brain tissue allow scientists to monitor the cumulative effect of this electrical activity, called electroencephalography (EEG) signals. But what is it about the structure and function of each and every neuron, and the way they network together, that give rise to these electrical signals measured in a mammalian brain?

Modeling Brain Circuitry

The Blue Brain Project is working to model a complete human brain. For the moment, Blue Brain scientists study rodent brain tissue and characterize different types of neurons to excruciating detail, recording their electrical properties, shapes, sizes, and how they connect.

To answer the question of brain-wave origin, researchers at EPFL’s Blue Brain Project and the Allen Institute joined forces with the help of the Blue Brain modeling facilities. Their work is based on a computer model of a neural circuit the likes of which have never been seen before, encompassing an unprecedented amount of detail and simulating 12,000 neurons.

“It is the first time that a model of this complexity has been used to study the underlying properties of brain waves,” says EPFL scientist Sean Hill.

In observing their model, the researchers noticed that the electrical activity swirling through the entire system was reminiscent of brain waves measured in rodents. Because the computer model uses an overwhelming amount of physical, chemical and biological data, the supercomputer simulation allows scientists to analyze brain waves at a level of detail simply unattainable with traditional monitoring of live brain tissue.

“We need a computer model because it is impossible to relate the electrical activity of potentially billions of individual neurons and the resulting brain waves at the same time,” says Hill. “Through this view, we’re able to provide an interpretation, at the single-neuron level, of brain waves that are measured when tissue is actually probed in the lab.”

Finding brain wave analogs

Neurons are somewhat like tiny batteries, needing to be charged in order to fire off an electrical impulse known as a “spike”. It is through these “spikes” that neurons communicate with each other to produce thought and perception. To “recharge” a neuron, charged particles called ions must travel through miniscule ionic channels. These channels are like gates that regulate electrical current. Ultimately, the accumulation of multiple electrical signals throughout the entire circuit of neurons produces brain waves.

The challenge for scientists in this study was to incorporate into the simulation the thousands of parameters, per neuron, that describe these electrical properties. Once they did that, they saw that the overall electrical activity in their model of 12,000 neurons was akin to observations of brain activity in rodents, hinting at the origin of brain waves.

“Our model is still incomplete, but the electrical signals produced by the computer simulation and what was actually measured in the rat brain have some striking similarities,” says Allen Institute scientist Costas Anastassiou.

Hill adds, “For the first time, we show that the complex behavior of ion channels on the branches of the neurons contributes to the shape of brain waves.”
There is still much work to be done in order to arrive at a complete simulation. While the model’s electrical signals are analogous to in vivo measurements, researchers warn that there are still many open questions as well as room to improve the model. For instance, the simulation is modeled on neurons that control the hind-limb, while in vivo data represent brain waves coming from neurons that have a similar function but control whiskers instead.

“Even so, the computer model we used allowed us to characterize, and more importantly quantify, key features of how neurons produce these signals,” says Anastassiou.

The scientists are currently studying similar brain wave phenomena in larger and more realistic neural circuits.

This computer model is drawing cellular biophysics and cognitive neuroscience closer together, in order to achieve the same goal: understanding the brain. But the two disciplines share neither the methods nor the scientific language. By simulating electrical brain activity and relating the behavior of single neurons to brain waves, the researchers aim to bridge this gap, opening the way to better tools for diagnosing mental disorders, and on a deeper level, offering a better understanding of ourselves.

Filed under Blue Brain project brain mapping brainwaves neural circuits neuroscience science

344 notes

Brain research shows psychopathic criminals do not lack empathy, but fail to use it automatically
Criminal psychopathy can be both repulsive and fascinating, as illustrated by the vast number of books and movies inspired by this topic. Offenders diagnosed with psychopathy pose a significant threat to society, because they are more likely to harm other individuals and to do so again after being released. A brain imaging study in the Netherlands shows individuals with psychopathy have reduced empathy while witnessing the pains of others. When asked to empathize, however, they can activate their empathy. This could explain why psychopathic individuals can be callous and socially cunning at the same time.
Why are psychopathic individuals more likely to hurt others? Individuals with psychopathy characteristically demonstrate reduced empathy with the feelings of others, which may explain why it is easier for them to hurt other people. However, what causes this lack of empathy is poorly understood. Scientific studies on psychopathic subjects are notoriously hard to conduct. “Convicted criminals with a diagnosis of psychopathy are confined to high-security forensic institutions in which state-of-the-art technology to study their brain, like magnetic resonance imaging, is usually unavailable”, explains Professor Christian Keysers, Head of the Social Brain Lab in Amsterdam, and senior author of a study on psychopathy appearing in the Journal Brain this week. “Bringing them to scientific research centres, on the other hand, requires the kind of high-security transportation that most judicial systems are unwilling to finance.”
The Dutch judicial system, however, seems to be an exception. They joined forces with academia to promote a better understanding of psychopathy. As a result, criminals with psychopathy were transported to the Social Brain Lab of the University Medical Center in Groningen (The Netherlands). There, the team could use state of the art high-field functional magnetic resonance imaging to peak into the brain of criminals with psychopathy while they view the emotions of others.
The study, which will appear on the 25th of July in the journal Brain (published by Oxford University Press) and is entitled “Reduced spontaneous but relatively normal deliberate vicarious representations in psychopathy”, included 18 individuals with psychopathy and a control group, and consisted of three parts. “All participants first watched short movie clips of two people interacting with each other, zoomed in on their hands. The movie clips showed one hand touching the other in a loving, a painful, a socially rejecting or a neutral way. At this stage, we asked them to look at these movies just as they would watch one of their favourite films”, Harma Meffert, the first author of the paper, explains. Meffert was a graduate student in the Social Brain Lab while the study was conducted, and is now a post-doctoral fellow at the National Institutes of Mental Health in Bethesda.
Next, the participants watched the same clips again. This time, however, the researchers prompted them explicitly to “empathise with one of the actors in the movie”, that is, they were requested to really try to feel what the actors in the movie were feeling.
"In the third and final part, we performed similar hand interactions with the participants themselves, while they were lying in the scanner, having their brain activity measured", adds Meffert. "We wanted to know to what extent they would activate the same brain regions while they were watching the hand interactions in the movies, as they would when they were experiencing these same hand interactions themselves."
Our brains are equipped with what scientists call a “mirror system”. For example, the motor cortex of the brain normally allows you to move your own body. Your so called somatosensory cortex, when activated, makes you to feel touch on your skin. Your insula, finally, when activated makes you feel emotions like pain or disgust. In the last decades, brain scientists have discovered that when people watch other people move their body, or see those people being touched, or have emotions, these same brain regions are activated. In other words, the actions, touch or emotions of others become your own. This “mirror system” possibly constitutes a crucial part of our ability to empathize with other people, and it has been previously shown, that the less you activate this system, the less you report to empathize with other people. It has been suggested that individuals with psychopathy might somehow suffer from a broken “mirror system”, resulting in a diminished ability to empathize with their victims.
As it turns out, however, the picture seems to be more complex. When asked to just watch the film clips, the individuals with psychopathy indeed did activate their mirror system less. “Regions involved in their own actions, emotions and sensations were less active than that of controls while they saw what happens in others”, summarizes Christian Keysers. “At first, this seems to suggest that psychopathic criminals might hurt others more easily than we do, because they do not feel pain, when they see the pain of their victims.”
As the second part of the study revealed, however, it’s not quite so simple. Instead of generally activating their mirror system less, individuals with psychopathy rather seem not to use this system spontaneously, but they can use it when asked to. “When explicitly asked to empathize, the differences between how strongly the individuals with and without psychopathy activate their own actions, sensations and emotions almost entirely disappeared in their empathic brain”, explains Valeria Gazzola, Assistant Professor at the UMCG and second author of the paper. “Psychopathy may not be so much the incapacity to empathize, but a reduced propensity to empathize, paired with a preserved capacity to empathize when required to do so”. The brain data suggests, that by default, psychopathic individuals feel less empathy than others. If they try to empathize, however, they can switch to ‘empathy mode’.
There might be two sides to these findings. The darker side is that reduced spontaneous empathy together with a preserved capacity for empathy might be the cocktail that makes these individuals so callous when harming their victims and at the same time so socially cunning when they try to seduce their victims. Whether individuals with psychopathy autonomously switch their empathy mode on and off depending on the requirements of a social situation however remains to be established. The brighter side is that the preserved capacity for empathy might be harnessed in therapy. Instead of having to create a capacity for empathy, therapies may need to focus on making the existing capacity more automatic to prevent them from further harming others. How to do so, remains at this stage uncertain.

Brain research shows psychopathic criminals do not lack empathy, but fail to use it automatically

Criminal psychopathy can be both repulsive and fascinating, as illustrated by the vast number of books and movies inspired by this topic. Offenders diagnosed with psychopathy pose a significant threat to society, because they are more likely to harm other individuals and to do so again after being released. A brain imaging study in the Netherlands shows individuals with psychopathy have reduced empathy while witnessing the pains of others. When asked to empathize, however, they can activate their empathy. This could explain why psychopathic individuals can be callous and socially cunning at the same time.

Why are psychopathic individuals more likely to hurt others? Individuals with psychopathy characteristically demonstrate reduced empathy with the feelings of others, which may explain why it is easier for them to hurt other people. However, what causes this lack of empathy is poorly understood. Scientific studies on psychopathic subjects are notoriously hard to conduct. “Convicted criminals with a diagnosis of psychopathy are confined to high-security forensic institutions in which state-of-the-art technology to study their brain, like magnetic resonance imaging, is usually unavailable”, explains Professor Christian Keysers, Head of the Social Brain Lab in Amsterdam, and senior author of a study on psychopathy appearing in the Journal Brain this week. “Bringing them to scientific research centres, on the other hand, requires the kind of high-security transportation that most judicial systems are unwilling to finance.”

The Dutch judicial system, however, seems to be an exception. They joined forces with academia to promote a better understanding of psychopathy. As a result, criminals with psychopathy were transported to the Social Brain Lab of the University Medical Center in Groningen (The Netherlands). There, the team could use state of the art high-field functional magnetic resonance imaging to peak into the brain of criminals with psychopathy while they view the emotions of others.

The study, which will appear on the 25th of July in the journal Brain (published by Oxford University Press) and is entitled “Reduced spontaneous but relatively normal deliberate vicarious representations in psychopathy”, included 18 individuals with psychopathy and a control group, and consisted of three parts. “All participants first watched short movie clips of two people interacting with each other, zoomed in on their hands. The movie clips showed one hand touching the other in a loving, a painful, a socially rejecting or a neutral way. At this stage, we asked them to look at these movies just as they would watch one of their favourite films”, Harma Meffert, the first author of the paper, explains. Meffert was a graduate student in the Social Brain Lab while the study was conducted, and is now a post-doctoral fellow at the National Institutes of Mental Health in Bethesda.

Next, the participants watched the same clips again. This time, however, the researchers prompted them explicitly to “empathise with one of the actors in the movie”, that is, they were requested to really try to feel what the actors in the movie were feeling.

"In the third and final part, we performed similar hand interactions with the participants themselves, while they were lying in the scanner, having their brain activity measured", adds Meffert. "We wanted to know to what extent they would activate the same brain regions while they were watching the hand interactions in the movies, as they would when they were experiencing these same hand interactions themselves."

Our brains are equipped with what scientists call a “mirror system”. For example, the motor cortex of the brain normally allows you to move your own body. Your so called somatosensory cortex, when activated, makes you to feel touch on your skin. Your insula, finally, when activated makes you feel emotions like pain or disgust. In the last decades, brain scientists have discovered that when people watch other people move their body, or see those people being touched, or have emotions, these same brain regions are activated. In other words, the actions, touch or emotions of others become your own. This “mirror system” possibly constitutes a crucial part of our ability to empathize with other people, and it has been previously shown, that the less you activate this system, the less you report to empathize with other people. It has been suggested that individuals with psychopathy might somehow suffer from a broken “mirror system”, resulting in a diminished ability to empathize with their victims.

As it turns out, however, the picture seems to be more complex. When asked to just watch the film clips, the individuals with psychopathy indeed did activate their mirror system less. “Regions involved in their own actions, emotions and sensations were less active than that of controls while they saw what happens in others”, summarizes Christian Keysers. “At first, this seems to suggest that psychopathic criminals might hurt others more easily than we do, because they do not feel pain, when they see the pain of their victims.”

As the second part of the study revealed, however, it’s not quite so simple. Instead of generally activating their mirror system less, individuals with psychopathy rather seem not to use this system spontaneously, but they can use it when asked to. “When explicitly asked to empathize, the differences between how strongly the individuals with and without psychopathy activate their own actions, sensations and emotions almost entirely disappeared in their empathic brain”, explains Valeria Gazzola, Assistant Professor at the UMCG and second author of the paper. “Psychopathy may not be so much the incapacity to empathize, but a reduced propensity to empathize, paired with a preserved capacity to empathize when required to do so”. The brain data suggests, that by default, psychopathic individuals feel less empathy than others. If they try to empathize, however, they can switch to ‘empathy mode’.

There might be two sides to these findings. The darker side is that reduced spontaneous empathy together with a preserved capacity for empathy might be the cocktail that makes these individuals so callous when harming their victims and at the same time so socially cunning when they try to seduce their victims. Whether individuals with psychopathy autonomously switch their empathy mode on and off depending on the requirements of a social situation however remains to be established. The brighter side is that the preserved capacity for empathy might be harnessed in therapy. Instead of having to create a capacity for empathy, therapies may need to focus on making the existing capacity more automatic to prevent them from further harming others. How to do so, remains at this stage uncertain.

Filed under psychopathy empathy brain imaging brain activity somatosensory cortex psychology neuroscience science

108 notes

Epilepsy in a dish: Stem cell research reveals clues to disease’s origins and possible treatment
A new stem cell-based approach to studying epilepsy has yielded a surprising discovery about what causes one form of the disease, and may help in the search for better medicines to treat all kinds of seizure disorders.
The findings, reported by a team of scientists from the University of Michigan Medical School and colleagues, use a technique that could be called “epilepsy in a dish”.
By turning skin cells of epilepsy patients into stem cells, and then turning those stem cells into neurons, or brain nerve cells, the team created a miniature testing ground for epilepsy. They could even measure the signals that the cells were sending to one another, through tiny portals called sodium channels.
In neurons derived from the cells of children who have a severe, rare genetic form of epilepsy called Dravet syndrome, the researchers report abnormally high levels of sodium current activity. They saw spontaneous bursts of communication and “hyperexcitability” that could potentially set off seizures. Neurons made from the skin cells of people without epilepsy showed none of this abnormal activity.
They report their results online in the Annals of Neurology, and have further work in progress to create induced pluripotent stem cell lines from the cells of patients with other genetic forms of epilepsy. The work is funded by the National Institutes of Health, the American Epilepsy Society, the Epilepsy Foundation and U-M.
The new findings differs from what other scientists have seen in mice — demonstrating the importance of studying cells made from human epilepsy patients. Because the cells came from patients, they contained the hallmark seen in most patients with Dravet syndrome: a new mutation in SCN1A, the gene that encodes the crucial sodium channel protein called Nav1.1. That mutation reduces the number of channels to half the normal number in patients’ brains.
"With this technique, we can study cells that closely resemble the patient’s own brain cells, without doing a brain biopsy," says senior author and team leader Jack M. Parent, M.D., professor of neurology at U-M and a researcher at the VA Ann Arbor Healthcare System. "It appears that the cells are overcompensating for the loss of channels due to the mutation. These patient-specific induced neurons hold great promise for modeling seizure disorders, and potentially screening medications."
With the new paper, Parent, postdoctoral fellow Yu Liu, Ph.D. and their collaborators Lori Isom, Ph.D., professor of Pharmacology and of Molecular and Integrative Physiology at U-M, and Miriam Meisler, Ph.D., Distinguished University Professor of Human Genetics at U-M, report striking discoveries about what is happening at the cell level in the neurons of Dravet syndrome patients with a mutated SCN1A gene.
They also demonstrated that the effect is rooted in something that happens after function of the gene is reduced due to the mutation, though they don’t yet know how or why the nerve cells overcompensate for partial loss of this channel.
And, they found that the neurons didn’t show the telltale signs of hyperexcitability in the first few weeks after they were made — consistent with the fact that children with Dravet syndrome often don’t suffer their first seizures until they are several months old.
"In addition, reproduction of the hyperactivity of epileptic neurons in these cell cultures demonstrates that there is an intrinsic change in the neurons that does not depend on input from circuits in the brain," says co-author Meisler.
A platform for testing medications
Many Dravet patients don’t respond to current epilepsy medications, making the search for new options urgent. Their lives are constantly under threat by the risk of SUDEP, sudden unexplained death in epilepsy – and they never outgrow their condition, which delays their development and often requires round-the-clock care.
"Working with patient families, and translating our sodium channel research to a pediatric disease, has made our basic science work much more immediate and critical," says Isom, who serves on the scientific advisory board of the Dravet Syndrome Foundation along with Meisler. Parent, who co-directs U-M’s Comprehensive Epilepsy Program, was recently honored by the foundation.
The team is now working toward screening specific compounds for seizure-calming potential in Dravet syndrome, by testing their impact on the cells in the “epilepsy in a dish” model. The National Institutes of Health has made a library of drugs that have been approved by the U.S. Food and Drug Administration available for researchers to use — potentially allowing older drugs to have a second life treating an entirely different disease from what they were initially intended.
Parent and his colleagues hope to identify drugs that affect certain aspects of sodium channels, to see if they can dampen the sodium currents and calm hyperexcitability. The team is exploring new techniques that can make this process faster, using microelectrodes and calcium-sensitive dyes. They also hope to use the model to study potential drugs for non-genetic forms of epilepsy.
Having a U-M team that includes experts in induced pluripotent stem cell biology, sodium channel physiology and epilepsy genetics expertise helps the research progress, Parent notes. “Epilepsy is a complicated brain network disease,” he says. “It takes team-based science to address it.”
Patients as part of the research team
The U-M team’s research wouldn’t be possible without the participation of patients with Dravet syndrome and other genetic forms of epilepsy, and their parents.
More than 100 of them have joined the International Ion Channel Epilepsy Patient Registry, which is based at U-M and Miami Children’s Hospital and co-funded by the Dravet Syndrome Foundation and the ICE Epilepsy Alliance. The researchers hope to be able to conduct clinical trials of potential drugs with participation by these patients and others.
Meanwhile, patients with other genetically based neurological diseases can also help U-M scientists discover more about their conditions, by taking part in other efforts to create induced neurons from skin cells. Parent and his team have worked with several other U-M faculty to create stem cell lines from skin cells provided by patients with other diseases including forms of ataxia and lysosmal storage disease.

Epilepsy in a dish: Stem cell research reveals clues to disease’s origins and possible treatment

A new stem cell-based approach to studying epilepsy has yielded a surprising discovery about what causes one form of the disease, and may help in the search for better medicines to treat all kinds of seizure disorders.

The findings, reported by a team of scientists from the University of Michigan Medical School and colleagues, use a technique that could be called “epilepsy in a dish”.

By turning skin cells of epilepsy patients into stem cells, and then turning those stem cells into neurons, or brain nerve cells, the team created a miniature testing ground for epilepsy. They could even measure the signals that the cells were sending to one another, through tiny portals called sodium channels.

In neurons derived from the cells of children who have a severe, rare genetic form of epilepsy called Dravet syndrome, the researchers report abnormally high levels of sodium current activity. They saw spontaneous bursts of communication and “hyperexcitability” that could potentially set off seizures. Neurons made from the skin cells of people without epilepsy showed none of this abnormal activity.

They report their results online in the Annals of Neurology, and have further work in progress to create induced pluripotent stem cell lines from the cells of patients with other genetic forms of epilepsy. The work is funded by the National Institutes of Health, the American Epilepsy Society, the Epilepsy Foundation and U-M.

The new findings differs from what other scientists have seen in mice — demonstrating the importance of studying cells made from human epilepsy patients. Because the cells came from patients, they contained the hallmark seen in most patients with Dravet syndrome: a new mutation in SCN1A, the gene that encodes the crucial sodium channel protein called Nav1.1. That mutation reduces the number of channels to half the normal number in patients’ brains.

"With this technique, we can study cells that closely resemble the patient’s own brain cells, without doing a brain biopsy," says senior author and team leader Jack M. Parent, M.D., professor of neurology at U-M and a researcher at the VA Ann Arbor Healthcare System. "It appears that the cells are overcompensating for the loss of channels due to the mutation. These patient-specific induced neurons hold great promise for modeling seizure disorders, and potentially screening medications."

With the new paper, Parent, postdoctoral fellow Yu Liu, Ph.D. and their collaborators Lori Isom, Ph.D., professor of Pharmacology and of Molecular and Integrative Physiology at U-M, and Miriam Meisler, Ph.D., Distinguished University Professor of Human Genetics at U-M, report striking discoveries about what is happening at the cell level in the neurons of Dravet syndrome patients with a mutated SCN1A gene.

They also demonstrated that the effect is rooted in something that happens after function of the gene is reduced due to the mutation, though they don’t yet know how or why the nerve cells overcompensate for partial loss of this channel.

And, they found that the neurons didn’t show the telltale signs of hyperexcitability in the first few weeks after they were made — consistent with the fact that children with Dravet syndrome often don’t suffer their first seizures until they are several months old.

"In addition, reproduction of the hyperactivity of epileptic neurons in these cell cultures demonstrates that there is an intrinsic change in the neurons that does not depend on input from circuits in the brain," says co-author Meisler.

A platform for testing medications

Many Dravet patients don’t respond to current epilepsy medications, making the search for new options urgent. Their lives are constantly under threat by the risk of SUDEP, sudden unexplained death in epilepsy – and they never outgrow their condition, which delays their development and often requires round-the-clock care.

"Working with patient families, and translating our sodium channel research to a pediatric disease, has made our basic science work much more immediate and critical," says Isom, who serves on the scientific advisory board of the Dravet Syndrome Foundation along with Meisler. Parent, who co-directs U-M’s Comprehensive Epilepsy Program, was recently honored by the foundation.

The team is now working toward screening specific compounds for seizure-calming potential in Dravet syndrome, by testing their impact on the cells in the “epilepsy in a dish” model. The National Institutes of Health has made a library of drugs that have been approved by the U.S. Food and Drug Administration available for researchers to use — potentially allowing older drugs to have a second life treating an entirely different disease from what they were initially intended.

Parent and his colleagues hope to identify drugs that affect certain aspects of sodium channels, to see if they can dampen the sodium currents and calm hyperexcitability. The team is exploring new techniques that can make this process faster, using microelectrodes and calcium-sensitive dyes. They also hope to use the model to study potential drugs for non-genetic forms of epilepsy.

Having a U-M team that includes experts in induced pluripotent stem cell biology, sodium channel physiology and epilepsy genetics expertise helps the research progress, Parent notes. “Epilepsy is a complicated brain network disease,” he says. “It takes team-based science to address it.”

Patients as part of the research team

The U-M team’s research wouldn’t be possible without the participation of patients with Dravet syndrome and other genetic forms of epilepsy, and their parents.

More than 100 of them have joined the International Ion Channel Epilepsy Patient Registry, which is based at U-M and Miami Children’s Hospital and co-funded by the Dravet Syndrome Foundation and the ICE Epilepsy Alliance. The researchers hope to be able to conduct clinical trials of potential drugs with participation by these patients and others.

Meanwhile, patients with other genetically based neurological diseases can also help U-M scientists discover more about their conditions, by taking part in other efforts to create induced neurons from skin cells. Parent and his team have worked with several other U-M faculty to create stem cell lines from skin cells provided by patients with other diseases including forms of ataxia and lysosmal storage disease.

Filed under ataxia epilepsy epileptic seizures ion channels Dravet syndrome stem cells neuroscience science

217 notes

Brain picks out salient sounds from background noise by tracking frequency and time
New research reveals how our brains are able to pick out important sounds from the noisy world around us. The findings, published online today in the journal ‘eLife’, could lead to new diagnostic tests for hearing disorders. 
Our ears can effortlessly pick out the sounds we need to hear from a noisy environment - hearing our mobile phone ringtone in the middle of the Notting Hill Carnival, for example - but how our brains process this information (the so-called ‘cocktail party problem’) has been a longstanding research question in hearing science.
Researchers have previously investigated this using simple sounds such as two tones of different pitches, but now researchers at UCL and Newcastle University have used complicated sounds that are more representative of those we hear in real life. The team used ‘machine-like beeps’ that overlap in both frequency and time to recreate a busy sound environment and obtain new insights into how the brain solves this problem.
In the study, groups of volunteers were asked to identify target sounds from within this noisy background in a series of experiments.
Sundeep Teki, a PhD student from the Wellcome Trust Centre for Neuroimaging at UCL and joint first author of the study, said: “Participants were able to detect complex target sounds from the background noise, even when the target sounds were delivered at a faster rate or there was a loud disruptive noise between them.”
Dr Maria Chait, a senior lecturer at UCL Ear Institute and joint first author on the study, adds: “Previous models based on simple tones suggest that people differentiate sounds based on differences in frequency, or pitch. Our findings show that time is also an important factor, with sounds grouped as belonging to one object by virtue of being correlated in time.”
Professor Tim Griffiths, Professor of Cognitive Neurology at Newcastle University and lead researcher on the study, said: “Many hearing disorders are characterised by the loss of ability to detect speech in noisy environments. Disorders like this that are caused by problems with how the brain interprets sound information, rather than physical damage to the ear and hearing machinery, remain poorly understood.
"These findings inform us about a fundamental brain mechanism for detecting sound patterns and identifies a process that can go wrong in hearing disorders. We now have an opportunity to create better tests for these types of hearing problems."

Brain picks out salient sounds from background noise by tracking frequency and time

New research reveals how our brains are able to pick out important sounds from the noisy world around us. The findings, published online today in the journal ‘eLife’, could lead to new diagnostic tests for hearing disorders.

Our ears can effortlessly pick out the sounds we need to hear from a noisy environment - hearing our mobile phone ringtone in the middle of the Notting Hill Carnival, for example - but how our brains process this information (the so-called ‘cocktail party problem’) has been a longstanding research question in hearing science.

Researchers have previously investigated this using simple sounds such as two tones of different pitches, but now researchers at UCL and Newcastle University have used complicated sounds that are more representative of those we hear in real life. The team used ‘machine-like beeps’ that overlap in both frequency and time to recreate a busy sound environment and obtain new insights into how the brain solves this problem.

In the study, groups of volunteers were asked to identify target sounds from within this noisy background in a series of experiments.

Sundeep Teki, a PhD student from the Wellcome Trust Centre for Neuroimaging at UCL and joint first author of the study, said: “Participants were able to detect complex target sounds from the background noise, even when the target sounds were delivered at a faster rate or there was a loud disruptive noise between them.”

Dr Maria Chait, a senior lecturer at UCL Ear Institute and joint first author on the study, adds: “Previous models based on simple tones suggest that people differentiate sounds based on differences in frequency, or pitch. Our findings show that time is also an important factor, with sounds grouped as belonging to one object by virtue of being correlated in time.”

Professor Tim Griffiths, Professor of Cognitive Neurology at Newcastle University and lead researcher on the study, said: “Many hearing disorders are characterised by the loss of ability to detect speech in noisy environments. Disorders like this that are caused by problems with how the brain interprets sound information, rather than physical damage to the ear and hearing machinery, remain poorly understood.

"These findings inform us about a fundamental brain mechanism for detecting sound patterns and identifies a process that can go wrong in hearing disorders. We now have an opportunity to create better tests for these types of hearing problems."

Filed under auditory system hearing hearing disorders neuroscience science

115 notes

No Link Between Mercury Exposure and Autism-like Behaviors

The potential impact of exposure to low levels of mercury on the developing brain – specifically by women consuming fish during pregnancy – has long been the source of concern and some have argued that the chemical may be responsible for behavioral disorders such as autism. However, a new study that draws upon more than 30 years of research in the Republic of Seychelles reports that there is no association between pre-natal mercury exposure and autism-like behaviors.

image

“This study shows no evidence of a correlation between low level mercury exposure and autism spectrum-like behaviors among children whose mothers ate, on average, up to 12 meals of fish each week during pregnancy,” said Edwin van Wijngaarden, Ph.D., an associate professor in the University of Rochester Medical Center’s (URMC) Department of Public Health Sciences and lead author of the study which appears online today in the journal Epidemiology. “These findings contribute to the growing body of literature that suggest that exposure to the chemical does not play an important role in the onset of these behaviors.”

The debate over fish consumption has long created a dilemma for expecting mothers and physicians. Fish are high in beneficial nutrients such as, selenium, vitamin E, lean protein, and omega-3 fatty acids; the latter are essential to brain development. At the same time, exposure to high levels of mercury has been shown to lead to developmental problems, leading to the claim that mothers are exposing their unborn children to serious neurological impairment by eating fish during pregnancy. Despite the fact that the developmental consequences of low level exposure remain unknown, some organizations, including the U.S. Food and Drug Administration, have recommended that pregnant women limit their consumption of fish.

The presence of mercury in the environment is widespread and originates from both natural sources such as volcanoes and as a byproduct of coal-fired plants that emit the chemical. Much of this mercury ends up being deposited in the world’s oceans where it makes its way into the food chain and eventually into fish. While the levels of mercury found in individual fish are generally low, concerns have been raised about the cumulative effects of a frequent diet of fish.

The Republic of Seychelles has proven to be the ideal location to examine the potential health impact of persistent low level mercury exposure. With a population of 87,000 people spread across an archipelago of islands in the Indian Ocean, fishing is a both an important industry and a primary source of nutrition – the nation’s residents consume fish at a rate 10 times greater than the populations of the U.S. and Europe.  

The Seychelles Child Development Study – a partnership between URMC, the Seychelles Ministries of Health and Education, and the University of Ulster in Ireland – was created in the mid-1980s to specifically study the impact of fish consumption and mercury exposure on childhood development. The program is one of the largest ongoing epidemiologic studies of its kind.

“The Seychelles study was designed to follow a population over a very long period of time and focus on relevant mercury exposure,” said Philip Davidson, Ph.D., principal investigator of the Seychelles Child Development Study and professor emeritus in Pediatrics at URMC.   “While the amount of fish consumed in the Seychelles is significantly higher than other countries in the industrialized world, it is still considered low level exposure.”

The autism study involved 1,784 children, adolescents, and young adults and their mothers. The researchers were first able to determine the level of prenatal mercury exposure by analyzing hair samples that had been collected from the mothers around the time of birth, a test which can approximate mercury levels found in the rest of the body including the growing fetus. 

The researchers then used two questionnaires to determine whether or not the study participants were exhibiting autism spectrum-like behaviors. The Social Communication Questionnaire was completed by the children’s parents and the Social Responsiveness Scale was completed by their teachers. These tests – which include questions on language skills, social communication, and repetitive behaviors – do not provide a definitive diagnosis, but they are widely used in the U.S. as an initial screening tool and may suggest the need for additional evaluation.

The mercury levels of the mothers were then matched with the test scores of their children and the researchers found that there was no correlation between prenatal exposure and evidence of autism-spectrum-like behaviors. This is similar to the result of previous studies of the nation’s children which have measured language skills and intelligence, amongst other outcomes, and have not observed any adverse developmental effects.

The study lends further evidence to an emerging belief that the “good” may outweigh the possible “bad” when it comes to fish consumption during pregnancy. Specifically, if mercury does adversely influence child development at these levels of exposure then the benefits of the nutrients found in the fish may counteract or perhaps even supersede the potential negative effects of the mercury. 

“This study shows no consistent association in children with mothers with mercury levels that were six to ten times higher than those found in the U.S. and Europe,” said Davidson. “This is a sentinel population and if it does not exist here than it probably does not exist.”

“NIEHS has been a major supporter of research looking into the human health risks associated with mercury exposure,” said Cindy Lawler, Ph.D., acting branch chief at the National Institute of Environmental Health Sciences, part of National Institutes of Health. “The studies conducted in the Seychelles Islands have provided a unique opportunity to better understand the relationship between environmental factors, such as mercury, and the role they may play in the development of diseases like autism. Although more research is needed, this study does present some good news for parents.” 

Filed under ASD autism brain development mercury exposure neurobiology neuroscience science

98 notes

Researchers develop new approach for studying deadly brain cancer
Human glioblastoma multiforme, one of the most common, aggressive and deadly forms of brain cancer, is notoriously difficult to study. Scientists have traditionally studied cancer cells in petri dishes, which have none of the properties of the brain tissues in which these cancers grow, or in expensive animal models.
Now a team of engineers has developed a three-dimensional hydrogel that more closely mimics conditions in the brain. In a paper in the journal Biomaterials, the researchers describe the new material and their approach, which allows them to selectively tune up or down the malignancy of the cancer cells they study.
The new hydrogel is more versatile than other 3-D gels used for growing glioma (brain cancer) cells in part because it allows researchers to change individual parameters – the gel’s stiffness, for example, or the presence of molecular signals that can influence cancer growth – while minimally altering its other characteristics, such as porosity.
Being able to adjust these traits individually will help researchers tease out important features associated with the initial growth of a tumor as well as its response to clinical therapies, said University of Illinois chemical and biomolecular engineering professor Brendan Harley, who led the study with postdoctoral researcher Sara Pedron and undergraduate student Eftalda Becka. Harley is an affiliate of the Institute for Genomic Biology at Illinois.
The researchers found that they could increase or decrease the malignancy of glioma cells in their hydrogel simply by adding hyaluronic acid, a naturally occurring carbohydrate found in many tissues, especially the brain.
Hyaluronic acid (HA) is a key component of the extracellular matrix that provides structural and chemical support to cells throughout the body. HA contributes to cell proliferation and cell migration, and local changes in HA levels have been implicated in tumor growth.
“Hyaluronic acid is one of the major building blocks in the brain,” Harley said. “The structure of a newly forming brain tumor has some of this HA within it, but there’s also a lot of the HA in the brain surrounding the tumor.”
Previous studies have used hydrogels made out of nothing but hyaluronic acid to study gliomas, Harley said. “The problem there is that HA is structurally not very strong.” It also is difficult to adjust the amount of HA that the glioma cells are exposed to if their environment is 100 percent HA, he said.
In the new study, Pedron observed how glioma cells behaved in two different hydrogels – one based on methacrylated gelatin (GelMA) and the other using a more conventional polyethylene glycol (PEG) biomaterial. These two materials vary in one important trait: GelMA is a naturally derived material that contains adhesive sites that allow cells to latch onto it; synthetic PEG does not.
“The purpose of having these two systems was to isolate the effect of HA on glioma cells,” Pedron said. If changing HA levels produced different effects in different gels, that would indicate that the gels were contributing to those effects, she said.
Instead, Harley and Pedron found that additions of HA to glioma cells had “very similar” effects in both materials. Adding too little or too much HA led to reduced malignancy, while incorporating just enough HA led to significantly enhanced malignancy. This held true for multiple types of glioblastoma multiforme cells. This suggests that “it’s the HA itself that is likely the cause for this malignant change,” Harley said.
“If you have a material that allows you to selectively tune up or down malignancy, that will allow you to ask lots of questions about treatment methods for more malignant or less malignant forms of glioma. It also will allow scientists to try to get a response that’s closer to what you see in the body,” he said.
“If you talk to pathologists, they’ll say a biomaterial will never allow you to grow a full brain tumor, which is probably true,” Harley said. “But it’s realistic to think that a well-designed biomaterial will allow you to study aspects of glioma growth and treatment in a way that’s much richer than simply looking in a petri dish and much more accessible than trying to study tumor development within the brain itself.”

Researchers develop new approach for studying deadly brain cancer

Human glioblastoma multiforme, one of the most common, aggressive and deadly forms of brain cancer, is notoriously difficult to study. Scientists have traditionally studied cancer cells in petri dishes, which have none of the properties of the brain tissues in which these cancers grow, or in expensive animal models.

Now a team of engineers has developed a three-dimensional hydrogel that more closely mimics conditions in the brain. In a paper in the journal Biomaterials, the researchers describe the new material and their approach, which allows them to selectively tune up or down the malignancy of the cancer cells they study.

The new hydrogel is more versatile than other 3-D gels used for growing glioma (brain cancer) cells in part because it allows researchers to change individual parameters – the gel’s stiffness, for example, or the presence of molecular signals that can influence cancer growth – while minimally altering its other characteristics, such as porosity.

Being able to adjust these traits individually will help researchers tease out important features associated with the initial growth of a tumor as well as its response to clinical therapies, said University of Illinois chemical and biomolecular engineering professor Brendan Harley, who led the study with postdoctoral researcher Sara Pedron and undergraduate student Eftalda Becka. Harley is an affiliate of the Institute for Genomic Biology at Illinois.

The researchers found that they could increase or decrease the malignancy of glioma cells in their hydrogel simply by adding hyaluronic acid, a naturally occurring carbohydrate found in many tissues, especially the brain.

Hyaluronic acid (HA) is a key component of the extracellular matrix that provides structural and chemical support to cells throughout the body. HA contributes to cell proliferation and cell migration, and local changes in HA levels have been implicated in tumor growth.

“Hyaluronic acid is one of the major building blocks in the brain,” Harley said. “The structure of a newly forming brain tumor has some of this HA within it, but there’s also a lot of the HA in the brain surrounding the tumor.”

Previous studies have used hydrogels made out of nothing but hyaluronic acid to study gliomas, Harley said. “The problem there is that HA is structurally not very strong.” It also is difficult to adjust the amount of HA that the glioma cells are exposed to if their environment is 100 percent HA, he said.

In the new study, Pedron observed how glioma cells behaved in two different hydrogels – one based on methacrylated gelatin (GelMA) and the other using a more conventional polyethylene glycol (PEG) biomaterial. These two materials vary in one important trait: GelMA is a naturally derived material that contains adhesive sites that allow cells to latch onto it; synthetic PEG does not.

“The purpose of having these two systems was to isolate the effect of HA on glioma cells,” Pedron said. If changing HA levels produced different effects in different gels, that would indicate that the gels were contributing to those effects, she said.

Instead, Harley and Pedron found that additions of HA to glioma cells had “very similar” effects in both materials. Adding too little or too much HA led to reduced malignancy, while incorporating just enough HA led to significantly enhanced malignancy. This held true for multiple types of glioblastoma multiforme cells. This suggests that “it’s the HA itself that is likely the cause for this malignant change,” Harley said.

“If you have a material that allows you to selectively tune up or down malignancy, that will allow you to ask lots of questions about treatment methods for more malignant or less malignant forms of glioma. It also will allow scientists to try to get a response that’s closer to what you see in the body,” he said.

“If you talk to pathologists, they’ll say a biomaterial will never allow you to grow a full brain tumor, which is probably true,” Harley said. “But it’s realistic to think that a well-designed biomaterial will allow you to study aspects of glioma growth and treatment in a way that’s much richer than simply looking in a petri dish and much more accessible than trying to study tumor development within the brain itself.”

Filed under glioblastoma glioma brain cancer hyaluronic acid polyethylene glycol neuroscience science

free counters