Posts tagged epigenetics

Posts tagged epigenetics
New technique can rapidly turn genes on and off, helping scientists better understand their function.
Although human cells have an estimated 20,000 genes, only a fraction of those are turned on at any given time, depending on the cell’s needs — which can change by the minute or hour. To find out what those genes are doing, researchers need tools that can manipulate their status on similarly short timescales.
That is now possible, thanks to a new technology developed at MIT and the Broad Institute that can rapidly start or halt the expression of any gene of interest simply by shining light on the cells.
The work is based on a technique known as optogenetics, which uses proteins that change their function in response to light. In this case, the researchers adapted the light-sensitive proteins to either stimulate or suppress the expression of a specific target gene almost immediately after the light comes on.
“Cells have very dynamic gene expression happening on a fairly short timescale, but so far the methods that are used to perturb gene expression don’t even get close to those dynamics. To understand the functional impact of those gene-expression changes better, we have to be able to match the naturally occurring dynamics as closely as possible,” says Silvana Konermann, an MIT graduate student in brain and cognitive sciences.
The ability to precisely control the timing and duration of gene expression should make it much easier to figure out the roles of particular genes, especially those involved in learning and memory. The new system can also be used to study epigenetic modifications — chemical alterations of the proteins that surround DNA — which are also believed to play an important role in learning and memory.
Konermann and Mark Brigham, a graduate student at Harvard University, are the lead authors of a paper describing the technique in the July 22 online edition of Nature. The paper’s senior author is Feng Zhang, the W.M. Keck Assistant Professor in Biomedical Engineering at MIT and a core member of the Broad Institute and MIT’s McGovern Institute for Brain Research.
Shining light on genes
The new system consists of several components that interact with each other to control the copying of DNA into messenger RNA (mRNA), which carries genetic instructions to the rest of the cell. The first is a DNA-binding protein known as a transcription activator-like effector (TALE). TALEs are modular proteins that can be strung together in a customized way to bind any DNA sequence.
Fused to the TALE protein is a light-sensitive protein called CRY2 that is naturally found in Arabidopsis thaliana, a small flowering plant. When light hits CRY2, it changes shape and binds to its natural partner protein, known as CIB1. To take advantage of this, the researchers engineered a form of CIB1 that is fused to another protein that can either activate or suppress gene copying.
After the genes for these components are delivered to a cell, the TALE protein finds its target DNA and wraps around it. When light shines on the cells, the CRY2 protein binds to CIB1, which is floating in the cell. CIB1 brings along a gene activator, which initiates transcription, or the copying of DNA into mRNA. Alternatively, CIB1 could carry a repressor, which shuts off the process.
A single pulse of light is enough to stimulate the protein binding and initiate DNA copying. The researchers found that pulses of light delivered every minute or so are the most effective way to achieve continuous transcription for the desired period of time. Within 30 minutes of light delivery, the researchers detected an uptick in the amount of mRNA being produced from the target gene. Once the pulses stop, the mRNA starts to degrade within about 30 minutes.
In this study, the researchers tried targeting nearly 30 different genes, both in neurons grown in the lab and in living animals. Depending on the gene targeted and how much it is normally expressed, the researchers were able to boost transcription by a factor of two to 200.
Karl Deisseroth, a professor of bioengineering at Stanford University and one of the inventors of optogenetics, says the most important innovation of the technique is that it allows control of genes that naturally occur in the cell, as opposed to engineered genes delivered by scientists.
“You could control, at precise times, a particular genetic locus and see how everything responds to that, with high temporal precision,” says Deisseroth, who was not part of the research team.
Epigenetic modifications
Another important element of gene-expression control is epigenetic modification. One major class of epigenetic effectors is chemical modification of the proteins, known as histones, that anchor chromosomal DNA and control access to the underlying genes. The researchers showed that they can also alter these epigenetic modifications by fusing TALE proteins with histone modifiers.
Epigenetic modifications are thought to play a key role in learning and forming memories, but this has not been very well explored because there are no good ways to disrupt the modifications, short of blocking histone modification of the entire genome. The new technique offers a much more precise way to interfere with modifications of individual genes.
“We want to allow people to prove the causal role of specific epigenetic modifications in the genome,” Zhang says.
So far, the researchers have demonstrated that some of the histone effector domains can be tethered to light-sensitive proteins; they are now trying to expand the types of histone modifiers they can incorporate into the system.
“It would be really useful to expand the number of epigenetic marks that we can control. At the moment we have a successful set of histone modifications, but there are a good deal more of them that we and others are going to want to be able to use this technology for,” Brigham says.
(Source: web.mit.edu)
These Decapitated Worms Regrow Old Memories Along with New Heads
It’s long been known that many species of worms have the remarkable ability to grow back body and even specific organs when they’ve been cut off. But new research by a pair of scientists from Tufts University has revealed that planarians—small creatures, often called flatworms, that can live in water or on land—are capable of regenerating something even more amazing.
The researchers, Tal Shomrat and Michael Levin, trained flatworms to travel across a rough surface to access food, then removed their heads. Two weeks later, after the heads grew back, the worms somehow regained their tendency to navigate across rough terrain, as the researchers recently documented in the Journal of Experimental Biology.
Interest in flatworm memories dates to the 1950s, when a series of strange experiments by Michigan biologist James McConnell indicated that worms could gain the ability to navigate a maze by being fed the ground-up remains of other flatworms that had been trained to run through the same maze. McConnell speculated that a type of genetic material called “memory RNA” was responsible for this phenomenon, and could be transferred between the organisms.
Subsequent research into planarian memory RNA exploited the fact that the worms could easily regenerate heads after decapitation. In some studies, the worms’ heads were cut off and then regenerated while they swam in RNA solutions; in others, as the Field of Science blog points out, worms that had already been trained to navigate a maze were tested after they were decapitated and their heads grew back.
Unfortunately, McConnell’s findings were largely discredited—critics pointed to sloppy research methods, and some even charged that planarians had no capacity for long-term memory—and research in this area lay dormant. Recently, though, Shomrat and Levin developed automated systems to train and test the worms, which would enable standardized and rigorous measures of how the organisms acquired and retained memories over time. And though memory RNA is still believed to be a myth, their recent research has confirmed that these worms’ memories do work in astoundingly bizarre ways.
The researchers’ computerized system dealt with the worms, from the species Dugesia japonica, in two groups of 72 each. One group was conditioned to live in a rough-bottomed petri dish, with the other in a smooth-bottomed one, for ten days. Both dishes were stocked with ample worm food (small pieces of beef liver), so each group was conditioned to learn that their particular surface meant “food is nearby.”
Next, each group was separately put into a rough-bottomed petri dish with food located only in one quadrant, along with a bright blue LED. Flatworms typically avoid light, so spending time in that quadrant meant that their expectation of food nearby trumped their aversion to light.
As a result of their conditioning, the worms who’d lived in rough containers were much quicker to flock to the lit quadrant. The researchers had the automated system’s video cameras track how long it took for the worms to spend three straight minutes under the lights, and those reared in the rough dishes took an average of six minutes to pass this number, compared to about seven and a half minutes for the other group. This difference showed that the former group had been conditioned to associate rough surfaces with food, and explored these surfaces more readily.
Afterward, all worms were fully decapitated (every bit of brain was removed) and left alone to regrow their heads over the course of the next two weeks. When they were put back in the chamber with the rough surface, the group that had previously lived in the rough dishes—that is, their previous heads had lived in the rough dishes—were still willing to venture into the lit quadrant of the rough dish and spend an extended period of time there more than a minute faster than the other group.
Incredible as it seems, some lingering memories of the rough-surface conditioning seem to have lived on in the bodies of these worms, even after their heads were chopped off. The biological explanation for this is unclear, as The Verge blog notes. Previous research confirmed that the worms’ behavior is controlled by their brains, but it’s possible that some of their memories may have been stored in their bodies, or that the training given to their initial heads somehow modified other parts of their nervous systems, which then altered how their new brains grew.
There’s also another sort of explanation. The researchers speculate that epigenetics—changes to an organism’s DNA structure that alter the expression of genes—could play a role, perhaps encoding the memory (“rough floors = food”) permanently in the worms’s DNA.
In that case, this strange experiment would provide yet another surprising outcome. There may not be such a thing as “memory RNA” per se, but in speculating on the role of genetic material in the retention of these worms’ memories, McConnell may have been on the right track after all.
Unique Epigenomic Code Identified During Human Brain Development
Changes in the epigenome, including chemical modifications of DNA, can act as an extra layer of information in the genome, and are thought to play a role in learning and memory, as well as in age-related cognitive decline. The results of a new study by scientists at the Salk Institute for Biological Studies show that the landscape of DNA methylation, a particular type of epigenomic modification, is highly dynamic in brain cells during the transition from birth to adulthood, helping to understand how information in the genomes of cells in the brain is controlled from fetal development to adulthood. The brain is much more complex than all other organs in the body and this discovery opens the door to a deeper understanding of how the intricate patterns of connectivity in the brain are formed.
“These results extend our knowledge of the unique role of DNA methylation in brain development and function,” says senior author Joseph R. Ecker, professor and director of Salk’s Genomic Analysis Laboratory and holder of the Salk International Council Chair in Genetics. “They offer a new framework for testing the role of the epigenome in healthy function and in pathological disruptions of neural circuits.”
A healthy brain is the product of a long process of development. The front-most part of our brain, called the frontal cortex, plays a key role in our ability to think, decide and act. The brain accomplishes all of this through the interaction of special cells such as neurons and glia. We know that these cells have distinct functions, but what gives these cells their individual identities? The answer lies in how each cell expresses the information contained in its DNA. Epigenomic modifications, such as DNA methylation, can control which genes are turned on or off without changing letters of the DNA alphabet (A-T-C-G), and thus help distinguish different cell types.
In this new study, published July 4 in Science, the scientists found that the patterns of DNA methylation undergo widespread reconfiguration in the frontal cortex of mouse and human brains during a time of development when synapses, or connections between nerve cells, are growing rapidly. The researchers identified the exact sites of DNA methylation throughout the genome in brains from infants through adults. They found that one form of DNA methylation is present in neurons and glia from birth. Strikingly, a second form of “non-CG” DNA methylation that is almost exclusive to neurons accumulates as the brain matures, becoming the dominant form of methylation in the genome of human neurons. These results help us to understand how the intricate DNA landscape of brain cells develops during the key stages of childhood.
The genetic code in DNA is made up of four chemical bases: adenine (A), guanine (G), cytosine (C), and thymine (T). DNA methylation typically occurs at so-called CpG sites, where C (cytosine) sits next to G (guanine) in the DNA alphabet. About 80 to 90 percent of CpG sites are methylated in human DNA. Salk researchers previously discovered that in human embryonic stem cells and induced pluripotent stem cells, a type of artificially derived stem cell, DNA methylation can also occur when G does not follow C, hence “non-CG methylation.” Originally, they thought that this type of methylation disappeared when stem cells differentiate into specific tissue-types such as lung or fat cells. The current study finds this is not the case in the brain, where non-CG methylation appears after cells differentiate, usually during childhood and adolescence when the brain is maturing.
By sequencing the genomes of mouse and human brain tissue as well as neurons and glia (from the frontal cortex of the brain) during early postnatal, juvenile, adolescent and adult stages, the Salk team found that non-CG methylation accumulates in neurons through early childhood and adolescence, and becomes the dominant form of DNA methylation in mature human neurons. “This shows that the period during which the neural circuits of the brain mature is accompanied by a parallel process of large-scale reconfiguration of the neural epigenome,” says Ecker, who is a Howard Hughes Medical Institute and Gordon and Betty Moore Foundation investigator.
The study provides the first comprehensive maps of how DNA methylation patterns change in the mouse and human brain during development, forming a critical foundation to now explore whether changes in methylation patterns may be linked to human diseases, including psychiatric disorders. Recent studies have demonstrated a possible role for DNA methylation in schizophrenia, depression, suicide and bipolar disorder. “Our work will let us begin to ask more detailed questions about how changes in the epigenome sculpt the complex identities of brain cells through life,” says co-first author Eran Mukamel, from Salk’s Computational Neurobiology Laboratory.
“The human brain has been called the most complex system that we know of in the universe,” says Ryan Lister, co-corresponding author on the new paper, previously a postdoctoral fellow in Ecker’s laboratory at Salk and now a group leader at The University of Western Australia. “So perhaps we shouldn’t be so surprised that this complexity extends to the level of the brain epigenome. These unique features of DNA methylation that emerge during critical phases of brain development suggest the presence of previously unrecognized regulatory processes that may be critically involved in normal brain function and brain disorders.”
At present, there is consensus among neuroscientists that many mental disorders have a neurodevelopmental origin and arise from an interaction between genetic predisposition and environmental influences (for example, early-life stress or drug abuse), the outcome of which is altered activity of brain networks. The building and shaping of these brain networks requires a long maturation process in which central nervous system cell types (neurons and glia) need to fine-tune the way they express their genetic code.
“DNA methylation fulfills this role,” says study co-author Terrence J. Sejnowski, a Howard Hughes Medical Institute Investigator, holder of the Francis Crick Chair and head of Salk’s Computational Neurobiology Laboratory. “We found that patterns of methylation are dynamic during brain development, in particular for non-CG methylation during early childhood and adolescence, which changes the way that we think about normal brain function and dysfunction.”
By disrupting the transcriptional expression of neurons, adds co-corresponding author M. Margarita Behrens, a staff scientist in the Computational Neurobiology Laboratory, “the alterations of these methylation patterns will change the way in which networks are formed, which could, in turn, lead to the appearance of mental disorders later in life.”
Gene switches make prairie voles fall in love
Epigenetic changes affect neurotransmitters that lead to pair-bond formation.
Love really does change your brain — at least, if you’re a prairie vole. Researchers have shown for the first time that the act of mating induces permanent chemical modifications in the chromosomes, affecting the expression of genes that regulate sexual and monogamous behaviour. The study is published today in Nature Neuroscience.
Prairie voles (Microtus ochrogaster) have long been of interest to neuroscientists and endocrinologists who study the social behaviour of animals, in part because this species forms monogamous pair bonds — essentially mating for life. The voles’ pair bonding, sharing of parental roles and egalitarian nest building in couples makes them a good model for understanding the biology of monogamy and mating in humans.
Previous studies have shown that the neurotransmitters oxytocin and vasopressin play a major part in inducing and regulating the formation of the pair bond. Monogamous prairie voles are known to have higher levels of receptors for these neurotransmitters than do voles who have yet to mate; and when otherwise promiscuous montane voles (M. montanus) are dosed with oxytocin and vasopressin, they adopt the monogamous behaviour of their prairie cousins.
Because behaviour seemed to play an active part in changing the neurobiology of the animals, scientists suspected that epigenetic factors were involved. These are chemical modifications to the chromosomes that affect how genes are transcribed or suppressed, as opposed to changes in the gene sequences themselves.
Love potion
To look for clues of epigenetic agents at play in monogamous behaviour, neuroscientist Mohamed Kabbaj and his team at Florida State University in Tallahassee took voles which had been housed together for 6 hours but had not mated. The researchers injected drugs into the voles’ brains near a region called the nucleus accumbens, which is closely associated with the reinforcement of reward and pleasure. The drugs blocked the activity of an enzyme that normally keeps DNA tightly wound up and thus prevents the expression of genes.
The team found that the genes for the vasopressin and oxytocin receptors had been transcribed, and as a result the nucleus accumbens of the animals bore high levels of these receptors. Animals that had been permitted to mate also had high levels of vasopressin and oxytocin receptors, confirming the link between bond formation and gene activity.
“Mating activates this brain area which leads to partner preference — we can induce this same change in the brain with this drug,” Kabbaj explains.
Interestingly, the injection alone cannot induce the partner preference. “The drug by itself won’t do all these molecular changes — you need the context: it’s the drug plus the six hours of cohabitation,” says Kabbaj.
“This is a study I myself wanted to do years ago,” says Thomas Insel, who heads the US National Institute of Mental Health in Bethesda, Maryland. “If mating causes the release of the neuropeptide, how does this kick into a higher gear for the rest of the animal’s life? This study for me really is the first experimental demonstration that the epigenetic change would be necessary for the long-term change in behaviour.”
“This paper really shows that there is an epigenetic mechanism underlying pair bonds — we ourselves have looked for that and not found it,” says Alaine Keebaugh of Emory University in Atlanta, Georgia, who also studies the neuroscience of prairie voles.
Kabbaj says he hopes that the work could ultimately lead to an enhanced understanding of how epigenetic factors affect social behaviour in humans — not only in monogamy and pair bonding, but also in conditions such as autism and schizophrenia, which affect social interactions.
Alteration of two genes, detectable by simple blood test during pregnancy, foretold illness with 85 percent certainty in small study
Johns Hopkins researchers say they have discovered specific chemical alterations in two genes that, when present during pregnancy, reliably predict whether a woman will develop postpartum depression.
The epigenetic modifications, which alter the way genes function without changing the underlying DNA sequence, can apparently be detected in the blood of pregnant women during any trimester, potentially providing a simple way to foretell depression in the weeks after giving birth, and an opportunity to intervene before symptoms become debilitating.
The findings of the small study involving 52 pregnant women are described online in the journal Molecular Psychiatry.
“Postpartum depression can be harmful to both mother and child,” says study leader Zachary Kaminsky, Ph.D., an assistant professor of psychiatry and behavioral sciences at the Johns Hopkins University School of Medicine. “But we don’t have a reliable way to screen for the condition before it causes harm, and a test like this could be that way.”
It is not clear what causes postpartum depression, a condition marked by persistent feelings of sadness, hopelessness, exhaustion and anxiety that begins within four weeks of childbirth and can last weeks, several months or up to a year. An estimated 10 to 18 percent of all new mothers develop the condition, and the rate rises to 30 to 35 percent among women with previously diagnosed mood disorders. Scientists long believed the symptoms were related to the large drop-off in the mother’s estrogen levels following childbirth, but studies have shown that both depressed and nondepressed women have similar estrogen levels.
By studying mice, the Johns Hopkins researchers suspected that estrogen induced epigenetic changes in cells in the hippocampus, a part of the brain that governs mood. Kaminsky and his team then created a complicated statistical model to find the candidate genes most likely undergoing those epigenetic changes, which could be potential predictors for postpartum depression. That process resulted in the identification of two genes, known as TTC9B and HP1BP3, about which little is known save for their involvement in hippocampal activity.
Kaminsky says the genes in question may have something to do with the creation of new cells in the hippocampus and the ability of the brain to reorganize and adapt in the face of new environments — two elements important in mood. In some ways, he says, estrogen can behave like an antidepressant, so that when inhibited, it adversely affects mood.
The researchers later confirmed their findings in humans by looking for epigenetic changes to thousands of genes in blood samples from 52 pregnant women with mood disorders. Jennifer L. Payne, M.D., director of the Johns Hopkins Women’s Mood Disorders Center, collected the blood samples. The women were followed both during and after pregnancy to see who developed postpartum depression.
The researchers noticed that women who developed postpartum depression exhibited stronger epigenetic changes in those genes that are most responsive to estrogen, suggesting that these women are more sensitive to the hormone’s effects. Specifically, two genes were most highly correlated with the development of postpartum depression. TTC9B and HP1BP3 predicted with 85 percent certainty which women became ill.
“We were pretty surprised by how well the genes were correlated with postpartum depression,” Kaminsky says. “With more research, this could prove to be a powerful tool.”
Kaminsky says the next step in research would be to collect blood samples from a larger group of pregnant women and follow them for a longer period of time. He also says it would be useful to examine whether the same epigenetic changes are present in the offspring of women who develop postpartum depression.
Evidence suggests that early identification and treatment of postpartum depression can limit or prevent debilitating effects. Alerting women to the condition’s risk factors — as well as determining whether they have a previous history of the disorder, other mental illness and unusual stress — is key to preventing long-term problems.
Research also shows, Kaminsky says, that postpartum depression not only affects the health and safety of the mother, but also her child’s mental, physical and behavioral health.
Kaminsky says that if his preliminary work pans out, he hopes a blood test for the epigenetic biomarkers could be added to the battery of tests women undergo during pregnancy, and inform decisions about the use of antidepressants during pregnancy. There are concerns, he says, about the effects of these drugs on the fetus and their use must be weighed against the potentially debilitating consequences to both the mother and child of foregoing them.
“If you knew you were likely to develop postpartum depression, your decisions about managing your care could be made more clearly,” he says.
(Source: hopkinsmedicine.org)
Scientists from King’s College London have identified patterns of epigenetic changes involved in autism spectrum disorder (ASD) by studying genetically identical twins who differ in autism traits.

The study, published in Molecular Psychiatry, is the largest of its kind and may shed light on the biological mechanism by which environmental influences regulate the activity of certain genes and in turn contribute to the development of ASD and related behaviour traits.
ASD affects approximately 1 in 100 people in the UK and involves a spectrum of disorders which manifest themselves differently in different people. People with ASD have varying levels of impairment across three common areas: deficits in social interactions and understanding, repetitive behaviour and interests, and impairments in language and communication development.
Evidence from twin studies shows there is a strong genetic component to ASD and previous studies suggest that genes that direct brain development may be involved in the disorder. In approximately 70% of cases, when one identical twin has ASD, so does the other. However, in 30% of cases, identical twins differ for ASD. Because identical twins share the same genetic code, this suggests non-genetic, or epigenetic, factors may be involved.
Epigenetic changes affect the expression or activity of genes without changing the underlying DNA sequence – they are believed to be one mechanism by which the environment can interact with the genome. Importantly, epigenetic changes are potentially reversible and may therefore provide targets for the development of new therapies.
The researchers studied an epigenetic mechanism called DNA methylation. DNA methylation acts to block the genetic sequences that drive gene expression, silencing gene activity. They examined DNA methylation at over 27,000 sites across the genome using samples taken from 50 identical twin pairs (100 individuals) from the UK Medical Research Council (MRC) funded Twins Early Development Study (TEDS): 34 pairs who differed for ASD or autism related behaviour traits, 5 pairs where both twins have ASD, and 11 healthy twin pairs.
Dr Chloe Wong, first author of the study from King’s College London’s Institute of Psychiatry, says: “We’ve identified distinctive patterns of DNA methylation associated with both autism diagnosis and related behaviour traits, and increasing severity of symptoms. Our findings give us an insight into the biological mechanism mediating the interaction between gene and environment in autism spectrum disorder.”
DNA methylation at some genetic sites was consistently altered for all individuals with ASD, and differences at other sites were specific to certain symptom groups. The number of DNA methylation sites across the genome was also linked to the severity of autism symptoms suggesting a quantitative relationship between the two. Additionally, some of the differences in DNA methylation markers were located in genetic regions that previous research has associated with early brain development and ASD.
Professor Jonathan Mill, lead author of the paper from King’s College London’s Institute of Psychiatry and the University of Exeter, says: “Research into the intersection between genetic and environmental influences is crucial because risky environmental conditions can sometimes be avoided or changed. Epigenetic changes are potentially reversible, so our next step is to embark on larger studies to see whether we can identify key epigenetic changes common to the majority of people with autism to help us develop possible therapeutic interventions.”
Dr Alycia Halladay, Senior Director of Environmental and Clinical Sciences from Autism Speaks who funded the research, says: “This is the first large-scale study to take a whole genome approach to studying epigenetic influences in twins who are genetically identical but have different symptoms. These findings open the door to future discoveries in the role of epigenetics – in addition to genetics – in the development of autism symptoms.”
(Source: kcl.ac.uk)

Epigenetics: Neurons remember because they move genes in space
How do neurons store information about past events? In the Nencki Institute of Experimental Biology of the Polish Academy of Sciences in Warsaw, a mechanism unknown previously of memory traces formation has been discovered. It appears that at least some events are remembered thanks to… geometry.
Neurons are the most important cells of the nervous system. Scientists from the Nencki Institute of Experimental Biology of the Polish Academy of Sciences in Warsaw have shown that during neuron stimulation permanent changes are observed with respect to genes’ arrangement within the cell nucleus. This discovery, reported in the “Journal of Neuroscience”, one of the most prestigious journals in the field of neurobiology, is significant for developing a better understanding of the processes going on in the mind and disorders of the nervous system, especially the brain.
“While conducting experiments on rats after epileptic seizures we have observed that a gene may permanently move deeper into the neuron’s cell nucleus. Since modification of the geometrical structure of the nucleus leads to changes in gene expression, this is how the neuron remembers, what happened”, explains Prof. Grzegorz Wilczyński from the Laboratory of Molecular and Systemic Neuromorphology at the Nencki Institute.
“Seq-ing” Insights into the Epigenetics of Neuronal Gene Regulation
The epigenetic control of neuronal gene expression patterns has emerged as an underlying regulatory mechanism for neuronal function, identity, and plasticity, in which short- to long-lasting adaptation is required to dynamically respond and process external stimuli. To achieve a comprehensive understanding of the physiology and pathology of the brain, it becomes essential to understand the mechanisms that regulate the epigenome and transcriptome in neurons. Here, we review recent advances in the study of regulated neuronal gene expression, which are dramatically expanding as a result of the development of new and powerful contemporary methodologies, based on next-generation sequencing. This flood of new information has already transformed our understanding of many biological processes and is now driving discoveries elucidating the molecular mechanisms of brain function in cognition, behavior, and disease and may also inform the study of neuronal identity, diversity, and neuronal reprogramming.
Research helps explain early-onset puberty in females
New research from Oregon Health & Science University has provided significant insight into the reasons why early-onset puberty occurs in females. The research, which was conducted at OHSU’s Oregon National Primate Research Center, is published in the current early online edition of the journal Nature Neuroscience.
The paper explains how OHSU scientists are investigating the role of epigenetics in the control of puberty. Epigenetics refers to changes in gene activity linked to external factors that do not involve changes to the genetic code itself. The OHSU scientists believe improved understanding of these complex protein/gene interactions will lead to greater understanding of both early-onset (precocious) puberty and delayed puberty, and highlight new therapy avenues.
To conduct this research, scientists studied female rats, which like their human counterparts, go through puberty as part of their early aging process. These studies revealed that a group of proteins, called PcG proteins, regulate the activity of a gene called the Kiss1 gene, which is required for puberty to occur. When these PcG proteins diminish, Kiss1 is activated and puberty begins.
PcG proteins are produced by another set of genes that act as a biological switch during the embryonic stage of life. The role of these proteins is to turn off specific downstream genes at key developmental stages.
OHSU scientists found that both the activity of these “master” genes and their ability to turn off puberty are impacted by two forms of epigenetic control: a chemical modification of DNA known as DNA methylation, and changes in the composition of histones, a specialized set of proteins that modify gene activity by interacting with DNA.
Using this new information, researchers were then able to delay puberty in female rats. They accomplished this by increasing PcG protein levels in the hypothalamus of the brain using a targeted gene therapy approach so that Kiss1 activation failed to occur at the normal time in life. The hypothalamus is a region of the brain that controls reproductive development.
"While it was always understood that an organism’s genes determine the timing of puberty, the role of epigenetics in this process has never been recorded until now," said Alejandro Lomniczi, Ph.D., a scientist in the Division of Neuroscience at the OHSU Oregon National Primate Research Center.
"Because epigenetic changes are driven by environmental, metabolic and cell-to-cell influences, these findings raise the possibility that a significant percentage of precocious and delayed puberty cases occurring in humans may be the result of environmental factors and other alterations in epigenetic control," said Sergio Ojeda, D.V.M, who is also a scientist in the Division of Neuroscience at the OHSU ONPRC.
"There is also much more to be learned about the way that epigenetic factors may link environmental factors such as nutrition, man-made chemicals, social interactions and other day-today influences to the timing and completion of normal puberty."
Scientists discover how epigenetic information could be inherited
New research reveals a potential way for how parents’ experiences could be passed to their offspring’s genes. The research was published in the journal Science.
Epigenetics is a system that turns our genes on and off. The process works by chemical tags, known as epigenetic marks, attaching to DNA and telling a cell to either use or ignore a particular gene.
The most common epigenetic mark is a methyl group. When these groups fasten to DNA through a process called methylation they block the attachment of proteins which normally turn the genes on. As a result, the gene is turned off.
Scientists have witnessed epigenetic inheritance, the observation that offspring may inherit altered traits due to their parents’ past experiences. For example, historical incidences of famine have resulted in health effects on the children and grandchildren of individuals who had restricted diets, possibly because of inheritance of altered epigenetic marks caused by a restricted diet.
However, it is thought that between each generation the epigenetic marks are erased in cells called primordial gene cells (PGC), the precursors to sperm and eggs. This ‘reprogramming’ allows all genes to be read afresh for each new person – leaving scientists to question how epigenetic inheritance could occur.
The new Cambridge study initially discovered how the DNA methylation marks are erased in PGCs, a question that has been under intense investigation over the past 10 years. The methylation marks are converted to hydroxymethylation which is then progressively diluted out as the cells divide. This process turns out to be remarkably efficient and seems to reset the genes for each new generation. Understanding the mechanism of epigenetic resetting could be exploited to deal with adult diseases linked with an accumulation of aberrant epigenetic marks, such as cancers, or in ‘rejuvenating’ aged cells.
However, the researchers, who were funded by the Wellcome Trust, also found that some rare methylation can ‘escape’ the reprogramming process and can thus be passed on to offspring – revealing how epigenetic inheritance could occur. This is important because aberrant methylation could accumulate at genes during a lifetime in response to environmental factors, such as chemical exposure or nutrition, and can cause abnormal use of genes, leading to disease. If these marks are then inherited by offspring, their genes could also be affected.