Posts tagged neuroscience

Posts tagged neuroscience

Study Shows a Solitary Mutation Can Destroy Critical ‘Window’ of Early Brain Development
Scientists from the Florida campus of The Scripps Research Institute (TSRI) have shown in animal models that brain damage caused by the loss of a single copy of a gene during very early childhood development can cause a lifetime of behavioral and intellectual problems.
The study, published this week in the Journal of Neuroscience, sheds new light on the early development of neural circuits in the cortex, the part of the brain responsible for functions such as sensory perception, planning and decision-making.
The research also pinpoints the mechanism responsible for the disruption of what are known as “windows of plasticity” that contribute to the refinement of the neural connections that broadly shape brain development and the maturing of perception, language, and cognitive abilities.
The key to normal development of these abilities is that the neural connections in the brain cortex—the synapses—mature at the right time.
In an earlier study, the team, led by TSRI Associate Professor Gavin Rumbaugh, found that in mice missing a single copy of the vital gene, certain synapses develop prematurely within the first few weeks after birth. This accelerated maturation dramatically expands the process known as “excitability”—how often brain cells fire—in the hippocampus, a part of the brain critical for memory. The delicate balance between excitability and inhibition is especially critical during early developmental periods. However, it remained a mystery how early maturation of brain circuits could lead to lifelong cognitive and behavioral problems.
The current study shows in mice that the interruption of the synapse-regulating gene known as SYNGAP1—which can cause a devastating form of intellectual disability and increase the risk for developing autism in humans—induces early functional maturation of neural connections in two areas of the cortex. The influence of this disruption is widespread throughout the developing brain and appears to degrade the duration of these critical windows of plasticity.
“In this study, we were able to directly connect early maturation of synapses to the loss of an important plasticity window in the cortex,” Rumbaugh said. “Early maturation of synapses appears to make the brain less plastic at critical times in development. Children with these mutations appear to have brains that were built incorrectly from the ground up.”
The accelerated maturation also appeared to occur surprisingly early in the developing cortex. That, Rumbaugh added, would correspond to the first two years of a child’s life, when the brain is expanding rapidly. “Our goal now is to figure out a way to prevent the damage caused by SYNGAP1 mutations. We would be more likely to help that child if we could intervene very early on—before the mutation has done its damage,” he said.

Scientists discover previously unknown requirement for brain development
Scientists at the Salk Institute for Biological Studies have demonstrated that sensory regions in the brain develop in a fundamentally different way than previously thought, a finding that may yield new insights into visual and neural disorders.
In a paper published June 7, 2013, in Science, Salk researcher Dennis O’Leary and his colleagues have shown that genes alone do not determine how the cerebral cortex grows into separate functional areas. Instead, they show that input from the thalamus, the main switching station in the brain for sensory information, is crucially required.
O’Leary has done pioneering studies in “arealization,” the way in which the neo-cortex, the major region of cerebral cortex, develops specific areas dedicated to particular functions. In a landmark paper published in Science in 2000, he showed that two regulatory genes were critically responsible for the general pattern of the neo-cortex, and has since shown distinct roles for other genes in this process. In this new set of mouse experiments, his laboratory focused on the visual system, and discovered a new, unexpected twist to the story.
"In order to function properly, it is essential that cortical areas are mapped out correctly, and it is this architecture that was thought to be genetically pre-programmed," says O’Leary, holder of the Vincent J. Coates Chair in Molecular Neurobiology at Salk. "To our surprise, we discovered thalamic input plays an essential role far earlier in brain development."
Vision is relayed from the outside world into processing areas within the brain. The relay starts when light hits the retina, a thin strip of cells at the back of the eye that detects color and light levels and encodes the information as electrical and chemical signals. Through retinal ganglion cells, those signals are then sent into the Lateral Geniculate Nucleus (LGN), a structure in thalamus.
In the next important step in the relay, the LGN routes the signals into the primary visual area (V1) in the neo-cortex, a multi-layered structure that is divided into functionally and anatomically distinct areas. V1 begins the process of extracting visual information, which is further carried out by “higher order” visual areas in the neo-cortex that are vitally important to visual perception. Like parts in a machine, the functions of these areas are both individual and integrated. Damage in one tiny area can lead to strange visual disorders in which a person may be able to see a moving ball, and yet not perceive it is in motion.
Current dogma holds that this basic architecture is entirely genetically determined, with environmental input only playing a role later in development. One of the most famous examples of this idea is the Nobel Prize-winning work of visual neuroscientists David Hubel and Torsten Wiesel, which showed that there is a “critical period” of sensitivity in vision. Their finding was commonly interpreted as a warning that without exposure to basic visual stimuli early in life, even an individual with a healthy brain will be unable to see correctly.
Later discoveries in neural plasticity more optimistically suggested that early deprivation can be overcome, and the brain can even sprout new neurons in specific areas. Nevertheless, this still reinforced the idea that environmental influences might modify neural architecture, but only genetics could establish how cortical areas would be laid out.
In their new study, however, O’Leary and the paper’s co-first authors, Shen-Ju Chou and Zoila Babot, post-doctoral researchers in O’Leary’s laboratory, show that genetics only provides a broad field in the neo-cortex for visual areas.
When they created mouse mutants that disconnected the link between thalamus and cortex but only after early cortical development was complete, they found that the primary and higher order visual areas failed to differentiate from one another as they should.
"Our new understanding is that genes only create a rough lay-out of cortical areas," explains O’Leary. "There must be thalamic input to develop the fine differentiation necessary for proper sensory processing."
Essentially, if the brain were a house, genes would determine which areas were bedrooms. Thalamic input provides the details, distinguishing what will be the master bedroom, a child’s bedroom, a guest bedroom and so on. “The size and location of areas within the overall cortex does not change, but without thalamic input from the LGN, the critical differentiation process that creates primary and higher order visual areas does not happen,” says O’Leary.
Given that most sensory modalities—sight, hearing, touch—route through thalamus to cortex, this experiment may suggest why, when someone lacks a sensory modality from birth, that individual has a harder time processing restored sensory input than someone who lost the sense later in life. But in addition, as O’Leary says, “More subtle changes in thalamic input in humans would also likely result in changes to the neo-cortex that could well have a substantial impact on the ability to process vision, or other senses, and lead to abnormal behavior.”
O’Leary says his lab plans to continue to explore the links between how cortical areas in the brain are established and various developmental disorders, such as autism.
(Image: Nucleus Medical Art, Inc.)
A synthetic compound is able to turn off “secondary” vacuum cleaners in the brain that take up serotonin, resulting in the “happy” chemical being more plentiful, scientists from the School of Medicine at The University of Texas Health Science Center San Antonio have discovered. Their study, released June 18 by The Journal of Neuroscience, points to novel targets to treat depression.
Serotonin, a neurotransmitter that carries chemical signals, is associated with feelings of wellness. Selective serotonin reuptake inhibitors (SSRIs) are commonly prescribed antidepressants that block a specific “vacuum cleaner” for serotonin (the serotonin transporter, or SERT) from taking up serotonin, resulting in more supply of the neurotransmitter in circulation in the extracellular fluid of the brain.
Delicate balance
"Serotonin is released by neurons in the brain," said Lyn Daws, Ph.D., professor of physiology and pharmacology in the School of Medicine. "Too much or too little may be a bad thing. It is thought that having too little serotonin is linked to depression. That’s why we think Prozac-type drugs (SSRIs) work, by stopping the serotonin transporter from taking up serotonin from extracellular fluid in the brain."
A problem with SSRIs is that many depressed patients experience modest or no therapeutic benefit. It turns out that, while SSRIs block the activity of the serotonin transporter, they don’t block other “vacuum cleaners.” “Until now we did not appreciate the presence of backup cleaners for serotonin,” Dr. Daws said. “We were not the first to show their presence in the brain, but we were among the first show that they were limiting the ability of the SSRIs to increase serotonin signaling in the brain. The study described in this new paper is the first demonstration of enhancing the antidepressant-like effect of an SSRI by concurrently blocking these backup vacuum cleaners.”
Serotonin ceiling
Even if SERT activity is blocked, the backup vacuum cleaners (called organic cation transporters) keep a ceiling on how high the serotonin levels can rise, which likely limits the optimal therapeutic benefit to the patient, Dr. Daws said.
"Right now, the compound we have, decynium-22, is not an agent that we want to give to people in clinical trials," she said. "We are not there yet. Where we are is being able to use this compound to identify new targets in the brain for antidepressant activity and to turn to medicinal chemists to design molecules to block these secondary vacuum cleaners."
(Source: eurekalert.org)
Alzheimer’s disease protein controls movement in mice
Researchers in Berlin and Munich, Germany and Oxford, United Kingdom, have revealed that a protein well known for its role in Alzheimer’s disease controls spindle development in muscle and leads to impaired movement in mice when the protein is absent or treated with inhibitors. The results, which are published in The EMBO Journal, suggest that drugs under development to target the beta-secretase-1 protein, which may be potential treatments for Alzheimer’s disease, might produce unwanted side effects related to defective movement.
Alzheimer’s disease is the most common form of dementia found in older adults. The World Health Organization estimates that approximately 18 million people worldwide have Alzheimer’s disease. The number of people affected by the disease may increase to 34 million by 2025. Scientists know that the protein beta-secretase-1 or Bace1, a protease enzyme that breaks down proteins into smaller molecules, is involved in Alzheimer’s disease. Bace1 cleaves the amyloid precursor protein and generates the damaging Abeta peptides that accumulate as plaques in the brain leading to disease. Now scientists have revealed in more detail how Bace1 works.
"Our results show that mice that lack Bace1 proteins or are treated with inhibitors of the enzyme have difficulties in coordination and walking and also show reduced muscle strength," remarked Carmen Birchmeier, one of the authors of the paper, Professor at the Max-Delbrück-Center for Molecular Medicine in Berlin, Germany, and an EMBO Member. "In addition, we were able to show that the combined activities of Bace1 and another protein, neuregulin-1 or Nrg1, are needed to sustain the muscle spindles in mice and to maintain motor coordination."
Muscle spindles are sensory organs that are found throughout the muscles of vertebrates. They are able to detect how muscles stretch and convey the perception of body position to the brain. The researchers used genetic analyses, biochemical studies and interference with pharmacological inhibitors to investigate how Bace1 works in mice. “If the signal strength of a specific form of neuregulin-1 known as IgNrg1 is gradually reduced, increasingly severe defects in the formation and maturation of muscle spindles are observed in mice. Furthermore, it appears that Bace1 is required for full IgNrg1 activity. The graded loss of IgNrg1 activity results in the animals having increasing difficulties with movement and coordination,” says Cyril Cheret, the first author of the work.
Drug developers are interested in stopping the Bace1 protein in its tracks because it represents a promising route to treat Alzheimer’s disease. If the protein were inhibited, it would interfere with the generation of the smaller damaging proteins that accumulate in the brain as amyloid plaques and would therefore provide some level of protection from the effects of the disease. “Our data indicate that one unwanted side effect of the long-term inhibition of Bace1 might be the disruption of muscle spindle formation and impairment of movement. This finding is relevant to scientists looking for ways to develop drugs that target the Bace1 protein and should be considered,” says Birchmeier. Several Bace1 inhibitors are currently being tested in phase II and phase III clinical trials for the treatment of Alzheimer’s disease.
Scientists from the Florida campus of The Scripps Research Institute (TSRI) have found a compound that could counter Parkinson’s disease in two ways at once.
In a new study published recently online ahead of print by the journal ACS Chemical Biology, the scientists describe a “dual inhibitor”—two compounds in a single molecule—that attacks a pair of proteins closely associated with development of Parkinson’s disease.
“In general, these two enzymes amplify the effect of each other,” said team leader Phil LoGrasso, a TSRI professor who has been a pioneer in the development of JNK inhibitors for the treatment of neurodegenerative diseases. “What we were looking for is a high-affinity, high-selectivity treatment that is additive or synergistic in its effect—a one-two punch.”
That could be what they found.
This new dual inhibitor attacks two enzymes—the leucine-rich repeat kinase 2 (LRRK2) and the c-jun-N-terminal kinase (JNK)—pronounced “junk.” Genetic testing of several thousand Parkinson’s patients has shown that mutations in the LRRK2 gene increase the risk of Parkinson’s disease, while JNK has been shown to play an important role in neuron (nerve cell) survival in a range of neurodegenerative diseases. As such, they have become highly viable targets for drugs to treat disorders such as Parkinson’s disease.
A dual inhibitor ultimately would be preferred over separate individual JNK and LRRK2 inhibitors because a combination molecule would eliminate complications of drug-drug interactions and the need to optimize individual inhibitor doses for efficacy, the study noted.
Now the team’s new dual inhibitor will need to be optimized for potency, high selectivity (which reduces off-target side effects) and bioavailability so it can be tested in animal models of Parkinson’s disease.
(Source: scripps.edu)
No matter how we jump, roll, sit, or lie down, our brain manages to maintain a visual representation of the world that stays upright relative to the pull of gravity. But a new study of rider experiences on the Hong Kong Peak Tram, a popular tourist attraction, shows that specific features of the environment can dominate our perception of verticality, making skyscrapers appear to fall.

The study is published in Psychological Science, a journal of the Association for Psychological Science.
The Hong Kong Peak Tram to Victoria Peak is a popular way to survey the Hong Kong skyline and millions of people ride the tram every year.
“On one trip, I noticed that the city’s skyscrapers next to the tram started to appear very tilted, as if they were falling, which anyone with common sense knows is impossible,” says lead researcher Chia-huei Tseng of the University of Hong Kong. “The gasps of the other passengers told me I wasn’t the only one seeing it.”
The illusion was perplexing because, in contrast with most illusions studied in the laboratory, observers have complete access to visual cues from the outside world through the tram’s open windows.
Exploring the illusion under various conditions, Tseng and colleagues found that the perceived, or illusory, tilt was greatest on night-time rides, perhaps a result of the relative absence of visual-orientation cues or a heightened sense of enclosure at night. Enhancing the tilted frame of reference within the tram car — indicated by features like oblique window frames, beams, floor, and lighting fixtures — makes the true vertical of the high rises seem to tilt in the opposite direction.
The illusion was significantly reduced by obscuring the window frame and other reference cues inside the tram car, by using wedges to adjust observers’ position, and by having them stand during the tram ride.
But no single modification was sufficient to eliminate the illusion.
“Our findings demonstrate that signals from all the senses must be consonant with each other to abolish the tilt illusion,” the researchers write. “On the tram, it seems that vision dominates verticality perception over other sensory modalities that also mediate earth gravity, such as the vestibular and tactile systems.”
The robustness of the tram illusion took the researchers by surprise:
“We took the same tram up and down for hundreds of trips, and the illusion did not reduce a bit,” says Tseng. “This suggests that our experiences and our learned knowledge about the world — that buildings should be vertical — are not enough to cancel our brain’s wrong conclusion.”
People can plan strategic movements to several different targets at the same time, even when they see far fewer targets than are actually present, according to a new study published in Psychological Science, a journal of the Association for Psychological Science.

A team of researchers at the Brain and Mind Institute at the University of Western Ontario took advantage of a pictorial illusion — known as the “connectedness illusion” — that causes people to underestimate the number of targets they see.
When people act on these targets, however, they can rapidly plan accurate and strategic reaches that reflect the actual number of targets.
Using sophisticated statistical techniques to analyze participants’ responses to multiple potential targets, the researchers found that participants’ reaches to the targets were unaffected by the presence of the connecting lines.
Thus, the “connectedness illusion” seemed to influence the number of targets they perceived but did not impact their ability to plan actions related to the targets.
These findings indicate that the processes in the brain that plan visually guided actions are distinct from those that allow us to perceive the world.
“The design of the experiments allowed us to separate these two processes, even though they normally unfold at the same time,” explained lead researcher Jennifer Milne, a PhD student at the University of Western Ontario.
“It’s as though we have a semi-autonomous robot in our brain that plans and executes actions on our behalf with only the broadest of instructions from us!”
According to Mel Goodale, professor at the University of Western Ontario and senior author on the paper, these findings “not only reveal just how sophisticated the visuomotor systems in the brain are, but could also have important implications for the design and implementation of robotic systems and efficient human-machine interfaces.”
One in four people who survive a stroke or transient ischemic attack (TIA) suffer from symptoms of post-traumatic stress disorder (PTSD) within the first year post-event, and one in nine experience chronic PTSD more than a year later. The data suggest that each year nearly 300,000 stroke/TIA survivors will develop PTSD symptoms as a result of their health scare. The study, led by Columbia University Medical Center researchers, was published today in the online edition of PLOS ONE.

“This work builds on recent findings of ours that PTSD is common among heart attack survivors and that it contributes to a doubled risk of a future cardiac event or of dying within one to three years. Our current results show that PTSD in stroke and TIA survivors may increase their risk for recurrent stroke and other cardiovascular events,” said first author Donald Edmondson, PhD, MPH, assistant professor of behavioral medicine (Center for Behavioral Cardiovascular Health) at CUMC. “Given that each event is life-threatening and that strokes/TIAs add hundreds of millions of dollars to annual health expenditures, these findings are important to both the long-term survival and health costs of these patient populations.”
“PTSD is not just a disorder of combat veterans and sexual assault survivors, but strongly affects survivors of stroke and other potentially traumatic acute cardiovascular events as well,” said Ian M. Kronish, MD, MPH, assistant professor of medicine (Center for Behavioral Cardiovascular Health) and the study’s senior author. “Surviving a life-threatening health scare can have a debilitating psychological impact, and health care providers should make it a priority to screen for symptoms of depression, anxiety, and PTSD among these patient populations.”
Stroke is the fourth-leading cause of death and the top cause of disability in the United States. According to data from the American Stroke Association, nearly 795,000 Americans each year suffer a new or recurrent stroke, and up to an additional 500,000 suffer a TIA.
PTSD is an anxiety disorder initiated by exposure to a traumatic event. Common symptoms include nightmares, avoidance of reminders of the event, and elevated heart rate and blood pressure. Chronic PTSD is a duration of these symptoms for three months or longer (as defined by the DSM-IV).
Since only a few studies have assessed PTSD due to stroke, Drs. Edmondson and Kronish and their colleagues performed the first meta-analysis of clinical studies of stroke- or TIA-induced PTSD. The nine studies in the meta-analysis included a total of 1,138 stroke or TIA survivors.
The study found that 23 percent, or roughly one in four, of the patients developed PTSD symptoms within the first year after their stroke or TIA, with 11 percent, or roughly one in nine, experiencing chronic PTSD more than a year later.
“PTSD and other psychological disorders in stroke and TIA patients appear to be an under-recognized and undertreated problem,” said Dr. Kronish.
“Fortunately, there are good treatments for PTSD,” said Dr. Edmondson. “But first, physicians and patients have to be aware that this is a problem. Family members can also help. We know that social support is a good protective factor against PTSD due to any type of traumatic event.”
“The next step is further research to assess whether mental health treatment can reduce stroke- and TIA-induced PTSD symptoms and help these patients regain a feeling of normalcy and calm as soon as possible after their health scare,” said Dr. Edmondson.
(Source: newsroom.cumc.columbia.edu)
A line of genetically modified mice that Western University scientists call “Forrest Gump” because, like the movie character, they can run far but they aren’t smart, is furthering the understanding of a key neurotransmitter called acetylcholine (ACh). Marco Prado, PhD, and his team at Robarts Research Institute say the mice show what happens when too much of this neurotransmitter becomes available in the brain. Boosting ACh is a therapeutic target for Alzheimer’s disease because it’s found in reduced amounts when there’s cognitive failure. Prado’s research is published in the Journal of Neuroscience.
“We wanted to know what happens if you have more of the gene which controls how much acetylcholine is secreted by neurons,” says Prado, a Robarts scientist and professor in the Departments of Physiology and Pharmacology and Anatomy and Cell Biology at Western’s Schulich School of Medicine & Dentistry. “The response was the complete opposite of what we expected. It’s not a good thing. Acetylcholine release was increased threefold in these mice, which seemed to disturb cognitive function. But put them on a treadmill and they can run twice as far as normal mice before tiring. They’re super-athletes.” In addition to its function in modulating cognitive abilities, ACh drives muscle contraction which allowed for the marked improvement in motor endurance.
One of the tests the scientists, including first author Benjamin Kolisnyk, used is called the touch screen test for mice which uses technology similar to a tablet. After initiating the test, the mice have to scan five different spots on the touch screen to see a light flash, and then run and touch that area. If they get it right they get a reward. Compared to the control mice, the “Forrest Gump” mice failed miserably at the task. The researchers found the mice, which have the scientific name ChAT-ChR2-EYFP, had terrible attention spans, as well as dysfunction in working memory and spatial memory.
Prado interprets the research as showing ACh is very important for differentiating cues. So if your brain is presented with a lot of simultaneous information, it helps to pick what’s important. But when you flood the brain with ACh, your brain loses the ability to discern what’s relevant. This study was funded mainly by the Canadian Institutes of Health Research.
(Source: communications.uwo.ca)

The link between circadian rhythms and aging
Human sleeping and waking patterns are largely governed by an internal circadian clock that corresponds closely with the 24-hour cycle of light and darkness. This circadian clock also controls other body functions, such as metabolism and temperature regulation.
Studies in animals have found that when that rhythm gets thrown off, health problems including obesity and metabolic disorders such as diabetes can arise. Studies of people who work night shifts have also revealed an increased susceptibility to diabetes.
A new study from MIT shows that a gene called SIRT1, previously shown to protect against diseases of aging, plays a key role in controlling these circadian rhythms. The researchers found that circadian function decays with aging in normal mice, and that boosting their SIRT1 levels in the brain could prevent this decay. Conversely, loss of SIRT1 function impairs circadian control in young mice, mimicking what happens in normal aging.
Since the SIRT1 protein itself was found to decline with aging in the normal mice, the findings suggest that drugs that enhance SIRT1 activity in humans could have widespread health benefits, says Leonard Guarente, the Novartis Professor of Biology at MIT and senior author of a paper describing the findings in the June 20 issue of Cell.
“If we could keep SIRT1 as active as possible as we get older, then we’d be able to retard aging in the central clock in the brain, and health benefits would radiate from that,” Guarente says.
Staying on schedule
In humans and animals, circadian patterns follow a roughly 24-hour cycle, directed by the circadian control center of the brain, called the suprachiasmatic nucleus (SCN), located in the hypothalamus.
“Just about everything that takes place physiologically is really staged along the circadian cycle,” Guarente says. “What’s now emerging is the idea that maintaining the circadian cycle is quite important in health maintenance, and if it gets broken, there’s a penalty to be paid in health and perhaps in aging.”
Last year, Guarente found that a robust circadian period correlated with longer lifespan in mice. That got him wondering what role SIRT1, which has been shown to prolong lifespan in many animals, might play in that phenomenon. SIRT1, which Guarente first linked with aging more than 15 years ago, is a master regulator of cell responses to stress, coordinating a variety of hormone networks, proteins and genes to help keep cells alive and healthy.
To investigate SIRT1’s role in circadian control, Guarente and his colleagues created genetically engineered mice that produce different amounts of SIRT1 in the brain. One group of mice had normal SIRT1 levels, another had no SIRT1, and two groups had extra SIRT1 — either twice or 10 times as much as normal.
Mice lacking SIRT1 had slightly longer circadian cycles (23.9 hours) than normal mice (23.6 hours), and mice with a 10-fold increase in SIRT1 had shorter cycles (23.1 hours).
In mice with normal SIRT1 levels, the researchers confirmed previous findings that when the 12-hour light/dark cycle is interrupted, younger mice readjust their circadian cycles much more easily than older ones. However, they showed for the first time that mice with extra SIRT1 do not suffer the same decline in circadian control as they age.
The researchers also found that SIRT1 exerts this control by regulating the genes BMAL and CLOCK, the two major keepers of the central circadian clock.
Enhancing circadian function
A growing body of evidence suggests that being able to respond to large or small disruptions of the light/dark cycle is important to maintaining healthy metabolic function, Guarente says.
“Essentially we experience a mini jet lag every day because the light cycle is constantly changing. The critical thing for us is to be able to adapt smoothly to these jolts,” Guarente says. “Many studies in mice say that while young mice do this perfectly well, it’s the old mice that have the problem. So that could well be true in humans.”
If so, it could be possible to treat or prevent diseases of aging by enhancing circadian function — either by delivering SIRT1 activators in the brain or developing drugs that enhance another part of the circadian control system, Guarente says.
“I think we should look at every aspect of the machinery of the circadian clock in the brain, and any intervention that can maintain that machinery with aging ought to be good,” he says. “One entry point would be SIRT1, because we’ve shown in mice that genetic maintenance of SIRT1 helps maintain circadian function.”
Some SIRT1 activators are now being tested against diabetes, inflammation and other diseases, but they are not designed to cross the blood-brain barrier and would likely not be able to reach the SCN. However, Guarente believes it could be possible to design SIRT1 activators that can get into the brain.
Roman Kondratov, an associate professor of biology at Cleveland State University, says the study raises several exciting questions regarding the potential to delay or reverse age-related changes in the brain through rejuvenation of the circadian clock with SIRT1 enhancement.
“The importance of this study is that it has both basic and potentially translational applications, taking into account the fact that pharmacological modulators of SIRT1 are currently under active study,” Kondratov says.
Researchers in Guarente’s lab are now investigating the relationship between health, circadian function and diet. They suspect that high-fat diets might throw the circadian clock out of whack, which could be counteracted by increased SIRT1 activation.
(Image: Wikimedia Commons)